This help page contains recommended use cases for each Adobe Analytics tool. Tools should be considered in the order they are listed. If a certain tool does not meet the need, move to the next one for consideration.
For more on Adobe Analytics Product Comparisons, go here.
Here is a video that compares various Adobe Analytics tools:
Adobe Analytics Reporting User Interfaces
Analysis Workspace should be the go-to user interface for all of your reporting and analysis needs. Adobe continues to invest in and release monthly updates to this product. If there is a task you cannot do in Analysis Workspace, consider the other interfaces below.**
When there is metadata you want to associate to a collect value (eVar, prop, marketing channel)
Rule builder: use when you have predictable formatted-values being collected for a variable, e.g. delimited values. This approach allows you to set up rules once and largely “set-it and forget-it”.
Browser importer: use when you don’t have predictable values, or when you have a finite list of values that requires a one-time update. This approach requires that you do ongoing monitoring of the classifications for new values.
When there is offline data you want permanently written into Adobe Analytics
Summary: simple data uploads, by day or limited dimensions
Transaction ID: data uploads that connect an online endpoint to offline data, and fully associate imported data to a visitor snapshot captured online (e.g. orders complete online, and get returned offline)
Full Processing: time-stamped data sources, processed as if it was a hit collected by Adobe servers. I.e. data gets inserted directly into the visitor journey.
When you engage with a 3rd-party provider that has built a supported connection with Adobe Analytics. Integration apps typically incorporate summary-level data into Adobe Analytics permanently and automatically, on a recurring basis.
Data Insertion API and Bulk Data Insertion API are both methods to submit server-side collection data to Adobe Analytics. Data Insertion API calls are made one event at a time. Bulk Data Insertion API accepts CSV formatted files containing event data, one event per row. If you are working on a new implementation of server-side collection, we recommend using Bulk Data Insertion API.
If you want to incorporate Adobe Audience Manager (AAM) audience data such as demographic information (e.g. gender or income level), psychographic information (e.g. interests and hobbies), CRM data, or ad impression data into any Analytics workflow.
If you want uploaded CRM data to be time based, because this integration sends new information to Analytics hit by hit.
To utilize the most granular data feed we can provide (visitor ID, hit).
If the client wants Adobe data stored in a client-side database, at the most granular level we can send.
If the client wants to develop a Business Intelligence (BI) tool or input hit-level Adobe data into a 3rd-party tool.
Reporting APIs should be used when the other visualization options do not meet your needs. The 3 API options include:
Fully Processed: when you want feature-rich data (including visits, visitors, and segments). This is typical Analytics UI summarized data, available within ~30-90 minutes. Can be used through Report Builder.
Real-Time: when you want to view a few metrics and dimensions with seconds of latency. This is limited, partially processed, summarized data that is available within ~30 seconds. Includes unique algorithms of most popular, gainers, and losers. Can be used through Report Builder.
Live Stream: when you want a stream of partially-processed hit-level Analytics data within seconds of collection. This is partially processed data, available within ~30 seconds. Available for Analytics Premium only. Requires some way to visualize the data, typically through an Engineering Services engagement.
Engineering Services should be used when:
The other Adobe tools don’t meet your needs.
You want a custom experience.
You want a fully automated solution.
You want to reach many devices.
You have multiple data sources.
You have complex data ETL (Extract-Transform-Load) requirements.