This topic provides answers to common questions.
For transaction ID data sources to tie offline data to online events, you must enable Transaction ID Recording. See Transaction ID Recording for more information.
Data Sources does not carry any additional fees beyond the standard server call. Server call charges apply only to full-processing data source types, where individual hits are sent in as rows of data. Traffic and aggregate level data sources do not incur additional costs.
Each row in a Data Sources file that begins with a pound sign (#) is treated as a comment.
Yes. Because many marketing reports are keyed from the date column, Data Sources requires a date column.
Adobe recommends you select new, unused variables to import data using Data Sources. If you are uncertain about the configuration of your data file, or want to better understand the risks of re-using variables, contact Customer Care.
Processing pauses if the size exceeds 50 MB and does not resume until the total is below 50 MB. To limit delays in generating reports, do not upload more than 90 days of data per day.
Data Sources data never overwrite existing report data. Instead, data uploaded using Data Sources is added to the existing data.
When you upload Data Sources data, you are uploading the metrics that will be available in the report interface.
For example, if you are uploading Call Center Revenue for products you sell on your site, you can have that Call Center Revenue in the same report as Online Revenue. However, you will not be able to use it in conjunction with Visits, because you didn’t upload the number of Visits with it. Adobe can only report on the metrics and elements that you uploaded through Data Sources (in addition to the regular marketing report metrics).
The value is decreased accordingly.
The Traffic Data Source uploads much faster since it merely updates the summary values into the appropriate tables. The Generic Data Source with conversion data (events etc.) creates a hit for every value in the column to be processed.
The example above creates 553 hits to be processed in the cache system.
Since the Data Source process (“for Generic DS, non-Traffic”) builds individual hits that are processed by cache, the subrelation process is used, but not the correlation process. Pathing has the potential to be processed, but each hit would be its own visit, so no pathing is generated. Pathing data is generated for Web log imports.
If the extensions of a Data Source upload file or a classification file are capitalized, the files will not be processed. Data Source upload file extensions must be lowercase. For example, file.TXT and file.FIN will not be processed. Similarly, .TAB and .FIN will not be processed. However, .txt and .fin are processed.
You can add in as many events as you like. However, the wizard allows for only three events only. Once the template file is created you can add in more events as needed.
If you have a Data Source file where one or more of the records do not have the same number of columns as the header record, the following will occur.
Data Sources information can be rolled up; however, Adobe Customer Care must reprocess the rollup from the historical date to include the historical data. For example, if the current date is 31 October 2015 and you upload data for 1-15 August 2015 using Data Sources, the rollup must be set to reprocess beginning with 1 August 2015, so that the newly imported data is included.
Also note that data should never be uploaded directly into a rollup report suite using Data Sources. If you need this data included in a rollup, it should be imported into a standard report suite, also called a
child suite to the rollup. Contact Adobe Customer Care for more information.
Data Sources does not report data on an hourly basis. When you try to run a report for a particular day, the data can be broken down only by the hour so nothing shows in the report. You can see data only when it is broken down by a level of granularity of daily or higher, which can be accomplished by running a weekly or a monthly report.
The number of Unique Visitors in a web-server log is calculated as the different distinct combinations of
IP Address and
User Agent in the Web log. Each unique combination of these two items is calculated as a Unique Visitor. If the User Agent column is blank (or not included in the web log) then we are unable to identify Unique Visitor counts, and the entire upload will count as just one Unique Visitor (even if there are multiple IP addresses).
In Data Sources, the report suite ID is the first part of the login appended by a random number that identifies the specific data source that was set up. For example,
In version 15, Data Sources behave differently based on the source type:
The data feed contains any transaction ID metrics that have been received. However, if you upload transaction ID data for a date in the past, the only way to get that data is to download the data feed again for that day.
No for full processing, yes for transaction ID. Full processing data sources are processed using separate visitor profiles, so even if the visitor IDs match, they won’t be tied to together from an eVar allocation perspective. Transaction ID data sources are tied to the main visitor profile, so persisting eVars are allocated to events uploaded using transaction ID.
No. eVars uploaded via Transaction ID data sources will only read from the stored profile info, not update the profile.
No. eVars are the only variables that are saved in the snapshot of the visitor profile.
Full processing only supports legacy event list formats excluding the numeric/currency/Counter (more than 1) event value directly in the events list, that is
"eventNN=#.##". It means that it only supports a counter event if it is passed in events column in data source file and it increments by 1.
If numeric, currency or counter (more than 1) events are required, use the product list:
s.products="Footwear;Running Shoes;1;99.99;event1=4.50"; s.products="Footwear;Running Shoes;1;99.99;event1=4.50|event4=1.99";
After the .fin file is uploaded, it’s important that you log out of the Data Sources FTP site. The reason is that Analytics uses logout events as a trigger to indicate that files are ready for processing. If you are programmatically uploading the files, it is important that your automated process also logs out of the FTP site after the files have been uploaded.
Verify that your filenames follow the correct format. Leading or trailing whitespace in the filename causes the file to go unrecognized and to not be picked up by the Adobe ingestion process.