Sensor automates the acquisition of data from your Internet channel by doing away with the bulk of human labor traditionally involved in data collection.
In many cases, using Sensor can vastly simplify your data management process.
Today’s large Internet, extranet, and intranet sites often run on an array of web servers. The logs and data produced can be very large and cumbersome to manage. For example, if your site is running 30 web servers, typically one of your employees (or outsourced service provider’s employees) would pull and consolidate each log file on each of the 30 servers, then run reports on them. Installing Sensor on each of your web servers automates this entire process, reducing your expenses and making data available in real time.
To automate this process, Sensor collects raw information about the traffic on a website directly from each web server. The raw data that Sensor captures is called event data and is similar to the type of data that your web server records in its log files.
To capture this data, instrumentation within Sensor records information about each HTTP request that your web server processes. Sensor then buffers the information to protect against network failure and securely transmits the information via HTTP/S to the data workbench server that you specify.
After the data workbench server receives the data, it processes and stores your log files in highly-compressed .vsl format files, allowing you to easily maintain very large amounts of data on inexpensive hardware.
For information about the event data fields collected by Sensor in .vsl files, see Event Data Record Fields.