Tracking Overview tracking-overview
This documentation covers tracking in version 2.x of the SDK.
Player Events
Tracking core playback includes tracking media load, media start, media pause, and media complete. Although not mandatory, tracking buffering and seeking are also core components used for tracking content playback. In your media player API, identify the player events that correspond with the Media SDK tracking calls, and code your event handlers to call tracking APIs, and to populate required and optional variables.
On media load
- Create the media object
- Populate metadata
- Call
trackSessionStart; For example:trackSessionStart(mediaObject, contextData)
On media start
- Call
trackPlay
On pause/resume
- Call
trackPause - Call
trackPlaywhen playback resumes
On media complete
- Call
trackComplete
On media abort
- Call
trackSessionEnd
When scrubbing starts
- Call
trackEvent(SeekStart)
When scrubbing ends
- Call
trackEvent(SeekComplete)
Cancel changes
When buffering starts
- Call
trackEvent(BufferStart);
When buffering ends
- Call
trackEvent(BufferComplete);
Implement implement
-
Initial tracking setup - Identify when the user triggers the intention of playback (the user clicks play and/or autoplay is on) and create a
MediaObjectinstance using the media information for content name, content ID, content length, and stream type.MediaObjectreference:table 0-row-3 1-row-3 2-row-3 3-row-3 4-row-3 5-row-3 Variable Name Description Required nameContent name Yes mediaidContent unique identifier Yes lengthContent length Yes streamTypeStream type Yes mediaTypeMedia type (audio or video content) Yes StreamTypeconstants:table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 6-row-2 Constant Name Description VODStream type for Video on Demand. LIVEStream type for Live content. LINEARStream type for Linear content. AODStream type for audio on demand AUDIOBOOKStream type for audio book PODCASTStream type for Podcast MediaTypeconstants:table 0-row-2 1-row-2 2-row-2 Constant Name Description AudioMedia type for Audio streams. VideoMedia type for Video streams. The general format for creating the
MediaObjectisMediaHeartbeat.createMediaObject(<MEDIA_NAME>, <MEDIA_ID>, <MEDIA_LENGTH>, <STREAM_TYPE>, <MEDIA_TYPE>); -
Attach metadata - Optionally attach standard and/or custom metadata objects to the tracking session through context data variables.
-
Standard metadata -
note note NOTE Attaching the standard metadata object to the media object is optional. Instantiate a standard metdata object, populate the desired variables, and set the metadata object on the Media Heartbeat object.
See the comprehensive list of metadata here: Audio and video parameters.
-
Custom metadata - Create a variable object for the custom variables and populate with the data for this content.
-
-
Track the intention to start playback - To begin tracking a session, call
trackSessionStarton the Media Heartbeat instance.note important IMPORTANT trackSessionStarttracks the user intention of playback, not the beginning of the playback. This API is used to load the data/metadata and to estimate the time-to-start QoS metric (the time duration betweentrackSessionStartandtrackPlay).note note NOTE If you are not using custom metadata, simply send an empty object for the dataargument intrackSessionStart. -
Track the actual start of playback - Identify the event from the media player for the beginning of the playback, where the first frame of the content is rendered on the screen, and call
trackPlay. -
Track the completion of playback - Identify the event from the media player for the completion of the playback, where the user has watched the content until the end, and call
trackComplete. -
Track the end of the session - Identify the event from the media player for the unloading/closing of the playback, where the user closes the content and/or the content is completed and has been unloaded, and call
trackSessionEnd.note important IMPORTANT trackSessionEndmarks the end of a tracking session. If the session was successfully watched to completion, where the user watched the content until the end, ensure thattrackCompleteis called beforetrackSessionEnd. Any othertrack*API call is ignored aftertrackSessionEnd, except fortrackSessionStartfor a new tracking session. -
Track all possible pause scenarios - Identify the event from the media player for pause and call
trackPause.Pause Scenarios - Identify any scenario in which the Player will pause and make sure that
trackPauseis properly called. The following scenarios all require that your app calltrackPause():- The user explicitly hits pause in the app.
- The player puts itself into the Pause state.
- (Mobile Apps) - The user puts the application into the background, but you want the app to keep the session open.
- (Mobile Apps) - Any type of system interrupt occurs that causes an application to be backgrounded. For example, the user receives a call, or a pop up from another application occurs, but you want the application to keep the session alive to give the user the opportunity to resume the content from the point of interruption.
-
Identify the event from the player for play and/or resume from pause and call
trackPlay.note tip TIP This may be the same event source that was used in Step 4. Ensure that each trackPause()API call is paired with a followingtrackPlay()API call when the playback resumes. -
Listen for playback seeking events from the media player. On seek start event notification, track seeking using the
SeekStartevent. -
On seek complete notification from the media player, track the end of seeking using the
SeekCompleteevent. -
Listen for the playback buffering events from media player, and on buffer start event notification, track buffering using the
BufferStartevent. -
On buffer complete notification from the media player, track the end of buffering using the
BufferCompleteevent.
See examples of each step in the following platform-specific topics, and look at the sample players included with your SDKs.
For a simple example of playback tracking, see this use of the JavaScript 2.x SDK in an HTML5 player:
/* Call on media start */
if (e.type == "play") {
// Check for start of media
if (!sessionStarted) {
/* Set media info */
/* MediaHeartbeat.createMediaObject(<MEDIA_NAME>,
<MEDIA_ID>,
<MEDIA_LENGTH>,
<MEDIA_STREAMTYPE>,
<MEDIA_MEDIATYPE>);*/
var mediaInfo = MediaHeartbeat.createMediaObject(
document.getElementsByTagName('video')[0].getAttribute("name"),
document.getElementsByTagName('video')[0].getAttribute("id"),
video.duration,
MediaHeartbeat.StreamType.VOD);
/* Set custom context data */
var customVideoMetadata = {
isUserLoggedIn: "false",
tvStation: "Sample TV station",
programmer: "Sample programmer"
};
/* Set standard video metadata */
var standardVideoMetadata = {};
standardVideoMetadata[MediaHeartbeat.VideoMetadataKeys.EPISODE] = "Sample Episode";
standardVideoMetadata[MediaHeartbeat.VideoMetadataKeys.SHOW] = "Sample Show";
mediaInfo.setValue(MediaHeartbeat.MediaObjectKey.StandardVideoMetadata,
standardVideoMetadata);
// Start Session
this.mediaHeartbeat.trackSessionStart(mediaInfo, customVideoMetadata);
// Track play
this.mediaHeartbeat.trackPlay();
sessionStarted = true;
} else {
// Track play for resuming playack
this.mediaHeartbeat.trackPlay();
}
};
/* Call on video complete */
if (e.type == "ended") {
console.log("video ended");
this.mediaHeartbeat.trackComplete();
this.mediaHeartbeat.trackSessionEnd();
sessionStarted = false;
};
/* Call on pause */
if (e.type == "pause") {
this.mediaHeartbeat.trackPause();
};
/* Call on scrub start */
if (e.type == "seeking") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.SeekStart);
};
/* Call on scrub stop */
if (e.type == "seeked") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.SeekComplete);
};
/* Call on buffer start */
if (e.type == "buffering") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.BufferStart);
};
/* Call on buffer complete */
if (e.type == "buffered") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.BufferComplete);
};
Validate validate
For information on validating your legacy implementation, see Legacy Validation.