Tracking Overview

This documentation covers tracking in version 2.x of the SDK.


If you are implementing a 1.x version of the SDK, you can download 1.x Developers Guides here: Download SDKs.

Player Events

Tracking core playback includes tracking media load, media start, media pause, and media complete. Although not mandatory, tracking buffering and seeking are also core components used for tracking content playback. In your media player API, identify the player events that correspond with the Media SDK tracking calls, and code your event handlers to call tracking APIs, and to populate required and optional variables.

On media load

  • Create the media object
  • Populate metadata
  • Call trackSessionStart; For example: trackSessionStart(mediaObject, contextData)

On media start

  • Call trackPlay

On pause/resume

  • Call trackPause
  • Call trackPlay   when playback resumes

On media complete

  • Call trackComplete

On media abort

  • Call trackSessionEnd

When scrubbing starts

  • Call trackEvent(SeekStart)

When scrubbing ends

  • Call trackEvent(SeekComplete)

When buffering starts

  • Call trackEvent(BufferStart);

When buffering ends

  • Call trackEvent(BufferComplete);

The playhead position is set as part of the set-up and configuration code. For more information about getCurrentPlayheadTime, see Overview: General Implementation Guidelines.


  1. Initial tracking setup - Identify when the user triggers the intention of playback (the user clicks play and/or autoplay is on) and create a MediaObject instance using the media information for content name, content ID, content length, and stream type.

    MediaObject reference:

    Variable Name Description Required
    name Content name Yes
    mediaid Content unique identifier Yes
    length Content length Yes
    streamType Stream type Yes
    mediaType Media type (audio or video content) Yes

    StreamType constants:

    Constant Name Description
    VOD Stream type for Video on Demand.
    LIVE Stream type for Live content.
    LINEAR Stream type for Linear content.
    AOD Stream type for audio on demand
    AUDIOBOOK Stream type for audio book
    PODCAST Stream type for Podcast

    MediaType constants:

    Constant Name Description
    Audio Media type for Audio streams.
    Video Media type for Video streams.

    The general format for creating the MediaObject is MediaHeartbeat.createMediaObject(<MEDIA_NAME>, <MEDIA_ID>, <MEDIA_LENGTH>, <STREAM_TYPE>, <MEDIA_TYPE>);

  2. Attach metadata - Optionally attach standard and/or custom metadata objects to the tracking session through context data variables.

    • Standard metadata -


      Attaching the standard metadata object to the media object is optional.

      Instantiate a standard metdata object, populate the desired variables, and set the metadata object on the Media Heartbeat object.

      See the comprehensive list of metadata here: Audio and video parameters.

    • Custom metadata - Create a variable object for the custom variables and populate with the data for this content.

  3. Track the intention to start playback - To begin tracking a session, call trackSessionStart on the Media Heartbeat instance.


    trackSessionStart tracks the user intention of playback, not the beginning of the playback. This API is used to load the data/metadata and to estimate the time-to-start QoS metric (the time duration between trackSessionStart and trackPlay).


    If you are not using custom metadata, simply send an empty object for the data argument in trackSessionStart.

  4. Track the actual start of playback - Identify the event from the media player for the beginning of the playback, where the first frame of the content is rendered on the screen, and call trackPlay.

  5. Track the completion of playback - Identify the event from the media player for the completion of the playback, where the user has watched the content until the end, and call trackComplete.

  6. Track the end of the session - Identify the event from the media player for the unloading/closing of the playback, where the user closes the content and/or the content is completed and has been unloaded, and call trackSessionEnd.


    trackSessionEnd marks the end of a tracking session. If the session was successfully watched to completion, where the user watched the content until the end, ensure that trackComplete is called before trackSessionEnd. Any other track* API call is ignored after trackSessionEnd, except for trackSessionStart for a new tracking session.

  7. Track all possible pause scenarios - Identify the event from the media player for pause and call trackPause.

    Pause Scenarios - Identify any scenario in which the Player will pause and make sure that trackPause is properly called. The following scenarios all require that your app call trackPause():

    • The user explicitly hits pause in the app.
    • The player puts itself into the Pause state.
    • (Mobile Apps) - The user puts the application into the background, but you want the app to keep the session open.
    • (Mobile Apps) - Any type of system interrupt occurs that causes an application to be backgrounded. For example, the user receives a call, or a pop up from another application occurs, but you want the application to keep the session alive to give the user the opportunity to resume the content from the point of interruption.
  8. Identify the event from the player for play and/or resume from pause and call trackPlay.


    This may be the same event source that was used in Step 4. Ensure that each trackPause() API call is paired with a following trackPlay() API call when the playback resumes.

  9. Listen for playback seeking events from the media player. On seek start event notification, track seeking using the SeekStart event.

  10. On seek complete notification from the media player, track the end of seeking using the SeekComplete event.

  11. Listen for the playback buffering events from media player, and on buffer start event notification, track buffering using the BufferStart event.

  12. On buffer complete notification from the media player, track the end of buffering using the BufferComplete event.

See examples of each step in the following platform-specific topics, and look at the sample players included with your SDKs.

For a simple example of playback tracking, see this use of the JavaScript 2.x SDK in an HTML5 player:

/* Call on media start */
if (e.type == "play") {

    // Check for start of media
    if (!sessionStarted) {
        /* Set media info */
        /* MediaHeartbeat.createMediaObject(<MEDIA_NAME>,
        var mediaInfo = MediaHeartbeat.createMediaObject(

        /* Set custom context data */
        var customVideoMetadata = {
            isUserLoggedIn: "false",
            tvStation: "Sample TV station",
            programmer: "Sample programmer"

        /* Set standard video metadata */
        var standardVideoMetadata = {};
        standardVideoMetadata[MediaHeartbeat.VideoMetadataKeys.EPISODE] = "Sample Episode";
        standardVideoMetadata[MediaHeartbeat.VideoMetadataKeys.SHOW] = "Sample Show";

        // Start Session
        this.mediaHeartbeat.trackSessionStart(mediaInfo, customVideoMetadata);

        // Track play
        sessionStarted = true;

    } else {
        // Track play for resuming playack

/* Call on video complete */
if (e.type == "ended") {
    console.log("video ended");
    sessionStarted = false;

/* Call on pause */
if (e.type == "pause") {

/* Call on scrub start */
if (e.type == "seeking") {

/* Call on scrub stop */
if (e.type == "seeked") {

/* Call on buffer start */
if (e.type == “buffering”) {

/* Call on buffer complete */
if (e.type == “buffered”) {


For information on validating your implementation, see Validation.

On this page