Website Integration

Because every game exposes Genvid differently, every game requires a customized website. This includes both the skinning aspect (HTML and CSS) as well as the functional side (JavaScript). You also need to run a web server hosting the web application for the browser.

Since there are multiple ways of creating websites, this section only covers steps common to the Genvid SDK.

JavaScript API

Genvid provides two libraries:

  • genvid Contains the logic necessary to connect to the leaf, decode its stream, create a video player, and sync data and video.
  • genvid-math Provides several geometry computation utility functions that you may need to create a WebGL overlay for example. You can find examples using this lib in the Samples.

These two libraries each come in three different bundles:

  • NPM Package archive A gzipped tarball file that you can import as if it were a local NPM package. You can then import your dependencies using the ES6 syntax and bundle your project using the method of your choice. You will have access to type definitions.
  • ES5 js module This allows you to import the libraries, such as by exposing the global variables genvid and genvidMath. (Note: You won’t have access to the type definitions.) All our Cube Samples go through the Web Sample which uses this method.
  • UMD js module A UMD module that allows you to import the libraries in many ways, such as by exposing the global variables genvid and genvidMath. You won’t have access to the type definitions. Only our Twitch extension sample uses this method.


The preferred method to use the genvid and genvid-math dependencies is now by importing our ES5 js modules. All the Cube Samples use the Web Sample, which illustrates this method.

We provide older type definitions for backwards compatibility in the api/web/legacy_types folder. However, we recommend following the Web Sample method for newer non-Twitch Extension projects. (The Twitch Extension Development Sample uses the UMD js module.)

Frontend integration

The first step in integrating Genvid into your frontend is to instantiate a genvidClient.IGenvidClient() object using the genvidClient.createGenvidClient():

let client = genvid.createGenvidClient(streamInfo, websocketURL, websocketToken, video_player_id);

The first parameter, streamInfo, corresponds to a genvidClient.IStreamInfo() structure, typically returned by the backend service from the POST /disco/stream/join call.

The next two parameters, websocketURL and websocketToken, are the two values specifying the websocket address and security token, respectively. They are provided by the backend through the POST /disco/stream/join call.

The last parameter, video_player_id, is a string referencing an HTML element that you want to use for the live streaming video player. When the IGenvidClient() creates the video player, it will replace this HTML element with the one from the live streaming service.

onVideoPlayerReady callback

When you create the player, IGenvidClient() calls the function specified with genvidClient.IGenvidClient.onVideoPlayerReady(). This is typically used to hook up the overlay.

client.onVideoPlayerReady( function(elem) { console.log('Create the overlay!'); } );

onAuthenticated callback

The onAuthenticated() callback tells the user when the IGenvidClient() successfully connects to the Genvid Services:

client.onAuthenticated( function(succeed) { if (succeeded) { console.log('Connected!'); } });

onStreamsReceived callback

The onStreamsReceived() callback is called as soon as the webclient receives new game data and annotations from the websocket connection to the leaf. As opposite to onDraw(), the data is not synchronized to the video being played.

The data given to that callback is the one the game sent using Genvid_SubmitGameData() and Genvid_SubmitAnnotation().

This callback is useful for decoding, analyzing, or any other costly processing you need to do on the data before it’s rendered: for each data frame, generally you can use the user field to store the data transformed by your processing. You can later access that field in the onDraw() method. We don’t advise you to make that processing at rendering time (onDraw()), since it can add latency to your overlay.

client.onStreamsReceived((streams) => { myGameOverlay.onStreamsReceived(streams); });

onNotificationsReceived callback

This method allows for registering a callback that will receive notifications as soon as they are received from the websocket connection to the Leaf.

Notifications can be sent from the game or the Webgateway API.

Notifications are used to transfer information as fast as possible and are not synchronized to the video streams, contrary to game data and annotations.

They are sent from the game using the native SDK function Genvid_SubmitNotification().

client.onNotificationsReceived((notifications) => { myGameOverlay.onNotificationsReceived(notifications); });

onDraw callback

The onDraw() method allows the registration of a callback that will be called regularly at a configurable framerate (defaults to 30 times per second) and carries the game data and annotations synchonized to the current video stream being played. You can usually use this to access the game data necessary to render your overlay or any data you need to be synchronized to the video stream:

client.onDraw((frame) => { myGameOverlay.onDraw(frame); });


If you use onStreamsReceived() to transform the data, you will be accessing the transformed data in the onDraw() callback.

When the onDraw() callback is invoked, it receives a genvidClient.IDataFrame() object as a parameter. It contains the timecode for this video frame, the data streams coming from every session connected, and information about the video composition. The frame data is organized in the following ways:

  • You can obtain data from the sessions member. That member contains a list of streams and a list of annotations, both of which are classified by their respective session IDs.
  • The genvidClient.IDataFrame() interface also gives direct access to its streams and annotations members. However, those are deprecated and you shouldn’t use them. Use the sessions member described above to be in compliance with the new multi-session order.

Composition data (an array of composition.ISourceCompositionData()) is information about what transformations were applied to the different video streams. Some of the ways you can use it include knowing what pixel of the video overlay matches which video source or to perform a clipping operation to prevent an overlay related to one video source overlaying another video.

The current streams contain only the latest frame for each stream. Annotations, on the other hand, hold all of the previous frames accumulated so far and will not be repeated. Although streams and annotations behave differently, both are cast to the genvidClient.IDataStreamFrame() interface.

The Genvid SDK carries data in binary form using the field rawdata, which is a JavaScript ArrayBuffer(). You can interpret the data in any way, and the SDK provides a few utility routines which can help decoding.

For example, if the game sends JSON data encoded as UTF-8, the website code needs to decode the rawdata binary field as UTF-8 and parse the resulting string as JSON. You can use the accessor that, when accessed, will try to decode the genvidClient.IDataStreamFrame.rawdata() field using UTF-8.


If the data you’re sending does not support UTF-8 decoding, accessing the field will cause an exception.

onDraw(frame) {
   let stream = frame.streams["position"]
   let datastr = genvid.UTF8ToString(frame.rawdata);
   stream.user = JSON.parse(datastr);

Because Genvid sometimes repeats data (for example, when a stream has a low framerate value), we include a mechanism for avoiding decoding identical data multiple times. The IGenvidClient() includes the onStreamsReceived() function which passes a genvidClient.IDataStreams() collection upon reception of the data. The data streams contain a collection of video streams and their frames which you can modify before they get integrated into the Genvid synchronization engine. For example, parsing a collection to detect an upcoming event or removing the collection entirely.

You could also modify the function to decode differently based on the streamId, as long as you’re consistent with the format used when sending data from the game process (see Game-Data Streaming).

Once everything is ready in the game’s website code, you can start the GenvidClient streaming:


The IGenvidClient() automatically uses configured callbacks and handles synchronization between the streaming video and the game data sent in the onDraw().

onDisconnect callback

The onDisconnect() callback gets triggered upon a websocket closing. You can bind functionality to this callback and be alerted when a socket closes:

client.onDisconnect(() => {this.onDisconnectDetected();});

reconnect callback

The reconnect() callback establishes a new websocket connection. It receives new stream info, a new leaf URI, and a new token as parameters.

client.reconnect(info, uri, token);


While Genvid doesn’t provide a strict overlay API, it does expose everything necessary in the IGenvidClient() for the overlay to work. We also expose WebGL utility-routines which we use for all of our samples.

The main entry point to the overlay is the callback set using onDraw(). On regular intervals, the specified callback is invoked, with a frame of data for all streams existing in the game.

This callback receives the latest game-data frame for every stream. You have full control of what to do with that data: render some 3D highlights in a WebGL canvas, tweak HTML elements to display current game stats, adjust button visibility to allow new game events from the spectator, etc.

To facilitate WebGL rendering, Genvid provides a genvidWebGL.IWebGLContext() class which simplifies repetitive tasks in WebGL. You can create it with the genvidWebGL.createWebGLContext() method.


You can send events back to the Genvid Services through the IGenvidClient() instance. See sendEvent() and sendEventObject() for more information.

Inside the Genvid services, for the sake of scalability, these direct events will be processed and reduced according to your own event configuration. An example of configuration can be seen here.

After that, the game will be able to subscribe to a reduced events stream.

For example, if 100,000 spectators send a “cheer” event in a short period of time, the game will receive a single object with a count value of 100,000 instead of receiving 100,000 messages in a similarly short period of time. This is the solution we provide in order to decouple the game machine load from the number of spectators watching a broadcast.

Google Chrome Autoplay Support

Google Chrome has a special policy for handling video playback. In our API, all video players that support it have the autoplay tag set. This enables starting video playback automatically on browsers like Firefox, as well as on websites with the Media Engagement Index set high enough.

If you want to enable automatic video-playback on most browsers, including Chrome, you can set the video to mute. You also need to ensure the overlay is hidden when the video is paused to avoid locking the user in front of a paused video. The following code is an example of how to do both.

client.onVideoPlayerReady( (elem) => {

    // Optional: Set to muted to autostart even on Chrome and also iOS.

    // Always safe to hide the overlay on startup.
    // The PLAYING event below will show it.

    client.videoPlayer.addEventListener(genvid.PlayerEvents.PAUSE, () => {

    client.videoPlayer.addEventListener(genvid.PlayerEvents.PLAYING, () => {;