MyCujoo LiveServices logo
HomeProductsPricingHelp Center
Create your accountSign in
Sign In
Help centerProduct & FeaturesMCLS Asynchronous API




Help Center


Product & Features

Getting started


API Documentation

MCLS Asynchronous API

The MCLS Asynchronous API allows notifications to be sent to a custom integration when certain changes are triggered in MCLS such as a stream URL update. This is very useful when creating more complex applications using MCLS. An example would be to be notified when a stream is being published to then execute workflows such as updating a page or update your internal databases with contents from our API.

Examples/Use cases

You can find an example repository here:

This application is written in Javascript (NodeJS) but can easily be modified to any programming language.

How to use it

To be able to let the MCLS Asynchronous API call you service we will have to create an Integration. An integration will tell our system which resource to sent an update to and where to send it. Your service will need to handle the incoming request.

To create an integration you will need to specify:

  • A Target (Webhook/Pubsub)

  • A source (Events, streams etc)

We currently support the following targets:

  • Webhooks

  • Google Cloud Pubsub

Configuring integrations via the API

The full integrations API documentation can be found here:

Creating an integration

The example below features a webhook integration. Please refer to the other sections regarding specific integrations. To create an integration you will need to use the following API endpoint: POST The payload looks like this:

{ "source": "SOURCE_UNSPECIFIED", "target": { "webhook": { "headers": { "authorization": "Basic YWxhZGRpbjpvcGVuc2VzYW1l" }, "params": { "src": "mcls" }, "url": "" } }} For the source you have the following options available:

  • SOURCE_EVENTS: Sends an update whenever event gets updated

  • SOURCE_STREAM: Sends an update whenever stream gets updated

  • SOURCE_STREAM_URL: Sends an update whenever stream URL gets updated

You can find the most up-to-date sources here.

The URL field needs to contain the full URL of your webhook (including HTTP(s)://).

The "Headers" field can be used to send specific headers along with the request we make to your webhook. This can be useful when you want to send an authorization token for example to authenticate the request.

The "Params" field is used to send key/value arguments of your choice with the request. These will be sent as query parameters when we call your webhook.

After creation you will get the receive the following payload: { "id": "1rZbNHRV2FsmsK8xG67pWkkSAyk", "source": "SOURCE_STREAM", "target": { "webhook": { "url": "", "headers": { "authorization": "Bearer YWxhZGRpbjpvcGVuc2VzYW1l" }, "params": { "src": "mcls" } } }, "create_time": "2021-04-23T14:01:57.151590Z" }

This payload contains the ID of the integration, you will need to store this in your system so you can delete or update the integration later.

After the integration has been created, simply test it by doing an operation on a stream, such as creating a new stream, editing a stream, publish a stream, start transcoding etc. Every time the state of a stream changes the webhook will now be called.

The payload that's sent by the webhook looks like this {"id":"cknadtb8g000d0185607jxxxx","type":"stream"}

If you specified any params in the "Params" field they will not be sent in the payload, but as query parameters. Headers you provided will also be sent when the Asynchronous API sends the request.

Google Cloud Pub/Sub integration

In order to use the asynchronous API with Google Cloud Pub/Sub you will need the following:

  • A service account that has the role "Pub/Sub Publisher"

  • A Pub/Sub topic

There are two ways to add authentication to MCLS to publish to your topic: A) You can pass your Google Cloud project id and grant our Service Account // rights to publish there.

To do that:

  1. Open the Google Cloud Console.

  2. Select your project, and then click Pub/Sub in the left-hand navigation.

  3. Find your topic, and open the permissions details.

  4. Add the service account [email protected], and grant it the role of Pub/Sub Publisher.

  5. Click Save to complete the topic set up.

Note: If you go for option A then the project_id field is required to be set to the project ID of your Pub/Sub topic in the integration creation payload.

B) You can create service account in your Google Cloud project where Pub/Sub is enabled and pass json credentials for integration to use it.

Note: If you go for option B then service_account_json is required in the integration creation payload.

Please refer to the following guide on how to create a service account JSON: In order to create the integration you will need to use the following payload: { "source": "SOURCE_STREAM_URL", "target": { "google_cloud_pub_sub": { "topic_id": "<your-topic-id?", "service_account_json": "{\n \"type\": \"service_account\",\n \"project_id\": \"your-project-id\",\n \"private_key_id\": \"1234567890\",\n \"private_key\": \"-----BEGIN PRIVATE KEY-----\\NOTAREALKEY+Ng==\\n-----END PRIVATE KEY-----\\n\",\n \"client_email\": \"[email protected]\",\n \"client_id\": \"1234567\",\n \"auth_uri\": \"\",\n \"token_uri\": \"\",\n \"auth_provider_x509_cert_url\": \"\",\n \"client_x509_cert_url\": \"\"\n}\n" } } }

Note that the service account JSON needs to be escaped. While this can be done manually it can be done much more easily using a command line utility called jq. Please see the following page on how to install this utility for your operating system:

For example on linux you can escape your JSON with the following command after installing jq:

cat ~/Downloads/new-service-account.json | jq -aRs .

This will produce an escaped JSON that can be used when configuring your Pub/Sub target . There are also online formatters that do this but we don't recommend using those because this involves security credentials and there are no guarantees that those websites don't store your credentials.

After the integration has been created you can test this integration by streaming to MCLS and pulling for messages on a default subscriber.