Capabilities

Supported MIME Types

@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


Cobalt            ->  Starboard:        SbIsMediaSupported(video_codec, audio_codec, key_system)
note right: key_system ignored


Starboard         ->  rialtoClient:     createMediaPipelineCapabilities()
rialtoClient      ->  rialtoServer:     createMediaPipelineCapabilities()
rialtoServer      ->  GStreamer_server: Get decoder capabilities
GStreamer_server  ->  rialtoServer:     Decoder caps
rialtoServer      ->  rialtoServer:     Convert caps to list of supported MIME\ntypes: 'supportedMimeTypes'
note right
Rialto has pre-defined list of its known MIME types.
The list of supported MIME types is the subset of
MIME types that are in the pre-defined  list AND
are supported by the underlying decoders.
end note
rialtoServer      ->  rialtoClient:     media_pipeline_caps
rialtoClient      ->  Starboard:        media_pipeline_caps

Starboard         ->  Starboard:        Convert video_codec to MIME type
Starboard         ->  rialtoClient:     media_pipeline_caps.isMimeTypeSupported(mimeType)
rialtoClient      ->  rialtoServer:     media_pipeline_caps.isMimeTypeSupported(mimeType)
rialtoServer      ->  rialtoServer:     Check mimeType against supportedMimeTypes
rialtoServer      ->  rialtoClient:     result
rialtoClient      ->  Starboard:        result

opt video_codec supported
Starboard         ->  Starboard:        Convert audio_codec to MIME type
Starboard         ->  rialtoClient:     media_pipeline_caps.isMimeTypeSupported(mimeType)
rialtoClient      ->  rialtoServer:     media_pipeline_caps.isMimeTypeSupported(mimeType)
rialtoServer      ->  rialtoServer:     Check mimeType against supportedMimeTypes
rialtoServer      ->  rialtoClient:     result
rialtoClient      ->  Starboard:        result

opt audio_codec supported
Starboard         --> Cobalt:           true
else
Starboard         --> Cobalt:           false
end

else
Starboard         ->  Cobalt:           false
end



== Get Supported Mime Types ==
note across: An alternative implementation could use the getsupporteMimeTypes() API

Starboard         ->  rialtoClient:     media_pipeline_caps.getSupportedMimeTypes(sourceType)
note right: sourceType indicates audio or video
rialtoClient      ->  rialtoServer:     media_pipeline_caps.getSupportedMimeTypes(sourceType)
rialtoServer      ->  rialtoServer:     Extract relevant MIME types from\nsupportedMimeTypes based on sourceType
rialtoServer      ->  rialtoClient:     mimeTypes[]
rialtoClient      ->  Starboard:        mimeTypes[]

@enduml

Get supported properties

This is not something that a user of rialto-gstreamer needs to be concerned about (it is done automatically). This API is to query if a specified set of properties are supported on the RialtoServer. This enables RialtoGstreamer to determine the set of properties to present, and support, for the user of rialto-gstreamer. This will happen upon creation of a pipeline (when a sink is created).

@startuml

autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoGstreamer
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoGstreamer:  a sink is requested
rialtoGstreamer   ->  rialtoClient:     getSupportedProperties(pipeline_session, media_type, property_names)
rialtoClient      ->  rialtoServer:     getSupportedProperties(pipeline_session, media_type, property_names)
rialtoServer      ->  GStreamer_server: g_object_class_find_property(called once for each property name)
GStreamer_server  ->  rialtoServer:     property (or null if not supported) for each property name
rialtoServer      ->  rialtoClient:     list of supported properties
rialtoClient      ->  rialtoGstreamer:  list of supported properties
rialtoGstreamer   ->  GStreamer_client: a sink is created that supports the properties

@enduml


Output Control

Set Video Window Size & Position

API for setting the position and size of the video output window for a pipeline session.


@startuml


autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


Cobalt            ->  Starboard:        SetBounds(z, x, y, w, h)
Starboard         ->  GStreamer_client: set video sink geometry (pipeline, x, y, w, h)
GStreamer_client  ->  rialtoClient:     setVideoWindow(pipeline_session, x, y, w, h)
rialtoClient      ->  rialtoServer:     setVideoWindow(pipeline_session, x, y, w, h)
rialtoServer      ->  GStreamer_server: set video sink geometry (pipeline, x, y, w, h)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: status
GStreamer_client  --> Starboard:
Starboard         --> Cobalt:

@enduml


Quality of Service

The notifyQos() callback is used to notify clients of dropped audio/video data in the server side pipeline.


@startuml


autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


== During pipeline creation ==

rialtoServer      ->  GStreamer_server: Subscribe for QOS event


== During playback ==

opt Frames dropped or element changed processing strategy

note across

This event is generate whenever GStreamer generates a QoS event which is currently for these reasons:

 - The element dropped a buffer because of QoS reasons.

 - An element changed its processing strategy because of QoS reasons. This could include a decoder that

   decided to drop every B frame to increase its processing speed or an effect element switching to a lower

   quality algorithm.

end note


GStreamer_server  -/  rialtoServer:     GstBusMessage(qos_event)
rialtoServer      ->  rialtoServer:     Extract required data from event
rialtoServer      -/  rialtoClient:     notifyQos(pipeline_session, source_id, qos_info)
rialtoClient      -/  GStreamer_client: notifyQos(pipeline_session, source_id, qos_info)
GStreamer_client  ->  GStreamer_client: Create GST_EVENT_QOS message from qos_info
GStreamer_client  -/  Starboard:        GST_MESSAGE_QOS(qos_event)

note right
Only number of processed & dropped frames/
samples is sent by Rialto so the QOS event
sent by GStreamer_client will have less
valid properties set than the original server
side event.
end note

note over Cobalt, Starboard
QOS events are not propogated up to
Cobalt but can be queried anytime
by calling SbPlayerInfo2()
end note

end

@enduml

Volume

API for setting and getting the volume for a pipeline session.


@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     setVolume(pipeline_session, volume)
rialtoClient      ->  rialtoServer:     setVolume(pipeline_session, volume)
rialtoServer      ->  GStreamer_server: gst_stream_volume_set_volume(pipeline, GST_STREAM_VOLUME_FORMAT_LINEAR, volume)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: status
@enduml

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     getVolume(pipeline_session)
rialtoClient      ->  rialtoServer:     getVolume(pipeline_session)
rialtoServer      ->  GStreamer_server: gst_stream_volume_get_volume(pipeline, GST_STREAM_VOLUME_FORMAT_LINEAR)
GStreamer_server  --> rialtoServer:     volume
rialtoServer      --> rialtoClient:     volume
rialtoClient      --> GStreamer_client: volume
@enduml

Stats

API for obtaining the number of "rendered frames" and "dropped frames"

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     getStats(pipeline_session, source_id)
rialtoClient      ->  rialtoServer:     getStats(pipeline_session, source_id)
rialtoServer      ->  GStreamer_server: g_object_get(pipeline, "stats", &stats, nullptr)
GStreamer_server  --> rialtoServer:     GstStructure *stats; containing rendered_frames, dropped_frames
rialtoServer      --> rialtoClient:     rendered_frames, dropped_frames
rialtoClient      --> GStreamer_client: stats
@enduml

GstStructure *stats{nullptr};
g_object_get(sink, "stats", &stats, nullptr);
if (stats)
{
    guint64 renderedVideoFrames;
    guint64 droppedVideoFrames;
    if (gst_structure_get_uint64(stats, "rendered", &renderedVideoFrames) &&
        gst_structure_get_uint64(stats, "dropped", &droppedVideoFrames))
    {
        std::cout << "renderedVideoFrames " << renderedVideoFrames << std::endl;
        std::cout << "droppedVideoFrames " << droppedVideoFrames << std::endl;
    }
    gst_structure_free(stats);
}

Immediate Output

API for setting and getting the "immediate-output" property (used for low-latency output)

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     getImmediateOutput(pipeline_session, source_id)
rialtoClient      ->  rialtoServer:     getImmediateOutput(pipeline_session, source_id)
rialtoServer      ->  GStreamer_server: g_object_get(pipeline, "immediate-output", &immediateOutput, nullptr)
GStreamer_server  --> rialtoServer:     immediateOutput
rialtoServer      --> rialtoClient:     immediateOutput
rialtoClient      --> GStreamer_client: immediateOutput
@enduml

gboolean immediateOutput;
# NOTE: The "immediate-output" property will only be available if it's supported by the hardware
g_object_get(videoSink, "immediate-output", &immediateOutput, nullptr);
if (immediateOutput) ...


@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     setImmediateOutput(pipeline_session, source_id, immediateOutput)
rialtoClient      ->  rialtoServer:     setImmediateOutput(pipeline_session, source_id, immediateOutput)
rialtoServer      ->  GStreamer_server: g_object_set(pipeline, "immediate-output", immediateOutput, nullptr)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: nothing returned
@enduml

gboolean immediateOutput{TRUE}; // The desired setting
# NOTE: The "immediate-output" property will only be available if it's supported by the hardware
g_object_set(videoSink, "immediate-output", immediateOutput, nullptr);

Low Latency

API for setting the "low-latency" property (used for low-latency audio output)

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     setLowLatency(pipeline_session, lowLatency)
rialtoClient      ->  rialtoServer:     setLowLatency(pipeline_session, lowLatency)
rialtoServer      ->  GStreamer_server: getAudioSink(pipeline_session)
GStreamer_server  --> rialtoServer:     audioSink
rialtoServer      ->  GStreamer_server: g_object_set(audioSink, "low-latency", lowLatency, nullptr)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: nothing returned
@enduml

gboolean lowLatency{TRUE}; // The desired setting
# NOTE: The "low-latency" property will only be available if it's supported by the hardware
g_object_set(audioSink, "low-latency", lowLatency, nullptr);

Sync

API for setting and getting the "sync" property

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     getSync(pipeline_session)
rialtoClient      ->  rialtoServer:     getSync(pipeline_session)
rialtoServer      ->  GStreamer_server: getAudioSink(pipeline_session)
GStreamer_server  --> rialtoServer:     audioSink
rialtoServer      ->  GStreamer_server: g_object_get(audioSink, "sync", &sync, nullptr)
GStreamer_server  --> rialtoServer:    sync
rialtoServer      --> rialtoClient:    sync
rialtoClient      --> GStreamer_client: sync
@enduml

gboolean sync{TRUE};
# NOTE: The "sync" property will only be available if it's supported by the hardware
g_object_get(audioSink, "sync", &sync, nullptr);


@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     setSync(pipeline_session, sync)
rialtoClient      ->  rialtoServer:     setSync(pipeline_session, sync)
rialtoServer      ->  GStreamer_server: getAudioSink(pipeline_session)
GStreamer_server  --> rialtoServer:     audioSink
rialtoServer      ->  GStreamer_server: g_object_set(audioSink, "sync", sync, nullptr)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: nothing returned
@enduml

gboolean sync{TRUE}; // The desired setting
# NOTE: The "sync" property will only be available if it's supported by the hardware
g_object_set(audioSink, "sync", sync, nullptr);

Sync Off

API for setting the "sync-off" property 

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     setSyncOff(pipeline_session, syncOff)
rialtoClient      ->  rialtoServer:     setSyncOff(pipeline_session, syncOff)
rialtoServer      ->  GStreamer_server: getAudioDecoder(pipeline_session)
GStreamer_server  --> rialtoServer:     audioDecoder
rialtoServer      ->  GStreamer_server: g_object_set(audioDecoder, "sync-off",syncOff, nullptr)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: nothing returned
@enduml

gboolean syncOff{TRUE}; // The desired setting
# NOTE: The "sync-off" property will only be available if it's supported by the hardware
g_object_set(audioDecoder, "sync-off", syncOff, nullptr);

Stream Sync Mode

API for setting and getting the "stream-sync-mode" property

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     getStreamSyncMode(pipeline_session)
rialtoClient      ->  rialtoServer:     getStreamSyncMode(pipeline_session)
rialtoServer      ->  GStreamer_server: getAudioDecoder(pipeline_session)
GStreamer_server  --> rialtoServer:     audioDecoder
rialtoServer      ->  GStreamer_server: g_object_get(audioDecoder, "stream-sync-mode", &streamSyncMode, nullptr)
GStreamer_server  --> rialtoServer:   streamSyncMode
rialtoServer      --> rialtoClient:   streamSyncMode
rialtoClient      --> GStreamer_client: streamSyncMode
@enduml

gboolean sync{TRUE};
# NOTE: The "stream-sync-mode" property will only be available if it's supported by the hardware
g_object_get(audioDecoder, "stream-sync-mode", &streamSyncMode, nullptr);


@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     setSync(pipeline_session, streamSyncMode)
rialtoClient      ->  rialtoServer:     setSync(pipeline_session, streamSyncMode)
rialtoServer      ->  GStreamer_server: getAudioDecoder(pipeline_session)
GStreamer_server  --> rialtoServer:     audioDecoder
rialtoServer      ->  GStreamer_server: g_object_set(audioDecoder, "stream-sync-mode" ,streamSyncMode, nullptr)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: nothing returned
@enduml

gboolean sync{TRUE}; // The desired setting
# NOTE: The "streamSyncMode" property will only be available if it's supported by the hardware
g_object_set(audioDecoder, "stream-sync-mode", streamSyncMode, nullptr);

Mute

API for setting and getting the mute setting for a pipeline session.


@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     setMute(pipeline_session, mute)
rialtoClient      ->  rialtoServer:     setMute(pipeline_session, mute)
rialtoServer      ->  GStreamer_server: gst_stream_volume_set_mute(pipeline, mute)
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> GStreamer_client: status
@enduml



@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


GStreamer_client  ->  rialtoClient:     getMute(pipeline_session)
rialtoClient      ->  rialtoServer:     getMute(pipeline_session)
rialtoServer      ->  GStreamer_server: gst_stream_volume_get_mute(pipeline)
GStreamer_server  --> rialtoServer:     mute
rialtoServer      --> rialtoClient:     mute
rialtoClient      --> GStreamer_client: mute
@enduml


Process Audio Gap

API for processing audio gap for a pipeline session. Functionality used to avoid audio pops during transitions.

@startuml
autonumber

box "Container" #LightGreen
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant rdk_gstreamer_utils
end box


GStreamer_client     ->  rialtoClient:     processAudioGap(pipeline_session, position, duration, level)
rialtoClient         ->  rialtoServer:     processAudioGap(pipeline_session, position, duration, level)
rialtoServer         ->  rdk_gstreamer_utils: processAudioGap(pipeline, position, duration, level, isAudioAac)
rdk_gstreamer_utils  --> rialtoServer:     status
rialtoServer         --> rialtoClient:     status
rialtoClient         --> GStreamer_client: status
@enduml