@startuml autonumber box "Container" #LightGreen participant Cobalt participant Starboard participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server end box Cobalt -> Starboard: SbIsMediaSupported(video_codec, audio_codec, key_system) note right: key_system ignored Starboard -> rialtoClient: createMediaPipelineCapabilities() rialtoClient -> rialtoServer: createMediaPipelineCapabilities() rialtoServer -> GStreamer_server: Get decoder capabilities GStreamer_server -> rialtoServer: Decoder caps rialtoServer -> rialtoServer: Convert caps to list of supported MIME\ntypes: 'supportedMimeTypes' note right Rialto has pre-defined list of its known MIME types. The list of supported MIME types is the subset of MIME types that are in the pre-defined list AND are supported by the underlying decoders. end note rialtoServer -> rialtoClient: media_pipeline_caps rialtoClient -> Starboard: media_pipeline_caps Starboard -> Starboard: Convert video_codec to MIME type Starboard -> rialtoClient: media_pipeline_caps.isMimeTypeSupported(mimeType) rialtoClient -> rialtoServer: media_pipeline_caps.isMimeTypeSupported(mimeType) rialtoServer -> rialtoServer: Check mimeType against supportedMimeTypes rialtoServer -> rialtoClient: result rialtoClient -> Starboard: result opt video_codec supported Starboard -> Starboard: Convert audio_codec to MIME type Starboard -> rialtoClient: media_pipeline_caps.isMimeTypeSupported(mimeType) rialtoClient -> rialtoServer: media_pipeline_caps.isMimeTypeSupported(mimeType) rialtoServer -> rialtoServer: Check mimeType against supportedMimeTypes rialtoServer -> rialtoClient: result rialtoClient -> Starboard: result opt audio_codec supported Starboard --> Cobalt: true else Starboard --> Cobalt: false end else Starboard -> Cobalt: false end == Get Supported Mime Types == note across: An alternative implementation could use the getsupporteMimeTypes() API Starboard -> rialtoClient: media_pipeline_caps.getSupportedMimeTypes(sourceType) note right: sourceType indicates audio or video rialtoClient -> rialtoServer: media_pipeline_caps.getSupportedMimeTypes(sourceType) rialtoServer -> rialtoServer: Extract relevant MIME types from\nsupportedMimeTypes based on sourceType rialtoServer -> rialtoClient: mimeTypes[] rialtoClient -> Starboard: mimeTypes[] @enduml |
This is not something that a user of rialto-gstreamer needs to be concerned about (it is done automatically). This API is to query if a specified set of properties are supported on the RialtoServer. This enables RialtoGstreamer to determine the set of properties to present, and support, for the user of rialto-gstreamer. This will happen upon creation of a pipeline (when a sink is created).
@startuml autonumber box "Container" #LightGreen participant GStreamer_client participant rialtoGstreamer participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server end box GStreamer_client -> rialtoGstreamer: a sink is requested rialtoGstreamer -> rialtoClient: getSupportedProperties(pipeline_session, media_type, property_names) rialtoClient -> rialtoServer: getSupportedProperties(pipeline_session, media_type, property_names) rialtoServer -> GStreamer_server: g_object_class_find_property(called once for each property name) GStreamer_server -> rialtoServer: property (or null if not supported) for each property name rialtoServer -> rialtoClient: list of supported properties rialtoClient -> rialtoGstreamer: list of supported properties rialtoGstreamer -> GStreamer_client: a sink is created that supports the properties @enduml |
API for setting the position and size of the video output window for a pipeline session.
@startuml autonumber box "Container" #LightGreen participant Cobalt participant Starboard participant GStreamer_client participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server end box Cobalt -> Starboard: SetBounds(z, x, y, w, h) Starboard -> GStreamer_client: set video sink geometry (pipeline, x, y, w, h) GStreamer_client -> rialtoClient: setVideoWindow(pipeline_session, x, y, w, h) rialtoClient -> rialtoServer: setVideoWindow(pipeline_session, x, y, w, h) rialtoServer -> GStreamer_server: set video sink geometry (pipeline, x, y, w, h) GStreamer_server --> rialtoServer: status rialtoServer --> rialtoClient: status rialtoClient --> GStreamer_client: status GStreamer_client --> Starboard: Starboard --> Cobalt: @enduml |
The notifyQos() callback is used to notify clients of dropped audio/video data in the server side pipeline.
@startuml
box "Container" #LightGreen box "Platform" #LightBlue
rialtoServer -> GStreamer_server: Subscribe for QOS event
opt Frames dropped or element changed processing strategy note across This event is generate whenever GStreamer generates a QoS event which is currently for these reasons: - The element dropped a buffer because of QoS reasons. - An element changed its processing strategy because of QoS reasons. This could include a decoder that decided to drop every B frame to increase its processing speed or an effect element switching to a lower quality algorithm. end note GStreamer_server -/ rialtoServer: GstBusMessage(qos_event) note right note over Cobalt, Starboard end @enduml |
API for setting and getting the volume for a pipeline session.
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
API for obtaining the number of "rendered frames" and "dropped frames"
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
GstStructure *stats{nullptr};
g_object_get(sink, "stats", &stats, nullptr);
if (stats)
{
guint64 renderedVideoFrames;
guint64 droppedVideoFrames;
if (gst_structure_get_uint64(stats, "rendered", &renderedVideoFrames) &&
gst_structure_get_uint64(stats, "dropped", &droppedVideoFrames))
{
std::cout << "renderedVideoFrames " << renderedVideoFrames << std::endl;
std::cout << "droppedVideoFrames " << droppedVideoFrames << std::endl;
}
gst_structure_free(stats);
} |
API for setting and getting the "immediate-output" property (used for low-latency output)
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean immediateOutput; # NOTE: The "immediate-output" property will only be available if it's supported by the hardware g_object_get(videoSink, "immediate-output", &immediateOutput, nullptr); if (immediateOutput) ... |
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean immediateOutput{TRUE}; // The desired setting
# NOTE: The "immediate-output" property will only be available if it's supported by the hardware
g_object_set(videoSink, "immediate-output", immediateOutput, nullptr); |
API for setting the "low-latency" property (used for low-latency audio output)
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean lowLatency{TRUE}; // The desired setting
# NOTE: The "low-latency" property will only be available if it's supported by the hardware
g_object_set(audioSink, "low-latency", lowLatency, nullptr); |
API for setting and getting the "sync" property
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean sync{TRUE};
# NOTE: The "sync" property will only be available if it's supported by the hardware
g_object_get(audioSink, "sync", &sync, nullptr); |
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean sync{TRUE}; // The desired setting
# NOTE: The "sync" property will only be available if it's supported by the hardware
g_object_set(audioSink, "sync", sync, nullptr); |
API for setting the "sync-off" property
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean syncOff{TRUE}; // The desired setting
# NOTE: The "sync-off" property will only be available if it's supported by the hardware
g_object_set(audioDecoder, "sync-off", syncOff, nullptr); |
API for setting and getting the "stream-sync-mode" property
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean sync{TRUE};
# NOTE: The "stream-sync-mode" property will only be available if it's supported by the hardware
g_object_get(audioDecoder, "stream-sync-mode", &streamSyncMode, nullptr); |
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
gboolean sync{TRUE}; // The desired setting
# NOTE: The "streamSyncMode" property will only be available if it's supported by the hardware
g_object_set(audioDecoder, "stream-sync-mode", streamSyncMode, nullptr); |
API for setting and getting the mute setting for a pipeline session.
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
@startuml box "Container" #LightGreen box "Platform" #LightBlue
|
API for processing audio gap for a pipeline session. Functionality used to avoid audio pops during transitions.
@startuml autonumber box "Container" #LightGreen participant GStreamer_client participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant rdk_gstreamer_utils end box GStreamer_client -> rialtoClient: processAudioGap(pipeline_session, position, duration, level) rialtoClient -> rialtoServer: processAudioGap(pipeline_session, position, duration, level) rialtoServer -> rdk_gstreamer_utils: processAudioGap(pipeline, position, duration, level, isAudioAac) rdk_gstreamer_utils --> rialtoServer: status rialtoServer --> rialtoClient: status rialtoClient --> GStreamer_client: status @enduml |