...
The shared memory buffer shall be split into two regions for each playback session, one for a video stream, and the other for audio (only one concurrent audio track supported, for audio track selection previous audio must be removed first). Each region must be big enough to accommodate the largest possible frame of audio/video data plus associated decryption parameters and metadata. At the end there is also a separate section for webaudio regions (common for all playback sessions). The buffer shall initially be sized to 8Mb per playback session to allow some overhead + 10kB * number of webaudio regions. There can be 0 or more Web Audio regions per Rialto Client.
For apps that can support more than one concurrent playback the shared memory buffer shall be sized accordingly and partitioned into different logical areas for each playback session. The partitions need not necessarily be equally sized, for example if an app supports one UHD and one HD playback the 'HD' partition may be smaller. There can be 0 or more Web Audio regions per Rialto Client.
| draw.io Diagram | |||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
...
| draw.io Diagram | |||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
...
| draw.io Diagram | |||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
...
| draw.io Diagram | ||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
V2 metadata uses protobuf to serialise the frames' properties to the shared memory buffer. This use of protobuf aligns with the IPC protocol but also allows support for optional fields and for fields to be added and removed without causing backward/forward compatibility issues. It also supports variable length fields so the MKS ID, IV & sub-sample information can all be directly encoded in the metadata, avoiding the complexities of interleaving them with the media frames and referencing them with offsets/lengths as used in the V1 metadata format.
enum SegmentAlignment {
enumMediaSegmentMetadata CipherMode { required uint32length
optional uint32 length = 1; /* Number of bytes in sample */ required optional sint64 time_position = 2; /* Position in stream in nanoseconds */ required optional sint64 sample_duration = 3; /* Frame/sample duration in nanoseconds */ optional required uint32 stream_id = 4; /* stream id (unique ID for ES, as defined in attachSource()) */ optional uint32 sample_rate = 5; /* Samples per second for audio segments */ optional uint32 channels_num = 6; /* Number of channels for audio segments */ optional uint32 width = 7; /* Frame width in pixels for video segments */ optional uint32 height = 8; /* Frame height in pixels for video segments */ optional SegmentAlignment segment_alignment = 9; /* Segment alignment can be specified for H264/H265, will use NAL if not set */ optional bytes extra_data = 10; /* Buffer containing extradata */ optional bytes media_key_session_id = 11; /* Buffer containing key session ID to use for decryption */ optional bytes key_id = 12; /* Buffer containing Key ID to use for decryption */ optional bytes init_vector = 13; /* Buffer containing the initialization vector for decryption */ optional uint32 init_with_last_15 = 14; /* initWithLast15 value for decryption */ optional repeated SubsamplePair sub_sample_info = 15; /* If present, use gather/scatter decryption based on this list of clear/encrypted byte lengths. */ /* If not present and content is encrypted then entire media segment needs decryption. (unless */ */ optional bytes codec_data = 16; /* Buffer containing updated codec data for video segments */ message SubsamplePair{ required uint32_t /* cipher_mode indicates pattern encryption in which case crypt/skip byte block value specify */ num_clear_bytes = 1; /* How many of next bytes in sequence are clear */ required uint32_t num_encrypted_bytes = 2; /* the encryption pattern) */ optional bytes codec_data = 16; /* Buffer containing updated codec data for video segments */ optional CipherMode cipher_mode = 17; /* Block cipher mode of operation when common encryption used */ optional uint32 crypt_byte_block = 18; /* Crypt byte block value for CBCS cipher mode pattern */optional uint32 skip_byte_block = 19; /* Skip byte block value for CBCS cipher mode pattern */ }message SubsamplePair{ optional uint32_t num_clear_bytes = 1; /* How many of next bytes in sequence are clear */optional uint32_t num_encrypted_bytes = 2; /* How many of next bytes in sequence are encrypted * How many of next bytes in sequence are encrypted */ } |
...
| PlantUML Macro | ||||
|---|---|---|---|---|
| ||||
@startuml autonumber box "Container" #LightGreen participant Cobalt participant Starboard participant GStreamer_client participant rialtoClient end box Cobalt -> Starboard:@startuml autonumber box "Container" #LightGreen participant Cobalt participant Starboard participant GStreamer_client participant rialtoClient end box Cobalt -> Starboard: SbPlayerSetPlaybackRate(player, rate) note over GStreamer_client For now we assume max 1 pipeline session in GStreamer_client and therefore store its Rialto handle in a local variable. This will need to be fixed to support dual playback. end note opt rate == 0 Starboard -> GStreamer_client: Set pipeline state PAUSED GStreamer_client -> rialtoClient: pause(pipeline_session) else rate != 0 Starboard SbPlayerSetPlaybackRate(player, -> rate) note over GStreamer_client: Set For now we assume max 1 pipeline statesession PLAYINGin GStreamer_client and therefore ->store its Rialto rialtoClient:handle in a local play(pipeline_session) end rialtoClient --> GStreamer_client: status GStreamer_client --> Starboard: status variable. This will need to be fixed to support dual playback. end note opt rate == 0 Starboard -> GStreamer_client: Set pipeline playbackstate ratePAUSED GStreamer_client -> rialtoClient: setPlaybackRate(rate) rialtoClientpause(pipeline_session) else rate != 0 Starboard --> GStreamer_client: status Set pipeline state PLAYING GStreamer_client --> Starboard rialtoClient: status Starboard play(pipeline_session) end rialtoClient --> Cobalt: status opt Pause->play successful rialtoClient -/ GStreamer_client: status GStreamer_client: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_PLAYING) else Play->pause successful rialtoClient --> Starboard: -/ GStreamer_client: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_PAUSED) else Play<->pause state change failed rialtoClient status Starboard -/> GStreamer_client: Set notifyPlaybackState(pipeline_session, PLAYBACK_STATE_FAILURE) end @enduml |
Render frame may be called whilst playback is paused either at the start of playback or immediately after a seek operation. The client must wait for the readyToRenderFrame() callback first.
| PlantUML Macro | ||||
|---|---|---|---|---|
| ||||
@startuml autonumber box "Container" #LightGreen participant Netflix participant DPI participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server end box Netflixpipeline playback rate GStreamer_client -> rialtoClient: setPlaybackRate(rate) rialtoClient --> GStreamer_client: status GStreamer_client --> Starboard: DPI: status Starboard renderFrame() DPI --> Cobalt: status opt ->Pause->play successful rialtoClient rialtoClient: renderFrame() rialtoClient -/ GStreamer_client: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_PLAYING) else Play->pause successful rialtoClient -> rialtoServer: -/ GStreamer_client: renderFrame() opt Frame renderable rialtoServernotifyPlaybackState(pipeline_session, PLAYBACK_STATE_PAUSED) else Play<->pause state change failed rialtoClient ->/ GStreamer_serverclient: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_FAILURE) end @enduml |
Render frame may be called whilst playback is paused either at the start of playback or immediately after a seek operation. The client must wait for the readyToRenderFrame() callback first.
| PlantUML Macro | ||||
|---|---|---|---|---|
| ||||
@startuml autonumber box "Container" #LightGreen participant Netflix participant DPI participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server end box Netflix Trigger rendering of frame opt Frame rendered successfully note across: It is a Netflix requirement to call updatePlaybackPosition() after renderFrame() rialtoServer --> rialtoClient DPI: notifyPosition(position) rialtoClient --> DPI: renderFrame() DPI notifyPosition(position) DPI -> rialtoClient: renderFrame() rialtoClient --> NetflixrialtoServer: updatePlaybackPosition(ptsrenderFrame() opt Frame renderable rialtoServer --> rialtoClient GStreamer_server: Trigger rendering status=true elseof frame rialtoServer --> rialtoClient: status=false end true else renderFrame() called in bad state rialtoServer --> rialtoClient: status=false end rialtoClient --> DPI: status DPI --> Netflix: status @enduml |
...
| PlantUML Macro | ||||
|---|---|---|---|---|
| ||||
@startuml autonumber box "Container" #LightGreen participant Cobalt participant Starboard participant GStreamer_client_appsrc participant GStreamer_client_rialto_sinkdecrypter_element participant ocdmProxy end box Cobalt -> Starboard: SbPlayerWriteSample2(player, sample[]) Cobaltnote right: Currently sample array size must be 1 Starboard -> Starboard: Create GstBuffer and add media data from sample -> Starboard:to it opt Sample encrypted Starboard SbPlayerWriteSample2(player, sample[]) note right: Currently sample array size must be 1-> GStreamer_client_appsrc: gst_buffer_add_protection_meta(gst_buffer, decrytion_params) end Starboard -> StarboardGStreamer_client_appsrc: gst_app_src_push_buffer(app_src, gst_buffer) GStreamer_client_appsrc Create GstBuffer and add media data from sample to it opt Sample encrypted Starboard--> decrypter_element: data flows through client pipeline decrypter_element -> GStreamer_client_appsrc decrypter_element: gst_buffer_addget_protection_meta(gst_buffer, decrytion_params) end Starboard opt Implementation before CBCS support added decrypter_element -> ocdmProxy: -> GStreamer_client_appsrc: gstopencdm_gstreamer_appsession_srcdecrypt_push_buffer(app_src, gst_buffer) GStreamer_client_appsrcex(key_session, buffer, sub_samples, iv, kid, init_with_last_15, caps) ocdmProxy --> GStreamer_client_rialto_sink: data flows through client pipeline GStreamer_client_rialto_sink -> GStreamer_client_rialto_sink: gst_buffer_get_protection_meta(buffer) GStreamer_client_rialto_sink -> ocdmProxy: Create gst struct opencdm_gstreamer_session_decrypt(key_session, buffer,containing encryption data decrytion_params) ocdmProxy else Implementation after CBCS support added decrypter_element -> ocdmProxy: -> ocdmProxy: opencdm_gstreamer_session_decrypt_buffer(key_session, buffer, caps) end ocdmProxy Create gst struct containing encryption data decrytion_params ocdmProxy -> decrypter_element: -> GStreamer_client_rialto_sink: gst_buffer_add_protection_meta(buffer, metadata) note left Decryption is deferred until the data is sent to Rialto so attach the required decryption parameters to the media frame which are then ready to be passed to Rialto when it requests more data. end note @enduml |
...
| PlantUML Macro | ||||
|---|---|---|---|---|
| ||||
@startuml autonumber box "Platform" #LightGreen participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server participant Ocdm end box GStreamer_server -> rialtoServer: decrypt(buffer) rialtoServer -> GStreamer_server: gst_buffer_get_protection_meta(buffer) opt Protection Meta exists (Frame encrypted) rialtoServer -> rialtoServer: Extract frame's metadata opt media_keys.key_system == "com.netflix.playready" rialtoServer -> Ocdm: opencdm_select_key_id(ocdm_session, kid) end opt Implementation before CBCS support added rialtoServer -> Ocdm: opencdm_gstreamer_session_decrypt_ex(ocdm_session, gst_buffer, subsample_info, iv, key, init_with_last_15) end _last_15, caps) else Implementation after CBCS support added rialtoServer -> Ocdm: opencdm_gstreamer_session_decrypt_buffer(ocdm_session, gst_buffer, caps) end end @enduml |
...
| PlantUML Macro | ||||
|---|---|---|---|---|
| ||||
@startuml autonumber box "Container" #LightGreen participant Cobalt participant Starboard participant GStreamer_client participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server end box == Initialisation - register for callbacks == opt Video source attached rialtoServer -> GStreamer_server: g_signal_connect(video_decoder, getVideoUnderflowSignalName_soc(), video_underflow_cb, user_data); GStreamer_server --> rialtoServer: video_handler_id end opt Audio source attached rialtoServer -> GStreamer_server: g_signal_connect(audio_decoder, getAudioUnderflowSignalName_soc(), audio_underflow_cb, user_data); GStreamer_server --> rialtoServer: audio_handler_id end == Termination - unregister for callbacks == opt Video source removed rialtoServer -> GStreamer_server: g_signal_handler_disconnect(video_decoder, video_handler_id); GStreamer_server --> rialtoServer: end opt Audio source removed rialtoServer -> GStreamer_server: g_signal_handler_disconnect(audio_decoder, audio_handler_id); GStreamer_server --> rialtoServer: end == Underflow == opt Data starvation in server AV pipeline GStreamer_server -/ rialtoServer: video_underflow_cb() or audio_underflow_cb() rialtoServer -> rialtoServer: Set pipeline state to paused rialtoServer -/ rialtoClient: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_PAUSED) rialtoClient -/ GStreamer_client: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_PAUSED) rialtoServer -/ rialtoClient: notifyNetworkState(pipeline_session, NETWORK_STATE_STALLED) rialtoClient -/ GStreamer_client: notifyNetworkState(pipeline_session, NETWORK_STATE_STALLED) note over Starboard, GStreamer_client Starboard does not have any support for underflow so the event can be ignored for this integration. end note note across There will be one ore more pending need data requests at this point which if serviced will allow playback to resume end note end == Recovery == opt rialtoServer detects that any need media data requests pending at point of underflow are now serviced and pushed to GStreamer_server || EOS signalled for any underflowed sources note across It is likely that underflow is due to one source becoming starved whilst data is buffered for other sources, so waiting until pending request(s) are serviced should allow playback to resume. There are also some YT conformance tests that delay signalling EOS for an underflowed source whilst the other continues to stream hence the EOS condition to allow streaming to resume for the valid source. end note rialtoServer -> rialtoServer: Set pipeline state to playing rialtoServer -/ rialtoClient: notifyNetworkState(pipeline_session, NETWORK_STATE_BUFFEREDnote across underflow_enabled: Underflow is enabled when we're in playing state and source is attached. underflow_cancelled: Underflow may be cancelled when haveData is called between notification from GStreamer and Underflow task handling. end note opt underflow_enabled && !underflow_cancelled rialtoServer -/ rialtoClient: notifyBufferUnderflow(source_id) rialtoClient -/ GStreamer_client: notifyBufferUnderflow(source_id) GStreamer_client -/ Starboard: emit video_underflow_cb() or audio_underflow_cb() note over Starboard, GStreamer_client Starboard does not have any support for underflow so the event can be ignored for this integration. end note end note across There will be one re more pending need data requests at this point which if serviced will allow playback to resume end note end @enduml |
Decryption: Any encrypted frames that fail to decrypt are dropped, and an error notification is propagated to the rialto-gstreamer, at which point a decryption error is raised on the sink.
| PlantUML Macro | ||||
|---|---|---|---|---|
| ||||
@startuml autonumber box "Container" #LightGreen participant Application participant rialtoGstreamer participant rialtoClient end box box "Platform" #LightBlue participant rialtoServer participant GStreamer_server end box == Decryption == GStreamer_server -> rialtoServer: decrypt(buffer) rialtoServer -> GStreamer_server: MediaKeyErrorStatus::Fail GStreamer_server -> GStreamer_server: GST_BASE_TRANSFORM_FLOW_DROPPED note over GStreamer_server Frame is dropped but playback is unaffected. end note GStreamer_server -> rialtoServer: GST_MESSAGE_WARNING(src, GST_STREAM_ERROR_DECRYPT) rialtoServer -/ rialtoClient: notifyPlaybackError(MediaSourceType, PlaybackError::DECRYPTION) rialtoClient -/ GStreamer_client rialtoGstreamer: notifyNetworkState(pipeline_session, NETWORK_STATE_BUFFERED) rialtoServer -/ rialtoClient: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_PLAYING) rialtoClient -/ GStreamer_client: notifyPlaybackState(pipeline_session, PLAYBACK_STATE_PLAYING) endnotifyPlaybackError(MediaSourceType, PlaybackError::DECRYPTION) note over rialtoGstreamer Posting an error message on the sink make the\n sink unable to continue playing back content. end note rialtoGstreamer -/ Application: GST_MESSAGE_ERROR(sink, GST_STREAM_ERROR_DECRYPT) @enduml |