Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

PlantUML Macro
formatSVG
titleRender Frame
@startuml

autonumber

box "Container" #LightGreen
participant Netflix
participant DPI
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box

Netflix            ->  DPI:              renderFrame()
DPI                ->  rialtoClient:     renderFrame()
rialtoClient       ->  rialtoServer:     renderFrame()
opt Frame renderable
rialtoServer       ->  GStreamer_server: Trigger rendering of frame

optrialtoServer Frame rendered successfully

note across: It is a Netflix requirement to call updatePlaybackPosition() after--> rialtoClient:     status=true
else renderFrame() called in bad state
rialtoServer       --> rialtoClient:     notifyPosition(position)status=false
end

rialtoClient       --> DPI:              notifyPosition(position)status
DPI                --> Netflix:          updatePlaybackPosition(pts)

rialtoServer       --> rialtoClient:     status=true
else
rialtoServer       --> rialtoClient:     status=false
end

else renderFrame() called in bad state
rialtoServer       --> rialtoClient
@enduml


Media data pipeline

Note that the data pipelines for different data sources (e.g. audio & video) should operate entirely independently. Rialto should

  • attempt to keep the shm buffer as full as possible by requesting a refill for that source whenever the source's memory buffer is empty
  • attempt to push all available frames for a source to GStreamer, i.e. push until Gstreamer indicates that it can accept no more data

Cobalt to Gstreamer


PlantUML Macro
formatSVG
titleCobalt pushing media frames
@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client_appsrc
participant decrypter_element
participant ocdmProxy
end box


Cobalt                        ->  Starboard:     status=false
end

rialtoClient       --> DPI:       SbPlayerWriteSample2(player, sample[])
note right: Currently sample array size status
DPImust be 1
Starboard                     -->  NetflixStarboard:          status

@enduml

Media data pipeline

Note that the data pipelines for different data sources (e.g. audio & video) should operate entirely independently. Rialto should

  • attempt to keep the shm buffer as full as possible by requesting a refill for that source whenever the source's memory buffer is empty
  • attempt to push all available frames for a source to GStreamer, i.e. push until Gstreamer indicates that it can accept no more data

Cobalt to Gstreamer

PlantUML Macro
formatSVG
titleCobalt pushing media frames
@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client_appsrc
participant decrypter_element
participant ocdmProxy
end box


Cobalt          Create GstBuffer and add media data from sample to it

opt Sample encrypted
Starboard                     ->  GStreamer_client_appsrc:      gst_buffer_add_protection_meta(gst_buffer, decrytion_params)
end


Starboard   ->  Starboard:                    SbPlayerWriteSample2(player, sample[])
note right: Currently sample array size must be 1
Starboard->  GStreamer_client_appsrc:      gst_app_src_push_buffer(app_src, gst_buffer)
GStreamer_client_appsrc       --> decrypter_element:            data ->flows through Starboard: client pipeline
decrypter_element             ->  decrypter_element:        Create GstBuffer and add media data from sample to it

opt Sample encrypted
Starboard gst_buffer_get_protection_meta(buffer)

opt Implementation before CBCS support added
decrypter_element             ->  ocdmProxy:            ->  GStreamer_client_appsrc:      gstopencdm_buffergstreamer_addsession_protectiondecrypt_metaex(gstkey_session, buffer, decrytion_params)
end


Starboardsub_samples, iv, kid, init_with_last_15, caps)
ocdmProxy                     ->  GStreamer_client_appsrc:ocdmProxy:         gst_app_src_push_buffer(app_src, gst_buffer)
GStreamer_client_appsrc           --> decrypter_element:            data flows through client pipelineCreate gst struct containing encryption data decrytion_params
else Implementation after CBCS support added
decrypter_element             ->   decrypter_elementocdmProxy:            gst_buffer_get_protection_meta(buffer)

opt Implementation before CBCS support added
decrypter_element        opencdm_gstreamer_session_decrypt_buffer(key_session, buffer, caps)
end

ocdmProxy             ->  ocdmProxy:       ->  decrypter_element:            opencdmgst_gstreamerbuffer_sessionadd_decryptprotection_exmeta(key_sessionbuffer, buffer, sub_samples, iv, kid, init_with_last_15, caps)
ocdmProxy                     ->  ocdmProxy:          metadata)
note left
Decryption is deferred until the data is sent to Rialto so
attach the required decryption parameters to the media frame
which are then ready to be passed to Rialto when it requests
more data.
end note

@enduml


Netflix to Rialto Client


PlantUML Macro
formatSVG
titleNetflix to Rialto Client
@startuml

autonumber

box "Container" #LightGreen
participant Netflix
participant DPI
participant rialtoClient
end box

rialtoClient      -/  DPI:   Create gst struct containing encryption data decrytion_params
else Implementation after CBCS support added
decrypter_element     notifyNeedMediaData(pipeline_session, sourceId, frame_count, need_data_request_id, shm_info)
DPI         ->  ocdmProxy:     --> rialtoClient:

opt Cached segment stored from previous need data request
DPI      opencdm_gstreamer_session_decrypt_buffer(key_session, buffer, caps)
end

ocdmProxy       ->  rialtoClient:     addSegment(need_data_request_id, cached_media_segment)
rialtoClient      -->  decrypter_elementDPI:            gst_buffer_add_protection_meta(buffer, metadata)
note left
Decryption is deferred until the data is sent to Rialto so
attach the required decryption parameters to the media frame
which are then ready to be passed to Rialto when it requests
more data.
end note

@enduml

Netflix to Rialto Client

PlantUML Macro
formatSVG
titleNetflix to Rialto Client
@startuml

autonumber

box "Container" #LightGreen
participant Netflix
participant DPI
participant rialtoClient
end box

rialtoClient  status
note right: status!=OK should never happen here
end

loop While (frames_written < frame_count) && (addSegment() returns OK) && (get_next_media_sample_status == OK)

DPI               -/  Netflix:          getNextMediaSample(es_player, sample_writer)
Netflix           -/>  DPI:              notifyNeedMediaDatainitSample(pipelinesample_sessionwriter, sourceId, frame_count, need_data_request_id, shm_infosample_attributes)
DPI               --> rialtoClient:

opt Cached segment stored from previous need data request DPI:              Cache sample_attributes
DPI               --> Netflix:   rialtoClient:     addSegment(need_data_request_id, cached_media_segment)
rialtoClient  status
Netflix      --> DPI:    ->          status
note rightDPI: status!=OK should never happen here
end

loop While (frames_written < frame_count) && (addSegment() returns OK) && write(get_next_media_sample_status == OKwriter, data)

DPI               -/>  NetflixDPI:          getNextMediaSample(es_player, sample_writer)
Netflix   Create MediaSegment object from data and cached\nsample_attributes (including ->any decryption attributes)
DPI:              initSample(sample_writer, sample_attributes)
DPI ->  rialtoClient:     addSegment(need_data_request_id, media_segment)

opt Encrypted content && key session ID ->present in DPI:map (see Select Key ID)
rialtoClient      ->  rialtoClient:  Cache sample_attributes
DPI  Set key_id in media_segment to value\nfound in map for this key session  --> NetflixID
note left: MKS ID should only be found in map for status
Netflix content
end

rialtoClient          -->  DPI:              write(sample_writer, data)status

opt status==NO_SPACE
DPI  ->  DPI:            ->  DPI:             Cache Createsegment MediaSegmentfor objectnext fromneed data and cached\nsample_attributes (including any decryption attributes)
DPI               ->  rialtoClient:     addSegment(need_data_request_id, media_segment)

opt Encrypted content && key session ID present in map (see Select Key ID)
rialtoClientrequest
note right
This will require allocating temporary
buffer to store the media data but this
should happen very rarely in practise.

*TODO:* Consider adding canAddSegment()
Rialto API so that initSample() could
return NO_AVAILABLE_BUFFERS to cancel
this request and avoid the need for 
the temporary media data cache.
end note
end

DPI       ->  rialtoClient:     Set key_id in media_segment to value\nfound in map for this key session ID
note left: MKS ID should only be found in map for Netflix content
end

rialtoClient       --> Netflix:          write_status
Netflix           --> DPI:              get_next_media_sample_status
end

opt get_next_media_sample_status ==NO_SPACE OK
DPI  ->  DPI:           ->  DPI:              Cache segment for next need data request
note right
This will require allocating temporary
buffer to store the media data but this
should happen very rarely in practise.

*TODO:* Consider adding canAddSegment()
Rialto API so that initSample() could
return NO_AVAILABLE_BUFFERS to cancel
this request and avoid the need for 
the temporary media data cache.
end note
end

have_data_status = OK
else get_next_media_sample_status == NO_AVAILABLE_SAMPLES
DPI               ->  DPI:              have_data_status = NO_AVAILABLE_SAMPLES
else get_next_media_sample_status == END_OF_STREAM
DPI               --> Netflix:          write_status
Netflix           --> DPI:              gethave_next_mediadata_sample_status
end

opt get_next_media_sample_status == OKEOS
else
DPI               ->  DPI:              have_data_status = OK
else get_next_media_sample_status == NO_AVAILABLE_SAMPLESERROR
end

DPI               ->  DPIrialtoClient:              haveData(pipeline_session, have_data_status, = NO_AVAILABLE_SAMPLES
else get_next_media_sample_status == END_OF_STREAM
DPI         need_data_request_id)


opt Data accepted

opt Frames pushed for all attached sources && buffered not sent
rialtoClient      ->/  DPI:              havenotifyNetworkState(NETWORK_data_status = EOS
else
DPISTATE_BUFFERED)
end

rialtoClient      --> DPI:        ->  DPI:    OK
else Errror
rialtoClient      -->   have_data_status = ERROR
end

DPIDPI: ERROR
rialtoClient      -/  DPI:       ->  rialtoClient:     haveDatanotifyPlaybackState(pipeline_session, have_data_status, need_data_request_id)
PLAYBACK_STATE_FAILURE)
end

opt Data accepted

opt Frames pushed for all attached sources && buffered not sent First video frame at start of playback or after seek ready for rendering
opt notifyFrameReady not currently implemented
rialtoClient      -/  DPI:              notifyNetworkStatenotifyFrameReady(NETWORKtime_STATE_BUFFEREDposition)
endelse

rialtoClient      -->/  DPI:                   OK
else Errror
rialtoClient   notifyPlaybackState(PLAYBACK_STATE_PAUSED)
end
DPI       --> DPI: ERROR
rialtoClient      -/  DPINetflix:              notifyPlaybackState(PLAYBACK_STATE_FAILUREreadyToRenderFrame(pts=time_position)
end

opt First video frame at start of playback or after seek ready for rendering
opt notifyFrameReady not currently implemented
rialtoClient@enduml


Cobalt to Rialto Client


PlantUML Macro
formatSVG
titleNetflix to Rialto Client
@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant rialtoClient
end box

== Initialisation ==

Cobalt            -/>  DPIStarboard:              notifyFrameReady(time_position)
else
rialtoClientSbPlayerGetMaximumNumberOfSamplesPerWrite(player, sample_type)
Starboard      -/  DPI:              notifyPlaybackState(PLAYBACK_STATE_PAUSED)
end
DPI--> Cobalt:                    -/  Netflix:          readyToRenderFrame(pts=time_position)
end

@enduml

Cobalt to Rialto Client

PlantUML Macro
formatSVG
titleNetflix to Rialto Client
@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant rialtoClient
end box

== Initialisation ==

Cobalt     max_samples=1
note right: Specify that Cobalt only send 1 sample at a time


== Write samples ==

rialtoClient       ->/  Starboard:        SbPlayerGetMaximumNumberOfSamplesPerWrite(playernotifyNeedMediaData(pipeline_session, sourceId, sample_type)
Starboard         --> Cobalt:           max_samples=1
note right: Specify that Cobalt only send 1 sample at a time


== Write samples ==

rialtoClient      -/  Starboard:        notifyNeedMediaData(pipeline_session, sourceId, frame_count, need_frame_count, need_data_request_id, shm_info)
Starboard         --> rialtoClient:

opt Cached segment stored from previous need data request
Starboard         ->  rialtoClient:     addSegment(need_data_request_id, cached_media_segment)
rialtoClient      --> Starboard:        status
note right: status!=OK should\nnever happen here
end

loop While (frames_written < frame_count) && (addSegment() returns OK) && (not end of stream)

Starboard         ->  Starboard:        Convert sourceId to media type
Starboard         -/  Cobalt:           SbPlayerDecoderStatusFunc(player, media_type, kSbPlayerDecoderStateNeedsData, ticket)
note right: ticket should be set to ticket value in last call to SbPlayerSeek()
Cobalt            --> Starboard:

opt Not end of stream
Cobalt            ->  Starboard:        SbPlayerWriteSample2(player, sample_type, samples, num_samples)
note right: num_samples!=1 is an error
Starboard         ->  Starboard:        Construct media_segment from sample, including any decryption parameters (drm_info)
Starboard         ->  rialtoClient:     addSegment(need_data_request_id, media_segment)
rialtoClient      --> Starboard:        status
opt status==NO_SPACE
Starboard         ->  Starboard:        Cache segment for next need data request
note right
This will require allocating temporary
buffer to store the media data but this
should happen very rarely in practise.

*TODO:* Consider adding canAddSegment()
Rialto API so that initSample() could
return NO_AVAILABLE_BUFFERS to cancel
this request and avoid the need for 
the temporary media data cache.
end note
end

else End of stream
Cobalt            ->  Starboard:        SbPlayerWriteEndOfStream(player, stream_type)
end

Starboard         --> Cobalt:
end

opt Not end of stream
Starboard         ->  Starboard:        have_data_status = OK
else End of stream
Starboard         ->  Starboard:        have_data_status = EOS
end

Starboard         ->  rialtoClient:     haveData(pipeline_session, have_data_status, need_data_request_id)

@enduml

...

PlantUML Macro
formatSVG
titleServer only mode
@startuml

autonumber

box "Platform" #LightBlue
participant client
participant rialtoServer
end box

rialtoServer      -/  client:           notifyNeedMediaData(pipeline_session, sourceId, frameCount, maxBytes, needDataRequestId, shmInfo)
client            ->  client:           Ignore shmInfo
client            --> rialtoServer:

loop While (framesFetched < frameCount) && (addSegment() returns OK)
client            ->  client:       client:           Get next frame & any decryption metadata
client            ->  rialtoServer:     addSegment(needDataRequestId, segment)

opt needDataRequestId is valid && enough space to write segment and its metadata to shm region && client not trying to send too many frames     Get next frame & any decryption metadata
client            ->  rialtoServer:     addSegment(needDataRequestId, segment)

opt needDataRequestId is valid && enough space to write segment and its metadata to shm region && client not trying to send too many frames
rialtoServer      ->  rialtoServer:     Copy segment & metadata to shm buffer based on shmInfo for this request ID
rialtoServer      --> client:           OK
else Not enough space in shm region || client trying to send too many frames
rialtoServer      --> client:           NO_SPACE
else needDataRequestId not found
rialtoServer      --> client:           OK
note right: Silently ignore calls with invalid request ID as this is possible due to race conditions
end



end

note over client
Set status following same rules as in client/server mode
end note
client            ->  rialtoServer:     haveData(pipeline_session, status, needDataRequestId)
note across: From this point processing follows the same flow as shown in the client-server diagram.

@enduml
Note

The code for populating the shm buffer from the parameters to addSegment() will be common on the client & server side so this should be stored in a common location to be used by both implementations.


See also Rialto Client MSE Player Session Streaming State Machine for some additional clarity on how the Rialto client should manage the flow of data in particular regard to seek operations.

Rialto Server to Gstreamer server

This algorithm should be run for all attached sources. A haveData() call in the above sequence can restart the algorithm when it previously stopped due to data exhaustion.


PlantUML Macro
formatSVG
titlePushing data to Gstreamer server pipeline
@startuml

autonumber

box "Platform" #LightGreen
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
participant Ocdm
end box

note across
Gstreamer app source uses 2 signals, need-data and enough-data,
to notify its client whether it needs more data. Rialto server
should only push data when the appsrc indicates that it is in
the need-data state.
end note


loop While appsrc needs data && appsrc data available in shm buffer

rialtoServer       ->  rialtoServer:     Copy segmentExtract &frame's metadata tofrom shm buffer based on shmInfo for this request ID
rialtoServer
opt Frame encrypted

rialtoServer	   ->  GStreamer_server:  gst_buffer_add_protection_meta(buffer, meta)

end

rialtoServer       -->  clientGStreamer_server:  Set width/height caps
opt new codec_data in    OK
else Not enough space in shm region || client trying to send too many frames
rialtoServerframe
rialtoServer       ->  GStreamer_server:  Set codec_data caps
end

rialtoServer       --> client GStreamer_server:  gst_app_src_push_buffer(src, gst_buffer)
rialtoServer       -> NO_SPACE
else needDataRequestId not found
rialtoServer rialtoServer:      'Remove' frame --> client:           OK
note right: Silently ignore calls with invalid request ID as this is possible due to race conditions
end



end

note over client
Set status following same rules as in client/server mode
end note
client     from shm

opt First video frame pushed at start of playback / after seek
note across: Not currently implemented
rialtoServer       --/ rialtoClient:      notifyFrameReady(frame_timestamp)
end

opt Appsrc data exhausted from shm
opt (status == EOS) for this appsrc
rialtoServer       ->  rialtoServerGStreamer_server:     haveData(pipeline_session, status, needDataRequestId)
note across: From this point processing follows the same flow as shown in the client-server diagram.

@enduml
Note

The code for populating the shm buffer from the parameters to addSegment() will be common on the client & server side so this should be stored in a common location to be used by both implementations.

See also Rialto Client MSE Player Session Streaming State Machine for some additional clarity on how the Rialto client should manage the flow of data in particular regard to seek operations.

Rialto Server to Gstreamer server

 notify EOS
else Not EOS
rialtoServer       --/ rialtoClient:      notifyNeedMediaData(...)
end
end

end

@enduml


Frames are decrypted in the pipeline when they are pulled for playbackThis algorithm should be run for all attached sources. A haveData() call in the above sequence can restart the algorithm when it previously stopped due to data exhaustion.


PlantUML Macro
formatSVG
titlePushing data to Decrypt Frames on Gstreamer server pipeline
@startuml

autonumber

box "Platform" #LightGreen
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
participant Ocdm
end box

note across
Gstreamer app source uses 2 signals, need-data and enough-data,
to notify its client whether it needs more data. Rialto server
should only push data when the appsrc indicates that it is in
the need-data state.
end note


loop While appsrc needs data && appsrc data available in shm buffer
GStreamer_server   ->  rialtoServer:      decrypt(buffer)

rialtoServer	   ->  GStreamer_server:  gst_buffer_get_protection_meta(buffer)
 
opt Protection Meta exists (Frame encrypted)
 
rialtoServer       ->  rialtoServer:      Extract frame's metadata

opt media_keys.key_system == "com.netflix.playready"
rialtoServer       ->  rialtoServerOcdm:      Extract frame's metadata from shm
opt Frame encrypted

rialtoServer	   ->  GStreamer_server:  gst_buffer_add_protection_meta(buffer, meta)

end

opencdm_select_key_id(ocdm_session, kid)
end


opt Implementation before CBCS support added
rialtoServer       ->  GStreamer_serverOcdm:  Set width/height caps
opt new codec_data in frame
rialtoServer       ->  GStreamer_server:  Set codec_data caps
end
opencdm_gstreamer_session_decrypt_ex(ocdm_session, gst_buffer, subsample_info, iv, key, init_with_last_15, caps)
else Implementation after CBCS support added
rialtoServer       ->  GStreamer_server:  gst_app_src_push_buffer(srcOcdm:              opencdm_gstreamer_session_decrypt_buffer(ocdm_session, gst_buffer, caps)
rialtoServer       ->  rialtoServer:      'Remove' frame from shm

opt First video frame pushed at start of playback / after seek
note across: Not currently implemented
end

end

@enduml

Playback State

Position Reporting

The position reporting timer should be started whenever the PLAYING state is entered and stopped whenever the session moves to another playback state, i.e. stop polling during IDLE, BUFFERING, SEEKING etc.


PlantUML Macro
formatSVG
titlePosition updates
@startuml


autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box

== Regular position update notifications ==

rialtoServer     ->  --/ rialtoClientrialtoServer:      notifyFrameReady(frame_timestamp)
end

opt Appsrc data exhausted from shm
opt (status == EOS) forPosition thistimer appsrcfired
rialtoServer       ->  GStreamer_server: Get position from pipeline
GStreamer_server  notify EOS
else Not EOS
rialtoServer  --> rialtoServer:     Current position
rialtoServer     --/  rialtoClient:      notifyNeedMediaData(...)
end
end

end

@enduml

Frames are decrypted in the pipeline when they are pulled for playback.

PlantUML Macro
formatSVG
titleDecrypt Frames on Gstreamer server pipeline
@startuml

autonumber

box "Platform" #LightGreen
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
participant Ocdm
end box

GStreamer_server   ->  rialtoServer:      decrypt(buffer)

rialtoServer	   ->  GStreamer_server:  gst_buffer_get_protection_meta(buffer)
 
opt Protection Meta exists (Frame encrypted)
 
rialtoServer       ->  rialtoServernotifyPosition(pipeline_session, position)
rialtoClient     -/  GStreamer_client: notifyPosition(pipeline_session, position)
note over GStreamer_client: Not used by Cobalt as some conformance\ntests require very high position accuracy


== Get position ==

Cobalt           ->  Starboard:      Extract frame's metadata

opt media_keys.key_system == "com.netflix.playready"
rialtoServer        SbPlayerGetInfo2(player)
Starboard        ->  GStreamer_client: Get position
GStreamer_client ->  OcdmrialtoClient:     getPosition(pipeline_session)
rialtoClient     ->  rialtoServer:     opencdm_select_key_id(ocdmgetPosition(pipeline_session, kid)
end


optrialtoServer Implementation before CBCS support added
rialtoServer->  GStreamer_server: Get position from pipeline
GStreamer_server --> rialtoServer:  Ocdm:   position
rialtoServer     --> rialtoClient:     opencdm_gstreamer_session_decrypt_ex(ocdm_session, gst_buffer, subsample_info, iv, key, init_with_last_15, caps)
else Implementation after CBCS support added
rialtoServerposition
rialtoClient     --> GStreamer_client: position
GStreamer_client --> Starboard:        position
Starboard        ->  OcdmStarboard:              Set  opencdm_gstreamer_session_decrypt_buffer(ocdm_session, gst_buffer, caps)
end

end

@enduml

Playback State

Position Reporting

...

player_info.pos = position
Starboard        --> Cobalt:           player_info

@enduml


End of stream

PlantUML Macro
formatSVG
titlePosition updatesEnd of stream
@startuml


autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box

==opt RegularEnd positionof update notifications ==

rialtoServercontent reached
GStreamer_server     ->/  rialtoServer:     Position timer fired
rialtoServer     ->  GStreamer_server: Get position from pipeline
GStreamer_server --> rialtoServer:GST_MESSAGE_EOS
rialtoServer     Current position
rialtoServer -/  rialtoClient:  -/  rialtoClient:     notifyPositionnotifyPlaybackState(pipeline_session, position_session, END_OF_STREAM)
rialtoClient       -/  GStreamer_client:     notifyPositionnotifyPlaybackState(pipeline_session, positionEND_OF_STREAM)
note over  left
This should notify all attached sinks of EOS
end note
GStreamer_client   -/  Starboard: Not used by Cobalt as some conformance\ntests require very high position accuracy


== Get position ==

CobaltGST_MESSAGE_EOS
Starboard          -/  Cobalt:     ->  Starboard:        SbPlayerGetInfo2PlayerStatus(player, kSbPlayerStateEndOfStream)
Starboard        ->  GStreamer_client: Get position
GStreamer_client ->  rialtoClient:     getPosition(pipeline_session)
rialtoClient     ->  rialtoServer:     getPosition(pipeline_session)
rialtoServer
end

@enduml


Underflow

PlantUML Macro
formatSVG
titleUnderflow
@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box

== Initialisation - register for callbacks ==

opt Video source attached
rialtoServer       ->  GStreamer_server: Get position from pipeline
GStreamer_server    g_signal_connect(video_decoder, getVideoUnderflowSignalName_soc(), video_underflow_cb, user_data);
GStreamer_server   --> rialtoServer:     position
rialtoServer     --> rialtoClient: video_handler_id
end

opt Audio source attached
rialtoServer    position
rialtoClient     --> GStreamer_client: position
GStreamer_client --> Starboardserver:     g_signal_connect(audio_decoder, getAudioUnderflowSignalName_soc(),  position
Starboardaudio_underflow_cb, user_data);
GStreamer_server   --> rialtoServer:    ->  Starboard:   audio_handler_id
end


== Termination - unregister  Set player_info.pos = position
Starboardfor callbacks ==

opt Video source removed
rialtoServer        -->  CobaltGStreamer_server:     g_signal_handler_disconnect(video_decoder, video_handler_id);
GStreamer_server     player_info

@enduml

End of stream

PlantUML Macro
formatSVG
titleEnd of stream
@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box

opt End of content reached--> rialtoServer:
end

opt Audio source removed
rialtoServer       -> GStreamer_server:     g_signal_handler_disconnect(audio_decoder, audio_handler_id);
GStreamer_server   --/ > rialtoServer:
end


== Underflow ==

opt Data starvation in server AV  GST_MESSAGE_EOS
rialtoServer  pipeline
GStreamer_server     -/  rialtoClientrialtoServer:         notifyPlaybackState(pipeline_session, END_OF_STREAM)
rialtoClient       -/  GStreamer_client:     notifyPlaybackState(pipeline_session, END_OF_STREAM)
note left
This should notify all attached sinks of EOS
end note
GStreamer_client   -/  Starboard:            GST_MESSAGE_EOS
Starboardvideo_underflow_cb() or audio_underflow_cb()
note across
underflow_enabled: Underflow is enabled when we're in playing state and source is attached.
underflow_cancelled: Underflow may be cancelled when haveData is called between notification from GStreamer and Underflow task handling.
end note
opt underflow_enabled && !underflow_cancelled
rialtoServer       -/  rialtoClient:         notifyBufferUnderflow(source_id)
rialtoClient          -/  CobaltGStreamer_client:     notifyBufferUnderflow(source_id)
GStreamer_client   -/  Starboard:           PlayerStatus(player, kSbPlayerStateEndOfStream)
end

@enduml

Underflow

     emit video_underflow_cb() or audio_underflow_cb()

note over Starboard, GStreamer_client
Starboard does not have any support for underflow
so the event can be ignored for this integration.
end note

end

note across
There will be one re more pending need data requests at this point which if serviced will allow playback to resume
end note

end

@enduml


Non-fatal Playback Failures

Decryption: Any encrypted frames that fail to decrypt are dropped, and an error notification is propagated to the rialto-gstreamer, at which point a decryption error is raised on the sink.

PlantUML Macro
formatSVG
titleNon-fatal Errors
@startuml

autonumber

box "Container" #LightGreen
participant Application
participant rialtoGstreamer
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server 
end box

== Decryption ==
GStreamer_server   ->  rialtoServer:      decrypt(buffer)

rialtoServer	   ->  GStreamer_server:  MediaKeyErrorStatus::Fail
GStreamer_server   ->  GStreamer_server:  GST_BASE_TRANSFORM_FLOW_DROPPED 

note over GStreamer_server
Frame is dropped but playback is unaffected.
end note 
PlantUML Macro
formatSVG
titleUnderflow
@startuml

autonumber

box "Container" #LightGreen
participant Cobalt
participant Starboard
participant GStreamer_client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box

== Initialisation - register for callbacks ==

opt Video source attached
rialtoServer       ->  GStreamer_server:     g_signal_connect(video_decoder, getVideoUnderflowSignalName_soc(), video_underflow_cb, user_data);
GStreamer_server   --> rialtoServer:         video_handler_id
end

opt Audio source attached
rialtoServer       ->  GStreamer_server:     g_signal_connect(audio_decoder, getAudioUnderflowSignalName_soc(), audio_underflow_cb, user_data);
GStreamer_server   --> rialtoServer:         audio_handler_id
end


== Termination - unregister for callbacks ==

opt Video source removed
rialtoServer       ->  GStreamer_server:     g_signal_handler_disconnect(video_decoder, video_handler_id);
GStreamer_server   --> rialtoServer:
end

opt Audio source removed
rialtoServer       -> GStreamer_server:     g_signal_handler_disconnect(audio_decoder, audio_handler_id);
GStreamer_server   --> rialtoServer:
end


== Underflow ==

opt Data starvation in server AV pipeline
GStreamer_server   -/>  rialtoServer:         video_underflow_cb() or audio_underflow_cb()
opt underflow_enabled && !underflow_cancelled
rialtoServer       -/  rialtoClientGST_MESSAGE_WARNING(src, GST_STREAM_ERROR_DECRYPT)

rialtoServer       -/  rialtoClient:         notifyBufferUnderflow(source_id notifyPlaybackError(MediaSourceType, PlaybackError::DECRYPTION)
rialtoClient       -/  GStreamer_client rialtoGstreamer:     notifyBufferUnderflow(source_id)
GStreamer_client   -/  Starboard:            emit video_underflow_cb() or audio_underflow_cb()
 notifyPlaybackError(MediaSourceType, PlaybackError::DECRYPTION)
 
note over Starboard, GStreamer_client
Starboard does not have any support for underflow
so the event can be ignored for this integrationrialtoGstreamer
Posting an error message on the sink make the\n
sink unable to continue playing back content.
end note

end

note across
There will be
rialtoGstreamer one re more pending-/ need Application: data  requests  at this point which if serviced will allow playback to resume
end note

end  GST_MESSAGE_ERROR(sink, GST_STREAM_ERROR_DECRYPT)

@enduml