Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Introduce new IWebAudioPlayer interface to allow the client to control PCM injection. This will be implemented in both Rialto Client ClieGstreamer app source uses 2 signals,nt & Server.
  • Data will be passed in existing shm buffer used for MSE
    • The resources parameter provided to Rialto Application Session Server when it is spawned will be extended to include a max_web_audio_playbacks parameter to determine how many Web Audio playbacks can be performed by this application
    • The shared memory buffer will have a suitably sized Web Audio region allocated if max_web_audio_playbacks > 0

...

draw.io Diagram
bordertrue
diagramNameWeb Audio Shm Region
simpleViewerfalse
width
linksauto
tbstyletop
lboxtrue
diagramWidth1161
revision1


Sequence Diagrams

Initialisation &

...

Termination

PlantUML Macro
formatSVG
titleInit/Term
@startuml

autonumber

box "Container" #LightGreen
participant Client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


== Create Web Audio Player session ==

Client            ->  rialtoClient:     createWebAudioPlayer(client, pcm_params)
note right
This should only be called in when app is in ACTIVE
state. If client porting layer requires a persistent
web audio player it will need to manage creation &
destruction of the rialto object on state change.
end note
rialtoClient      ->  rialtoServer:     createWebAudioPlayer(client, pcm_params)
rialtoServer      ->  rialtoServer:     Check resource permissions that were granted to app and currently\nallocated pipelines permit this object to be created
note left: This check is (num_current_web_audio_sessions < resources.max_supported_web_audio)
rialtoServer      ->  rialtoServer:     Check pcm_params valid
note left: sampleSize is valid aslong as it is > CHAR_BIT (8 bits)
 

opt Permission & pcm_params checks passed
rialtoServer      ->  rialtoServer:     Store offset & size of media transfer buffer in shm
note left: This data comes from the getSharedMemory() API called when Rialto server entered ACTIVE state
rialtoServer      ->  GStreamer_server: Create rialto server PCM audio pipeline
GStreamer_server  --> rialtoServer:
rialtoServer      --> rialtoClient:     web_audio_session
rialtoClient      --> Client:           web_audio_session

rialtoServer      --/ rialtoClient:     notifyState(web_audio_session, IDLE)
rialtoClient      --/ Client:           notifyState(web_audio_session, IDLE)

else Checks failed

rialtoServer      --> rialtoClient:     nullptr
rialtoClient      --> Client:              nullptr

end


== Initialisation ==

Client            ->  rialtoClient:     getDeviceInfo()
rialtoClient      ->  rialtoServer:     getDeviceInfo()
rialtoServer      --> rialtoClient:     preferred_frames, max_frames, support_deferred_play
note right: preferred_frames=minimum of 640 or max_frames\nmax_frames=shm_region_size/(pcm_params.sample_sizechannels * pcm_params.sampleSize) \nsupport_deferred_play=true
rialtoClient      --> Client:           preferred_frames, max_frames, support_deferred_play


== Destroy Web Audio Player session ==

Client            ->  rialtoClient:     ~IWebAudioPlayer()
note right
This should be called in when app leaves ACTIVE
state or destroys it's web audio player.
end note
rialtoClient      ->  rialtoServer:     ~IWebAudioPlayer(client, pcm_params)
rialtoServer      ->  GStreamer_server: Destroy PCM audio pipeline
GStreamer_server  --> rialtoServer:
rialtoServer      --> rialtoClient:
rialtoClient      --> Client:


@enduml

...

PlantUML Macro
formatSVG
titlePush Data Algorithm
@startuml

autonumber

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box

note across
Gstreamer app source uses 2 signals, need-data and enough-data, to notify its
client whether it needs more data. Rialto server should only push data when
the appsrc indicates that it is in the need-data state.
end note


loop While appsrc needs data && data available in web audio shm region
loop data available in web audio shm region

rialtoServer       ->  GStreamer_server:  gst_app_src_get_current_level_bytes(src)
GStreamer_server   --> rialtoServer: bytes_in_gst_queue
note right: size of gst_buffer either free_bytes in src or size of samples in shm 
rialtoServer       ->   GStreamer_server: gst_buffer_new_allocate(size)
GStreamer_server   --> rialtoServer: gst_buffer
rialtoServer       ->  GStreamer_server:  gst_app_src_push_buffer(src, gst_buffer)
rialtoServer       ->  rialtoServer:      Update internal shm variables for consumed data

opt Appsrc data exhausted from shm && internal EOS flag set
rialtoServer       ->  rialtoServerGStreamer_server:      Create gst_buffer containing next set of samples from shm
note right: This will need to perform endianness conversion if necessary
rialtoServer       ->  GStreamer_server:  gst_app_src_push_buffer(src, gst_buffer)
rialtoServernotify EOS
end

end

@enduml


Play & Pause

PlantUML Macro
formatSVG
titlePlay & Pause
@startuml

autonumber

box "Container" #LightGreen
participant Client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


== Start/resume playback ==

Client            ->  rialtoServerrialtoClient:      Update internal shm variables for consumed data

opt Appsrc data exhausted from shm && internal EOS flag set
rialtoServer play(web_audio_session)
rialtoClient      ->  rialtoServer:     play(web_audio_session)
rialtoServer      ->  GStreamer_server: Set pipeline state notifyto EOS
end

end

@enduml

Play & Pause

PlantUML Macro
formatSVG
titlePlay & Pause
@startuml

autonumber

box "Container" #LightGreen
participant Client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


== Start/resume playback ==

Client      PLAYING
GStreamer_server  --> rialtoServer:     status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> Client:           status

opt status==true
rialtoServer      --> / rialtoClient:     playnotifyState(web_audio_session, PLAYING)
rialtoClient      ->--/ Client:      rialtoServer:     playnotifyState(web_audio_session, PLAYING)
else status=false
rialtoServer      --/ rialtoClient:    ->  GStreamer_server: Set pipeline state to PLAYING
GStreamer_server notifyState(web_audio_session, FAILURE)
rialtoClient      -->/ rialtoServerClient:     status
rialtoServer      --> rialtoClient:     status
rialtoClient notifyState(web_audio_session, FAILURE)
end


== Pause playback ==

Client      --> Client:     ->  rialtoClient:    status

opt status==true
rialtoServer pause(web_audio_session)
rialtoClient      --/>  rialtoClientrialtoServer:     notifyStatepause(web_audio_session, PLAYING)
rialtoClientrialtoServer      --/ Client:           notifyState(web_audio_session, PLAYING)
else status=false>  GStreamer_server: Set pipeline state to PAUSED
GStreamer_server  --> rialtoServer:     status
rialtoServer      --/> rialtoClient:     notifyState(web_audio_session, FAILURE)status
rialtoClient      --/> Client:           notifyState(web_audio_session, FAILURE)
end


== Pause playback ==

Client     status

opt status==true
rialtoServer      --> / rialtoClient:     pausenotifyState(web_audio_session, PAUSED)
rialtoClient      ->--/ Client:      rialtoServer:     pausenotifyState(web_audio_session, PAUSED)
else status=false
rialtoServer      ->  GStreamer_server: Set pipeline state to PAUSED
GStreamer_server--/ rialtoClient:     notifyState(web_audio_session, FAILURE)
rialtoClient      -->/ rialtoServerClient:          status
rialtoServer      --> rialtoClient:     status
rialtoClient notifyState(web_audio_session, FAILURE)
end


@enduml


Get Buffer Delay

PlantUML Macro
formatSVG
titleGet Delay Frames
@startuml

autonumber

box "Container" #LightGreen
participant Client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
end box


Client      --> Client:     ->  rialtoClient:    status

opt status==true
rialtoServer getBufferDelay(web_audio_session)
rialtoClient      --/>  rialtoClientrialtoServer:     notifyStategetBufferDelay(web_audio_session, PAUSED)
rialtoClientrialtoServer      --/>  Client:           notifyState(web_audio_session, PAUSED)
else status=falserialtoServer:     Calculate delay_frames
note right
Directly get the amount of buffered data:
  delay = gst_queued_frames() + shm_queued_frames();
end note
rialtoServer      --/> rialtoClient:     notifyState(web_audio_session, FAILURE)delay_frames
rialtoClient      --/> Client:           notifyState(web_audio_session, FAILURE)
enddelay_frames


@enduml

...

Set Volume

PlantUML Macro
formatSVG
titleGet Delay FramesSet Volume
@startuml

autonumber

box "Container" #LightGreen
participant Client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


Client  ->  rialtoClient:             setVolume(web_audio_session, volume)
rialtoClient      ->  rialtoClient rialtoServer:       getBufferDelaysetVolume(web_audio_session, volume)
rialtoClientrialtoServer         ->  rialtoServer GStreamer_server:     getBufferDelay(web_audio_session)
rialtoServer      -> gst_stream_volume_set_volume(pipeline, GST_STREAM_VOLUME_FORMAT_LINEAR, volume)
GStreamer_server  --> rialtoServer:      Calculate delay_frames
note right
Ideally this would be queried from the GStreamer pipeline, either by directly getting
the amount of buffered data or calculating it something like this:

  delay = gst_element_query_duration() - gst_element_query_position();

This requires some experimentation to see what works. The existing implementation
relies on elapsed time vs position but this will become problemmatic when support
for pause() is required.
end note
rialtoServer        status
rialtoServer      --> rialtoClient:     status
rialtoClient      --> Client: status
@enduml

Get Volume

PlantUML Macro
formatSVG
titleGet Volume
@startuml
autonumber

box "Container" #LightGreen
participant Client
participant rialtoClient
end box

box "Platform" #LightBlue
participant rialtoServer
participant GStreamer_server
end box


Client  ->  rialtoClient:     getVolume(web_audio_session)
rialtoClient      ->  rialtoServer:     getVolume(web_audio_session)
rialtoServer      ->  GStreamer_server: gst_stream_volume_get_volume(pipeline, GST_STREAM_VOLUME_FORMAT_LINEAR)
GStreamer_server  --> rialtoServer:     volume
rialtoServer      --> rialtoClient:       delay_framesvolume
rialtoClient         --> Client:           delay_frames

volume
@enduml