You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

GStreamer Abstraction Layer & Plugin

AAMP has two modes of operation - gstreamer player (when JS/Native API's are used by the application) and gstreamer element (when application manages playbin as in the case of html5 video playback)

AAMP as Gstreamer player

In this mode of operation, AAMP manages the gstreamer pipeline. AAMP creates one playbin for each stream ( Audio or Video) and adds these playbins to pipeline. AAMP manages the pipeline state as well as injects data to the playbins through gstreamer appsrc element. AAMP sets caps of appsrc src pads based on the format of injected data. Buffers and events are sent to gstreamer using these appsrc objects.

AAMP as Gstreamer element

In this mode of operation, AAMP is a gstreamer element. Usually, application manages the lifecycle of the playbin which has aamp as one element. There is a dummy element aampsrc to handle protocol "aamp://" and "aamps://" which is replaced to http internally. aamp gstreamer element manages state of AAMP engine as well as gets media data and injects it to next element. aamp gstreamer element exposes dynamic pads which have caps corresponding to the format of injected data on that srcpad. 

AAMP provides four gstreamer plugins.

  1. aampsrc - Used as a dummy protocol handler to handle aamp:// or aamps:// URL prefix. Used only on HTML5 video playback.
  2. aamp - Manages an AAMP instance lets playback of DASH and HLS streams. Used only on HTML5 video playback. Instead of using AAMPGstPlayer, this plugin provides buffers on src pads, which is used by playbin to connect required demux/decoder element.
  3. aampwidevinedecryptor: Decrypt Widevine encrypted samples
  4. aampplayreadydecryptor: Decrypt PlayReady encrypted samples

Software demux vs hardware demux

For HLS, AAMP uses software demux which is a part of AAMP code by default. If codec info is not in the manifest, AAMP does not demux the TS fragments, which causes h/w demux gstreamer element of the platform to be used by playbin to demux the TS fragments. In summary, AAMP outputs elementary streams on using s/w demux, else output mpeg TS buffers which results in hardware demux to be used.

HLS on Broadcom STB (UVE and S/W Demux)

AAMP manages the pipeline. For audio and video there are separate playbins. AAMP provides audio/video elementary stream to the playbins using appsrc. Playbin connects platform decoder and sink elements using the capability of the  appsrc src pads.

Pipeline graph


HLS on Broadcom STB (UVE, H/W Demux)

AAMP manages the pipeline. AAMP provides  TS  to the playbins using appsrc. Playbin connects platform ts-demux, decoder and sink elements.

Pipeline graph



DASH Widevine on Broadcom STB (UVE)

AAMP manages the pipeline. For audio and video there are separate playbins. AAMP provides audio/video fragmented mp4 buffers to the playbins using appsrc. Playbin connects qtdemux, platform decoder and sink elements using the capability of the src pads.

Qtdemux Patches

For DASH streams, AAMP downloads and injects ISO-BMFF fragments to gstreamer pipeline. Demuxing is done by opensource qtdemux gstreamer plugin. To achieve PTS re-stamping to achieve desired rate/ continuity across period boundaries, AAMP has added custom patches to qtdemux gstreamer plugin. These patches support custom events from AAMP.

0009-qtdemux-aamp-tm.patch

https://gerrit.teamccp.com/plugins/gitiles/rdk/yocto_oe/layers/meta-rdk-ext/+/refs/heads/stable2/recipes-multimedia/gstreamer/files/0009-qtdemux-aamp-tm.patch

This patch adds support for PTS restamping of outgoing buffers from qtdemux based on  a custom event aamp_override. App can override the base PTS which  is to be used as base for restamping using property "basePTS". If it is not proved, PTS of first fragment will be used as base PTS. 

gst_qtdemux_decorate_and_push_buffer is the function in which qtdemux pushes the buffers to srcpad. If override is enabled by the event, the patch re-stamps PTS of ES packets based on the rate and basePTS.

0021-qtdemux-aamp-tm-multiperiod.patch

https://gerrit.teamccp.com/plugins/gitiles/rdk/yocto_oe/layers/meta-rdk-ext/+/refs/heads/stable2/recipes-multimedia/gstreamer/files/0021-qtdemux-aamp-tm-multiperiod.patch

On period transition during trickmode, PTS of the new fragment changes. To ensure smooth trickplay experience, PTS restamping should be continuous. This patch ensures this using aamp-tm-disc event. Fragments after this event has PTS very different from previous fragments. So base PTS has to be recalculated to have continuous trick play experience at this point. PTS offset is calculated by adding last restamped PTS prior to the event and expected delay between Iframes. Base PTS is reset. PTS offset is added to the recalculated PTS on next fragments. 

Decryptor gstreamer elements

If content is encrypted with Widevine or Playready DRM, aamp decryptor elements )aampplayreadydecryptor or aampwidevinedecryptor) are automatically plugged in after qtdemux by playbin. These elements decrypt samples using opencdm interface.

Pipeline graph

HTML5 Video Tag based playback on Broadcom STB

Here Webkit manages the pipeline. aampsrc is a dummy gstreamer source just to handle aamp:// protocol. aamp element fetches manifets/playlists and fragments, demuxes TS fragment in case of HLS, injects buffers to the downstream element using srcpads. RDK WPE uses Westeros sink gstreamer element as video sink.

Pipeline graph


Progressive Support in Aamp

As the engine behind Media Player component, AAMP video engine recently started supporting  playback of streaming audio/video in progressive format (raw mp4 or mp3 downloads) including mp3 streaming music.

In order to support progressive playback, aamp by default uses souphttpsrc as the source element instead of appsrc which is used for adaptive streams.Aamp decides from the url extension that the stream is progressive.Once this is confirmed, aamp would set the url to uri property of playbin.Aamp would then take its natural course and manage the pipeline.

Pipeline Graph (used by default)



Aamp can however choose appsrc over souphttpsrc by toggling a runtime configuration switch termed as "useAppSrcForProgressivePlayback". This switch gets enabled when the keyword "appSrcForProgressivePlayback" is added in aamp.cfg of the box.

When "appSrcForProgressivePlayback" is enabled in aamp.cfg, aamp will choose appsrc as the source element and complete its pipeline.The pipeline graph then, will look like below.

Pipeline Graph

UVE playback on Intel STB

Playback is similar to Broadcom platform, but uses Intel ismd gstreamer plugins to decode/render. For DASH normal playback,  audio and video are re-stamped since start-time  of segment events are not properly supported in Intel platform.

Pipeline graph


HTML5 Video Tag based playback on Intel STB

Playback is similar to Broadcom platform, but uses Intel ismd gstreamer plugins to decode/render.

Pipeline graph


Gstreamer SOC element properties used by AAMP

AAMP Gstreamer player uses custom properties of SOC elements to achieve requirements.

Broadcom SOC element properties used by AAMP

Element

Property

Description

brcmvideosinkrectangleSet video output rectangle
brcmvideosinkzoom-modeSet video zoom
brcmvideosinkshow-video-windowHide/Show video window
brcmvideosinkenable-reject-prerollNot used anymore
brcmaudiodecoderlimit_bufferingLimit buffering to avoid audio drops during VG on/off
brcmaudiodecoderlimit_buffering_msBuffering duration in milliseconds. Used along with limit_buffering property.
brcmaudiosinkmutemute/unmute audio
brcmaudiosinkvolumeSet audio volume
brcmvideodecodervideo-ptsPTS of last decoded stream. Used to identify EOS/ stall detection
brcmvideodecoderbuffered_bytesGet buffered bytes from decoder. Used for buffering logic
brcmvideodecoderqueued_framesGet number of queued frames from decoder. Used for buffering logic
brcmvideodecodervideodecoderGet videodecoder handle to be used by CC module

Intel gstreamer element properties used by AAMP


Element

Property

Description

ismdgstaudiosinksyncEnsure audio/video synchronisation with playbin playback.
ismdgstaudiosinkinput-gainmute/unmute audio
ismdgstvidsinkstop-keep-frameHide/ continue to show last frame after stop.
ismdgstvidsinkrectangleSet the video output rectangle
ismdgstvidsinkscale-modeSet the video zoom
ismdgstvidsinkcrop-linesDisable cropping. Cropping enabled results in glitches during ABR transitions.
ismdgstvidsinkmuteHide/Show video window
ismdgstvidsinkreuse-vidrendReuse video renderer to avoid glitches during seek.
ismdgsth264viddecdecode-handleGet videodecoder handle to be used by CC module
ismdgstaudiosinkvolumeSet audio volume


Intel SOC (PaceXG1V1) specific changes

INTELCE macro is enabled for Intel SOC. Key differences from Broadcom based STB's are

  1. MPEG DASH is disabled 
  2. For DASH playback (if manually enabled), for each seek, aamp_override is enabled in qtdemux. This is because start PTS of segment event does not works as expected. Also base PTS is sent with the event as audio and video should use same base PTS.
  3. Intel ismd gstreamer elements are used, so element names and property names are different.
  4. For seek/ trickplay, instead of pause → flush → play , pipeline is deleted and created again. "stop-keep-frame" property is used to keep the last frame on screen during these transitions.
  5. H264 caps are more descriptive to avoid the opensource h264parse element added to the pipeline. This helps to improve performance.
  6. First frame notification is done on transitioning to PLAYING state. Broadcom uses a callback for this.
  7. Internal buffering before transitioning to playing state is disabled 
  8. There are some changes in sending segment event as Intel and Broadcom processes segment events differently.
  9. Broadcom has some workarounds to handle missing EOS events. Intel does not need those and are disabled.
  • No labels