DASH Streaming Video from the Web

Video Streaming Notes: Online video Option

  1.  Video has real-time protocol as well, namely RTP, RTMP, RTSP, etc
  2. HTTP Streaming (HDS, HLS, Smooth Streaming, Dash, etc)
  3. YouTube and Netflix uses Dash video format

  4. Most of these protocol utilizes plugins, no good.

    What do browsers support?
  5. Unfortunately, Progressive Download is the only  ubiquitously supported option
  6. Different Browsers support different video codec's
  7. Safari (iOs and macOS only) natively supports HLS
  8. MediaSource Extensions released in Chrome, IE11, and Safari
    -Firefox in beta
  9. [NB] For the video to stream smoothly without buffering, the video on the server has to be stored in segments/chunks. For example, one segment or first segment has would contain a segment initialize called Head.m4s and the following other segments would be in different sizes. YouTube actually stores segments of videos and audio separately for really fast transport over HTTP protocol. The data is received as bytes in the form of an array buffer. The array buffer is cast in the format of Uint8Array(response) the array bytes is then inputted in a source buffer by appending the stream of bytes. 
  10. The files have their manifest to tell the algorithm where the video files are located

    What is MPEG-DASH
  11. DASH - Dynamic Adaptive Streaming via HTTP
  12. The international open standard developed and published by ISO
  13. Addresses both simple and advanced use cases
  14. Enable highest-quality multi-screen distribution and efficient dynamic adaptive switching
  15. Enables reuse of existing content, devices and infrastructure
  16. Attempts to unify to a single standard for HTTP Streaming
  17. [NB] It is the only standard of the HTTP Streaming today. Streaming of HLS owned by Apple, Smooth Streaming owned by Microsoft and HDS owned by Adobe,  and others are proprietary protocol owned by other companies. Dash Streaming specification was created by these three companies, Dash was created recently based on these other proprietary protocols.

    Advantages of Using Dash
  18. Provides a clean separation between audio and video files (for multi-language video, becomes easy to swap up audio)
  19. The DASH specification is codec agnostic
  20. Any existing or future codec can work with DASH
  21. Allows ability for a single manifest to describe several different versions in different codecs (It doesn't care whether you are dealing with webM or MP4 etc. It will simply describe what the content is and let the player decide what it knows what to play)

  22. Understanding what is now. For live videos

    Building a DASH player
  23. DASH player is available as an open-source for different platforms, HTML 5, flash, and android
  24. DASH.js is open source
  25. DASH.js is the reference player for the DASH industry Forum (dashif.org)
  26. DASH.js has all the bit rate algorithm, buffering

    How to play a DASH Stream
  27. Download Manifest
  28. Parse Manifest
  29. Determine option bandwidth for client
  30. Initialize for bandwidth
  31. Download Segment
  32. Hand segment to MSE
  33. Check Bandwidth to determine if the change is necessary

    DASH Manifest
  34. You have a period section, this can be used for advertising
  35. AdaptationSet section cab is used for audio and video (It is more like a web.config file for videos) The adaptive bitrate algorithm determines which AdaptiveSet it would want to render.
  36. When the client first requests the download of the Manifest file, the DASH will calculate how fast the file was downloaded and know the client bandwidth.

    Understand DASH Structure
  37. Three types of files
    - Manifest (.mpd) - XML file describing the segments
    - Initialization file - contains headers needed to decode bytes in segments
    -Segments Files - contains playable media - includes (0 to many video tracks and 0 to many audio tracks)

    [Manifest File]
    -SegmentBase (Describes a stream with only a single Segment per bit-rate, Can be used for Byte Range Requests)
    -SegmentList (A SegmentURL- individual HTTP packets with media data[ Can be used for Byte Range Request])
    - SegmentTemplate (Defines a Known URL for the fragment with wildcards, resolved at run-time to request segments. Alternatively, can specify a list of segments based on duration)

    Simple SegmentList
    -Representation id='id'  mimeType= 'blablabla' codecs='codk' width='jdjd' height='jdjd' startWithSAP='1' bandwidth='7654'
        ..So on and so on

    SegmentTemplate fixed segment duration
       ContentComponet id ='1' (The Segmentation fixed segment duration defines how long before another segment starts loading. This avoid the video from buffering. Look and load the next segment before the segment ends)
  38. Class Structure
    -The player is divided into two main packages
    -Streaming - Contains the classes responsible for creating and populating the MediaSource buffers. These classes are intended to be abstract enough for use with any segmented stream (such as DASH, HLS, HDS, and MSS)
    -dash - contains the classes responsible for making decisions specifically related to Dash.
  39. Dash API (mediaPlayer.js)
    -Exposes the top-level functions and properties to the Developer (play, autoPlay, isLive, abr quality, and metrics)
    -The manifest URL and the HTML Video object as passed to the MediaPlayer

    Dependency Injection
    -Uses a class called Context from Context.js
    -The dependency mapping for the stream package.
    -The context is passed into the MediaPlayer object allowing for different MediaPlayer instances to use different mappings.

    -Loads/refreshes the manifest
    -Create SourceBuffers from MediaSource
    -Responds to events from HTML Video object
    -For a live stream, the live edge is calculated and passed to the BufferController instances.

    Debug.js (class)
    -Convenience class for logging methods
    -A default implementation is to just use console.log()
    -Extension point for tapping into logging messages

    -Responsible for loading fragments and pushing the bytes into the SourceBuffer.
    -Once play() has been called a timer is started to check the status of the bytes in the buffer.
    -If the amount of time left to play is less than Manifest.minBufferTime the next fragment is loaded
    -Records metrics related to playback.

    -Responsible for loading manifest files
    -Returns the parsed manifest object

    -Responsible for loading fragments
    -Loads requests sequentially

    -Responsible for deciding if the current quality should be changed
    -The stream metrics are passed to the set of 'rules'.
    -Methods: getPlayBackQuality(type, data) :type = the type of data audio or video

    -DownloadRatialRule.js: Validates that fragment is being downloaded in a timely manner.
     -Compares the time it takes to download a fragment to how long it takes to play out a fragment
    -If the download time is considered a bottleneck the quality will be lowered.      

    -Validates that the buffer doesn't run dry during playback
    -If the buffer is running dry continuously it likely means that the player has a processing bottleneck (video decode time is longer than playback time) 

    DASH packages
    -DashContext.js - Defines dependency mapping specific to the dash package.
    -Parser, Index Handler and Manifest Extensions    

    -Converts the manifest to a JSON object
    -Converts duration and DateTime strings into number/date objects
    -Manages inheritance fields. - many fields are inherited from parent  to child nodes in DASH

    -Responsible for deciding which fragment URL should be loaded
    -Methods: getInitRequeqst (quality)             
  40. Flow
    -Create the Context and MediaPlayer instances
    var context = new Dash.di.DashContext();
    player = new MediaPlayer(context);
    -Initialize MediaPlayer and set manifest URL
    -Attach HTML Video element
     video = document.querySelector('.dash-video-player'),player.autoplay=true;

  41. References https://dashif.org/dash.js/
  42. Next Steps: https://www.instructables.com/id/Making-Your-Own-Simple-DASH-MPEG-Server-Windows-10/
  43. Documentation: BestLink AnotherLink AnotherLinkToMediaSourceAPI TestYourDashVideo Documentation


Mark said:

If you are working in Dotnet core 2.2 and you have a separate video folder inside your assets wwwroot folder, you will have to include some code inside your startup class in order to use the video folder in asp.net core 2.2 asset pipeline.

Posted On: October 05, 2019 7:53:15 AM

© 2024 - ErnesTech - Privacy
E-Commerce Return Policy