• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/228

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

228 Cards in this Set

  • Front
  • Back
What browsers support the HTML5 video element?
1. chrome 4+
2. firefox 4+
3. safari 4+
4. opera 11+
5. internet explorer 9+
6. ios safari 6+
7. android 3+
8. opera mini no support
9. opera mobile 11+
What are the three file formats that publishers have to
consider if they want to cover all browsers that support HTML5 <video>
1. WebM
2. Ogg Theora
3. MPEG-4 H.264
What video codecs does Firefox support?
1. WebM
2. Ogg Theora
What video codecs does Safari support?
1. MPEG-4 H.264
What video codecs does Opera support?
1. WebM
2. Ogg Theora
What codecs does Google Chrome support?
1. WebM
2. Ogg Theora
3. MPEG-4 H.264
What codecs does IE support?
1. MPEG-4 H.264
Example of embedding Ogg Theora video.
<video src=”HelloWorld.ogv”></video>
Example of embedding webm video.
<video src=”HelloWorld.webm”></video>
Example of embedding MPEG-4 H.264 video.
<video src=”HelloWorld.mp4”></video>
What are the two reasons the video tag has an open and close tag.
1. There are other elements introduced as children of the <video> element — in particular the <source> and the <track>
2. Anything stated inside the <video> element that is not inside one of the specific child elements of the <video> element is regarded as “fallback content”.
Show an example of embedding MPEG-4 video in HTML5 with fallback content
<video src=”HelloWorld.mp4”>
Your browser does not support the HTML5 video element.
</video>
What are the video element attributes in HTML5.
1. src
2. crossorigin
3. poster
4. preload
5. autoplay
6. mediagroup
7. loop
8. muted
9. controls
10. width
11. height
Describe the crossorigin content attribute of the video element
The crossorigin content attribute on media elements is a CORS settings attribute.
What does IDL stand for in terms of the HTML5 spec?
IDL or Web IDL stands for (Web) Interface Definition Language can be used to describe interfaces that are intended to be implemented in web browsers.
Describe the autoplay attribute of the video
The autoplay attribute makes the video start automatically.
What happens if you don't use the autoplay attribute?
Without being set to autoplay, a browser will download only enough bytes from the beginning of a video resource to be able to tell whether it is able to decode it and to decode the header, such that the decoding pipeline for the video and audio data is set up.
What is the header being decoded in a video stream called?
Metdata
What happens when the autoplay attribute is provided?
When the @autoplay attribute is provided, the video will automatically request more audio and
video data after setting up the decode pipeline, buffer that data, and play back when sufficient data has
been provided and decoded so that the browser thinks it can play the video through at the given
buffering rate without rebuffering.
What browsers support the autoplay attribute?
The autoplay attribute is supported in all major browsers that support the video tag.
What happens if the download speed of the video data is not fast enough to provide a smooth playback or the browser's decoding speed is too slow?
The video playback will stall and allow for the playback buffers to be filled before continuing playback. The browser will give the user some notice of the stalling — e.g. a spinner or a “Loading…” message.
Describe the loop attribute
The loop attribute makes the video automatically restart after finishing playback. All browsers except Firefox support this attribute.
Describe the poster attribute?
It provides the representative image for the video. The choice of frame to display is actually up to the browser. Most browsers will pick the first frame since its data typically comes right after the headers in the video resource and therefore are easy to download. But there is no guarantee. Also, if the first frame is black, it is not the best frame to present.
Why should you explicitly set the width and height attributes?
It can create a performance bottleneck in the browsers and a disruptive display when the viewport suddenly changes size between a differently scaled poster image and the video.
Describe the controls attribute
The @controls attribute is a boolean attribute. If specified without @autoplay, the controls are
displayed either always (as in Safari and Chrome), or when you mouse over and out of the video (as in
Firefox), or only when you mouse over the video (as in Opera and IE).
Describe the preload attribute
The @preload attribute gives the web page author explicit means to control the download behavior of the Web browser on <video> elements. The @preload attribute can take on the values “none”, “metadata”, or “auto”.
describe the none parameter for the preload attribute
You would choose “none” in a situation where you do not expect the user to actually play back the
media resource and want to minimize bandwidth use. A typical example is a web page with many video
elements — something like a video gallery — where every video element has a @poster image and the
browser does not have to decode the first video frame to represent the video resource. On a video gallery,
the probability that a user chooses to play back all videos is fairly small. Thus, it is good practice to set
the @preload attribute to “none” in such a situation and avoid bandwidth wasting, but accept a delay
when a video is actually selected for playback.
Describe the metadata parameter of the preload attribute
You will choose “metadata” in a situation where you need the metadata and possibly the first video
frame, but do not want the browser to start progressive download. This again can be in a video gallery situation. For example, you may want to choose “none” if you are delivering your web page to a mobile device or a low-bandwidth connection, but choose “metadata” on high-bandwidth connections. Also, you may want to choose “metadata” if you are returning to a page with a single video that a user has
already visited previously, since you might not expect the user to view the video again, but you do want
the metadata to be displayed. The default preload mode is “metadata”.
Describe the auto parameter of the preload attribute
You will choose “auto” to encourage the browser to actually start downloading the entire resource,
i.e. to do a progressive download even if the video resource is not set to @autoplay. The particular
browser may not want to do this, e.g. if it is on a mobile device, but you as a web developer signal in this way to the browser that your server will not have an issue with it and would prefer it in this way so as to
optimize the user experience with as little wait time as possible on playback.
What browsers support the preload attribute
All major browsers that support video element support preload except internet explorer.
Describe the source element
both the <video> and the <audio> element do not have a universally supported baseline codec. Therefore, the HTML5 specification has created a means to allow specification of alternative source files through the <source> element. This allows a web developer to integrate all the required links to alternative media resources within the markup without having to test for browsers' support and use JavaScript to change the currently active resource.
show an example of the use of the source element with video tag
<video poster="HelloWorld.png" controls>
<source src="HelloWorld.mp4">
<source src="HelloWorld.webm">
<source src="HelloWorld.ogv">
</video>
Describe the type attribute of the source element
The <source> element has a @type attribute to specify the media type of the referenced media resource.
This attribute is a hint from the web developer and makes it easier for the browser to determine whether
it can play the referenced media resource. It can even make this decision without having to fetch any
media data. The @type attribute contains a MIME type with an optional codecs parameter.
Why was the type attribute created?
The @type attribute was created to remove the need for sniffing and speed up the resource selection and loading.
Describe the media attribute of the source element.
The @media attribute provides a means for associating so-called media queries with a resource.
It is not supported in any of the major desktop browsers
What is the browser support for the audio element
same as video
What audio codecs does Firefox support?
1. WAV
2. Ogg Vorbis
What audio codecs does safari support?
1. WAV
2. MP3
What audio codecs does chrome support?
1. MP3
2. WAV
3. Ogg Vorbis
What audio codecs does IE support?
1. MP3
What audio codecs does Opera support?
1. WAV
2.Ogg Vorbis
What are the MIME types for audio formats
1. MP3 = audio/mpeg
2. Ogg = audio/ogg
3. Wav = audio/wav
show an example of embedding a wav file using the audio tag
<audio src=”HelloWorld.wav”></audio>
show an example of the use of the source element with audio tag
<audio controls>
<source src="HelloWorld.mp3">
<source src="HelloWorld.ogg">
<source src="HelloWorld.wav">
</audio>
What about the attributes for the audio tag
Src, autoplay, loop, controls and preload work the same for audio as video except that its different
What library do most open source tools use for encoding MPEG-4 H.264 Video?
Open-source tools for encoding MPEG-4 H.264 basically all use the x264 encoding library, which is published under the GNU GPL license. x264 is among the most feature-complete H.264 codecs and widely accepted as one of the fastest.
What is FFmpeg?
FFmpeg is a command line tool to convert multimedia files between formats. It has binaries for all major platforms.
What is transcoding and why should you avoid it
Transcoding means you are decoding from one encoding format and reencoding in another. All transcoding creates new artifacts, even if you are transcoding for the same codec. Artifacts are visible or audible effects created in the audio or video data that are not present in the original material and introduced through the encoding process and which reduce the
quality of the material.
Encoding a video resource to the MPEG-4 H.264 format using presets
ffmpeg -i input-file \
-vcodec libx264 -vpre <preset> \
-acodec libfaac \
-threads 0 outfile.mp4
what does the -vpre standfor
video preset
what does setting the -threads parameter to 0 do
Setting the -threads parameter to 0 encourages FFmpeg to use as many threads as appropriate.
Typical encoding using the normal profile
ffmpeg -i HelloWorld.dv \
-vcodec libx264 -vpre normal -vb 3000k \
-acodec libfaac -ab 192k \
-threads 0 HelloWorld.mp4
How do you control the video and audio bitrate
If you want to control the bitrate, you can set it for the video with -vb <bitrate> and for the audio with -ab <bitrate>.
Describe two pass encoding.
A technique to improve video quality.
How do you perform a two pass encoding in FFmpeg
For this, you run FFmpeg twice. The first time, you need to include only the video, no audio, and use the- fastfirstpass preset, as well as a -pass 1 parameter. Also, you need to write the output somewhere to a temporary file, since you are interested only in creating the log files that the second pass requires. Then, in the second pass, you include a -pass 2 parameter and the audio.
show an example of two pass encoding in FFmpeg
ffmpeg -i Helloworld.dv -pass 1 \
-vcodec libx264 -vpre fastfirstpass \
-an -threads 0 tempfile.mp4

ffmpeg -i HelloWorld.dv -pass 2 \
-vcodec libx264 -vpre normal -vb 3000k \
-acodec libfaac -ab 192k \
-threads 0 HelloWorld.mp4
What library do most open source tools use for encoding Ogg Theora Video?
Open-source tools for encoding Ogg Theora basically use the libtheora15 encoding library, which is published under a BSD style license by Xiph.org.
What are the most broadly used encoders for Ogg Theora Video?
There are several encoders written on top of libtheora,
of which the most broadly used are ffmpeg2theora and FFmpeg.
What is the main difference between ffmpeg2theora and FFmpeg.
The main difference between ffmpeg2theora and FFmpeg is that ffmpeg2theora is fixed to use the
Xiph libraries for encoding, while FFmpeg has a choice of codec libraries, including its own Vorbis implementation and its own packaging. ffmpeg2theora has far fewer options to worry about. The files created by these two tools differ.
what do you do if you want to use FFmpeg for encoding Ogg Theora.
If you want to use FFmpeg for encoding Ogg Theora, make sure to use the -acodec libvorbis and not -acodec vorbis;
otherwise your files may be suboptimal.
Encoding a video resource to the Ogg Theora/Vorbis format
ffmpeg2theora -o outfile.ogv infile
Adapting the bitrate for audio and video for ffmpeg2theora
ffmpeg2theora -o HelloWorld.ogv -p pro \
--videobitrate 3000 \
--audiobitrate 192 \
infile
Using two-pass encoding for Ogg Theora
ffmpeg2theora -o HelloWorld.ogv -p pro \
--two-pass infile
What library do most open source tools use for encoding WebM Video?
Open-source tools for encoding WebM basically use the libvpx encoding library, which is published
under a BSD style license by Google.
What are the most broadly used encoders for WebM?
There are several encoders written on top of libvpx, which
includes DirectShow filters, a VP8 SDK, GStreamer plug-ins, and patches for FFmpeg.
Encoding a video resource to the WebM format
ffmpeg -i infile outfile.webm
Adapting the bitrate for audio and video for FFmpeg
ffmpeg -o HelloWorld.webm -p 720p \
--videobitrate 3000 \
--audiobitrate 192 \
infile
Encoding audio to MP3 using FFmpeg
ffmpeg -i audio -acodec libmp3lame -aq 0 audio.mp3
Describe the aq parameter in ffmpeg encoding
The aq parameter signifies the audio quality and goes from 0 to 255 with 0 being the best quality.
Encoding audio to Ogg Vorbis using FFmpeg
ffmpeg -i audio -f ogg -acodec libvorbis -ab 192k audio.ogg
What other library(other than FFmpeg) can you use to encode Ogg Vorbis
You can also use oggenc to encode to Ogg Vorbis, which is slightly easier to use and has some specific Ogg Vorbis functionality.
Encoding audio to Ogg Vorbis using oggenc
oggenc audio -b 192 -o audio.ogg
What support should a server have to serve HTML5 video and audio?
When making a choice between which server software to choose, make sure it supports HTTP 1.1 Byte Range requests.
What is the standard way in which browsers receive HTML5 media resources from web servers
HTTP Byte Range requests is the standard way in which browsers receive HTML5 media resources from web servers.
How does the browser generally enable play
back audio and video content during a progressive download?
When using progressive download, the browser first
retrieves the initial byte ranges of a media resource, which tells it what kind of media resource it is and
gives it enough information to set up the audio and video decoding pipelines. This enables it to play
back audio and video content.
How can a browser progressively download byte ranges and at the same time decode and start playing
back the audio-visual content already received.
Audio and video data in a media resource are typically provided in a multiplexed manner, i.e. a bit of video, then the related bit of audio, then the next bit of video etc. Thus, if the browser asks for a byte range on the resource, it will retrieve both, the audio and video data that belongs to the same time range together.
how is the best result of progressive downloading of a video achieved?
The best result is achieved when the network is fast enough to feed the decoding pipeline quicker than the decoder and graphics engine can play back the video in real time. This will give a smooth playback impression to users without making them wait until the whole video is downloaded.
What is the key to the usability of progressive downloads noting that with progessive downloads users cannot seek to any offset unless the download has reached those bytes
Key to the usability of progressive download is the ability to seek into not-yet downloaded sections by asking the web server to stop downloading and start providing those future data chunks. This is implemented in browsers using the HTTP 1.1 Byte Range request mechanism.
What is the aim of http streaming
The aim here is to continue using normal web servers without any software extensions and to push the complexity of streaming into the client software. This avoids all the issues of creating new protocols, fighting firewalls, and installing new software on servers.
Why is the aim of http streaming so great?
This is great because they can roll out new functionality through user software updates and without having to update server software.
What is adaptive HTTP streaming?
Adaptive HTTP streaming is a process that adjusts the quality of a video delivered over HTTP, based on changing network and user software conditions to ensure the best possible viewer experience.
How does http streaming help the browser deal with varying bandwidth playback?
This is particularly targeted at varying bandwidth availability and aims at removing the buffering interruptions users on low-bandwidth networks can experience when watching video over HTTP progressive download.
What are the technologies implementing HTTP Streaming?
The different technologies in the market are:
• Apple HTTP Live Streaming.
• Microsoft Smooth Streaming.
• Adobe HTTP Dynamic Streaming.
Is there an an standards implementation of HTTP Streaming?
Yes, Dynamic Adaptive Streaming over HTTP (DASH), also known as MPEG-DASH, enables high quality streaming of media content over the Internet delivered from conventional HTTP web servers.
Describe the IDL attribute currentSource.
The currentSource attribute stores the resource location of the media element (video, audio or source).
How would you dynamically change the resource location of the media element?
To dynamically change the resource location of the media element, you can always set the @src content attribute of the media element using JavaScript and call the load() method to reload the element's media resource.
What is generally involved in selecting media resources.
queuing tasks, firing events, setting network states, ready states, and potentially error states.
When is the resource selection algorithm invoked?
The resource selection algorithm is invoked as the media element is loaded and asynchronously executes thereafter.
What are the IDL attributes that desccribe the general features of media resources
1. currentSrc
2. startTime
3. duration
4. volume
5. muted
6. videoWidth
7. videoHeight
What initiates the resource fetch algorithm ?
The resource selection algorithm initiates the resource fetch algorithm which actually downloads the media data and decodes it.
Why cant you rely on currentSrc being available
You cannot rely on it to be available to JavaScript before the resource selection algorithm has started fetching media data, which is signified through firing a progress
event.
When can't you rely on the progress event
When you are dealing with a buffered video resource, since in this case, no progress event is fired.
If downloading a buffered video resource has no progress event how do you know when the currentSrc parameter is available?
Thus, the event that will indicate a media resource is now usable is the loadedmetadata event. You need to listen for the loadedmetadata event being fired before accessing
@currentSrc.
example of a eventListener for the loadedmetadata event and the progress event
video.addEventListener("progress", callbackFunction, false);
video.addEventListener("loadedmetadata", callbackFunction, false);
Bottom line for currentSrc
You can not rely on anything but loadedmetadata event.
What is the startTime property
After a media resource is loaded, it is possible to know what the time-stamp is of the earliest possible position that can be played back from the media resource.
Describe the duration property
When a media resource's metadata is loaded and before any media data is played back, you can find out about the duration of the resource. The read-only @duration IDL attribute returns the length of the media resource in seconds.
During loading time what will the duration property return
During loading time, @duration will return the NaN value (Not-a-Number).
What will duration return if it is a live or unbound stream?
If it is a live or an unbound stream, the duration is infinity.
If the stream ends on an unbound or live stream what will the duration property return.
The duration changes to that given through the last samples in the stream set in relation to @startTime.
What happens on every update of the @duration property of the media resource
Every update of the @duration of the media resource causes a durationchange event to be fired, so
you can always retrieve the exact @duration value that the UA is working with. This will also happen if a
different resource is loaded.
Why is the duration property in Google's Chrome more accurate than in other browsers.
Note that the duration property in Google Chrome is more accurate than in the other browsers, since it
returns duration as a double value of seconds. I
More recently, the HTML5 specification has slightly changed the meaning of the @duration attribute what has it changed to?
More recently, the HTML5 specification has slightly changed the meaning of the @duration attribute. It now represents the end time of the media resource. This has not been implemented by any browser yet.
Describe the volume property of the media element
When reading the @volume IDL attribute of a media resource, the playback volume of the audio track
is returned in the range 0.0 (silent) to 1.0 (loudest).
What event is fired when the properties value changes?
Whenever the volume of the media resource is changed—either through user interaction or JavaScript—a volumechanged event is fired.
Describe the muted property of the media element.
When reading the @muted IDL attribute of a media resource, it returns “true” if the audio channels are muted and “false” otherwise.
What event is fired when the muted property is changed
A volumechanged event is fired.
What are the playback related attributed of media resources
1. currentTime
2. seeking
3. paused
4. ended
5. defaultPlaybackRate
6. playbackRate
Describe the currentTime attribute
The @currentTime IDL attribute returns the current playback position of the media resource in
seconds.
What happens if you set the currentTime through code
It will initiate a seek by the browser to a new
playback position.
When can you seek?
Seeking can be undertaken only when the media element's metadata has been loaded, so don't try
changing the @currentTime before the readyState is at least HAVE_METADATA.
What event is fired on a successful seek
A timeupdate event will be fired upon a successful seek.
What happens if you start a new seeking action in the middle of one?
A browser will interrupt any current seeking activities if you start a new seeking action.
If you seek to a time where the data is not available yet?
If you seek to a time where the data is not available yet, current playback (if any) will be stopped and you will have to wait until that data is available. A waiting event will be fired.
Describe the seeking attribute?
The read-only @seeking IDL attribute is set by the browser to “true” during times of seeking and is “false” at all other times.
Describe the pause attribute?
The read-only @paused IDL attribute is set by the browser to “true” if the media playback is paused.
Pausing can happen either through user interaction on the interface or through JavaScript.
Why cant you assume that the video is playing when paused is set to false?
Even when @paused is “false,” it is possible the media resource is in a state of buffering, in an error state, or has reached the end and is waiting for more media data to be appended.
How can you tell when the media playback is playing?
Because there is no explicit @playing IDL attribute,
you need to use the @paused value and some other hints to determine if the browser is currently playing
back a media resource. The combined hints are:
1. @paused is “false.”
2. @ended is “false.”
3. The readyState is HAVE_FUTURE_DATA or HAVE_ENOUGH_DATA.
4. @error is null.
What events can you use to help you know that the media is playing?
There are also events that can help you ensure that playback continues working: the playing event is
fired when playback starts and as long as no waiting or ended or error event is fired and @paused is
“false,” you can safely assume that you are still playing.
Describe the ended attribute?
The read-only @ended IDL attribute is set by the browser to “true” if the media playback has ended and the direction of playback is forward (see @playbackRate); otherwise @ended is “false."
Describe the defaultPlaybackRate attribute
The @defaultPlaybackRate IDL attribute returns the speed at which the media resource is meant to be played back as a multiple of its intrinsic speed. Initially, it is set to 1.0, which is normal playback speed.
Describe the playbackRate.
The @playbackRate IDL attribute returns the speed at which the media resource actually plays back as a multiple of its intrinsic speed. You can implement fast forward and fast rewind with @playbackRate.
When the @defaultPlaybackRate or the @playbackRate attribute values are changed what event is fired.
When the @defaultPlaybackRate or the @playbackRate attribute values are changed, a rateChange
event is fired.
What are the IDL attributes, which represent web browser managed states of a media element?
The following IDL attributes, which represent web browser managed states of a media element, are
explained in this section:
1. networkState
2. readyState
3. error
4. buffered TimeRanges
5. played TimeRanges
6. seekable TimeRanges
Describe the networkState attribute
The @networkState IDL attribute represents the current state of network activity of the media element. The available states are the following:
1. networkState
2. readyState
3. error
4. buffered TimeRanges
5. played TimeRanges
6. seekable TimeRanges
What are the states of the networkState attribute.
1. NETWORK_EMPTY
2. NETWORK_IDLE
3. NETWORK_LOADING
4. NETWORK_NO_SOURCE
Describe the NETWORK_EMPTY state?
The NETWORK_EMPTY or 0 state defines a state where no @currentSrc has been identified. This may be because the element has yet to be initialized, or because the resource selection hasn't found an @src attribute or
<source> elements and is waiting for a load() function call to set it.
Describe the NETWORK_IDLE state?
NETWORK_IDLE or 1 state is when an @currentSrc has been identified and resource fetching is possible, but the
browser has suspended network activity while waiting for user activity.
What activities can cause the NETWORK_IDLE
1. downloaded the media element metadata and the media resource not set to autoplay.
2. It also happens when the media resource has been partly downloaded and the network buffering is suspended for some reason.
3. it also occurs when a resource is completely downloaded.
What are the reasons that network buffering can suspended?
1. a connection interruption,
2. media resource file corruption,
3. a user abort
4. or for the simple fact that the browser has pre-buffered more than enough media data ahead of the playback position so is waiting for the user to catch up
Describe the NETWORK_LOADING state?
The NETWORK_LOADING or 2 state signifies that the browser is trying to download media resource data.
What events are connected to the NETWORK_LOADING state?
1. loadstart event is fired right before the state is NETWORK_LOADING
2. progress event is fired if the @networkState changes at a later stage back to NETWORK_LOADING
3. stalled event is fired if data is unexpectedly not arriving from the network while trying to load
Describe the NETWORK_NO_SOURCE state?
The NETWORK_NO_SOURCE or 3 the resource selection has identified a @currentSrc, but the resource has failed to load or the URL couldn't be resolved, or there is no resource provided; i.e. no @src or valid <source> children
Describe the readyState IDL attribute?
The @readyState IDL attribute represents the current state of the media element in relation to its
playback position. The available states are the following:
1. HAVE_NOTHING (0)
2. HAVE_METADATA (1)
3. HAVE_CURRENT_DATA (2)
4. HAVE_FUTURE_DATA (3)
5. HAVE_ENOUGH_DATA (4)
Describe the HAVE_NOTHING (0) readyState?
HAVE_NOTHING (0): no information regarding the video resource is available, including nothing about
its playback position. This is typically the case before the media resource starts downloading.
What is the relationship between the NETWORK_EMPTY networkState attribute and the HAVE_NOTHING readyState
Media elements whose @networkState attribute is set to
NETWORK_EMPTY are always in the HAVE_NOTHING @readyState.
Describe the HAVE_METADATA readyState?
the setup information of the media resource has been received, such that
1. the decoding pipeline is set up
2. the width and height of a video resource are known 3. and the duration of the resource (or a good approximation of it) is available.

Seeking and decoding is now possible, even though no actual media data is available yet for the current playback position.
What event is fired as the HAVE_METADATA state is reached?
As the HAVE_METADATA state is reached, a loadedmetadata event is fired.
Even though no actual media data is available at the time the HAVE_META is set for the current playback position what functions are now the possible?
Seeking and decoding is now possible
Describe the HAVE_CURRENT_DATA readyState?
The HAVE_CURRENT_DATA readyState happens when 1. the decoded media data for the current playback position is available, but either not enough to start playing back continuously or the end of the playback direction has been reached.
2. This state will also be reached when waiting for enough data to download for playback; e.g. after a seek or after the buffered data ran out
What event is fired upon reaching the HAVE_CURRENT_DATA readyState for the first time?
If this state is reached for the first time, a loadeddata event is fired.
If the HAVE_CURRENT_DATA state is not after the normal reasons for activating it what happens?
Note that this state may not be taken, but rather a HAVE_FUTURE_DATA or HAVE_ENOUGH_DATA state may be directly achieved after HAVE_METADATA
What event is fired as a result of HAVE_FUTURE_DATA or HAVE_ENOUGH_DATA state being reached for the first time
loadeddata event is fired upon reaching them for the first time.
What happens when reaching HAVE_CURRENT_DATA state is caused by the waiting for enough data to download for playback; e.g. after a seek or after the buffered data ran out
A waiting and a timupdate event are fired.
Describe the HAVE_FUTURE_DATA readyState.
happens when the decoded media data for the current playback position and the next position is available; e.g. the current video frame and the one following it.
What happens when the state is reached for the first time
If this state is reached for the first time, a canplay event is fired.
What happens if the element is not paused and
not seeking and HAVE_FUTURE_DATA is reached?
If the element is not paused and not seeking and HAVE_FUTURE_DATA is reached, a playing event is fired.
Describe the HAVE_ENOUGH_DATA readyState.
HAVE_ENOUGH_DATA or 4 : means that enough decoded media data is available for the current and next playback positions. positions. The network download rate is fast enough that the browser estimates data will be fetched and decoded at the @defaultPlaybackRate sufficiently to allow continuous playback to the end of the media resource without stopping for further buffering.
What happens if this state is reached without going through HAVE_FUTURE_DATA
a canplay event is fired
What happens if the element is not paused and not seeking and this state is reached without going through HAVE_FUTURE_DATA
playing event is fired.
what happens if the HAVE_ENOUGH_DATA state is reached for the first time.
a canplaythrough event is fired.
Describe the @error IDL attribute
The @error IDL attribute represents the latest error state of the media element as a MediaError object.
What are the error states for the error IDL attribute
1. MEDIA_ERR_ABORTED (1):
2. MEDIA_ERR_NETWORK (2):
3. MEDIA_ERR_DECODE (3):
4. MEDIA_ERR_SRC_NOT_SUPPORTED (4):
Describe the MEDIA_ERR_ABORTED error state:
this error is raised when the fetching process for the media resource was aborted by the browser at the user's request; e.g. when browsing to another web page.
What will the networkState be when the error state is MEDIA_ERR_ABORTED
The networkState will be either NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was aborted.
What event is fired upon reaching the MEDIA_ERR_ABORTED state?
An abort event is fired
Describe the MEDIA_ERR_NETWORK error state
this error is raised when any kind of network error caused the browser to stop fetching the media resource after the resource was established to be usable; e.g. when the network connection is interrupted.
What will the networkState be when the error state is MEDIA_ERR_NETWORK
The @networkState will be either
NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was aborted.
What event is fired upon reaching the MEDIA_ERR_NETWORK state?
An error event is fired.
Describe the MEDIA_ERR_DECODE error state
this error is raised when decoding of a retrieved media resource failed and video playback had to be aborted; e.g. because the media data was corrupted or the media resource used a feature that the browser does not support.
What will the networkState be when the error state is MEDIA_ERR_DECODE
The networkState will be either NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was aborted.
What event is fired upon reaching the MEDIA_ERR_DECODE state?
An error event is fired.
Describe the MEDIA_ERR_SRC_NOT_SUPPORTED error state
This error is raised when the media resource in the @src attribute failed to load or the URL could not be resolved. The media resource may not load if the server or the network failed or because the format is not supported.
What will the networkState be when the error state is MEDIA_ERR_SRC_NOT_SUPPORTED
The networkState will be either NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was
aborted.
What event is fired upon reaching the MEDIA_ERR_SRC_NOT_SUPPORTED state?
An error event is fired.
Describe the buffered IDL attribute
The @buffered IDL attribute retains the ranges of the media resource that the browser has buffered. The value is stored in a normalized TimeRanges object, which represent a list of ranges (intervals or periods) of time.
Why would the buffered IDL attribute TimeRanges object need a list of ranges
Typically, the @buffered IDL attribute contains a single time range that starts at the @startTime of
the media resource and grows as media data is downloaded until all of it has been received. However, for
a large resource where seeking is undertaken to later points in the resource, the browser may store
multiple byte ranges, thus creating multiple TimeRanges.
Describe the played IDL attribute
The @played IDL attribute retains the ranges of the media resource the browser has played. The value is stored in a normalized TimeRanges object (see @buffered attribute). The timeline of the @played IDL attribute is the timeline of the media resource.
Describe the seekable IDL attribute
The @seekable IDL attribute retains the ranges of the media resource to which the browser can seek. The value is stored in a normalized TimeRanges object (see @buffered attribute). The timeline of the @seekable IDL attribute is the timeline of the media resource.
show a function you could use to print out timeranges from the time ranges object
function printTimeRanges(tr) {
if (tr == null) return "undefined";
s = tr.length + ": ";
for (i=0; i<tr.length; i++) {
s += tr.start(i) + " - " + tr.end(i) + "; ";
}
return s;
}
what are the JavaScript control methods defined on media elements
1. load()
2. play()
3. pause()
4. canPlayType()
Describe the load control method
The load() control method, when executed on a media element, causes all activity on a media
resource to be suspended (including resource selection and loading, seeking and playback), all network
activity to be seized, the element to be reset (including removal of pending callbacks and events), and
the resource selection and loading process to be restarted.
What are the phase to the succesful load call
1. initialization
2. resource selection
3. resource fetching
4. playback start (if autoplay is set to true)
Describe the sequence of events in the initalization phase
The steps dont necessarily happen in this order
1. networkState is set to NETWORK_EMPTY.
2. readyState is set to HAVE_NOTHING.
3. paused is set to “true.”
4. seeking is set to “false.”
5. ended is set to “false.”
6. currentTime is set to 0.
7. error is set to null.
8. buffered, played, and seekable are set to empty.
9. playbackRate is set to the value of defaultPlaybackRate.
Describe the sequence of events in the resource selection phase
The steps dont necessarily happen in this order
1. currentSrc is set from the given @src value or the <source> elements.
2. networkState is set to NETWORK_LOADING.
3. the loadstart event is fired.
Describe the sequence of events in the resource fetching phase
1. begin downloading the media resource identified in the @currentSrc attribute.
2. progress event is fired roughly every 350ms or for every byte received (whichever is less frequent).
3. @preload and @autoplay values help determine how much to download.
4. the resource's metadata can be downloaded
5. the media can seek to the appropriate start time given in the media resource or the @currentSrc URI:
6. potentially more media data can be downloaded (and decoded):
7 . @networkState is set to NETWORK_IDLE.
Describe the steps that happen during the resource fetching phase after the resource's metadata has been downloaded:
1. @startTime is determined.
2. @currentTime is set to @startTime.
3. @duration is determined.
4. the durationchange event is fired.
5. @videoWidth and @videoHeight are determined (if video element).
6. @seekable is determined.
7. @readyState is set to HAVE_METADATA.
8. the loadedmetadata event is fired.
Describe the steps that happen during the resource fetching phase after seeking to the appropriate start time given in the media resource or the @currentSrc URI:
1. @currentTime is set to this start time.
2. the timeupdate event is fired.
Describe the steps that happen during the resource fetching phase if potentially more media data is downloaded (and decoded)
1. @readyState changes to HAVE_CURRENT_DATA or higher.
2. the loadeddata event is fired.
3. the canplay event is fired for any @readyState higher than
HAVE_FUTURE_DATA.
4 @buffered is updated.
Describe the sequence of events that happen if the playback start phase happens
1. download more data until @readyState is HAVE_FUTURE_DATA or higher (preferably HAVE_ENOUGH_DATA so playback doesn't get stalled).
2. @paused is set to “false.”
3. the play event is fired.
4. the playing event is fired.
5. playback is started.
Describe the play() control method
The play() control method executed on a media element sets the @paused IDL attribute to “false” and starts playback of the media resource, downloading and buffering media data as required.
In a typical scenario for a successful play(), roughly what sequence of steps will happen:
1. if @networkState is NETWORK_EMPTY—i.e. no @currentSrc has been determined yet (e.g. because the @src of the element was empty as the element was set up, but now the attribute was set through JavaScript and the resource can be fetched)— “resource selection” and “resource fetching,” as described for load() are
executed.
2. if @ended is “true” and the playback direction is forward, the browser seeks to @startTime.
3. @currentTime is set to @startTime.
4. timeupdate event is fired.
5. “start playback” as described for load() above is executed.
Describe the pause() control method
The pause() control method, when executed on a media element, sets the @paused IDL attribute to
“true” and stops playback of the media resource.
In a typical scenario for a successful pause(), roughly the following sequence of steps will happen:
1. pause playback:
2. @paused is set to “true.”
3. timeupdate event is fired.
4. pause event is fired.
5. downloading of more media data is potentially suspended and a suspend event is fired if the browser is far ahead of the current playback position.
Describe the canPlayType() control method
The canPlayType(in DOMString type) control method for a media element takes a string as a
parameter that is of a MIME type and returns whether the browser is confident that it can play back that
media type.
What are the return values of the canPlayType method?
1. empty string
2. maybe
3. probably
Describe the empty string return value of the canplayType method.
the browser is confident it cannot decode and render this type of media resource in a media element.
Describe the maybe return value of the canplayType method.
the browser is not sure if it can or cannot render this type of media resource in a media element.
Describe the maybe return value of the canplayType method.
the browser is confident that it can decode and render this type of media resource in a media element; because this implies knowledge about whether the codecs in a container format are supported by the browser. Browsers are encouraged to only return “probably” for a MIME type that includes the codecs parameter.
What are the available events for the Javascript Media Resources API
1. loadstart
2. progress
3. suspend
4. abort
5. error
6. emptied
7. stalled
8. play
9. pause
10. loadedmetadata
11. loadeddata
12. waiting
13. playing
14. canplay
15. canplayThrough
16. seeking
17. seeked
18. timeupdate
19. ended
20. ratechange
21. volumeChange
22. durationchange
loadstart is dispatched when
the browser begins looking for media data, as part of the resource selection upon media element load, or load(), play(), or pause().
What is the precondition for loadstart
@networkState is NETWORK_LOADING for the first time.
progress is dispatched when
the browser is fetching media data.
What is the precondition for progress event
@networkState is NETWORK_LOADING.
suspend event is dispatched when
the browser has paused fetching media data,
but does not have the entire media resource
downloaded yet.
What is the precondition for suspend event
@networkState is NETWORK_IDLE.
abort event is dispatched when
the browser was stopped from fetching the media data before it is completely downloaded, but not due to an error—rather due to a user action, such as browsing away.
What is the precondition for abort event
1. error is MEDIA_ERR_ABORTED.
2. networkState is either NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was aborted.
error event is dispatched when
an error occurred while fetching the media
data.
What is the precondition for error event
@error is MEDIA_ERR_NETWORK or higher. @networkState is either NETWORK_EMPTY or NETWORK_IDLE, depending on when the download was aborted.
emptied event is dispatched when
a media element has just lost the network connection to a media resource, either because of a fatal error during load that's about to be reported or because the load() method was invoked while the selection algorithm was already running.
What is the precondition for emptied event
@networkState is NETWORK_EMPTY for the first time and all the IDL attributes are in their initial states.
stalled event is dispatched when
the browser tried to fetch media data, but data has not arrived for more than 3 seconds.
What is the precondition for stalled event
@networkState is NETWORK_LOADING.
play event is dispatched when
playback has begun upon media element load with an autoplay attribute through user interaction or after the play() method has returned.
What is the precondition for play event
@paused is newly “false.”
pause event is dispatched when
playback has been paused either through user interaction, or after the pause() method has returned.
What is the precondition for pause event
@paused is newly “true”.
loadedmetadata event is dispatched when
the browser has just set up the decoding pipeline for the media resource and determined the @duration and dimensions.
What is the precondition for loadedmetadata event
@readyState is HAVE_METADATA or greater for the first time.
loadeddata event is dispatched when
the browser can render the media data at the
current playback position for the first time.
What is the precondition for loadeddata event
@readyState is HAVE_CURRENT_DATA or greater for the first time.
waiting event is dispatched when
playback has stopped because the next media data is not yet available from the network, but the browser expects that frame to become available in due course; i.e. less than 3 seconds. This can be after a seek or when the network is unexpectedly slow.
What is the precondition for waiting event
@readyState is newly equal to or less than HAVE_CURRENT_DATA, and @paused is “false.” Either @seeking is “true,” or the current playback position is not contained in any of the ranges in @buffered.
playing event is dispatched when
playback has started.
What is the precondition for playing event
@readyState is newly equal to or greater than HAVE_FUTURE_DATA, paused is “false,” @seeking is "false,” or the current playback position is contained in one of the ranges in @buffered.
canplay event is dispatched when
the browser can start or resume playback of the media resource, but without being certain of being able to play through at the given playback rate without a need for further
buffering.
What is the precondition for canplay event
@readyState newly increased to HAVE_FUTURE_DATA or greater.
canplaythrough event is dispatched when
the current playback position changed as part
of normal playback every 15 to 250ms. It is also
fired when seeking, or when fetching a new
media resource. It is also fired when playback
has ended, is paused, is stopped due to an
error or because the media resource needs to
buffer from the network.
What is the precondition for canplaythrough event
1. @seeking is newly “true” OR
2. @startTime is newly set OR
3. @ended is newly “true” OR
4. @paused is newly “true” OR
5. @readyState newly changed to a value lower than
HAVE_FUTURE_DATA without @ended is “true” OR
6. @error is newly nonnull with @readyState being HAVE_METADATA or more OR
7. @seeking is “false,” @paused is “false,” @ended is “false,” @readyState is at least HAVE_FUTURE_DATA, and the last timeupdate was fired more than 15-250ms ago
ended event is dispatched when
playback has stopped because the end of the
media resource was reached.
What is the precondition for ended event
@ended is newly “true” and @currentTime is equal to @startTime plus @duration.
ratechange event is dispatched when
either the default playback rate or the playback rate has just been updated.
What is the precondition for ratechange event
@defaultPlaybackRate or @playbackRate is newly changed.
durationchange event is dispatched when
the duration has just been changed upon media element load, or after an explicit change of @src and media resource fetching, or when the browser has a better estimate; e.g. during streaming.
What is the precondition for durationchange event
@readyState is HAVE_METADATA or greater for the first time.
volumechange event is dispatched when
either the volume or the muted state of the media element changed.
What is the precondition for volumechange event
@volume or @muted changed.