One of the biggest driving forces behind the growth of the internet has been the insatiable demand from users for ever more multimedia in the form of audio and video. Initially, bandwidth was so precious that there was no such thing as live streaming, and it could take minutes or even hours to download an audio track, let alone a video.
The high cost of bandwidth and limited availability of fast modems drove the development of faster and more efficient compression algorithms, such as MP3 audio and MPEG video, but even then the only way to download files in any reasonable length of time was to drastically reduce their quality.
One of my earlier internet projects, back in 1997, was the UK’s first online radio station licensed by the music authorities. Actually, it was more of a podcast (before the term was coined) because we made a daily half-hour show and then compressed it down to 8-bit, 11 KHz mono using an algorithm originally developed for telephony, and it sounded like phone quality, or worse. Still, we quickly gained thousands of listeners who would download the show and then listen to it as they surfed to the sites discussed in it by means of a pop-up browser window containing a plug-in.
Thankfully for us, and everyone publishing multimedia, it soon became possible to offer greater audio and video quality, but still only by asking the user to download and install a plug-in player. Flash became the most popular of these players, after beating rivals such as RealAudio, but it gained a bad reputation as the cause of many a browser crash, and constantly required upgrading when new versions were released.
So, it was generally agreed that the way ahead was to come up with some web standards for supporting multimedia directly within the browser. Of course, browser developers such as Microsoft and Google had differing visions of what these standards should look like, but when the dust settled, they had agreed on a subset of file types that all browsers should play natively, and these were introduced into the HTML5 specification.
Finally, it is possible (as long as you encode your audio and video in a few different formats) to upload multimedia to a web server, place a couple of HTML tags in a web page, and play the media on any major desktop browser, smartphone, or tablet device, without the user having to download a plug-in or make any other changes.
There are still a lot of older browsers out there, so Flash remains important to support them. In this chapter, I show you how to add code to use Flash as a backup to HTML5 audio and video, to cover as many hardware and software combinations as possible.
The term codec stands for encoder/decoder. It describes the functionality provided by software that encodes and decodes media such as audio and video. In HTML5 there are a number of different sets of codecs available, depending on the browser used.
One complication around audio and video, which rarely applies to graphics and other traditional web content, is the licensing carried by the formats and codecs. Many formats and codecs are provided for a fee, because they were developed by a single company or consortium of companies that chose a proprietary license. Some free and open source browsers don’t support the most popular formats and codecs because it is unfeasible to pay for them, or because the developers oppose proprietary licenses in principle. Because copyright laws vary from country to country and because licenses are hard to enforce, the codecs can usually be found on the web for no cost, but they might technically be illegal to use where you live.
Following are the codecs supported by the HTML5 <audio> tag (and also when audio is attached to HTML5 video):
audio/aac.audio/mpeg.audio/wav, but you may also see audio/wave.audio/ogg, or sometimes audio/oga.The following list summarizes the major operating systems and browsers, along with the audio types their latest versions support:
Apple iOS: AAC, MP3, PCM
Apple Safari: AAC, MP3, PCM
Google Android: 2.3+ AAC, MP3, Vorbis
Google Chrome: AAC, MP3, Vorbis
Microsoft Internet Explorer: AAC, MP3, PCM
Microsoft Edge: AAC, MP3, PCM
Mozilla Firefox: MP3, PCM, Vorbis
Opera: PCM, Vorbis
The outcome of these different levels of codec support is that you always need at least two versions of each audio file to ensure it will play on all platforms. These could be PCM and AAC, PCM and MP3, PCM and Vorbis, AAC and Vorbis, or MP3 and Vorbis.
To cater to all platforms, you need to record or convert your content using multiple codecs and then list them all within <audio> and </audio> tags, as in Example 25-1. The nested <source> tags then contain the various media you wish to offer to a browser. Because the controls attribute is supplied, the result looks like Figure 25-1.
<audio controls> <source src='audio.m4a' type='audio/aac'> <source src='audio.mp3' type='audio/mp3'> <source src='audio.ogg' type='audio/ogg'> </audio>
In this example I included three different audio types, because that’s perfectly acceptable, and can be a good idea if you wish to ensure that each browser can locate its preferred format rather than just one it knows how to handle. However, you can drop either (but not both) of the MP3 or the AAC files and still be confident that the example will play on all platforms.
The <audio> element and its partner <source> tag support several attributes:
autoplaycontrolslooppreloadsrctypeIf you don’t supply the controls attribute to the <audio> tag, and don’t use the autoplay attribute either, the sound will not play and there won’t be a Play button for the user to click to start playback. This would leave you no option other than to offer this functionality in JavaScript, as in Example 25-2 (with the additional code required highlighted in bold), which provides the ability to play and pause the audio, as shown in Figure 25-2.
<!DOCTYPE html>
<html>
<head>
<title>Playing Audio with JavaScript</title>
<script src='OSC.js'></script>
</head>
<body>
<audio id='myaudio'>
<source src='audio.m4a' type='audio/aac'>
<source src='audio.mp3' type='audio/mp3'>
<source src='audio.ogg' type='audio/ogg'>
</audio>
<button onclick='playaudio()'>Play Audio</button>
<button onclick='pauseaudio()'>Pause Audio</button>
<script>
function playaudio()
{
O('myaudio').play()
}
function pauseaudio()
{
O('myaudio').pause()
}
</script>
</body>
</html>
This works by calling the play or pause methods of the myaudio element when the buttons are clicked.
In very rare cases (such as when developing web pages for in-house intranets that are using older browsers), it may be necessary for you to fall back to Flash for handling audio. Example 25-3 shows how you can do this using a Flash plug-in called audioplayer.swf that’s available, along with all the examples, in the archive file on the book’s companion website. The code to add is highlighted in bold.
<audio controls>
<object type="application/x-shockwave-flash"
data="audioplayer.swf" width="300" height="30">
<param name="FlashVars"
value="mp3=audio.mp3&showstop=1&showvolume=1">
</object>
<source src='audio.m4a' type='audio/aac'>
<source src='audio.mp3' type='audio/mp3'>
<source src='audio.ogg' type='audio/ogg'>
</audio>
Here we take advantage of the fact that non-HTML5 browsers read and act on anything inside the <audio> tag (other than the <source> elements, which are ignored). Therefore, by placing an <object> element there that calls up a Flash player, we ensure that any non-HTML5 browsers will at least have a chance of playing the audio, as long as they have Flash installed, as shown in Figure 25-3.
The particular audio player used in this example, audioplayer.swf, takes the following arguments and values to the FlashVars attribute of the <param> element:
mp3showstop1, the display shows the Stop button; otherwise, it is not displayed.showvolume1, the display shows the volume bar; otherwise, it is not displayed.As with many elements, you can easily resize the object to (for example) 300 × 30 pixels by providing these values to its width and height attributes.
Playing video in HTML5 is quite similar to audio; you just use the <video> tag and provide <source> elements for the media you are offering. Example 25-4 shows how to do this with three different video codec types, as displayed in Figure 25-4.
<video width='560' height='320' controls> <source src='movie.mp4' type='video/mp4'> <source src='movie.webm' type='video/webm'> <source src='movie.ogv' type='video/ogg'> </video>
As with audio, there are a number of video codecs available, with differing support across multiple browsers. These codecs come in different containers, as follows:
video/mp4.video/ogg, or sometimes video/ogv.video/webm.These may then contain one of the following video codecs:
The following list details the major operating systems and browsers, along with the video types their latest versions support:
Apple iOS: MP4/H.264
Apple Safari: MP4/H.264
Google Android: MP4, Ogg, WebM/H.264, Theora, VP8
Google Chrome: MP4, Ogg, WebM/H.264, Theora, VP8, VP9
Microsoft Internet Explorer: MP4/H.264
Microsoft Edge: MP4/H.264, WebM/H.264
Mozilla Firefox: MP4, Ogg, WebM/H.264, Theora, VP8, VP9
Opera: Ogg, WebM/Theora, VP8
Looking at this list, it’s clear that MP4/H.264 is almost unanimously supported, except by the Opera browser. So, to reach over 96% of users, you only need to supply your video using that one file type. But for maximum viewing, you really ought to encode in Ogg/Theora or Ogg/VP8 as well.
Therefore, the movie.webm file in Example 25-4 isn’t strictly needed, but it shows how you can add all the different file types you like, to give browsers the opportunity to play back the formats they prefer.
The <video> element and accompanying <source> tag support the following attributes:
autoplaycontrolsheightloopmutedposterpreloadsrctypewidthIf you wish to control video playback from JavaScript, you can do so using code such as that in Example 25-5 (with the additional code required highlighted in bold), with the results shown in Figure 25-5.
<!DOCTYPE html>
<html>
<head>
<title>Playing Video with JavaScript</title>
<script src='OSC.js'></script>
</head>
<body>
<video id='myvideo' width='560' height='320'>
<source src='movie.mp4' type='video/mp4'>
<source src='movie.webm' type='video/webm'>
<source src='movie.ogv' type='video/ogg'>
</video><br>
<button onclick='playvideo()'>Play Video</button>
<button onclick='pausevideo()'>Pause Video</button>
<script>
function playvideo()
{
O('myvideo').play()
}
function pausevideo()
{
O('myvideo').pause()
}
</script>
</body>
</html>
This code is just like that for controlling audio from JavaScript. Simply call the play and/or pause methods of the myvideo object to play and pause the video.
If you find yourself having to develop for non-HTML5–capable browsers, Example 25-6 shows you how to do this (highlighted in bold) using the flowplayer.swf file (included with the example archive available on the book’s companion website). Figure 25-6 shows how it displays in a browser that doesn’t support HTML5 video.
<video width='560' height='320' controls>
<object width='560' height='320'
type='application/x-shockwave-flash'
data='flowplayer.swf'>
<param name='movie' value='flowplayer.swf'>
<param name='flashvars'
value='config={"clip": {
"url": "movie.mp4",
"autoPlay":false, "autoBuffering":true}}'>
</object>
<source src='movie.mp4' type='video/mp4'>
<source src='movie.webm' type='video/webm'>
<source src='movie.ogv' type='video/ogg'>
</video>
This Flash video player is particular about security; it won’t play videos from a local filesystem, only from a web server, so I have supplied a file on the book’s companion website for this example to play.
Here are the arguments to supply to the flashvars attribute of the <param> element:
urlA URL of an .mp4 file to play (must be on a web server)
autoPlayIf true, plays automatically; otherwise, waits for the Play button to be clicked
autoBufferingIf true, in order to minimize buffering later on with slow connections, preloads the video sufficiently for the available bandwidth before it starts playing
For more information on the Flash flowplayer program (and an HTML5 version), check out flowplayer.org.
Using the information in this chapter, you will be able to embed any audio and video you like on almost all browsers and platforms without worrying about whether users may or may not be able to play it.
In the following chapter, I’ll demonstrate the use of a number of other HTML5 features, including geolocation and local storage.
Which two HTML element tags are used to insert audio and video into an HTML5 document?
How many audio codecs should you use to guarantee maximum playability on all platforms?
Which methods can you call to play and pause HTML5 media playback?
How can you support media playback in a non-HTML5 browser?
Which two video codecs should you use to guarantee maximum playability on all platforms?
See “Chapter 25 Answers” in Appendix A for the answers to these questions.