June 2012 Archives

June 29, 2012

Google I/O: Turning the Web Up to 11 Video

Google's Moog Doodle Source Code Now Available!

Moog doodle

On May 23, 2012, in honor of Bob Moog's birthday, Google placed on their main search page an interactive doodle mimicking the original Mini Moog.  The doodle was the first large-scale use of the new Web Audio API and received much acclaim throughout the web.

Today, at the request of the HTML5 Audio blog, the code for the Moog doodle has been released under Apache License 2.0 and is available for developers worldwide to see how it was made and build upon its foundation.

June 28, 2012

Google I/O: Turning the Web Up to 11

Turning the web up to 11

For those who couldn't be at Chris Wilson's Google I/O talk about web audio today, his slides are available online and provide quite an outstanding array of examples, arguments about why Web Audio API is relevant and vitally important, as well as tutorials on how to use the API to create interesting audio solutions for your projects.

The slides are chock full of amazing things one can implement using Web Audio API and any developer won't want to miss it in its entirety.  Some highlights I found particularly instructive and informative include:

You'll want to use Canary and have the Jazz MIDI Plugin installed (as per the instructions I gave recently for another of Chris' excellent examples).

Google I/O: EA Strike Fortress and PlayCanvas

Google io

Electronic Arts announced at Google I/O that their new game Strike Fortress is developed using HTML5, but few details are available so far about what technologies were used.

But there's more!  PlayCanvas was revealed indicating further exciting movement in the HTML5 game development space.  PlayCanvas is a development toolset that relies heavily on HTML5 technologies, including the Web Audio API and WebGL, to implement games and other interactive media.  While PlayCanvas is currently a closed beta, there are a series of demos available, but most refused to run in Chrome 20 for me; I was able to use them all in Canary, however.

My favorite bit?  One of the PlayCanvas demos is the old After Dark Flying Toasters screensaver, complete with Peter Drescher's music!

Thanks to Rory O'Neill for the contribution!

June 26, 2012

Chrome 20 ships, Includes Oscillator Nodes

Google chrome

Google shipped Chrome 20 today. Among a number of bug fixes and other improvements (additional info here), the new version includes the new Oscillator Node, which was previously only available in Canary.

Current capabilities of Chrome are documented on the Web Platform Status page and information about Oscillator being integrated in WebKit are documented in trac, too.

Update: Chris Wilson tells me that while Oscillator is indeed in Chrome 20, noteOn()/noteOff() was added a bit later and is slated for Chrome 21, therefore existing Oscillator implementations may not work properly in this release.

June 25, 2012

Case Study: A Tale of An HTML5 Game with Web Audio

Filedrunners

Over at HTML5 Rocks they have a number of excellent tutorials, which I've linked to in the past.  I just discovered there's also an outstanding post mortem covering Web Audio API implementation on Fieldrunners for the Chrome browser.  The article goes deep into how the audio engine was implemented, what kinds of mistakes they made, and how the whole thing came together.

Absolutely a must-read for anyone wanting to do interactive audio in a browser.

Controlling amplitude using the Gain Node

Stuart Memo has posted another of his great, simple tutorials.  This time it's a demonstration of how to adjust output amplitude using Web Audio API's Gain Node and a slider control.  And as before, he's posted a working source example in a fiddle.

June 23, 2012

Aurora and FLAC.js — audio codecs using the Web Audio API

Want to use FLAC and other unsupported codecs to provide audio data in your web application?  As we know, leaving codec support to the browser makers has lead to fragmentation that makes what should be a straightforward feature unnecessarily complex.  Enter Aurora, a framework for programmers to include whatever codec support they need within their apps.  In getting Aurora ready for public consumption, the folks at Official.fm Labs have built codecs for JSMad, ALAC, and FLAC.  And all this at less than 5% CPU usage!

If you’re interested in extending Aurora with your own demuxers and decoders, documentation for the framework is available.  Let's get to work!

June 22, 2012

Sympyrean — a generative audio sequencer built using Web Audio API

Symprean

Sympyrean, a generative audio sequencer, uses a stellar system you create to produce music. The motion of the planets and moons within the system, as well as their distance from the star determine what notes you hear and when.

The app was inspired by a concept called Musica Universalis (also known as the music of the spheres) which dates all the way back to ancient Greece. The story begins with the greek mathematician Pyathagoras who apparently loved to play with strings. What he discovered was that if he stretched a string between two points and placed his finger on that string and plucked it, the pitch (or note) it produced was determined by the length of the segment he plucked. More importantly, he found the notes which sounded the clearest were at interval ratios between the length of the plucked segment and the total length of the string. This is, in fact, where we get the system of tuning used in music today. For instance, the interval of a perfect fifth has a ratio of 3:2. If we take the note E4 which has a frequency of about 659Hz, and divide it by the frequency of the note a perfect fifth below, A4 at 440Hz, we end up with about 1.5 which is three halves or, you guessed it: a ratio of 3:2.

Quite interesting and fun, once you get the hang of the interface.  And the audio functions were built entirely using Web Audio API!

Faust includes support for Web Audio API

Faust

FAUST (Functional Audio Stream), a functional programming language specifically designed for real-time signal processing and synthesis, has noted in their most recent blog post that support for Web Audio API is now included:

The Faust2 branch has been updated with a new JavaScript backend that allows to generate code to be used in this context. New architecture files have been developed to embed this generated code and do Faust audio processing directly in a browser.

Seems like a pretty useful way to create JavaScript code using a more "On Rails"-type approach to audio processing widgets.

June 19, 2012

How to get Web Audio API support in WebKit GTK+

Came across a blog article that explains how to get Web Audio API working for Gnome users. I'm not sure if this is the first implementation of Web Audio API in Linux. Does the Chrome beta for Linux support Web Audio API?

<audio> Support and Troubleshooting Guide

Trying to build a web application or site that includes audio, but are frustrated over how hard it is to get bugs sorted out in all browsers? Fear not! Game developer Chris Khoo has published an outstanding guide to support of the <audio> tag in the popular browsers. Unfortunately, there isn't a solution that works for all browsers (and his conclusions in regards to Android are to build native apps instead, using appMobi, PhoneGap, or Appcelerator), but his information is insightful nonetheless, and nice to see all in one place.

June 18, 2012

Some basic Web Audio API tutorials

Waveforms

Stuart Memo has been posting some very nice, basic tutorials for web audio beginners to use.  One is a simple introduction to using the Web Audio API.  The second discusses oscillators, including a simple discussion of different kinds of waves.  There is even a jsfiddle example setup you can run in Google Canary.  (Click "Run" at the top of the page in order to hear sound and be sure to turn down your computer's speakers first!)

One more thing worth taking a look at is Stuart's Question Park website, which contains a few audio toys that are fun to play with.  (Too bad they only work on browsers that support Ogg Vorbis audio, as the code seems to work fine on the iPad, but the audio couldn't play.)

Thanks to Stuart Memo for the contribution!

June 15, 2012

WebModular — a Web Audio API and Audio Data API synth

WebModular

Apparently it is modular synthesizer week.  From the folks at g200kg.com comes WebModular, an impressive and easy-to-use modular synthesizer build entirely using HTML5 and JavaScript.  There are a couple neat features worth pointing out:

  • WebModular employs both the Web Audio API and Mozilla's Audio Data API, so it works great in Chrome and Firefox browsers.
  • Like Google's Moog Doodle, you can play WebModular with your QWERTY keyboard.
  • Playback can be scripted by the user, with instructions listed below the entry field.

All in all, it's a great demonstration.  I hope they add MIDI support once the Web MIDI API is ready!

Thanks to Kevin Ryan for the contribution!

June 14, 2012

Web Audio at Google I/O and OpenWebCamp

Google io

Want to learn more about the Web Audio API and other new developments in HTML5 audio? Chris Wilson will be presenting a seminar entitled "Turning the Web Up to 11" at Google I/O 2012, and if you care about web audio you should be there! (Chris is someone worth listening to, especially since he co-authored the original Windows version of NCSA Mosaic back in 1993 and knows a thing or two about web browsers.) In the session, Chris will walk through the draft Web Audio API standard, discuss how the API enables game and music applications, and show some audio applications built using Chrome's implementation.

Google I/O is already sold out, but many of the sessions will be on I/O Live, and all sessions will be online after the event has concluded. You can keep an eye out for announcements about when the sessions are posted.

If you cannot make it to Google I/O, Chris will reprise his session at OpenWebCamp July 14 in San Jose, CA, so there are a plethora of ways you can get your web audio on!

Basic MIDI synthesizer example using the Web Audio API

Chris Wilson over at Google is working on a web-based MIDI synthesizer that utilizes the Web Audio API. At the moment there is not much in the way of UI, but it can work with MIDI controllers, which is pretty nifty.

Here's what I did to get it set up and functional:

  1. Download and install Google Chrome Canary
  2. Launch Canary and install the Jazz MIDI Plugin (since Web MIDI API is not implemented yet).
  3. Launch AK-7 Core MIDI controller on the iPad.
  4. Launch Audio MIDI Setup on the Mac.
  5. Open "MIDI Studio" window and double-click on "Network".
  6. Add a new session under "My Sessions".
  7. Locate my iPad's name in the "Directory" list, select it, and click "Connect". (Ensure the iPad's name is listed under "Participants".)
  8. Go to Canary and load the web page.
  9. Give Jazz MIDI Plugin permission to run on this website.
  10. Select the name of the session I created in Audio MIDI Setup.

That should be it! Give it a whirl!

June 13, 2012

iOS 6 and Safari 6.0 to support the Web Audio API

Wwdc2012 apple

Good news from Apple on Monday! iOS 6 was announced at Apple's yearly Worldwide Developer Conference (WWDC), with developers receiving the first beta release. That beta release includes support for the Web Audio API, as noted in this screenshot.

Additionally, Apple announced Safari 6.0 (and released a developer preview for Lion) and is providing the Web Audio API there, too! It doesn't seem likely that Snow Leopard users will get Safari 6, though, given the developer preview is only for Lion users.

June 10, 2012

Are we playing yet?

The folks over at SoundCloud have created a website that tests your browser for <audio> compatibility, from functionality support to formats, properties, events, and attributes. I ran the test suite on my iPad 3 and a whole lot of red Xs came up, many of which I already knew about, but some that I hadn't gotten around to testing myself yet. The SoundCloud folks would love help in developing the tools further, so if you wish to get involved, go fork their GitHub project and start coding!

Included on the GitHub page is a list of contact points for reporting issues to browser makers.

It would be great if users could submit their test results to AWPY so a searchable repository of support information could be built as a reference for developers. Also, AWPY should add a test for Web Audio API and Web MIDI API support, since we're likely to start seeing more of that as the API becomes more mature.

June 8, 2012

Web MIDI API test suite released

Jussi Kalliokoski, a developer in the W3C's Audio Working Group, has just posted the first version of a new Web MIDI API test suite for those wishing to begin writing browser-based MIDI implementations.

Kalliokoski wrote in his post to the working group's listserve:

I have now released a test suite for the Web MIDI API as it's currently specified. It's a really basic test suite to see if the features are there, not if they work that correctly (as JS has no other means of accessing MIDI devices, proper functionality testing is almost impossible). The test obviously requires user input for getting the MIDIAccess object, otherwise it's quite automatic.

Well, what are you waiting for? Have at it!

June 6, 2012

Web Audio Editor — an experimental tool using the Web Audio API

Web audio editor

Jan Myler has created a great Web Audio Editor application as part of a bachelor's thesis project. (The code is available here.) Web Audio Editor can:

  • Open and display WAVE/MP3 audio files.
  • Zoom in/out audio waveform.
  • Audio tracks containing editable audio clips.
  • Audio clips are movable, resizable. Can be looped and splitted easily.
  • Editing functions such as Cut, Copy, Paste and Delete.
  • Multitrack selection tool.
  • and more...

There is also an ironically silent video tutorial that demonstrates the features of the app.

About the project, Jan had this to say on the W3C Audio Working Group's listserv:

It is more or less experimental and has only limited functionality (no audio exports, open/save functions).

I had problems with high memory usage. Opened audio files are decoded into audioBuffers using audio context. These buffers are used for displaying the waveforms and as playback sources. So they are kept in memory throughout the application run.

Unfortunately, I didn't find a better way to be able to display waveforms when I was in the early stage of development. Even now I don't know another way.

Feedback is welcome.

The project works great in the Chrome browser. I would suspect it can run in the nightly WebKit, but haven't tried yet.

June 5, 2012

7.1% of Android users can play audio in the browser

Android

Google has released the latest stats on Android version adoption. Ice Cream Sandwich, the only version of Android that includes audio codecs for playback of sound using the <audio> tag, has now reached 7.1% adoption, up from 2.9% at the end of April. This is progress, but let's bear in mind that it took seven months to reach this point, and Jelly Bean -- the next release of Android OS -- is expected only a few months from now. Add to this the fact that Gingerbread is still adding usage share (now at 65%, up from 63.7% at the end of April, a 2% increase) and the news is not very encouraging for audio developers who want to include browser-based sound in Android devices. (It should be noted as a comparison that iOS 5.1.1 took 19 hours to reach 7% adoption and 4 days to reach 30%, with a daily adoption rate of around 7.5%!)

One other parallel item worth noting is that Samsung's Galaxy Player 4.2 (a competitor to the iPod Touch which shipped about two weeks ago) comes with Gingerbread (Android 2.3.5). Clearly, audio on Android is going to remain a far off dream for some time to come.