MUSIC, AMBIENCE AND SOUND ART > Music Gearheads Tech Talk

The THX sound


Julio Di Benedetto:
I had read somewhere that the THX sound was created on a 90 oscillator Serge was not so.  Heres the story behind it

I like to say that the THX sound is the most widely-recognized piece of computer-generated music in the world," says Andy Moorer. "This may or may not be true, but it sounds cool!"
>> It's called 'Deep Note'. 
>> It was made by Dr James 'Andy' Moorer in 1982, who has had a very cool career: Four patents, one Oscar. In the '60s he was working in Artificial Intelligence at Stanford. In the '70s he was at IRCAM in Paris, working on speech synthesis and ballet. In the '80s he worked at the LucasFilm DroidWorks, before joining Steve Jobs at NeXT. Today, he consults, repairs old tube radios and plays banjo.
>> At one point, the THX sound was being played 4,000 times a day at cinemas around the world (that's once every 20 seconds).
>> The Simpsons got permission for this [mpg movie] parody. Dr Dre was less lucky. He asked permission to sample 'Deep Note' but was turned down. He used it anyway, to open '2001', and LucasFilm sued.
>> Stanford student Jesse Fox tried to recreate 'Deep Note' for a course. His version sounds like a nasty accident in an organ factory. 
>> There are various theories on the web about how the THX sound was created - some people say it was a Yamaha CS-80, others that it was a Synclavier. I emailed Andy Moorer to ask how it was really made. The short answer was "On a big-ass mainframe computer at LucasFilm". But I thought I should give you the long answer here in full, just because it feels like Andy's writing his own history for the first time...
>> "I've never written the THX story down (nobody ever asked). So, here's the whole story:
>> "I was working in what was then called the "Lucasfilm Computer Division" that existed from roughly 1980 to 1987 or so. It spawned several companies, including Pixar and Sonic Solutions. I was head of the audio group. In about 1982, we built a large-scale audio processor. This was in the days before DSP chips, so it was quite a massive thing. We called it the ASP (Audio Signal Processor).
>> "At the same time Tom Holman was also working at Lucasfilm. He had developed what is now called the THX sound system. It was to premiere with Lucasfilm's "Return of the Jedi." They were making a logo to go before the film. I was asked by the producer of the logo piece to do the sound. He said he wanted "something that comes out of nowhere and gets really, really big!" I allowed as to how I figured I could do something like that.
>> "I set up some synthesis programs for the ASP that made it behave like a huge digital music synthesizer. I used the waveform from a digitized cello tone as the basis waveform for the oscillators. I recall that it had 12 harmonics. I could get about 30 oscillators running in real-time on the device. Then I wrote the "score" for the piece.
>> "The score consists of a C program of about 20,000 lines of code. The output of this program is not the sound itself, but is the sequence of parameters that drives the oscillators on the ASP. That 20,000 lines of code produce about 250,000 lines of statements of the form "set frequency of oscillator X to Y Hertz".
>> "The oscillators were not simple - they had 1-pole smoothers on both amplitude and frequency. At the beginning, they form a cluster from 200 to 400 Hz. I randomly assigned and poked the frequencies so they drifted up and down in that range. At a certain time (where the producer assured me that the THX logo would start to come into view), I jammed the frequencies of the final chord into the smoothers and set the smoothing time for the time that I was told it would take for the logo to completely materialize on the screen. At the time the logo was supposed to be in full view, I set the smoothing times down to very low values so the frequencies would converge to the frequencies of the big chord (which had been typed in by hand - based on a 150-Hz root), but not converge so precisely that I would lose all the beats between oscillators. All followed by the fade-out. It took about 4 days to program and debug the thing. The sound was produced entirely in real-time on the ASP.
>> "When we went to sync up the sound with the video (which I hadn't seen yet), we discovered that the timings were all different. I readjusted the times, generated a new score, and in ten minutes, we had the sound synced up with the video perfectly.
>> There are many, many random numbers involved in the score for the piece. Every time I ran the C-program, it produced a new "performance" of the piece. The one we chose had that conspicuous descending tone that everybody liked. It just happened to end up real loud in that version.
>> "Some months after the piece was released (along with "Return of the Jedi") they lost the original recording. I recreated the piece for them, but they kept complaining that it didn't sound the same. Since my random-number generators were keyed on the time and date, I couldn't reproduce the score of the performance that they liked. I finally found the original version and everybody was happy.
>> "If you get permission from THX, I can supply you with the written "score" for the piece (in music notation - this was used to get the copyright) or even the original C program that produced the parameter lists. I can't supply you with a program that makes the sound itself.
>> "The ASP was decommissioned in 1986 and later sold for scrap."

Goosebumps every time. Haha!
Thanks for sharing.

It's funny, I always thought it was a Csound program. That's only part of the answer, it turns out.


[0] Message Index

Go to full version