With over 25 years of work under his belt, Charles Maynes is a veteran of the film and TV world with well over 100 feature film credits to his name. His sound design work can be heard on iconic movies and TV shows such as Twister, The Pacific (for which he won an Emmy award), Letters From Iwo Jima, U-571, and Spiderman, to name a few. More recently, Charles has been sound designer and effects editor for the popular Paramount + drama series Seal Team.
As you’ll read in this interview, a good deal of Charles’s sound design work regularly centres around weaponry, a specialisation which was inspired by the movie Terminator 2 in 1991 and has been honed from the recording and/or design of weapons for numerous AAA title video games and films.
So, why would someone who is clearly an expert at recording and designing his own sound effects, call on BOOM Library for his work? In this interview, we find out…
How has the business of sound design developed since you got into it?
It’s complicated to talk about how the changes have specifically occurred but, in the general sense, I think sound design processes have not changed as much as it might seem.
“No matter what technology is being deployed, with sound we are basically telling stories, and in doing so we essentially make a set of images emotionally sensible/believable to our audience.”
The most radical shifts were the movement from working on film and moving to digital workstations, with a brief interlude using timecode based analog and digital multi-track tape recorders. No matter what technology is being deployed, with sound we are basically telling stories, and in doing so we essentially make a set of images emotionally sensible/believable to our audience. The principal objective has always been to create a reality that is not going to challenge their suspension of disbelief.
For one of your iconic movies, Twister, which was released in 1996, you had to recreate the sound of tornadoes etc., were you out in fields trying to capture wind noises? How did that work?
Twister had an all-star sound team headed by Academy Award winning sound editor Stephen Hunter Flick (for RoboCop, and Speed). The principal sound designer on the film was longtime Flick collaborator John Pospisil (who shared the Academy Award for Best Sound Editing with Flick). On that film I recorded some specific things for the work I was doing with the effects, though Stephen had sent out a number of Sound Effects recordists (Ken Johnson, Eric Potter and Patricio Libensen) to cover a lot of the production vehicles, and specific location ambiences for the film in the middle of America where the film was supposed to be taking place. I don’t think they recorded really any natural, heavy storm stuff, but one of our recordists, Ken Johnson, developed a set of devices which were modeled after old radio play props for making hurricane sounds and stuff.
One thing that seemed to be a recurring comment from survivors of them was that tornadoes sounded a lot like a freight train. If you’re really close to a freight train going by very quickly, it had that kind of power to it.
I think some of the designs were used in The Wizard of Oz, so we had all that material. John Pospisil also developed a lot of great sounds through more traditional design techniques in response to the story prompts which were used in developing the effects sound. When I, and Greg Hedgepath were assigned to work on the tornado sound design, we split the task up into two different branches of focus – Greg was handling the more “emotional” side of the sounds, utilising more abstract sounds, and I concentrated on more natural sounds which our re-recording mixer Gregg Landaker balanced on the dubstage. Knowing that there wasn’t a lot of actual sound recordings that were able to be used for real tornados, I became very interested in the literary descriptions of what these storms would sound like. One thing that seemed to be a recurring comment from survivors of them was that tornadoes sounded a lot like a freight train. If you’re really close to a freight train going by very quickly, it had that kind of power to it.
The most difficult thing with the concept of something like a tornado is that it’s unrelenting. It’s just a constant big noise so you don’t have natural peaks and valleys where you can restart and build back up to a scary sound.
So, who put the cow in the tornado?!
The cow is an interesting one. Richard King, who has received a number of Academy Awards working with Christopher Nolan and of course Peter Weir, actually cut that sequence with the flying cow and he asked me to see if I could do some magic to make the cow sound like it was flying by. I ended up using an Emulator IIIXP which had a really nice Doppler effect which I processed these cow moos with. I just created a bunch of Doppler cows, gave them to him and he made some selections as to what he thought would make sense inside of the image and put it in the track. It was exciting to see that because it actually sounds pretty convincing – at least to me!
I’m guessing these days, a lot of that would be done in the box, wouldn’t it?
I think that the process evolved a bit. It was a DSP process that created the Doppler for the cow, so it gave the sense that it was going by. If we wanted to be really pedantic and we had a lot of money, we’d probably hire a cow and then essentially hook it up with a microphone. You could have a cow hand with a cow prod and when they see us driving past, they’d prod the cow, the cow bellows, and then we get an acoustic Doppler of that effect. But, like I said, it was done inside the Emulator sampler, it had an effect, which was a DSP process, which was a Doppler, and, for the most part, it was hit and miss, you just had to do a lot of passes on it until you got something that sounded like what you were looking for. Now there’s software plugins and various spatialization things that we can use inside Pro Tools, but the biggest thing is trying to set in your mind what you think is the best way to approach that problem and then use your best judgement to determine what sound works best for the moment.
You seem to have a lot of credits on weaponry stuff. Did that happen by accident or was it something you took a real interest in trying to perfect?
“Terminator 2 was the impetus for me wanting to go into sound for film. I was just so amazed at the quality of the effect of the sound of that film.”
Terminator 2 was the impetus for me wanting to go into sound for film. I was just so amazed at the quality of the effect of the sound of that film. Gary Rydstrom’s work in general is fantastically focused. It felt like I was being knocked out specifically by sound, the weapons sounded amazing, having a kind of quality to them that I was unaccustomed to hearing in other films, and that really set the bar for so much of what I did later. Doing film sound is pretty interesting in the way that there are a lot of spoken, and unspoken rules about how you present the work as it is rare to not be a part of a larger team on most TV or film projects, so we do have some universal practices that most people work hard to follow so we can hand our work off to another person, sometimes mid-project. There is also the matter of providing the re-recording mixer a set of tracks which is first, logically sensible for them to mix and also allows for the easiest way to put all the pieces together in the mix. That was probably the hardest thing to really embrace and refine for me when I started in Hollywood in 1995. I was at Universal where I did some comedy and non-action films, after which I joined Stephen Hunter Flick at Creative Café, and we did stuff that was a bit more action driven like Twister and Starship Troopers, both of which were very complicated sound editing projects.
In the course of refining my sound experience, I found I had a distinct contrarian streak which led me to sort of stupidly take on tasks that were way beyond my experience level. I always volunteered to do the stuff that was the most painful sorts of things to do, like doing all the weapons (well, most of them) in the film Starship Troopers, and all the horse in the film A Knight’s Tale. There’s a lot of jousting and a lot of story content that went with the horses.
In this particularly dumb choice, I found that horses are not easy to do sound editing on to make them feel right, and at the time I was a relatively novice sound person. It basically forced me to really work hard and to pick up a lot of understanding of how sound works against image.
In the application of that when you’re dealing with any sort of action sequence, you have a lot of perspective changes, the editing is usually fairly busy and sometimes choppy to essentially impart excitement into the scene. I think that was where Gary [Rydstrom’s] (indirect) tutelage really came into play for me because I focused on attempting to represent what I was seeing in the image, no matter how choppy it might be, and then essentially providing the underlying foundation so that you had a sense of continuity across those jump cuts and transitions.
Do you take a more comedic approach to sound design in something like a comedy action movie such as A Knight’s Tale, compared to a very serious film like Flags Of Our Fathers or do you want to make the weaponry as authentic as possible?
On A Knight’s Tale, which was supervised by Jon Johnson (Academy Award winner for U571), my point of departure was to try to be accurate so that essentially anybody who might have had any experience remotely close to what we’re seeing on screen wouldn’t be distracted by a bad execution of representing that action. But then Jon had this wonderful, yet sensible, mantra that he really imposed on all of us and that was ‘drama overshadows reality’.
“Depending on what the drama of the moment is, the (sound) accuracy might have to be foregone in order to allow for the dramatic requirement of the moment to be fulfilled.”
Depending on what the drama of the moment is, the accuracy might have to be foregone in order to allow for the dramatic requirement of the moment to be fulfilled. Like with gun sounds, for instance, that’s where you would have some crazy giant sound on a closeup or a slow motion thing of somebody getting executed, it’s not going to be the same sound as you would hear if the camera was depicting something that was 35 feet away from you. I did a talk with a podcast called Fighting on Film about this, where in Saving Private Ryan we have this squad which was Tom Hanks, Vin Diesel and Barry Pepper and the other guys, and they all had different weapons. I think it was Edward Burns’ character Private Reiben who had the most powerful gun, called the BAR [Browning Automatic Rifle] which was basically like a regular rifle that would fire automatic. Then you had Tom Sizemore’s gun, which was a smaller rifle, and then Tom Hanks who had a Thompson submachine gun which is like that gangster machine gun you’d see in things like the Untouchables or other gangster movies set in the 1920s or 30s.
When Hanks would use his Thompson it would sound like they were the most powerful gun ever (because historically, gangsters had killed lots of people because they fired fast, but the reality of it is that it’s firing a pistol round versus a rifle round, which is significantly less powerful.) So, in the context of Saving Private Ryan, and we also had this in Flags of Our Fathers, when somebody would be firing that gun, it needed to sound pretty powerful versus what it sounded like in real life. It doesn’t sound that impressive in real life, it was nicknamed “the Chicago typewriter” in popular war films. When I used it on Flags Of Our Fathers, we had one scene where a guy with a Tommy gun went behind a bunker and fired at a bunch of Japanese Army Soldiers. Due to the “dramatic requirements” of both the history, and the photography of the film, I ended up having to put a British Bren gun on top of it for dramatic punctuation. So, it’s like I had the right recording of the Thompson, but then I had this Bren gun, which was essentially a full powered machine gun, in sync so that it would sound aesthetically consistent with the audience expectation of what that gun might sound like.
We’re here because of BOOM Library, of which you’re a fan. How are they helping you in your work?
I certainly use their material often when I’m doing battle sequence stuff; their explosions, their gun recordings and I also use some of their plugins which I find very useful. They do great stuff and they often have more processed versions of their recordings which are quicker to drop into things. If I use my raw recordings, I usually have to use plugins and things to get them into a sound space that will work in a film or TV show. And they’re great people, I’ve known them for quite a number of years and I enjoy their work.
You mentioned about the treatment and obviously BOOM Library are very hot on the whole idea of making sure it’s recorded right in the first place. Does that help speed up the process for you?
It can. Basically, if you’re doing any sound design, it’s always a matter of selecting the material you think is going to be most appropriate for whatever the subject is, but that’s always measured against how much time you might have to be able to do that editorial work.
“I think this sort of active engagement with the way an editor or designer is making […] choices for their tracks is really something that AI assistance is going to have little impact in.”
Often, a pre-processed sound will give you the character that you want, as opposed to you taking a sound that might be a great recording. I don’t tend to process my own sound effects recordings for my library to go back to on new projects, since I prefer sort of inventing new sounds for every project I work on, so they’re always starting from a point zero and I’ll have to go and make design choices to make them more dramatic. If I want something that is unprocessed from their [BOOM Library’s] material, I have that available and then I can basically go through their different iterations and it’s a matter of getting it to what I feel happy with for that particular project. I think this sort of active engagement with the way an editor or designer is making these choices for their tracks is really something that AI assistance is going to have little impact in.
As to AI, I’ve had a bunch of conversations regarding its encroachment on our industry, or encroachment on creative arts in general, and the conclusion I’ve arrived at is that we’re focusing on how our workflows are threatened versus us being able to steer the workflow into a satisfying outcome, in as quick of a manner as we can. I think that drives a lot of this, we basically want to be able to finish a scene, in the manner that we feel happy with, in the fastest possible way. In some parts of that process AI can be very helpful, in others, it might not be as helpful, especially if the aesthetic concept is abstract.
Talking of speed, do you find the Universal Category Systems helpful in BOOM Library collections?
I think, ultimately, the UCS idea is most compelling for sound designers and editors entering new facilities, not being aware of what their sound libraries might hold and then being able to search that material in a manner that they’re accustomed to which is very handy. I record a lot of material. I have a lot of back material from projects that I use so all that UCS information is in my head. It does have the promise that in print catalogues, the Dewey Decimal system of library categorisation has for commercial publishers, it is a near requirement I think, but for those using their own systems, it might have less utility which is why I don’t prioritise UCS compatibility in my library as much because I have so much material that isn’t UCS encoded. I don’t know that there’s any real metrics for any studies that have been done as far as people not purchasing a library due to it not having UCS, but at the same time I think a lot of people in games and film and video can tend to have a bit of an obsessive compulsive demeanor. In terms of the sound editorial librarian programs, I think the most commonly used are Soundminer and Soundly, so when people add material to that database I think that they like the comfort of knowing that all of the fields are filled out. It’s like they feel a higher degree of confidence and if they don’t have fields filled out that they feel they can’t easily access the material because of that metadata being missing.
So in terms of the BOOM Library plugins, are there any in particular you call on?
I’ve used quite a lot, I’ve used Turbine on Seal Team, the TV show that I’ve been working on for the last seven seasons. We have a recurring set that is like a C17 military transport plane so we have a lot of scenes where they’re flying someplace, there’s dialogue happening and I’ve used Turbine on multiple occasions to generate engine sounds for that, which could vary and sound like a real airplane.
Onboard aircraft, for the most part, don’t sound that interesting and they can be really noisy but they tend to not have a lot of variation in their sound. I found Turbine to be wonderful to be able to essentially provide that kind of variation and performability to create sounds for that particular thing.
I use Enforcer quite a lot, usually to add bass information to transient sounds and whatnot. You can even use it for transitional elements and stuff. Very handy. It adds like a very full spectrum-y kind of quality to a sound, and I find the way the interface is laid out is very helpful. It’s not particularly arcane. There’s some nice preset settings available, and again, for me, it’s a matter of getting to an outcome as quickly as possible. I could use other tools for that, but that particular plugin is very nicely contained to be able to add special sauce to something in a quick manner.
I think it’s great to see how Boom has expanded the scope of their product line, and I am always interested in anything they put out.