Postproduction Sound

After all that work designing and recording sound, you now need to put it together into a cohesive final product. Welcome to the audio postproduction phase! Basically, you must improve what you already have, add more where you need it, and stitch it all neatly together. In audio, however, “post” doesn’t refer only to editing and mixing. Rather, the post process also includes creation or acquisition of any additional elements not already in your possession. This means recording narration, if needed; using the ADR process for additional, revised, or redone dialogue (looping); acquiring and manipulating prerecorded or library music or effects; creating and recording Foley effects on a specially configured stage; and recording music.

Mixing Consoles

image

The mixing console, or mixing board, is the tool used to bring in different sound sources and, as the name implies, mix them together, controlling different channels and frequencies to produce your desired blend. Whether you have the resources and ability to use a powerful stand-alone mixing console or are doing the mix in Pro Tools or some other software platform, here are some basic ideas to understand about functionality:

  • image The board will accept numerous input channels of audio.
  • image Each channel is controlled with a channel strip that uses a fader (level controller) to adjust volume by moving the control up or down.
  • image Channels can be combined together into buses, which are essentially networks for uniting two or more channels and then sending them somewhere else. Two of these are the mix bus, which is the board’s main output, and the monitor bus, which is the signal that goes to the monitor’s speakers.
  • image You will be able to send whichever channels you want to include to your master mix. There is usually a control called a master fader, which you will use to control the levels of all combined channels.
  • image As you combine various channels into your final mix, you will be able to use controls on the board to adjust gain, equalization, panning from left channel to right and back again, muting, the fading of some channels in favor of others, the addition of audio through microphone input, and more.

Even if you cannot afford to use a high-end professional mixing board, you can learn the ins and outs of a basic mixing console through affordable software equivalents and online manuals.

Editing and the final mix will be your final audio tasks, once all elements and a locked picture are in hand. Essentially, the sound editing process can be thought of as finalizing individual audio elements in a coherent way and combining those elements in each category—dialogue, effects, and sound. Mixing can be thought of as the process of doing the final manipulation, enhancement, balancing, and control of the assembled audio track and all its elements, down to characteristics of the audio signal itself, to create a finished product that can be matched to the picture during the final mastering process.

image SOFTWARE OPTIONS

There are a plethora of editing and mixing tools ranging across the consumer and prosumer space—technologies like Apple’s GarageBand, Tascam’s Portastudio, Adobe’s Audition, and even various freeware and shareware.

Typically, the sound designer or sound supervisor will supervise editing and mastering on major projects, but the go-to person in post will be the sound editor, with technical help from sound engineers. On large projects, there is usually a separate dialogue editor, effects editor, and music editor. These individuals conduct spotting sessions—meetings intended to make final decisions about the needs and specific placements of sound effects and music, and to figure out what dialogue manipulation, if any, will be required. As students, however, you will likely learn and perform all these functions yourself.

For doing such work, you will need some kind of digital audio workstation (DAW). A DAW can range from an ultra-expensive professional mixing console to apps for your personal tablet. In fact, on smaller projects, the power of affordable nonlinear editing systems is now so impressive that some filmmakers do sound editing and mixing on the same editing platform they use to cut the picture—most frequently some version of Avid’s, Apple’s, or Adobe’s NLE platforms (see Chapters 1112). When feasible, however, particularly if your mix will contain numerous complicated elements, you may prefer to use a stand-alone audio editing/mixing DAW of some type, with the industry standard at all levels being Avid’s Pro Tools, a powerful platform that allows you to edit, mix, and perform other functions as well (see Tech Talk: Mixing Consoles, above). As you segue into editing and mixing, no matter what tool you use, make sure you have addressed some basic considerations (see Action Steps: Prepping for Editing and Mixing, below). In the sections that follow, we cover the fundamentals behind these stages in the post chain.

ACTION STEPS

Prepping for Editing and Mixing

Before you actually begin editing and mixing your film, there are some important questions to ask yourself:

  1. image How will you be transferring sound from the capture medium to your editing environment? Whether you have already imported it from your shooting medium to your Avid, or you will design another path for your data to travel into your editing system, do not process or manipulate the signal in any way as you make the transfer. Import what you originally recorded, back it up, and worry about manipulation once you begin editing and mixing.
  2. image Do you have a proper environment in which to work, with quality speakers or headphones to monitor the work? A good listening environment is crucial to attaining a great final mix.
  3. image Do you have some kind of robust mixing-console technology—a software system or stand-alone mixing board into which you can input or transfer other audio sources, either as imported files or through hardware connections?
  4. image What method will you use for evaluating data to make sure it is compatible, at the right bit depths and formats, as mentioned in this chapter (see here)?
  5. image What is your goal for the final product? How many audio channels do you want to end up with? In what format or viewing platform(s) will your movie be distributed? Its final destination will directly impact how you mix the movie.
  6. image Likewise, what is your plan for organizing original elements so that they can be reaccessed and remixed if a new version of your film, such as a foreign-language version, needs to be created?

Dialogue Editing

Dialogue is the most important aspect of a soundtrack for the simple reason that it’s the primary tool for telling the audience what characters are doing, thinking, and feeling. Without clear dialogue, everything you are trying to do creatively will be compromised. The goal is for pristine dialogue, so that you have fewer problems and more options.

That said, the simple reality is that production recordings of dialogue frequently end up less than pristine for a variety of reasons. The job of the dialogue editor is to fix dialogue sections that need help. Thus, the process of editing dialogue tracks is nuanced; you need to clean up every piece of dialogue as much as it requires—removing extraneous noise, lowering or removing room tone, improving levels, or adjusting unclear words—but you need to do it in such a way that the quality of the performance is preserved or even enhanced. And, of course, you must make sure dialogue synchronizes with picture properly.

The ways to do this are numerous, depending on your technology, skill level, and creativity:

image CONSISTENT MICS

Try to mic and record characters in the same conversation the same way—using the same type of mic positioned the same way, the same recording levels, and so on. Otherwise, you may need to put each actor’s voice on a different dialogue track, then put them back together after some cleanup. If basic levels and sound quality appear the same on your instruments, you can keep both voices on the same track, and the end result will likely sound more organic.

ADR

The additional dialogue replacement (ADR) process will be scheduled after you have cued what pieces of dialogue need to be replaced. ADR is frequently called looping because, in an earlier era, actors would speak lines to match how they moved in particular scenes that played repeatedly in a “loop” in the studio.

Today the process is less cumbersome, although usually actors still watch their images on a screen and listen to original production dialogue on headphones. The goal is to redo a line while matching, as much as possible, the original movement of the lips.

There can be problems with ADR, and so you should carefully evaluate when you really need it. Some filmmakers believe it’s better to live with a modest flaw in the recording quality of a clip if they have the performance they wanted and can advance their story agenda with the production track. When using ADR, keep in mind that some actors have a hard time synchronizing their lines with what they did originally. Indeed, legendary sound editor Randy Thom has suggested that the mere attempt to have an actor re-create the energy and ambiance of their on-set performance will sometimes come up short.7 Still, ADR is used often, and when it is, it is typically recorded in a specialized studio where clips are played back for the actor.

Frequently, the ADR team will also be responsible for recording background crowd chatter, known in the industry as walla (because it used to be common for extras playing crowd members to mutter the made-up word “walla” during recording). Because the goal is to record the murmur of crowd noise, no specific words are required.

In any case, the dialogue editor will make sure that any ADR or walla elements are properly recorded and timed to synchronize with their corresponding picture elements.

Sound Effects Editing

From a technical standpoint, the process of editing sound effects is similar to that of editing dialogue. You will be using many of the same tools and will be involved in splitting effects’ tracks, removing unwanted elements and combining others, scrubbing, summing, solving problems, and so on.

image LOG SOUNDS

Make your daily life a recording adventure. Take your recording gear with you wherever you go—you may find interesting sounds you will use one day. Keep the sounds and log them in a meaningful database; some of them will come in handy when editing sound effects.

The difference is mainly creative and philosophical. With dialogue, you are generally trying to maintain, preserve, and improve an element generated by someone else—an actor—on-set. With sound effects, you are trying to create final elements yourself, using raw elements either already acquired or that you will find or create yourself via the use of libraries and the Foley process, among other methods.

Eventually, you will manipulate and process sound effects in different ways, with the express purpose of furthering the story. Even mundane sounds, when crafted properly and mixed into the final soundtrack, can have a profound impact on storytelling. “A bird, for example, can be made to make sounds in direct connection to something going on between two romantic characters,” Lon Bender suggests.

As such, the sound effects editor’s work may well begin before the dialogue editor’s work; based on the sound-design plan, he or she may have been collecting elements since the production’s earliest days to create specific effects, like a ship’s engine. Many top editors, in fact, have extensive databases of material they have been collecting for years. Part of the job involves knowing how to efficiently create or find something the track needs.

The job also requires smart data management. With effects, there can be reams of elements involved in every scene. To put it all together and hand it off for the final mix, you need protocols for knowing where everything is, including logical file-naming conventions. For every sound element, give it a proper name that is logical to search for. “Loud crash,” for instance, is not as specific, and thus not as helpful, as “loud crash when statue falls from roof.” You can then insert these sounds into your database in more broadly named folders. Thus, you might have an “explosion sounds” folder, a “zoo animal folder,” and so on.

For each file, log all corresponding information, or metadata, of note. Metadata can tell you how big the file is, where and at what time it was recorded, who recorded it, the machine that was used, what scene it was intended for, if it was from a library, if it needs to be licensed or if some other permission needs to be procured, and so on.

Foley

The Foley process refers to the artistic manufacturing of sound effects that directly match action within the picture. Unlike other areas of filmmaking, Foley has not radically changed in recent years. Yes, digital recorders are different from recorders of the analog era, but at its core, Foley is created as it has been since an artist named Jack Foley invented the technique in the early days of sound.

Foley does not take place until the film is locked, the point when no further changes in the imagery will be made, and you know specifically what scenes you need what kinds of effects for. It involves recording trained artists, called Foley walkers, as they “perform” effects in a choreographed way with the moving-picture image using special props on a special stage. A Foley mixer is usually the person recording the sounds.

A typical Foley stage can look like a junkyard and is often decidedly low tech outside of the recording equipment. This is because the most believable sounds are “real” sounds. Therefore, Foley artists connect real-world props with real-world surfaces in particular ways to create certain sounds. The stage floor may be divided into sections, each with a different surface or texture that the artist will use at various times. There could well be a wood-floor section, gravel or sand, carpeting, metal—even a pool of water. They may have coconuts to bang together to emulate the sound of horses, or boxes of Corn Flakes to dump on the floor to simulate the sound of a person walking over a crunching surface. They may have rubber and brick walls to throw things against.

When you hear a character in a movie in boots walking over crunchy snow, that sound was almost certainly created on a Foley stage. It’s a creative and fun technique, but quite difficult to perform because artists have to closely synchronize their movements with what is happening on-screen. But if you do it right, it can add ultrarealistic sounds to your movie.

Music Editing

In essence, music is entirely a postproduction enterprise, even though the design, potential writing, recording, or collection of music may well begin in a project’s earliest phases. That’s because the creation and manipulation of music to fit the movie’s beats is done separately from filming and cannot fully come together until it can be synced to an edited picture. We have already discussed the design and planning approach to music. In Chapter 12, we delve further into how to use music creatively to drive your emotional agenda. For now, let’s take a look at how to record and edit music.

You may have been trying out different kinds of temp music tracks to get a feel for what kind of music works with the material during the editorial process. Once you have some idea what you are looking for, decide whether you want music recorded live, manufactured using digital tools, or imported from preexisting sources. You will likely engage in a search for existing material or ways to manufacture music from digital sources, such as some of the music-creation software packages now available that offer simple melodies and sometimes allow you to purchase loops of prerecorded music bits. For anything that needs to be scored from scratch, the composer will take charge of writing it, working closely with the director, and the general direction of the score will develop from there.

During spotting sessions, cue sheets or timing sheets will be generated, showing every location where a musical cue will be required and how long it should be. The cues will be numbered for each reel of the film and defined as simple score material, transitions, bridges to take the audience out of one scene and into another, or stingers—short collections of musical notes designed to emphasize something specific going on. This will be a stage of experimentation, debates, and changing minds. Filmmakers will try different tempos, beats, volumes, and styles over the same sequence to see which one best fits the material. Eventually, a score emerges.

image USE YOUR SCHOOL

Make sure to check out your school’s music department. You will often find a wealth of resources from orchestra to composer to facilities, with fellow students all-too eager to help with your project.

Major projects then take the finest musicians available to a state-of-the-art scoring stage and record these pieces of music. However, assuming you can’t afford a full orchestra or studio scoring stage, you can record music any number of ways with whoever serves as your music editor and music mixer, and with as many mics strategically placed as resources and skill level allow. Generally, record multiple instruments onto multiple tracks, or discrete audio channels, to permit richer mixing options later.

Eventually, you will deliver cues to your music editor, preferably with timecode embedded in the track or with a separate log made of the timecode to more accurately help you sew pieces into intended locations. Using Pro Tools or some other platform, the editor then cuts in the music according to your finely crafted road map. Like any good editor, he or she will make adjustments, such as volume level and cue positioning to better match the editor’s pacing; some material will get dropped, and other material will be manufactured and weaved in; and so on.

During the process, you will have dozens of techniques available to you for bringing music in and out of sequences, including fading (gradually increasing or decreasing the sound of music over a scene), cross fading (transitioning out of one cue by bringing it down and into another by simultaneously bringing it up), filtering cues (increasing higher frequencies while limiting lower ones), and adding effects.

Art of the Mix

The industry’s digital evolution has provided the opportunity for you to be far along with the mix by the time you officially get to the final mix stage—the place where all audio elements will be locked and combined into a final soundtrack. Some professionals now combine editing and mixing chores themselves, since workflow and technology paradigms have changed. That is likely to be the case on your student project. However, remember that the mix is a final pass. Your operating agenda is to be as efficient as possible while achieving creative goals. After all, mixing is a unique process because it is at once a creative art and a technically complex endeavor.

At some point, you will need to split all tracks by category, sometimes referred to as stems, in preparation for the mix. Usually the editor generates a group of tracks for dialogue, a group of tracks for sound effects, and a group of tracks for music. He or she will lay the tracks out, or order them, in a visual pattern on a computer screen, which the mixer can then use as a map through Pro Tools or another system.

Once that is done, the mix begins. Generally speaking, this will be the final step—not only in the audio chain but also in the movie’s production. It could be either a long, laborious process, or the strategic tweaking of a soundtrack that’s nearly done. Either way, you need to understand the fundamentals of mixing.

At the core level, you must balance, or mix, everything together properly. This is where you make creative choices on what sounds to emphasize or deemphasize as you bring all sounds together with finished visuals. To do this, play back all edited tracks synchronized to your picture, and while you watch and listen, the sound mixer (or rerecording mixer) will adjust the signal, frequencies, and effects as needed. He or she will establish final levels in the mix, put the soundtrack through the EQ process (adjusting frequencies as needed), and filter the sound. Filtering is really a step down from the EQ process—limiting or cutting out some frequencies altogether if they introduce unwanted noise into the audio signal. Then, the mixer will finalize or process any special sound effects, such as reverb.

Because you may wind up with dozens or even hundreds of tracks, many projects include a premix. This basically allows filmmakers to start combining effects or dialogue tracks so that it will be easier to unite already combined larger groups of tracks with music and dialogue in the final mix.

At the end of the mix, you will generate what are called deliverables—finalized, locked pieces of the audio track that, when combined, become the complete soundtrack. On a studio film, there are often legal requirements regarding what the deliverables must be, and it is typically the responsibility of the postproduction supervisor to make sure these requirements are met. At the student level, those deliverables may be defined by your professor or department and can vary depending on where and how you will be exhibiting your film, what your resources are, and who will be helping you create the final master: the pristine final version of your movie—either a film negative or a secure digital file from which all other versions will be produced. At a minimum, they will include the following:

  1. The primary mix, which is the output from your mixer that can be attached directly to the locked picture. What format this will take depends on the workflow you are using and the mastering method you have chosen.
  2. Mix stems, which are the broken-down individual pieces of the mix—dialogue, effects, and music. These stems are all put together in the primary mix, but you also want to supply them as separate pieces in case there are mastering issues to resolve, and also to use for trailers and other promotional clips from the movie.
  3. Compressed files, just in case you will later produce a version of the mix for streaming or another presentation at a lower bandwidth than the original.

image DOING THE FOLEY

Find a garage, basement, or quiet room and bring the best recording equipment you can find, along with a volume-off clip from any film you choose. Bring props that will help you make the sounds required in the clip you have selected.

If you have no big screen, play the clip over and over on your laptop or tablet, and practice making the sounds of the person or object in the clip. Experiment with recording yourself providing the required sound effect in sync with the picture. See how long it takes for you to master the concept, and make note of what you did right and what you did wrong.

There are other delivery possibilities, such as if you are finishing the movie to film, which will require a film print of the soundtrack alone to marry with a master image print, called the printmaster. But with digital mastering and exhibition taking over, it’s unlikely you will deal with this process as a student.

Regardless, here are a few final tips to keep in mind as you dive into the mix:

  1. Understand your project’s intended destination—where you exhibit the movie will impact how many channels you mix. You will have more dynamic-range options if it will be a theatrical release; for broadcast or webcast, you will have far less. Dynamic range is essentially the ratio between the softest sound and the loudest sound—you normally can’t go too high or too low. But the better/louder your exhibition format will be, as in a theatrical presentation, the greater the range between the two that you can experiment with. You may also need different versions of the film for different destinations, and thus you may need to mix the movie more than once.
  2. Compressors and limiters are often used in the final mix, as they are in production recording. They are, in fact, one way to reduce dynamic range. If used correctly, the compressor will limit a signal’s peaks to avoid crossing certain aural thresholds. These are valuable tools and worth your while to learn.
  3. Background or room noise may continue to cause problems, but there are many software solutions and apps now available to help editors and mixers further reduce signal noise. In fact, a range of filters, reverb tools, plug-ins, and other sound-effect technologies are now readily available as apps. Learn how they can help your mix.

Sound Pro’s Emergency Kit

image

  • image Basic toolbox, not only for sound equipment but also because you may need to fix or manipulate props. Therefore, pliers, side cutters, screwdrivers, hammer, gloves, and soldering iron are essential.
  • image Electrical tape and hairpiece tape (available at beauty stores) for attaching lavalier mics to clothing
  • image Walkie-talkies
  • image Extra headphones, recording media, cables, adapters, batteries, and flashlight
  • image Pillows, pads, and blankets for muffling sound