A Development Process Map of Game-Audio Implementation 15 Field and Foley recording should be done with the proper equipment, such as a portable DAT or digital recorder. (Digital audiotape is becoming slow and inefficient to use compared with a portable hard drive, which isn't affected by shaking or jarring and can record instantly.) You'll also need a good stereo microphone, a windscreen or two, and a large foam microphone cover. If you really want to go nuts, get yourself a parabolic dish to record far-off sounds more easily. Be sure to bring something to jot down your locations and the sounds you record. Slating (announcing) sounds helps too; otherwise you end up listening to your whole tape trying to decipher what was what. For more information on creating Foley sound effects, take a look at Practical Art of Motion Picture Sound, Second Edition (Focal Press, 2003). At last, we're past preproduction! We're ready to begin hammering away at production and the nitty- gritty of the game itself. Production Production has fewer details to worry about than preproduction does, but that doesn't make it any easier. In the production phase, your nose is to the grindstone as you generate and implement content. Let's take another look at the DPM. We first observe that the three categories of sound--SFX, music, and voice-over--have different elements in their workflows (integration being the most common), and the workflow of voice-over looks very similar to that of a motion picture. Keep in mind that for each category of sound, we will cover the actual creation of the sound files before we move on to integration. This doesn't follow the DPM exactly; you'll see why when we reach integration. We begin with SFX. SFX Production First let's take a hypothetical example of creating a sound effect. The asset spreadsheet sits before you, boggling your mind with the thousands of sound effects that still need to be created for your game. You decide that because you just saw a video file of a lightning bolt spell being cast, it'll be a good idea to tackle that sound effect next. One lightning bolt sounds the same as another, right? Well, maybe to a buffoon. You know better. Unique sounds can create emotion just as unique music can. You immediately go into your sound effects database, which has already been set up in an easy-to-use interface for you to find and grab files. You do a search for lightning and pile all the results into a raw materials directory. Thus you've started to create your palette. You decide to draw from other sources--say, sweetener sounds such as heavy wood and stone impacts. Once you have your palette, it's time to work on a canvas. For some developers, this means turning to Sound Forge (the version from either Sony or Sonic Foundry), although managing dozens of files with this program can prove time-consuming. For others, it means using Steinberg's Wavelab; for still others, Bias Peak. Each of these editors has its own set of hallmarks. An easier method is to employ a multitracker such as Digidesign Pro Tools or Steinberg's Cubase SX (see Figure 1.8) or Nuendo (see Figure 1.9). That way you can see all the elements of each sound (an element being an individual sound file in the multitrack project) in addition to the effects you can put on each element all at once. Too much sweetener? Just take it down a few decibels instead of pressing Control-Z a lot and remixing in Sound Forge or Wavelab. Too much high end on one of the elements? Equalize it individually. Nothing could be simpler or more organized. If your boss is standing behind you saying, "Make it more badass," you can in mere seconds say, "Oh yeah? How's this ?" I know you've wanted to do that for a while. Check out Figure 1.8 to see Cubase SX's glory: You can perform multitrack editing, create sonic landscapes using MIDI, and control a great many parameters nondestructively.