Here at Designing Music NOW, we like to bring you articles from a wide variety of perspectives.  From seasoned veterans to up and comers, from those with a long track record of success to students who are just embarking on promising careers as composers. Our new contributor, Larry Chang, is one such individual.  He is about to graduate from the Entertainment Technology Center of Carnegie Mellon University, and he shares with us what he learned during his semester on the Emotionshop project.

As we embark on GDC 30, the video game landscape has changed drastically over the years. I remember my first GDC over 20 years ago. Large publishers dominated the show, and there was no Indie or VR pavilion. Indie companies like I had been working for lived and died by attaching themselves to these large publishers. There was no Steam, Kickstarter, iOS or Android in sight.  At DMN we embrace articles about all type of game music, from AAA to the explosive Indie game scene.

If you are a student who has worked on exciting game projects, or have an interesting experience from your work on a GameJam or Indie project, we would love to hear from you!  Please send an email to info@designingmusicnow.com and we will get back to you – probably after GDC!

Emotionshop is a semester long project at Carnegie Mellon University’s Entertainment Technology Center with the goal of exploring how game mechanics evoke emotions. Our team of five created as many game prototypes as possible, with each person making one game in 7 to 9 days centered around an emotional theme. So far, we have done 28 games for 7 different emotions.

As the sound designer and composer of the team, I’ve faced mainly 2 challenges. First of all, time pressure: I have to create/design sound for 3-4 games at a time by the end of each emotional theme cycle. Second, project requirement limitation: since this is a project that experiments with how game mechanics evoke emotions, excessive art assets, including audio, are not allowed, as we all know a well designed music piece works like magic to guide the audience to the anticipated emotion. Therefore, I can only create audio that won’t surpass the experience. Here are some insights I have learned how to rapidly prototype audio:

Lesson I: Design Critical Sound First

Due to the tight schedule, prioritizing work is important. Working on those sound that are more critical for the game first. After all, rapid prototyping is more about making things work in the first place. Critical sound files that could directly affect gameplay (it varies from game to game) should be done first since they work as the structure of the design and the experience won’t be working without them.

In Forest Escape, a game that tries to evoke daring by putting the players into a seemly dangerous environment and push them to make some bold decision, I prioritized designing the audio of the monster roaring, human screaming and timer ticking, since I considered it to be the core structure element for the experience: letting the players feel exposed to an insecure environment and could possibly die if not escaping from the forest in time. The soundscape immediately puts the audience into the situation and has a successful effect. Even though I didn’t have time to polish every detail of the other sounds I made in the end, the playtest result of the experience was still satisfying.      

 

Lesson II: Regard Sound as Feedback for Games

Players need feedback for their actions in games. Feedback can be in different forms, including sound. Because the goal of Emotionshop is using game mechanics to trigger emotions, approaching audio design as feedback for games helps enforce the overall game experience rather than stealing the thunder.

Take Press SHIFT to taunt, a 2 player typing game where players try to beat their opponent by typing more words and earning more points with a shared keyboard. There is a moment in the game when the winning player can taunt the other player by making them type an embarrassing sentence. When designing the sound for this game, I figured the core mechanics are competition and taunting. Therefore, I designed 2 different gunshot sounds for each player to define their relationship in game: enemies trying to fight with each other. Whenever the players successfully complete a word, the feedback (gunshot) makes them feel like they are making an attack. In the taunting part, when the winning player presses the taunt key, a parrot laughing sound plays, making the taunting command more humiliating. In these cases, the sound feedback strengthens the game design, therefore enhancing the experience for the players.

Lesson III: Simplicity Helps

For rapid prototyping, efficiency matters. Writing well-orchestrated music pieces for a game is impossible since it takes time to do so which is not allowed in a rapid prototyping environment. But what makes the experience good is never how delicate the music sounds but how effective it is for the game. Therefore, simplicity of the music structure with clear understanding of core design can really help since it takes comparatively less time to accomplish but is effective.

In Mr. Bluehat and His Dreams, I designed the music as part of the game mechanics that fluctuates according to player’s action. The chord progression of the music only contains two chords which appear sequentially. The design is really simple, but works magically. The two chords of sound try to imitate human inhaling and exhaling, and according to our playtest observation, players did synchronize themselves to the music and reached to our anticipated emotion, in this case: Serenity.

       

Lesson IV: Leaving Silence is also part of Audio Design

Some people might think audio design is about creating sound, but in my opinion, I believe audio design is about looking through the whole experience and designing the overall listening experience with or without audio. Sometime, the experience would be better without any sound element and that’s part of audio design, because the listening experience for the audience is considered.

For several of the rapid prototyping in the projects, I left the audio element empty and put the audience into an environment where they need to listen to themselves from the experience and take a moment to think about it while playing the game. Tan-Dot Jr. is a good example. Tan-Dot Jr. is a game about school bullying and discrimination. The experience wants the player to listen to the voice deep in their heart and think on what they encounter during the gameplay. I believe any sound added will turn out to be noise distracting from the experience.    

Final Thoughts

Audio design for rapid prototyping is very different from longer projects according to my experience. It has to be fast and precise. This philosophy helped me enhance the quality of project audio production process and enabled me to successfully finish designing audio for 28 games within the time frame of 16 weeks. Ultimately, audio designers should all stick to the number one principle: Serve the core design and value. After all, all we are trying to do is to enhance the strength of the message that the design wants to convey to the audience. Be supportive, and make the experience shine.

About the Author

Yu-Cheng (Larry) Chang is a game and film composer. He began his career in 2012 scoring for film, multimedia and theatre productions in Taiwan. He is dedicated to combining Eastern and Western music and introducing a different listening experience to the audience. He is currently participating in various projects at Entertainment Technology Center of Carnegie Mellon University to explore the different ways of incorporating audio into interactive environments.

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!