|Wilbert Roget, II is a veteran composer in the video game industry. In 2008, he joined LucasArts as a staff composer where he scored several games in the Star Wars universe, and has been composing for video games ever since. Recently, Designing Music Now was able to interview him about his score for Mortal Kombat 11 to gain some insight about his composing process, and his career.
Wilbert’s previous games include a diverse array of projects including Call of Duty: WWII, Star Wars: Vader Immortal, Lara Croft and the Temple of Osiris, Star Wars: The Old Republic, Destiny 2: Forsaken, Guild Wars 2: Path of Fire, Anew: The Distant Light, and other indie and AAA titles.
Wilbert is also the keynote speaker at this years GameSoundCon in Los Angeles Oct. 29-30, 2019.
What was the calendar for Mortal Kombat 11 from when you first learned about the project, through to the hiring process, on to composition, recording, mixing, and to release?
I first contacted the audio director, Rich Carle, sometime in early March, 2018. I was impressed by the audio in Mortal Kombat 9 and Mortal Kombat X, and wanted to have a chat at the upcoming Game Developers Conference. It turned out that not only was he in the midst of a search for a composer on their next game, but he was already quite familiar with my work and had even attended a lecture I gave at a previous GDC! So we quickly moved on to the audition process, wherein he gave me a PDF outlining the Mortal Kombat 11 storyline and explained what they were looking for in a spec demo. I wrote the track, and included a behind-the-scenes video showing some of the soloist musicians I hired to perform it – apparently, they liked it enough that it became note-for-note the main theme of the game!
Our first spotting session began at the beginning of August, wherein the audio director and the cinematics director (Marty Stoltz) walked me through the game’s first few cutscenes. We carefully planned the Story Mode score such that the cinematic music would segue seamlessly into the stages’ music, with transitions handled via Wwise
playlists. I wrote and produced each Story Mode cutscene cue, recording live instrumental soloists as I went along rather than at the end of the project. I’d then send off stereo mixes for approval, make revisions where necessary, and delivered final versions in stems. This pattern continued for the rest of the cutscenes in our 3-hour Story Mode until our final deadline in February. Five other composers (Rich Carle, Dan Forden, Nathan Grigg, Armin Haas, and Mattias Wolf) collaborated to write interactive music suites for the in-game stages. And lastly, the team at Sweet Justice Sound handled sound design and final mix for the cutscenes, similarly working on each one continuously throughout the project.
In MK11, the main campaign was how the fights are given more meaning by being placed in the context of a larger narrative structure. The roster of playable characters is 25, and all were being incorporated into the storyline with complex webs of relationships – there’s even a “time travel” component where certain players come up against their older “evil” selves. What conversations took place in the development phase regarding the music supporting character development and the larger narrative across the game.
As soon as I read the script, I immediately knew that this would be a character-based score, and that the old operatic concept of leitmotif – wherein short musical statements represented individual characters, factions, or ideas – would be central to its construction. We took our time with the first half of the game, as those cutscenes established the majority of the characters’ themes and needed extra attention each time a new leitmotif needed to be developed.
Rich worked on several Mortal Kombat games before, and was very closely familiar with the story, so he gave valuable insights on which characters would need particularly memorable, melodic leitmotifs. For example, he explained that Kitana was the central figure in building a “New Outworld” defined by unity rather than war and conquest, but her efforts would take several in-game chapters of personal growth and political dealings before coming to fruition. So I created a theme based on a rising scalar motif, first presenting itself early in the game as a more demure statement on orchestral strings, but eventually building to grand statements with heavy metal guitar solos by the final battle. I used her theme as well as Fire God Liu Kang’s theme in the ending credits song, “Rise”, a collaboration with EDM artist Super Square.
Marty and Rich also gave great insights about the antagonist, Kronika. They explained that in addition to her time-bending powers, she also had a unique ability to manipulate other characters, preying on the tragedies of their past and promising a new universe where these wouldn’t occur. I tailored Kronika’s theme to each of the people she’s manipulating, morphing it into a form that each character would find the most tempting. For example, the evil Revenants Liu Kang and Kitana felt powerless after being invaded by Raiden and the Special Forces, so Kronika seduces them by demonstrating an even greater power – I used epic orchestration and choir for her presentation in this case. Later on, she tempts Jaxx with promises of making him a General, keeping his daughter Jacqui out of military service, and having grandkids – I painted this scene with quiet heroic military solo horn and a wispy virtuosic classical violin solo.
What’s your process when composing for the game in terms of preparation and research? For example, do you research different regional instruments or musical styles, or specific composers? Or find some session players fluent in those instruments? Similarly – what is your typical composition process – e.g. do you begin with melody / harmony / rhythm first, paper, sketches?
Research and “pre-composition” are vital to my writing process! I try to do some form of transcription before every project, and in the case of Mortal Kombat 11, I listened to taiko drum ensemble concerts and transcribed their performances note-for-note. This helped me to write more authentic and interesting percussion parts later on. I also make sure to hire players well-versed in world instruments, who can play both in a Western style and in a traditional way. Sometimes I have them improvise over sections of a piece, which I then can edit together for the same cue or even other pieces entirely.
Once composition has begun, I try to leave the production to the very end, making sure I know exactly what I want it to sound like before wasting any time at the computer. Usually this means beginning with a very rough paper sketch, or even a list of vague ideas in a text file, which I then iterate on and flesh out until I feel confident enough to go to the PC. Sometimes entire melodies arrive in my mind all at once, other times it might be a rhythmic motif, harmonic twist, or textural gesture; Anything can be useful to set a groundwork and see where my imagination takes me!
For linear music like cutscenes or films, I’ll write timecode and sync points into my paper sketches, and read through them with a metronome while watching the visuals to make sure the music lines up properly. For interactive music, I’ll usually do a rough piano sketch within my DAW to make sure that the timing feels right – often I’ll find myself adding a measure here or there, or changing the time signature of a few bars, to make sure everything flows properly. In either case, once I’m confident enough to spend countless hours tweaking the details, I begin orchestrating and producing. My DAW of choice is Reaper, which is exceptionally powerful for mixing and audio editing in particular.
Can you describe some of the unique instruments you used for the MK11 score, and how you went about recording them, along with the orchestral material?
I was blessed to work with many fantastic world instrumentalists and vocalists! It was important to have a wide range of non-orchestral timbres in order to give each character leitmotif a unique voice. For example, the two instruments most prominently featured in the main theme (“A Matter of Time”) are the Chinese guzheng, played by Kathy Qianqian Jin and representing Raiden’s forces, and the Persian kemenche, played by Stelios Varveris and representing the antagonist Kronika. The guzheng is a large stringed zither with an impressive ability to bend pitches, and the kemenche is a simple bowed string instrument with a particularly nasal, reedy sound.
Some of the more unusual sounds in MK11 come from the bowed mandolin, and Scandinavian kulning vocals. I have an old cheap mandolin that, when played with a violin bow on either the G or E string, can give a particularly nasty, scratchy tone; I used a similar technique on an electric bass as well. And when I wanted a special vocal sound for the elder goddess Cetrion, I turned to Emma Sunbring for a kulning performance – it’s a brassy style of yodeling used by cattle herders in Sweden, which I thought was a perfect way to underscore a goddess attempting to lead mortals toward virtue.
What led you to founding Impact Soundworks? Do you find an intersection between the games that you compose for and your company Impact Soundworks which you founded with Andrew Aversa?
I founded Impact Soundworks back in 2006, when I had a few weeks of spare time between indie film scoring gigs. I had some large metal objects lying around that I wanted to use as percussion instruments in my scores, so I set up a home studio and sampled them extensively with different mallets, playing techniques, and multiple velocities and round-robin variations. Originally it was meant to be a private collection, but once I showed it to fellow composers on the VI-Control forums, they demanded that I release it commercially. So I set up a business storefront, added some additional content and converted to a few other sampler formats, and “Impact Steel” became the first Impact Soundworks library – as well as the world’s first deeply-sampled found-percussion library.
A few months later, Andrew Aversa came to me with the idea to do the world’s first deeply-sampled sitar library, so we found a player and released that as our first collaboration. Over the years his involvement with ISW increased, and mine greatly decreased as I was focusing on composition, so I stepped down and only rarely create new projects – for example, “Vocalisa”, which was the world’s first Slavic women’s chorus library. I still frequently create private sample libraries for myself, occasionally giving these away for free on my Facebook artist page, but I haven’t been interested in commercial sample development in a long time. By now there are many fantastic sample developers out there, and so I prefer not to develop libraries unless there’s a sound I need for my own work that doesn’t exist yet and requires extensive development.
From a career perspective, can you discuss the differences between your work as a contract composer versus your time working on staff for LucasArts?
Composing music for LucasArts was an incredible start to my career! I was hired in 2008 as a music assistant, mostly editing John Williams’ film scores to fit in the games, and later teaching myself scripting so that I could work on implementation as well. I eventually convinced the audio team to let me write original music, and my position evolved into both composition and music systems design. However, aside from brief stints working on the Monkey Island Special Edition series, most of my work at LucasArts was centered around the Star Wars franchise.
Though I still enjoy writing for Star Wars projects as a contract composer, I welcome the freedom and variety that my current career affords me. It’s wonderful being able to define the sound of a game, untethered by another composer’s famously iconic musical direction. In the past five years, I’ve been privileged to write drastically different soundtracks for several franchises – the bombastic and aggressive orchestra, world instruments and synths of Mortal Kombat 11, the gritty yet restrained and sincere sounds of Call of Duty WWII, the epic fantasies in Lara Croft and the Temple of Osiris as well as Guild Wars 2, and even a California rock score for the unfortunately-cancelled Dead Island 2. I’m currently working on an indie title, Anew: The Distant Light, whose modern art-music meets synth score likely has the most unique music I’ve ever written. And of course, I’ve also had the opportunity to return to my Star Wars roots with the VR title, Star Wars: Vader Immortal. So I’m profoundly grateful for having had so many varied experiences across several franchises!
After being in the industry for so long, what aspects are some the unique challenges that you face now?
You might think starting a new score gets easier each time, but every experienced game composer I speak to shares the same feeling – it absolutely doesn’t! We always start from square one, and each game is a new challenge. Sometimes this challenge can be a complicated story with overwhelmingly many characters like in Mortal Kombat 11; sometimes the setting demands a careful music direction to be respectful of history like in Call of Duty: WWII, and sometimes the gameplay itself can require unique music systems and compositional forms, like in my soundtracks to the Star Wars: First Assault and Dead Island 2 titles.
In each case, I like to focus first on the game world and write some sort of piece that exemplifies the setting – it can be a brief sketch or even an entire main theme. From there, I examine the instrumentation and form, and experiment with concepts for a dynamic music system based on the piece’s structure and divisibility.
You’ve worked on linear games such as MK11 and Lara Croft: Temple of Osiris, along with more open world games such as Guild Wars 2. What are some of the challenges of working on an MMO versus a more linear game?
In the cases of Guild Wars 2 and Star Wars: The Old Republic, the work wasn’t very different from my linear game scores. Since those soundtracks are more content-focused and don’t feature complex implementation; each cue was relatively self-contained and through-composed. They weren’t played back with stems or layers, there is no tempo-mapping, and instead they use classic loops; My ambient music for The Old Republic simply plays as one-shots, without even looping. Keeping things simple and allowing for musical silence is the strategy used by most MMORPG titles.
In the case of Destiny 2: Forsaken, however, we used a much more tightly-knit, complicated music implementation via Wwise. I only contributed additional music to that score, but it used an impressively detailed system with multiple stems, tempo-synchronization, and segments that could play in any order based on gameplay intensity. The goal was to bring the detail players expect from action games into the generalized and expansive world of an MMO, which necessitated this level of interactivity and a tremendous amount of content.
In the past, you’ve been an advocate for the DAW Reaper. How has it changed your workflow as a composer?
Reaper is well known for its customization, allowing users to easily build their own toolsets, functions, buttons, and layouts. I have a few custom actions, but for me, Reaper’s main advantages are its CPU efficiency, it’s open track architecture, and its audio editing features. Editing audio and creating musical sound design is a breeze, and the regions and batch-rendering functions have allowed me to do entire soundtracks in a single project file. I composed all of Call of Duty: WWII, Lara Croft, and Vader Immortal this way, which facilitated starting new pieces, maintaining a consistent mix, creating revisions, and reusing recordings and sound design across multiple cues.
How do audio middleware tools like FMOD, WWISE and Elias impact your writing? Do they make it easier, harder, or both? What kinds of things can you do differently on projects you know that middleware will be used?
I’m mostly familiar with Wwise, which has definitely facilitated my score planning. It’s important to know the capabilities of the audio engine, and to be thoughtful of how gameplay could influence the music’s interactivity – especially when scoring dynamic games where the players’ situation can drastically change throughout a level. Composers can write with the implementation in mind, adjusting their orchestration, form, or use of melody accordingly.
When compared to proprietary audio engines’ efforts towards interactive music, middleware is usually a godsend. With the advantages of many more years of testing and UX considerations, middleware generally is more stable and user-friendly than homegrown solutions; It also has the advantage of familiarity, for composers and implementers who’ve used the software on previous projects. Occasionally some proprietary solutions have certain features that are impossible or unfeasible on middleware, but these cases are quite rare given the adaptability of tools like Elias, FMOD and Wwise. In either case, it’s helpful for the composer to be aware of the abilities of whichever tool will be used, and be mindful of this potential while writing.