Introduction

What is Sibelius?” is a question I’m often asked at meetups and conferences. Composers are often asked how they operate their workflow. When I tell them, “I start by writing in Sibelius and export the MIDI file into Cubase to produce the track,” I often get a blank stare in return. After explaining that it is a music notation software in which I do 100% of my writing and orchestrating (yes, even ambient, synth heavy tracks!), it tends to click with them. It amazes me that a number of developers I’ve met associate composing solely with pressing the record button and playing into a MIDI Keyboard. While that is definitely a standard procedure for composers well-versed in their digital audio workstations, I want to explain why I choose to notate all of my music in separate software first before even touching my DAW. I know most major DAWs these days include a “Score Layout”-like option, but I have yet to find one that has intuitive ease of use like Sibelius.

From Undertale to Hyper Light Drifter, indie games are receiving incredible amounts of press and have become the entryway for those determined to work in the video game industry. That being said, hundreds—if not thousands—of upcoming titles require someone in charge of creating the game’s music. Due to the popular nature of cranking out small titles as fast as possible, those creating the soundtrack often skip over “Music Theory and Notation 101” and jump straight into their complimentary copy of Garage Band. This can often lead to a sacrifice of quality; it has become so easy to generate predetermined loops within the DAW without having any musical knowledge. I understand that in order to create high-quality productions, digital audio workstations are an absolute necessity (Cubase is my go-to DAW).

Orchestration / Score Study

A frequent conversation seen on social media websites between composers is, “If [insert classical composer here] were alive, he would totally be scoring for film and video games.” Beethoven. Brahms. Rachmaninoff. The composers from the 1800’s peaked as the great romantics, leading their music with new, lush, emotionally invigorating orchestration for the time period. Yet, so much inspiration still draws from that time period of music; seriously, listen to the chorus of Eyes on Me from Final Fantasy 8 immediately followed by the third movement’s opening line of Rachmaninoff’s Second Symphony. You will undoubtedly hear a striking similarity in the melody. Have you ever wondered why people question and contribute to that conversation of “Beethoven scoring video games?” It’s because their music works. Orchestras across the globe would not be playing their music for almost 200+ years if people grew tired of it. Classical music is a piece of timeless history. Fortunately for you as a video game composer, music over 100 years old is all publically accessible on IMSLP.org.

This is where using traditional notation for orchestrating your music trumps the concept of using solely the piano roll. What I mean is that let’s say you listen to Beethoven’s 5th Symphony and something strikes you as interesting. You can ask yourself, “What did he do at this point in the music?” Go to IMSLP.org, download the score, find that moment in the music and investigate what instruments were being played and how they played it. Again, this requires at least some basic knowledge of music theory, so maybe you don’t want to jump right into the score of a Wagner Opera or a Mahler Symphony. (Seriously—if you haven’t looked at an orchestral score before, avoid those composers at first. You’ll overwhelm yourself. Start small. The earlier the composer, the less dense the orchestration will be!). This concept doesn’t solely apply to “orchestral” music, either! Let’s say you’re writing a piano track for your game. Frederic Chopin’s 15th Nocturne, for example, made its way into one of Halo 3’s trailers. How did Chopin write his beautiful set of Nocturnes? Why did they work so well?

A great exercise is to “model” 15-30 seconds of your own music with the orchestration of a great composer. In other words, write a melody and harmonize it how you see fit, then orchestrate it exactly how Tchaikovsky did in a memorable moment of one of his symphonies. It can be fun, and is a great learning experience! I understand indie games do not often have live performers. But with the magnificent VST’s accessible and their ability to essentially recreate live sound as close as possible, I can guarantee you that notating your music suitable to live performers will greatly improve the quality of your music. If it’s written to be played live, chances are it will have far more authenticity.

Practicality

Let’s be honest: who doesn’t want their music in front of an orchestra? I’m sure we’ve all attended or seen footage of “Video Games Live,” which is one of the biggest video game music concert series known to man. How incredible would it be to have your music as part of the tour a few years from now? Additionally, there’s the idea of composing for an orchestra within a recording studio! I know that’s on my bucket list. Imagine landing a gig that has the budget to record in front of an orchestra but you’ve never once thought about learning how traditional notation works/operates? Like I said before, I know most DAWs include a “piano roll to notated score” layout option, but there are so many elements beyond just the score that come into play when prepping for an orchestral performance or recording session. Additionally, a DAW never defaults the articulations, dynamics, or general layout to performance readability. Plus, a whole other element exists when exporting individual parts to the respective instruments’ players. Even in Sibelius, a great amount of fine tuning is required in order for the conductor and performers to consider the score and parts readable and professional.

To put this into perspective, I’ve linked an image below. It’s a single page from a game I’m currently scoring by OmniVoid, titled Ashes. The track is called Chasms of Tombala, written in Sibelius and produced using VST instruments in Cubase 8. You can listen here on Soundcloud:

 

The game is very early in development, but regarding this idea of practicality, I can never be too prepared. Much like the rest of the video game industry, the music aspect is filled with “what if”s. What if this game picks up traction in a few years and we have the chance to record some of the soundtrack live? Fortunately, the music is already scored in a way that it can immediately apply to a recording session’s conductor and performers. What if your game picks up speed and picks up traction in a few years and you have the chance to record the soundtrack live? Having your music traditionally notated from the very start future-proofs any opportunities that take place. Even if it isn’t entirely fully-fleshed out like my track here (divisi markings, solo markings, etc.), you’d still save ample amounts of time down the line.

rsz_1ashes_-_earth_environment_0006

The worst situation that could happen in my mind is completing a 1-2 hour long soundtrack, entirely written and produced through a DAW or Piano Roll, then being offered to have it performed. Again, this can apply to a small string ensemble or solo piano/guitar/etc., not necessarily just full orchestra. To have to re-notate all of your music—especially if you’re unfamiliar with the process—would be a giant chunk of time lost that could be applied elsewhere (like a new project!). If you learn now before that situation arises, you could be saving yourself hours / days / months of trouble.

Separation Between “Writing” Time and “Producing” Time

Unlike the previous sections, this is less logistical and more psychological. Everyone works very differently. My workflow, as I’m sure you know by now, consists of Sibelius notation exported as a MIDI file, then imported directly into Cubase. The reason this works for me, personally, is a clear distinction between when I’m writing the music and producing the music. Have you ever worked a job that entails a certain activity for 8 hours, then when you come home, the last thing you want to do is that very activity? I don’t know what it’s like for chefs, but I imagine after cooking eggs and bacon all morning followed by tossing salads for lunch, when they come home, dinner could potentially feel like a chore. “MORE cooking?” At the same time, I’m sure chefs love cooking (just like how composers love writing music); but we all know how too much of a great thing becomes not-so-great anymore. If I spend hours writing a bunch of music in Cubase, when it comes time to mixing and mastering, the last thing I want to look at is more Cubase.

Additionally, having a designated software for the different creative stages of my music helps me maintain focus. Yes, production can be incredibly creative—in fact, so many great pop-song hooks are “hooks” from their production choices alone. But for me, doing interesting things with the latest reverb plugin requires a different type of creativity than formulating a new melody. Trying to write all of my melodic, harmonic, and orchestration elements in Cubase immediately preceding the production aspect resulted in feeling scatterbrained. I found myself writing a melody and simultaneously mixing or trying some type of neat delay effect before I finished actually writing the whole track. Having a productive day often involves an organized schedule and mindset. Using a single piece software to accomplish 100% of your tasks can work. I simply find it hard to streamline my focus when the user interface looks exactly the same at both stages of the composition process.

I wanted to demonstrate my workflow in a timelapse. Linked below is a video from my participation in Ludum Dare 35, a 72-hour game jam. This timelapse covers my entire process of the second track you’ll hear in the video. Essentially, I write 100% of the music in Sibelius (dynamic markings, articulation, etc.) and export it as a MIDI file. Once imported into Cubase 8, all I have to do is assign my VST instruments and smooth out a few CC (Continuous Controller) values for expressive purposes. The CC changes are few, as Sibelius does a great job at transferring overall CC expression. In addition, because some percussion instrument patches like Cymbals or Triangle have a plethora of options in a VST, I make minor edits to the note values themselves simply to find the right sound in context of the track. For example, C5 in the Cymbal patch sounds different than C3. This step can also apply if you want pre-defined effects within your specific VST for certain instruments (e.g. packaged Violin or Flute pre-recorded melodic sweeps). For melodic purposes, the MIDI file transferred from Sibelius requires much less tweaking when using Keyswitch instruments. Switches from articulations like “staccato” to “legato,” as an example, are already accomplished within the Sibelius export. After the very few, trivial edits, all that’s left is the actual production side of things (Compression, EQ, Mastering, etc.).

Below is a slowed-down excerpt from the image-driven timelapse video above. I simply wanted to emphasize how seamless the transition is from exporting the MIDI from Sibelius importing the file directly into Cubase (Right Click + “Open with” Cubase). Pause the video on its final frame and take note of all the MIDI stems and events. All of the instruments are lined up as they were in the score. Additionally, you’ll see the lines, hills, and valleys in all of the events; these are the already-established CC value changes changes, providing the framework of expression. As stated above, the only necessary steps before mixing/mastering begins is assigning your VST instruments as well as making minor “fine-tuning” changes to the CC expression and certain note values. This allows 100% of focus—whether it’s writing or producing—to be spent in its own designated software. Half the day in Sibelius and the other half in Cubase provides me with much less risk of mentally burning out.

 

Conclusion

Ultimately, using two drastically different types of software to compose your music has fantastic benefits. Knowing how to formally notate your music through traditional notation gives you far more coverage in honing your orchestration ability. Additionally, you will be a great asset as a composer if the opportunity arises to have your music played live and/or have it recorded. Trying a new workflow may help your overall productivity and concentration, as well.

Composing music for video games is constantly a learning experience. It’s never too late to try to learn new things, as daunting as that may seem. With every new software update or plugin release, it’s often tempting to shade away from watching tutorial videos or reading manuals. Learning can often be uncomfortable; people tend to find their comfort zones and never leave. Much like writing your first chiptune track or trying out a new EQ, taking a single step into a new musical approach can be scary! The benefits, however, almost always outweigh the apprehension. Your first steps into traditional notation will require reading, listening, analyzing, and most importantly, writing. With the high level of expectations for composers in today’s industry, every skill learned boosts your value as an individual.

Go forth, music makers: Learn, listen, and compose.

About the Author

Manfredonia - HeadshotTony Manfredonia is a contributor at Designing Music NOW. Tony studied Music Composition from Montclair State University and Temple University. Notable instructors include Dr. Robert Aldridge, Dr. Richard Brodhead, Hollywood Composer/Orchestrator Erik Lundborg, and Maurice Wright. His music has received a number of awards and performances. The Pittsburgh Symphony Orchestra read through his piece, Through the Tunnels and Back, as part of the 11th Annual Reading Session, led by composer-in-residence Mason Bates. The same piece won the Brazosport Symphony Orchestra’s Composer Competition for their 2015-2016 season. His most recent String Quartet, Fear of Disappointment, was the inaugural winner Texas A&M’s University Composition Competition and Symposium, chosen by the faculty and guest composer, John Mackey. Tony has also composed for theatrical works, with performances including Oedipus: The Musical for Philadelphia’s 2014 Fringe Festival and 60×60’s 2015 Dance Louisiana Mix.

He worked as an assistant and co-composer for Two Brothers, a video game created by Ackk Studios, which released and received critical acclaim for the soundtrack. Additionally, he scored the soundtrack for Ryan Campbell of www.battleofbrothers.com for his familial video game competition. Other video games for which Tony is currently scoring include Alone: The Untold, Super Toaster X, and Ashes, a new FPS-RPG being developed by OmniVoid.

 

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!