In this article, composer Stephen Baysted reports on the challenges of producing the music for two high-profile racing games during the COVID-19 pandemic.

The score to Slightly Mad Studios’ racing game Fast and Furious Crossroads has its origins in January 2018, and, given its scale (some two hours of adaptive music and one hour of cutscenes), it was scheduled to be completed in the summer of 2019, with the game itself being released later that year. As it transpired, the release was delayed, and development was extended, due principally to delays in the delivery of picture-locked cutscene animations. As a composer or sound designer, being in possession of picture-locked versions of the cutscenes (when the footage is fixed, and no further changes are expected or permitted) is essential for finalizing the music and audio synchronization. The belated completion of the cutscenes inevitably triggered an intense burst of compositional activity between August 2019 and December 2019. Rather fortuitously, as it would turn out, I had taken the decision to record all the adaptive music in June 2019, during three days of orchestral sessions in the UK.

Fast and Furious Crossroads is firmly rooted in the narrative universe of the Fast and Furious action film franchise and features all the characters, vehicles, and a fantastical plotline that would be instantly familiar to movie-goers. The musical language of the franchise can be summarized by its bombastic and dramatic form, dense hybrid orchestrations, and its frequent use of passages with radically different tempi and meter, juxtaposed in quick succession to reflect the onscreen histrionics. During the preproduction phase of the project, one central question arose: how to best reflect the linear musical language of the franchise in a nonlinear ludic context. In response, a proprietary music engine was devised that would organize and structure the music during gameplay and allow it to respond and adapt to player actions and progression through each phase of the game.

The design of this dynamic system flowed directly from the game’s structure. This racing game is split into individual missions, which require between five and eight minutes to complete, and each mission typically featured three or four objectives. The game’s structure thus defined the musical structure: each mission would sound as a continuous stretch of music, but the linear sequence of objectives allowed the mission to be segmented into different musical movements—in this case, looped material with different musical “layers” for each section of the mission—in order to provide variation and musical growth in response to player action and progression. Unlike many such loop and layered systems, we “baked in” the layers, so the game switches between loops as the mission progresses, rather than relying on just one loop for the whole mission. With the parameters of the gameplay in place, the music can be expected to sound for two or three minutes per objective, with the mission segmented into loops that are varied through the addition and removal of musical parts of those cues.

As I mentioned a moment ago, we recorded all the mission music and, if memory serves, four cutscene cues during the sessions in June 2019. The orchestra was suitably fortified in number to deliver the sonic fingerprint that the film franchise is renowned for: forty strings (twelve first violins, ten second violins, eight violas, six cellos, four basses); and eighteen brass (four trumpets, six French horns, two tenor trombones, two bass trombones, cimbasso, and tuba). In a rather unusual move, I added a contrabassoon and bass clarinet to the “brass” section for two purposes: to enhance the percussive attack in the lower octaves; and to help counterbalance the natural raspiness of the cimbasso and bass trombones in their lowest registers with a somewhat warmer, reedier timbre. As is customary, “prerecords” are brought to the session;1 these consist of synth parts, drums, percussion, guitars, piano, vocals—in fact anything that is not being recorded on the day. In my score, I had a large number of prerecorded tracks.

By the time the cutscenes were approaching picture lock status, the first news of an emergent virus was reaching the shores of the UK. The provisional dub took place in January 2020,2 but by early February, it appeared that my original plan to record a number of the key remaining cutscenes with orchestra and drop them into the game, in early April, might well be scuppered. On March 23, the UK entered its first lockdown, and overnight the entire music recording industry closed its doors. Studios, concert halls, recital rooms all went dark, and musicians had no venues in which to perform or record. Beyond the appalling devastation caused to populations across the globe by COVID-19, my most pressing musical issue was now one of attempting to match the quality of the sampled orchestral mock-ups of the cutscenes with the live recordings of the gameplay missions—and finding musicians with the requisite resources to record remotely. My tactic was to focus on a number of key cutscenes that were pivotal to the narrative and where solo instruments were being deployed for dramatic emphasis. In one such cutscene where Sebastian, a central character, is killed in a fiery lorry accident whilst being pursued by the game’s villains, I wrote and remotely recorded a doleful and plaintive solo cello line that soared above a sampled string ensemble. Later in this cutscene, as his girlfriend Vee watches on in horror, I recorded a live solo flamenco guitar that references Sebastian’s Spanish heritage. These interventions, subtle and modest as they are, do help heighten the dramatic impact and distract the viewer from the inherent limitations of the orchestral samples. As we will see in a moment, there are other ways of combining live solo recordings with sampled orchestra.

Alongside the Fast and Furious, another game had been taking shape concurrently at Slightly Mad Studios. Together with Fast and Furious, Project Cars 3 (2020) represented an important shift in focus for a games development company who had, from its inception, been synonymous with high-fidelity racing simulation games.3 The third iteration of the Project Cars franchise, published by Bandai Namco, saw a significant departure from a number of fundamental tenets of the simulation genre that had hitherto characterized the series: whilst the racing circuits and cars were authentically modeled, and the sonic representation of the cars in their environments was state of the art, much to the chagrin of the “hardcore sim” community, there were no pit stops, no fuel depletion, and no tire degradation. The Project Cars franchise had apparently decamped into “simcade” territory for its latest instalment.

From a purely musical perspective, the aesthetic that had defined the two previous games had fundamentally changed too. In the first two iterations, the musical objective was to attempt to, for want of a more elegant expression, “get inside the head of the racing driver” in order to represent or convey the driver’s emotional, psychological, and physiological journey experienced in the cockpit during the heat of battle. What we are talking about here is the surfeit, and superlative control of, adrenaline, fear, bravery, euphoria, determination, aggression, strength, honor, courage, and ego. I have been fortunate, over the past twenty years or so, to have met a large number of drivers from diverse motorsports disciplines whilst developing games, and there are common traits they all appear to share: they are all ruthlessly calculating, intelligent, and highly physically self-aware human beings. One simply does not drive a racing car at 320 kph toward Eau Rouge and Radillion at Spa-Francorchamps without calculating the probability of failure and conceiving of the potential outcome of an error or an unforeseen mechanical issue. And yet, on the next lap they might approach the same corner at 321 kph alongside a rival. We mere mortals look on in awe at these courageous human beings, but they do not see it through the same lens. They are always in control or, to put it another way, always on the edge of being out of control.

As a composer, of course, one always relishes musical challenges, and in Project Cars 3 this challenge came in the form of a collaboration on a number of exciting fronts. We would be working with Bristol-based drum and bass label Ram Records and a number of their signed artists. The basic plan was to take eight drum and bass songs and reframe them, transforming each into three additional versions: two ambient, atmospheric versions, and a large-scale cinematic version. I would then compose two large-scale orchestral pieces (each given two ambient variations) that the drum and bass artists would reframe and remix. The original drum and bass versions, plus the three additional variations, would be heard in the “front end” of the game as players navigated the menu system. It was imperative that each variation was identical in length, tempo, and basic harmonic outline to the original so that, at any point, we could crossfade between the original and any or all of the variations, as the players moved through the menu system. As the player made their way toward the race, having first navigated various submenus (track selection, car selection and customization, etc.), the music would become increasingly intense and move seamlessly between the ambient and orchestral variations, and eventually into the original drum and bass version.

I began composing the score for Project Cars 3 in late February 2020 with the original plan being to record orchestra in early June. Of course, following the first lockdown, an alternative plan had to be drawn up. In London there are two principal scoring orchestras (the London Metropolitan Orchestra and Isobel Griffiths Ltd) that draw session players from the capital’s most prestigious ensembles: London Symphony Orchestra, Royal Philharmonic Orchestra, BBC Symphony Orchestra, and the London Philharmonic Orchestra. As a composer you work with the director of one of the scoring orchestras, who acts as a “fixer” and puts together the requisite number of players based on the project’s requirements and music budget. I have worked with Andrew Brown, director of the London Metropolitan Orchestra (LMO), on a number of projects (including Fast and Furious Crossroads and Project Cars 2), and we had been discussing possible musician lineups even before work had fully started on the score. More on this in a moment.

My friend and neighbor Guy Fletcher (Dire Straits) and I had talked about collaborating on a future project a number of times before, and as the world was moving inexorably toward lockdown, it seemed like now would be an opportune moment to work together. As well as being a multi-instrumentalist, Guy is also a sound engineer and producer, and since commercial studios were eventually closed, he would also take on the task of mixing the score from his home studio. The score itself is, unsurprisingly, rather eclectic: a fusion of multiple genres and stylistically diverse. Guy brought a rock sensibility to it, and, with my cinematic language and the drum and bass material, we produced a unique and rather experimental score, with a specific function for the game’s front end.

As a composer, one approaches and writes a little differently for sampled orchestral instruments than one does for a live orchestra. For example, divisi rules need not be observed or even considered with sampled instruments; complex ostinati can be rendered without concern for playability or ease of sight reading;4 and brass or woodwind writing can be continuous without the slightest regard for breathing or fatiguing the player’s embouchure. We had, however, fallen between two stools so to speak: to be sure, we were not able to record live orchestra, but neither would we be writing uniquely for samples, and thus a revised plan was hatched.

Our modified plan for Project Cars 3 was to attempt to record a number of soloists from the LMO remotely and then superimpose these remote recordings onto the sampled orchestral instruments. This technique, and the concept behind it, is not only tried and tested as “hybrid orchestration,” but it is essentially quite simple: sampled instruments provide the “body” and “bulk” of the sound, whilst the live solo recordings enhance the expression, intrinsic musicality, and shaping of the sampled musical phrases, which results in a more realistic rendering of the music. What is not quite so straightforward, however, is capturing comparable, and viable, results in suboptimal, acoustically compromised recording environments such as residential living rooms, studies, and bedrooms. But the complexity does not end there: only one of the musicians had prior experience of remote sessions, and the remainder did not know how to operate a digital audio workstation (DAW) and did not possess a microphone, a microphone stand, or an audio interface.5

In order to prepare for the recording sessions in mid-May (2020) we advised the musicians which microphones to purchase, and with our orchestrator Matthew Slater, we set each musician up with Logic and a Dropbox account. We also helped the musicians optimize their acoustic environment, advising them where to position microphones, duvets, curtains, and rugs—anything absorbent—so that room reflections and imperfections would be minimized. A master Logic project, complete with a recording template, was shared between Andrew Brown, Matthew Slater, and me, and we were able to coach each musician through the process of recording live over Zoom. On the orchestral side, the lineup was: violin, viola, cello, trumpet, French horn, and trombone doubling tuba. On the non-orchestral side, we had remote recordings of drums, trumpet and flügelhorn, guitars, vocals, and piano. Fortuitously, the drummer and trumpeter had professional recording equipment; and Guy recorded the guitars, and I recorded the piano and vocal parts at my studio.

Permit me to rewind just a little bit. During the writing phase of the project, Guy Fletcher and I were working very closely together, as we had decided to divide the compositional tasks in such a way that we were frequently working on the same drum and bass track simultaneously. This typically involved sharing prerecords or Guy playing guitars on one of my variations. Invariably, I would drop off a memory stick through his letterbox in the morning, and he would send his parts over later that day on the same memory stick. At the time, my internet connection was antediluvian and glacially slow, so it was more efficient to use a physical drive. This was the process for at least a month and a half. Once we had “mocked up” the entire score, and the music had been approved, we were ready to record the remote sessions.

Once all the remote recordings had been completed, Guy began the mixing process. This involved taking all the prerecords,6 the orchestral sampled mock-ups, and the remote recordings and attempting to make them cohere in a convincing manner. Through necessity, we had “close-miked” each player to minimize the sound of the room,7 which at the mix stage adds yet another layer of complexity: the mixer’s task in such cases is to attempt to make each remote recording sound as if it had been recorded in the same acoustic space (as the other remote recordings), and that each remote recording also matches the acoustic signature of the orchestral samples. The remote recordings are then superimposed and blended with the orchestral samples to create the final mixes, which were then mastered on two-inch tape.

There are some conclusions that I can offer here. Clearly, one would have never chosen to produce a score in this fragmented, suboptimal fashion, but given the context and seriousness of the pandemic, there was little choice but to do so. Blending inherently compromised remote recordings with orchestral samples is problematic and complex, and the results fall short of studio-based recordings using the same technique. Ultimately, it is, of course, preferable to record large-scale forces in a professional studio environment. On the other hand, the collaborative nature of the project was surprisingly exciting. Not only were we forced to find innovative solutions to various problems and obstacles as they arose to see the project to its conclusion, the act of collaborating on such diverse and eclectic range of material was challenging and musically rewarding.


Literally, everything that is not being recorded at a session.


A provisional dub is the final mix of music, Foley, sound effects, and dialogue.


GTR: FIA GT Racing Game (2005), GT Legends (2005), GTR 2 (2006), BMW M3 Challenge (2007), Test Drive: Ferrari Racing Legends (2012), Project Cars (2015), Project Cars 2 (2017).


In almost all cases, the players sight-read as they do not have access to the music before the recording session, and there is no time or budget for rehearsals.


A video of the performers recording the score for Project Cars 3 in lockdown is available on YouTube: Andrew Brown, “Project Cars 3 Score with Stephen Baysted,” YouTube, October 16 2020,, accessed June 14, 2022.


In this case, guitars, synthesizers, drums, jazz trumpet, and flügelhorn.


It is conventional, when recording orchestra, to use a setup that consists of a Decca Tree, outriggers, and surround microphones, plus spot mics on certain instruments. For more information on the Decca Tree see: Caroline Haigh, John Dunkerley, and Mark Rogers, Classical Recording: A Practical Guide in the Decca Tradition (New York: Routledge, 2020), 128–149.