technosoundz

The history of adaptive music in video games

Illustration: Nhung Lê

The popularity, scale, and complexity of adaptive music in video games have increased dramatically in recent years.

That said, although adaptive music practices are often only associated with the current era of gaming, the tools and techniques used to create music soundtracks that evolve with player interaction date back to as early as the late 1970s. In fact, the techniques developed in this early generation of digital gaming were what laid the foundations for the very principles used in today’s games. In this article, let’s take a look at video games from the ’70s to today that feature soundtracks shaped by adaptive music techniques that have truly left their mark in history.

The early techniques of adaptive music

Space Invaders (1978)

The Space Invaders arcade game, which came out all the way back in 1978, has the simple premise where players must destroy a fleet of aliens that are creeping ever closer to their spaceship. As the player progresses through the game, the aliens increase their movement speed.

Both the sound effects and music for this game are generated utilizing procedural techniques, using synthesizers that exist within the physical hardware for the game. However, these technological limitations didn’t stop developer Tomohoro Nishikaro from creating a soundtrack that adapts to the player’s gameplay experience. As seen in the gameplay clip below, the ostinato-based musical loop continues to increase in speed and intensity as the player destroys more aliens. Often, this coincides with aliens reaching the bottom of the screen (which leaves the player themselves in danger of being destroyed), greatly increasing the tension of the experience.

Super Mario Bros. (1985)

The 1980s brought the release of Super Mario Bros., and with it, one of the most iconic video game soundtracks of all time. The straightforward (yet highly effective) music implementation may seem simple by today’s standards, but its state-changing ability was quite advanced for its time.

Throughout the game, music will not only change based on the level that the player is on, but also in conjunction with player activity—such as when Mario gets powered up by Starman, or reaches the flag signaling the end of a level. These transitions, though sometimes abrupt, sync with the gameplay in a cohesive way to underscore moment-to-moment actions.

Monkey Island 2 (1991)

The release of LucasArts’ Monkey Island 2 in 1991 introduced a revolutionary step in adaptive music technology. Pioneering a system that they dubbed as ‘IMUSE’ (Interactive Music Streaming Engine), a team led by Michael Land, Peter McConnel, and Clint Bajakian made one of the first great leaps in adaptive music technology.

The goal of IMUSE was to change the compositional structure of the soundtrack according to events in the game, by allowing different elements to enter and exit at specific transition points and pre-designated markers. The system allowed for soundtracks to cleanly loop, which paved the way for music to vamp on certain passages while waiting for player input. At the time, the system’s design was likened to a musical’s pit orchestra, where cues happen based on both the timing of scene changes and the actions of the actors.

The video below features an interview with Michael Land and Clint Bajakian, where the two talk about developing the IMUSE system.

Banjo-Kazooie (1998)

Banjo-Kazooie was released on the Nintendo 64 in 1998. Its soundtrack brought another advancement to the world of adaptive music with its focus on vertical remixing, a technique where different layers of a piece of music weave in and out of the overall arrangement based on gameplay behavior. This can take the form of adding or subtracting individual layers of instrumentation, or even altering the arrangement entirely.

In the case of Banjo-Kazooie, music corresponding to each level was composed with different layers in mind that were based on player location. This way, the arrangement could instantaneously adapt to a player’s behavior, whether they were jumping back and forth from area to area or swimming underwater. Additionally, when adding or subtracting a layer of music, the source or destination of the vertical transition didn’t matter; music flowed seamlessly, as each piece was composed with all layers included. With only the necessary layers audible, a smooth fade between layers could occur no matter the gameplay occurrence. The result is a cohesive soundtrack that comes alongside the player, without being intrusive or overbearing.

The video below further breaks down the different techniques used in the Banjo-Kazooie soundtrack.

Each of these early examples of adaptive music wouldn’t feel out of place at all in the current generation of video games. At their core, today’s games are still consistent with these techniques based on principle; it’s just that the context of their use has become more sophisticated.

The modern techniques of adaptive music

Uncharted 2: Among Thieves (2009)

Today, vertical remixing is a very commonplace technique that can be utilized across a variety of different gameplay contexts. For example, one particularly popular use case in modern games is during combat encounters. Whether a player’s moving from an ambient state to a combat state or increasing the intensity of combat, vertical remixing can act as a smooth vehicle for building the excitement during these sequences.

This was the approach that the developers at Naughty Dog took with their 2009 release, Uncharted 2: Among Thieves. As the player transitions across exploration, suspense, and combat states, the soundtrack builds in intensity accordingly—even though the compositional flow of the piece itself doesn’t change. The excerpt below from the Adaptive Music Adventures series demonstrates how music underscores this gameplay context.

Super Mario Odyssey (2017)

Vertical remixing is also heavily utilized in the music of Super Mario Odyssey in very unique ways. For example, throughout different levels in the game, there are easter eggs interspersed that allow the player to experience 2D gameplay that’s stylistically similar to the original Super Mario Bros. The aesthetics and mechanics aren’t the only things that change during these portions—when entering these areas, the music’s arrangement also transforms to an 8-bit style, giving a nod back to the original Super Mario Bros. soundtrack while stylistically complementing the gameplay.

This transition is executed without any gaps in the music or any changes to the composition. As seen in the video below, the music flows seamlessly between the changes in gameplay state.

FINAL FANTASY XV (2016)

The IMUSE system developed by LucasArts that we discussed earlier made heavy use of what we would call ‘horizontal sequencing.’ Horizontal re-sequencing is a modern evolution of the technique, where music is dynamically placed together based on the player’s actions. This can create music that flows in a cinematic fashion with gameplay, making an epic combat encounter feel that much more over-the-top.

This technique was notably used in Final Fantasy XV during boss encounters. In his GDC talk below, audio programmer Sho Iwamoto demonstrates how horizontal re-sequencing was used to not only increase the intensity of boss encounters as the player nears their end, but to also facilitate complex meter changes while allowing for cohesive transitions. This gives the composer freedom to create epic music that still stays synchronized with dynamic gameplay.

Destiny 2 (2017)

Though horizontal re-sequencing is used against more scripted sequences in Final Fantasy XV, the technique can be applied to more open-ended scenarios as well. This was the tactic used by the music team at Bungie for Destiny 2. In their GDC talk, they outline how they created a horizontally re-sequenced transition system for music in missions that could be altered on-the-fly to match a given gameplay scenario. With this approach, each mission could feel like it had its own pre-constructed soundtrack, no matter the design.

Just Cause 4 (2018)

While the previous examples all focus on horizontal re-sequencing and vertical remixing as they stand alone, the original principles outlined by IMUSE and Banjo-Kazooie respectively can be combined to form a soundtrack that adapts to gameplay in multiple directions (i.e. both horizontally and vertically).

This was our approach to combat music in Just Cause 4. When the player enters a combat scenario in the game, music is driven horizontally and vertically by different variables. The horizontal composition of the music is driven by the overall ‘combat intensity,’ a combination of variables such as enemy type and player heat level. With this approach, the composition of the music is being changed adaptively based on gameplay. As combat intensity increases, more intense sections of music are played; when combat intensity decreases, less intense sections of music are played.

On the other hand, the vertical layers of the music are driven by the overall amount of enemies who are engaged in a combat scenario. As this number fluctuates, so does the density of the orchestration. The video below outlines a combat scenario from Just Cause 4 that demonstrates the interactions between these implementation techniques.

The combination of these techniques allows for a soundtrack to adapt to gameplay on a much deeper level. At the core of this matrix, however, are the same fundamental principles that were used in IMUSE and Banjo-Kazooie. The difference lies in the fact that modern video game technology just allows us to take these techniques that much further.

Ape Out (2019)

As heavily reactive as the music of Space Invaders was, there are video games still being released today with soundtracks that respond remarkably close to what the player sees onscreen. Ape Out is a truly one-of-a-kind ‘beat ‘em up’ experience developed by Gabe Cuzzillo and published by Devolver Digital. This game features a frenetic and percussion-driven jazz soundtrack, where the underlying beat fully responds to the player’s actions in the game. As the player smashes objects and enemies, in-time drum fills and crash cymbals punctuate both the music and the gameplay, making the player feel as though they’re playing through a more matured version of Tom and Jerry.

The video below features some of the musical elements of Ape Out and showcases how they behave in tandem with the gameplay.

Killer Instinct (2013)

While for a slightly older title, the developers of Killer Instinct used a very similar approach to the music that underscores the game’s ‘Ultra Combos.’ As the player goes in for their finishing move, samples of the soundtrack are triggered in time with the player’s animations, while also being quantized to the appropriate musical beat. Additionally, there’s a well of samples for the system to choose from that progress in a logical way towards the end of the musical passage, so that the music can be completely adaptive no matter how long the combo takes. The result is a hyper-personalized soundtrack that ends both the match as well as the accompanying music with an incredibly well-choreographed and epic cadence.

Looking ahead

Although we’ve only scratched the surface of the fantastic adaptive music techniques that have been developed throughout the history of video games, it’s easy to see the influence that early developers had on the techniques still used in the modern era. Whether it’s a system as straightforwardly elegant as the one in Super Mario Bros. or music that’s closely intertwined with gameplay systems as heard in Killer Instinct, technology breeds innovation in both its advancement forward as well as its limitations.

What’s more, gaming as an art form is still in its relative infancy. In the coming years and decades, it’ll be exciting to see how the musical techniques we discussed here are taken to even greater heights.

November 1, 2021

Original Source Want some music? Head over to

Latest Posts

Share this post:

Signup for the news letter because you need more news.