Big office building as a background image

GreenPoint Insights Techno Robots

• GreenPoint • Eric Boothe

August 2025

MIDI 1.0: The Standard That Birthed Techno

In 1983, Dave Smith (founder of Sequential Circuits, a pioneer in digital music synthesizers) and Ikutaro Kakehashi (founder of Roland, still one of the biggest names in musical equipment) banded together to create one of the most influential standards in digital music enablement: MIDI. The Musical Instrument Digital Interface, or MIDI, is the standard way that the digital synthesis of music can make its way to a computer environment for ease of manipulation. Think of it this way: prior to MIDI, if I played a “C” note on an analog piano, my best representation of that note was a static recording of that sound – MIDI gave us a way to put a board on a piano synthesizer that would create a digital representation of that sound. That digital representation made way to revolutionary manipulation – I can now move, tune, sync, or even fundamentally change the sound of a digitally captured sequence. What if I turned a sick piano jam into a drum beat with a mere click of a mouse?

Enter: techno. I’m entering very dangerous territory by putting a definition on a music genre (I was scolded not but two days ago for calling Lorna Shore “heavy metal” rather than “deathcore”). Techno is a loop-based music that usually follows fairly repetitive patterns at fast paces(120-135 BPM). There isn’t a ton of melody, and in the 1980s most of the tones produced were driven by analog and digital synthesizers.

Techno is the sound of machines making music. But, there has been, until very recently, always been a human brain behind the music.

Period

Key Events &  Milestones

1983–1989

MIDI standard released  (1983), enabling  synced drum machines/synths - Detroit Techno emerges (Juan Atkins,  Derrick May, Kevin Saunderson) - Roland TR-909, TB-303, and digital synths  define early sound - Chicago House rises in parallel.

1990–1999

Techno spreads to Europe:  Berlin, UK, Netherlands - Rave culture explodes; illegal warehouse  parties become global - Rise of subgenres: acid techno, minimal,  hardcore, trance, drum & bass - DAWs like Cubase & Logic begin  displacing hardware setups.

2000–2009

Digital production goes  mainstream: Ableton Live, Reason, FL Studio - Minimal  techno (e.g., Richie Hawtin) and tech-house dominate clubs - EDM  boom begins in the U.S. late-decade - Music festivals (e.g., Movement,  Sonar, Mutek) spotlight techno artists.

2010–2019

Global EDM explosion (though  more commercial than techno): Skrillex, Calvin Harris, etc. - Techno  resurgence: Berlin (Berghain), Detroit (Underground Resistance) - Rise of  modular synths, hybrid analog-digital setups - Streaming platforms and  boiler room sets increase access.

2020–Present

COVID-19 lockdowns push virtual DJ sets, livestreaming  - Techno becomes more genre-fluid (blends with ambient, breakbeat,  industrial) - MIDI 2.0 ratified (2020), extending expressive  possibilities - Renewed interest in hardware and analog aesthetics in  digital music.

 

MIDI 2.0: Giving Rise to AI Music Creators

MIDI 1.0 in 1983 gave humans a unique pathway to push music synthesis into a digital environment, giving rise to completely new music genres and opening up new channels of creativity in existing genres. Several other major technologies only accelerated the advancements of music in the digital sphere: the internet created a whole host of applications to distribute, both illicitly (Limewire) and sanctioned (Spotify). The internet also created sharing mediums to socially collaborate and “riff” on one another’s creations, like SoundCloud. Today, we have a thriving ecosystem of digitally represented sounds, rhythms, voices, tones, instruments, and creative brains.

In 2020, the MIDI 2.0 standard was adopted. In the 37 years since MIDI 1.0 was implemented – not only did the distribution and ecosystem of music radically change, compute did as well. Today, the average DJ deck(turntable-looking thingy) is millions of times more powerful than the computer used to guide the Apollo missions. The total digital library of music is represented by ~200 million tracks and 5 petabytes of data (5 x 1,024terabytes, where one terabyte is probably your laptop storage). These changes led to necessary evolutions of the MIDI standard: one can now manipulate individual notes for pitch, timbre, articulation; the musical equipment can now communicate bi-directionally (meaning your computer can give feedback to the instrument producing the music), and the precision of data and digital representation has increased many fold. Once again, the standards have opened the door to yet another creative revolution:

Techno can now be machines making music.

In May of 2020, OpenAI released its whitepaper on GPT 3.0,with developers being able to access the wealth of OpenAI’s training power a few months later in October of 2020. While GPT was trained on text data (like, all of it), it made apparent a discrete possibility: any form of digital data could be used to create a foundation model. All of the sudden, the value of MIDI, the concrete representation of digital music down to the individual stroke of a piano key, became clear – this was data that could be used to train the ChatGPT equivalent of a master composer.

Startups were quick to train and commercialize on this opportunity. Founded in 2019 and soft launched in April of 2022 (a full 8months before ChatGPT), Soundful has quickly risen to one of the dominant AI-music creation platforms with over 150+genres that can export as a standard MP3 file or even as MIDI for music producers to further manipulate and bake into their own music. While the entirety of their secret sauce is not made public, it is almost certain that their foundation model is trained on a massive trove of publicly available data including raw MIDI data to train on individual musical sounds and tones. Today, Soundful and its competitors host impressive tracks created by generative AI, that can easily be manipulated with chat prompts and then used across any platform with very modest licensing fees.

In the 43 years since MIDI was adopted, this elegant digital standard gave way to new digital representations of music, an ecosystem of new creation, a data trove of this digital music world – and now, we have TechnoRobots.

Techno Robots Are Sick: But What Does This All Mean?

Thanks for calling them “sick,” they’ll remember that. There are a few important foundational themes here that will give way to future parts of our series on Robots: robots are effectively substitutes for humans that operate using a sensory dimension outside of a pure digital space, standards beget the data to train AI, and technology gives way to revolutionary enablements of robots to better exist in our human space.

Music is an excellent representation of a common sensory dimension of humans. I’m not sure if other species finds our music interesting(except maybe this horse that prefers Pantera and Slipknot to country music – slay, horse) –but music, other than some personal preferences, is a universally understood form of communication and expression for our species (unless you’re reading this, horse). Similar sensory environments exist that have been difficult for robots to abstract and manipulate: taste, smell, physical touch, complex physical environments, and so on. These difficulties, at least today, create significant barriers to robots substituting humans in these environments: you need to taste and smell to be a chef, you need to be able to touch with precision to be a physician, and you need to scale buildings to be an ironworker.

The MIDI standard gave rise to a means of connection between thousands of years of music created by human hands and a rapidly rising digital world. This standard was the conduit by which millions of songs are now not only digitally represented, but were originated entirely in a digital space. This data powers AI models that are not only capable of creating Robot Techno, they could arguably begin to understand it. Could music be what gives AIempathy?

Lastly, technology outside of MIDI enabled and powers an entire ecosystem of creativity. The internet, peer-to-peer file sharing, dramatic uplifts in compute and data storage, social media, and massive democratization of creation (Soundcloud, $80 keyboard synths, $180 FL Studio licenses) have all spun network effects that reduce barriers to entry and increase the rate by which new ideas and data spawn.

We’re in an era by which new means of engaging with human environments, data creation, and enabling technologies for robots are emerging at an unprecedented rate. The implications for this on all sectors, especially for real assets which almost entirely rely on humans (both for their creation and consumption), are massive. More to come in future parts of this series.

More information

To receive our insights pieces, subscribe here.

For more on this story, please contact us here.