MUSIC AND TECHNOLOGY 1
The idea that music and science have anything to do with each other would have struck most people as ridiculous 50 years ago. After all, music is an artform, and the arts and sciences are often seen as opposites - or at any rate, non-overlapping domains. Today, however, the gap between science and music may seem less of a gulf, thanks to the ubiquity of music and audio technology.
But technology is just the tip of the iceberg. It's the thing we're most aware of, but in fact, the relationship between science and music goes much deeper. One purpose of science is to pave the way to new technology, but it's also about improving our understanding of the world we live in. And, of course, music is an integral part of that world. Science can help to show us how music works, both in terms of the way it is created and the way we hear and respond to it. As surprising as it may seem, much of music is really quite mathematical in nature, from its basic scales and rhythms to the complex ways that different chords relate to each other. Musicians don't need a conscious understanding of these mathematical relationships - for most of them, it's a matter of intuition - but it can be interesting to look at them all the same. If nothing else, it shows that the domains of art and science aren't that far apart after all.
This studio, used by electronic music pioneer Karlheinz Stockhausen, resembles a scientific laboratory.
Wikimedia user McNitefly, CC-BY-SA-3.0
Some of the connections between science and music have become more visible through the use of technology, such as synthesisers and digital audio workstations (DAWs), but the fact is the connections have always been there. From a scientific perspective, all sounds, whether musical or otherwise, are vibrations in the air - or vibrations in any other medium, whether gas, liquid or solid. These vibrations spread out from the point of origin in the form of a wave, which is one of the most fundamental concepts in physics. Gravitational waves, radio waves, light rays, X-rays and even electron beams all propagate from A to B in the form of waves. In each case, something - some measurable quantity - is going up and down like a wave on the surface of the sea. In the case of a sound wave, the thing that goes up and down is pressure, so in contrast to an obviously undulating water wave, you have a moving pattern of compression and rarefaction. It's that pattern which, when it enters our ears, our cleverly evolved hearing system interprets as sound.
So how is a sound wave created in the first place? All sorts of natural processes can do that - anything that causes a disturbance in the surrounding air, from raindrops to a thunderclap. And, of course, human vocal cords. That's how music started - with simple a cappella singing - but to take it beyond that, we have to turn to technology. The Concise Oxford Dictionary defines technology as 'the application of scientific knowledge for practical purposes', so we're not necessarily talking about electronics here. We'll see in the next chapter how the ancient Greeks worked out how the design of a wind, string or percussion instrument determines the shape of the sound waves it produces. In that sense, even a drum, flute or lyre is 'music technology'.
Another big impact that technology has had on music is the ability to record a performance and play it back later. Today, that's inextricably linked to electronics, but it doesn't have to be. The first practical phonograph, invented by Thomas Edison in 1877, was purely mechanical. An ear trumpet-like horn amplified incoming sounds sufficiently to vibrate a stylus, which made a copy of the audio waveform onto a wax cylinder that was rotated at a steady rate by a clockwork mechanism. Playing back the sound was literally the reverse of recording: the rotating cylinder made the stylus vibrate and the horn amplified the sound to the point that it was audible.
A simple contraption like this is never going to produce the kind of high-fidelity sound you could sit and listen to with pleasure, but it's good enough to make some kind of permanent record of a performance. And that's where we come to the real irony - or maybe even tragedy - of the situation. Edison's phonograph was so simple it could have been made, if it had crossed anyone's mind to do so, centuries earlier. So just think of all the musical performances that could have been recorded but weren't. Both Bach and Beethoven were popular keyboard players as well as composers - and their real crowd-pleasers were virtuoso improvisations rather than written-out compositions. That, however, was in the days before phonographic recording, so all those performances are lost to history.
In fact, in the case of a keyboard player, such a recording doesn't even require a phonograph. Sometime after Edison's invention, someone belatedly worked out that, using a mechanism inside a specially modified piano, you can record a pianist's performance on a roll of perforated paper and then play it back automatically as often as you like. The 'player piano', as it was called, was very popular between its invention in 1896 and the advent of good (or at any rate acceptable) quality phonograph recordings in the 1920s, after which it fell into obscurity. But maybe not completely so. If you've ever used a DAW, you'll be familiar with its 'piano roll' editor - a name that harks back to the old perforated paper rolls of a player piano.
There's a rather silly trend in classical music these days towards 'authenticity' - getting as close as possible to the notated score, which is presumed to represent the composer's exact intention. But that might not be the case at all. When Chopin played one of his mazurkas - a Polish-style dance notated with three beats in a bar - for the conductor Charles Hallé, he played it with four beats in a bar. When Hallé pointed this out, Chopin explained that he was reflecting the 'national character of the dance'. If that performance had been captured on a piano roll or wax cylinder, Chopin's mazurkas might be played in a completely different way today.
As it happens, we do have a recording of one famous 19th-century composer playing the piano. Johannes Brahms died in 1897 - which happens to be the same year that a Cambridge professor named J.J. Thomson discovered the electron, so you know the quality of recordings made at this time are not going to be good. Made in 1889 using an Edison phonograph, it's a strong contender for the worst recording of a great musician ever made. You can hear it in a YouTube video made by a talented young pianist calling himself 'MusicJames'.* It's well worth watching the whole thing; with the aid of some clever detective work, James disentangles exactly what Brahms is playing - and it's not what any classical pianist today would expect.
The basic problem with the recording becomes apparent right at the start when Brahms speaks a few words in an unnaturally high-pitched voice. That's because the wax cylinder only preserved higher frequency sounds, so that when he starts playing one of his Hungarian dances, you can only hear the notes his right hand plays, not the left. The trouble, from a musician's point of view, is that what the right hand is playing doesn't quite match the notated score. By interpolating what must be going on in the left hand, James concludes that Brahms is playing in what might well have been the standard pianistic style of his time, full of 'rhythmic improvisation techniques which have gone extinct in classical music over the last century'.
Without technology, even in the crude form of an Edison phonograph, musical insights like this would be impossible. And without technology in the far more sophisticated form of digital workstations, electronic synthesisers and streaming services, a vast proportion of today's music simply wouldn't exist. In turn, the technology itself wouldn't exist if someone - generally the developers rather than the users - did not have a strong grasp of the underpinning science.
As an example, consider the brilliant, genre-hopping musician Jacob Collier, who is still in his twenties as this book is being written. There's no obvious science in his background or upbringing. His mother is a music teacher, violinist and conductor, and his grandfather was leader of the Bournemouth Symphony Orchestra. As a child, Collier sang the key role of Miles in three different productions of Benjamin Britten's The Turn of the Screw, one of the greatest of all 20th-century operas. In 2021, his rap song 'He Won't Hold You' won a Grammy award, and he narrowly missed getting Album of the Year too. Part of his popularity undoubtedly comes from a series of YouTube videos in which he shows exactly how he creates his music. And now we come to the point, because some of his explanations look more like science than art.
Like many musicians today, Collier puts his songs together almost entirely with the aid of software, in the form of a DAW. Apple users, whether they're musicians or not, probably have some knowledge of how this works through playing around with the GarageBand DAW that comes free with Mac computers and iPads and iPhones (albeit in cut-down form). The DAW used by Collier is a more sophisticated Apple product called Logic Pro,...