The impact of technology on music

I think a steep learning curve can be ok for any design if it suits the circumstances. Lots of expert systems are hard to learn but work really well once you know them.

It’s all about the goals…

Which is to say that novel interface controls in software instruments can be awesome, but only if people have a reason to invest in learning them. For software instruments where the creator wants to give people an entry point using familiar elements is a good way to do that.

I think it could also be argued that software instruments, as opposed to hardware ones, require a different kind of learning as they can be less “explorable” … a piano or guitar make noise and have physical cues that help you figure out how to play them even if you’ve never seen one before. A lot of software doesn’t do anything unless you use it “correctly” and the feedback mechanisms that promote learning by exploration/play have to be intentionally designed into it or it won’t exist.


Heard this earlier today, and thought you all might appreciate. Should we devote a thread to the history of electronic music, or will this one suffice? Maybe here-- The Language of Electronic Music? Are they one and the same or just tangentially related?

Regardless, I will try to go and report back. Looks fun.


Realise this is an older thread but I’ve been thinking about this subject today.

I’d been reading on the role of reverb and found this line:

Hope Bagenal, the senior acoustic consultant of the Royal Festival Hall, considered that the insertion of galleries in Lutheran churches, which reduced reverberation “is the most important single fact in the history of music because it leads directly to the St Matthew Passion and the B Minor Mass”

I’d been thinking on how the acoustics of a space shaped composers like Bach and how UAD effects users are now able to use modelled spaces like the Ocean Way and Capitol Chambers plug-ins in their own productions.

Given how those reverbs impart a famed character and can be used to connote an atmosphere, it seems like we’re getting back to writing music with specific ambiences in mind.


I don’t think people ever stopped. Consider, Metallica’s black album likely moved towards longer/slower grooves and riffs in large part because of the influence of mostly playing stadiums. Similarly, I think the thump-thump of dance music of the last 30 years is at least partially a response to the move towards cavernous warehouses and clubs, where nimble disco beats get lost in a smear of space.

Having played loud electronic noise live in a cafe, I can say confidently that quiet, intimate music makes much more sense for the venue.


space and architecture are definitely part of the “technology”…

i’ve been reluctant to post in this thread so far, because whenever i see the title, i think of how obvious it is that music has always been influenced by technology. the piano is a technological improvement on the harpsichord, rock n roll wouldn’t have happened without guitars becoming electric, etc.

certainly going back centuries is still relevant, too, when we look at scales, keyboard interfaces, and when we assess musical gestures and musical interactions… for the most part, they all boil down to a strike, a pluck, a rub, a press, or a breath. the only difference is that the thing on which we enact changes.


I tend to agree, but it can be fun and useful trying to understand the ways in which style has been affected by technology over time, because there are so many, like the piano led to greater dynamic range, microphones let crooners sing softly over orchestras, or big amplifiers led guitarists to rely on power chords more. And then snowball effects, like distortion led to less intricate guitar playing, so rock bands shifted intricacy over to intros and soloing. And as rock generally moved to less intricacy, folk/bluegrass moved to more (think John Fahey, Robbie Basho). And then as rock solos increased in intricacy, or more disparate parts were added to make up for the core simplicity, certain styles increased the complexity of time/harmony/structure choices (e.g. prog), and other styles of rock music then shifted to dogmatic non-intricacy in response (punk/hardcore), etc.

It’s possible we’ve moved a lot more now towards the medium influencing style than the tools themselves. Like what does streaming/playlists do to compositional choices, or how has the snippet/demo culture of Instagram changed certain styles of guitar or synth music. With the above discussion on interface, I wonder if it has had much of an influence on musical style? At least in the last 10 years?


I don’t think title implies a question of whether technology impacts music - it’s about observing how it does &, since technology is constantly progressing, the nature of its impact is in an eternal state of flux.

I think of this every time I see Norns threads where people have created something new with it. Perhaps this continues a trend seen in Max/MSP, Reaktor, PureData etc but I still find it fascinating that people are coding their own musical tools. What Bram did with Mozaic on iOS feels akin to it in different ways too.


largely a result of recent developments in processing power, I think - there’s plenty of CPU now to get inefficient, easy to write code processing audio in real time.


Some amazing technology in this 102 key piano, and the reviewer speaks extensively about the implications for music played on it.

Stuart & Sons 102-Key Piano Review