I’ve thought of making an acoustic guitar with a midi pickup, pisound, contact microphone, and exciter speaker. maybe I’ll get around to it someday. should be decent possibilities there.

1 Like

i confirm !
Tom is a Usine Hollyhock user :wink:

I didn’t know about the Morton Subotnick ghost pieces, that’s really interesting stuff. Here’s a master’s thesis I found which goes into quite a lot of detail describing the historical, technical and artistic context of the works, as well as how they actually work: https://scholarworks.sjsu.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=4861&context=etd_theses

The acoustic and electronic elements are exceptionally well balanced, listening to those pieces I can often not tell what is acoustic and what is processed. I see his approach of using pre-recorded gestural automation as a slightly different thing, a precursor perhaps, to a realtime-playable augmented instrument, but this idea of creating an uncanny atmosphere into which acoustic instruments are played is an interesting way of thinking about it — and, seems very characteristic psychedelic Subotnick :slight_smile:

Exciter speakers, and their potential to remove or reduce the need for a PA, are something which have interested me for a while, but I’ve never experimented with them. I think @papernoise did an interesting piece a while back with an exciter-driven string quartet, they’re probably a good person to ask about how to achieve that technically. Placing a contact microphone and an exciter on the same acoustic body seems like a recipe for feedback, but that in itself could be an interesting thing to experiment with.

Yeah, 7-bit MIDI is a pain, although in my experimentation so far I’ve found that it’s steppiness is less noticeable when using it to process acoustic material than when controlling synthesized sound, as the variations and imperfections in the acoustic sound can smooth out and hide the steps — but of course it depends on what you’re using the data for.

The augmented trumpet thesis mentions that, if you’re using any sort of DSP for the audio processing, latency introduced by the controller system is likely to be dwarfed by the latency introduced by the interface, computer and DSP algorithms.

Hollyhock looks interesting! Between that, bidule and Reaktor, I feel like my current approach of trying to implement all custom audio processing and control software in Puredata might be a dead end :confused:

1 Like

I’m sort of obsessed with the Vo-96, despite not having any hope of ever owning one, nor even being a guitar player.

3 Likes

This is a big area of interest for me, though these days less in terms of a controller/sensor type thing, and more things driven by audio input and analysis. That, along with some high-level control (via monome and a Softstep generally) is how I generally approach “augmenting” an instrument.

That being said, I am working on trying to get more information out of a drum that can then be used to drive audio processing with:

One thing I commented on in another thread is that, it’s a shame to see really powerful sensor and/or synthesis technology locked behind software walled gardens, where all that’s available is “regular” MIDI.

Latency-wise, you’d be surprised what you can get away with. Even doing percussion-heavy music, using processes that are triggered by onsets detected in an incoming audio stream, with attack-based sounds that fade quickly, if I have my I/O vector size set to 64, you can’t tell. Even 128 is passable really.

For me the bigger issue is resolution (and meaningful-ness) of the data you have coming. So 7-bit MIDI is shitty, as is uncalibrated/unsmoothed/noisy data.

There’s tons of interesting stuff going on in this area, off the top of my head here are two super-modified trumpets:

http://www.electrumpet.nl/Site/Electrumpet.html

http://www.sarahbellereid.com

7 Likes

I assume a lot of lines folks may also be Ableton folks, but if not, there is a cool video from last year’s Loop conference that they just posted the other day…

https://www.ableton.com/en/blog/motors-magnets-and-motion-electronic-instruments-physical-world/

2 Likes

I could see this turning into a cool delay… or maybe it will just be really noisy

1 Like

The technology behind the Vo-96 was releases as an e-bow like exciter the Wond. Paul Vo is working on the next generation of it, called the empick. https://www.paulvo.com

RPI is a little beehive of electroacoustic research:
http://www.arts.rpi.edu/pl/research-s3

1 Like

I thought this is how plate reverbs work?

2 Likes

Here is a piece of mine that is for double bass, ice in a glass, chopsticks, and compressor with amplification system. A fairly simple electro-acoustic arrangement using the instrument as a resonance source. Initially inspired by a @disquiet prompt.

I also frequently perform double bass + modular electroacoustic stuff and have an occasional podcast interviewing people who make and perform this kind of music.

5 Likes

Sarah Belle Reid stuff is fantastic!

How do I find the podcast ?

I post them here:

https://gahlorddewald.com/tag/and-electronics/

I’ve been very fortunate to get interviews with a variety of people doing interesting things with instruments + electronics—Sarah Kirkland Snider (composer of chamber orchestra + electronics song cycle Penelope), Meerenai Shim (label owner and flute + electronics performer), David Vickerman (wind ensemble and electronics conductor), and a bunch more.

If there are people you’d like to hear interviewed definitely let me know. I don’t keep a regular schedule of it because I have many things to do and only so many hours in a day, but I love doing them.

4 Likes

Sounds very cool, are they available on Apple podcasts?

I don’t know how I missed this thread the first time around… One of the neater things I saw at NIME a few years ago was the Feedback Cello from Alice Eldridge and Chris Kiefer

They do the “resonant transducer on the body of the instrument” thing, which is part of a feedback path to the strings and pickups… Some lovely sounds there.

3 Likes

unamplifed cello excited by unspecified devices/objects externally activated and controlled by low frequency resonance (via max)

4 Likes

It isn’t yet, unfortunately @eblomquist. I keep meaning to do that but I haven’t yet reserved an afternoon to do all the dull fiddly bits of googling the process and submitting it. I will eventually though!

Understood, but you’d have at least one devoted listener here if you do!

1 Like

Knowing this, I’ll get it into Podcasts much sooner! :slight_smile:

1 Like

The idea of augmenting instruments through electronics is my focus in composition. I’m an Oboist and use MaxMsp in my work. My teacher Mari Kimura always believed that these types of works are most effective when the laptop and performer are on equal footing. Basically, write a program that acts a second performer and supports the instrument rather than steals the spotlight. I don’t use any prerecorded sounds in my work either - I like the idea that every piece will be different.

My two newest works for oboe and live interactive electronics are here:
https://soundcloud.com/brandonlabadieoboe/lotos-eaters

https://soundcloud.com/brandonlabadieoboe/still-untitled

3 Likes