A colleague pointed me to these two papers by Reinhard Gupfinger and Martin Kaltenbrunner (Kaltenbrunner of Reactable fame), both in the journal Multimodal Technologies and Interaction. I’m posting their abstracts.

Animals Make Music: A Look at Non-Human Musical Expression in volume 3 issue 2 from 2018.

The use of musical instruments and interfaces that involve animals in the interaction process is an emerging, yet not widespread practice. The projects that have been implemented in this unusual field are raising questions concerning ethical principles, animal-centered design processes, and the possible benefits and risks for the animals involved. Animal–Computer Interaction is a novel field of research that offers a framework (ACI manifesto) for implementing interactive technology for animals. Based on this framework, we have examined several projects focusing on the interplay between animals and music technology in order to arrive at a better understanding of animal-based musical projects. Building on this, we will discuss how the implementation of new musical instruments and interfaces could provide new opportunities for improving the quality of life for grey parrots living in captivity.

The Design of Musical Instruments for Grey Parrots: An Artistic Contribution toward Auditory Enrichment in the Context of ACI in volume 4 issue 2 from 2020.

One particular approach in the context of Animal Computer Interaction (ACI) is auditory enrichment for captive wild animals. Here we describe our research and the methodology used to design musical instruments and interfaces aimed at providing auditory enrichment for grey parrots living in captivity. The paper is divided into three main phases: a project review and classification, sonic experiments at the parrot shelter and the design of musical instruments. The overview of recent projects that involve animals in the interaction and music-generation process highlights the costs and benefits of projects of this kind and provides insights into current technologies in this field and the musical talents of animals. Furthermore, we document a series of sonic experiments conducted at a parrot shelter to develop acoustically enriched environments through the use of musical instruments. These investigations were intended to provide a better understanding of how grey parrots communicate through sound, perceive and respond to auditory stimuli and possibly generate sound and music through the usage of technological devices. Based on the cognitive, physiological, and auditory abilities of grey parrots, and their intrinsic interest in sonic and physical interactions, we finally developed and tested various interactive instrument prototypes and here we present our design results for auditory enrichment in the context of ACI and artistic research.


Edit:

I’m reading the first, 2018 paper and it references the project metamusic

[whose] target is to improve the animals’ quality of life, by designing an interactive musical environment that takes the specific needs and skills of the animals into consideration. metamusic, developed by the artists’ group alien productions in collaboration with the zoologists and animal keepers of the ARGE Papageienschutz centers its attention on grey parrots.

From the paper, pp 1-2:

Recent research in the field of cognitive biology has focused on the role of animals listening to human music as a concept of enrichment [10–13]. Since most of the music is selected by humans, this can lead to anthropomorphic biases. Therefore, the music should be attuned to the animals’ auditory skills. Studies have shown that non-human species also have musical skills [14–16] and display entrainment to auditory stimuli [17,18]. Animal species such as grey parrots, cockatoos, elephants, primates, pigeons, and carps have been found to be able to discriminate between different composers or different genres, prefer music to silence, or move in rhythmic synchronicity to the musical beat [10, 14, 17–19].


A thought arrive into my brain: so, there is this notion of humanizing music; I think the basic idea is to add imperfection and “expression” through sprinkles of randomness to render more human-like sounds which would otherwise maybe sound “mechanic”, “repetitive” and “motorik”.

That begs the question: what would “animalizing” do? If there was an Eurorack module (or norns script wink wink) for “animalizing”, what should it do?

5 Likes