Yeah, some cool ideas. I also like how they are presented too. Reminds me of Jack Shaedler’s posts (but smaller scale), which is where I came across it.
Have really enjoyed reading through this thread. Instrument interfaces have been on my mind lately because I’m struggling to find one that I can commit to. It’s possible I just can’t commit and will always be reinventing ways to interact with my instrument. I’ve been working with a modular for the last two years, I’m almost exclusively an improviser with very little formal training. That said, I’ve devoured as much music theory as I can over the years, so I think I’ve got a reasonable base to rely on. I play some guitar and understand the structure of a piano keyboard, but I’ve never committed to real technique with either.
All that’s background to say - one of my favorite interfaces is the Makenoise Rene. I really love conductive pads, and the configurability of “it’s a sequencer that you can play like an instrument” is great. The James Cigler Rene videos really opened my mind to some unique possibilities. Especially his Rene 301 video,
I’m tempted to go all in and build an skiff around Rene and Pressure Points as my end all be all interface, but I waffle constantly. There’s an aspect of my personality that likes breadth rather than depth, which is incompatible with the majority of musical interfaces throughout history.
Coming back to this thread since the topic is near and dear and recently much on my mind as I’ve been playing with some live coding environments. (Thanks also for that nudge in particular @charlieroberts!) Anyway, curious if any of you are familiar with the Cognitive Dimensions framework as an approach for reasoning about usability. In my past life I was very preoccupied with the UX of program languages and environments and this framework basically blew my mind. A few papers have been written that apply to music applications specifically (notably Cognitive Dimensions and Musical Notation Systems (2000) and more recently The cognitive dimensions of music notations (2015)) although I think the general gist applies to anything. Anyway, hope someone finds this useful!
This is super interesting to me, relevant to my professional life as well as my personal interests.
I frequently find myself in a position where I am critiquing an API because I need it to do things in some way to support a specific desirable user experience, but it can all be so indirect. This gives me some tools for addressing API usability more directly.
Yes, these are great! It’s really nice to have a common vocabulary to refer to, and a method of evaluation that doesn’t necessarily rely solely on formal user studies, which I think are often imperfect for studying expressive interactive systems. Alan Blackwell is very active in the live-coding community (as are many of his students / former students), so a lot of live coding references the CDN.
My favorite “Intro to CDN” paper: https://www.researchgate.net/profile/Marian_Petre/publication/220578787_Usability_Analysis_of_Visual_Programming_Environments_A_‘Cognitive_Dimensions’_Framework/links/02bfe50fbf23476730000000.pdf
A seasonal bump here, I suppose.
I was considering making a whole new thread for some thoughts I was having this morning around touch-screen sliders and knobs.
But of course, the Vulf screenshot and Goodhertz quote from here is immediately relevant.
The main question I’m thinking about today is: What are the best practices for designing intuitive parameter controllers for touch screens?
Spurred from a more practical discussion elsewhere about iOS specific things like AutoLayout (apparently Apple’s announced that AUv3 plugin hosts on iOS will now be able to specify their plugin viewport size, which makes plugin UI design a bit more challenging)…
This is, I guess, an age-old question of digital skeumorphic (and post-skeumorphic) music UI… knobs or sliders? If knobs, rotation-based, or linear?
One thing I’ve noticed, that’s really interesting is, DAWs seem to regularly provide a generic, linear-slider based parameter control view for VSTs. Ableton does this, and GarageBand on iPhone does this… I think others do as well.
A bit more exploration into GarageBand on iPhone, and I discovered that their instrument UIs tend to use knobs, and do something really neat: In their Automatic mode (you can choose between Automatic, Linear, and Rotation behavior in Settings) each time you touch a virtual knob, it seems to detect whether your linear movement is up/down, or left/right, and then constrains that touch to only affect things on that axis.
Contrast this with Korg Gadget, where the “Linear” mode is only up/down. Gadget does show the actual parameter value in a little popover, which GarageBand doesn’t, so I also wonder if that information is actually important, and how best to display it…
Borderlands, I think, does a great job with this, with the circle of orbiting sliders, since the values are usually above your finger, and the movement is always linear. Fugue Machine, on the other hand, as great as it is at everything else, is really frustrating with the rotary knobs, since the visualization is under your finger most of the time.
Is there any published research on this sort of musical UI? And do any of y’all have thoughts on this area of things? Apps that do a reasonable thing? Apps that don’t?
if you have a laptop: put the modular on something that will raise it to be higher than the laptop screen so you have both in front of you
Executing the modular album
I’m not aware of published research on this topic, but it would be an interesting thing to do this research.
I think your instincts are pointing you in the right direction.
This is a great summary of various touch UIs. I wish more developers would take the cue from Apple.
It will be interesting to see how Apple adapts their own plugins to the autolayout.
These are great points.
I wish I had the drive to build within frameworks like TC-Data and Beatsurfer – I love the post-skeuomorphic world these inhabit, where gesture and planar relationships completely override traditional knobs and sliders. But you have to understand both worlds, it seems, to make best use of these tools.
Borderlands is closest to this type of design, as you mentioned and damn, I wish it was more universal in every developer’s approach.
Erik Sigth’s designs have always made sense to me. Maybe it’s the Teenage Engineering-esque visual language, but the humbletune apps strike a really nice real-world/digital balance.
(Though it certainly ain’t research, I wrote an “article” (read: listicle) for Reverb.com last year that touched a little bit on this: https://reverb.com/news/the-best-music-making-apps-for-ipad.)
Thanks for this direction… I was originally only intending to comment on specifically the sliders that come out when you tap on a cloud to edit it:
But your mention of this sort of hybrid of both worlds promted me to experiment with moving the cloud and a slider at the same time, which makes for a really wonderful and interesting playing experience
Unrelated to Borderlands, but on topic with the rotary controls, I wonder if the left-right aspect of the interaction spans to cultures that read right-to-left, or if it should be inverted? Time to put my phone into Arabic and check it out…
Just in case it wasn’t already mentioned in this thread, this book may be of interest to some (no affiliation – just a backer myself).
Looks like the knobs behave the same in Arabic.
Really looking forward to the Push Turn Move book too
and all of that gesture/parameter shifting can be recorded and looped! it’s a wonderful example of responsive design. i’m so glad that opened up!
PUSH TURN MOVE is the right kind of nightmare… It keeps getting bigger. We are up to 360 pages now! I just received the first package of page layouts to edit, and we are on schedule to get the book out the door this fall. If anyone has any questions about it, feel free to ask me. If I can’t answer, I can get answers from the author, my good friend Kim Bjørn. I’m really excited to be editing this book!
The best instruments are more about the player than the instrument. It’s why I love the piano, it’s a vehicle for self-discovery.
If you want a research partner, I’ll contribute! I’ve been building a UX of physical interfaces for a bit, but never taken it anywhere.
Sure! One of the first challenges we will encounter when doing research without a client, will be placing some constraints on the project so that it has some measurable goals and can be completed in a timeframe.
So we would need to define that.
First of all, what type of research? @rknLA had some pretty specific questions about the behavior of software knobs. Those types of questions could find answers in a software usability study.
A project like that represents a couple days of writing a screener and recruiting participants. The study itself takes a couple/few days, and you want a couple more days to publish.
Does that sound like something you have time to volunteer for?
In prior product projects I’ve had to work without the guiding north star of a client or technical end, it’s useful to identify either underlying principles for application to several technical endpoints or answering a specific question. There are a lot of questions to try to answer, but I think that they should center on interrogating the most compelling answers people put here. e.g. Ask about the interactive advantages or disadvantages of a skeumorphic interface versus a material principles one. the digital knobs question, etc. Another alternative is to make a findings with insights that would be marketable to device/instrument manufacturers.
Life has had some pretty big upsets for me recently so I’ll have the time to work on it, but not without a bit of coordination.
I recommend keeping the scope small, unless you can find a client for the study.