UX of Music Instruments & Tools


I’m going to have a lot more to say on this subject in the near future, but I wanted to start the conversation by just opening it up and listening.

What music production tools do you use that feel very usable to you? These are tools that make things easy that would otherwise be difficult or impossible. They don’t get in the way of your creativity, but instead expand the possibilities of what you might have otherwise created.

I’m interested in hearing why you think a particular interface works well. I’m interested in hearing stories about experiences you’ve had with attempting to achieve a goal, where the tool helped or hindered with that effort. I’m interested in hearing about things you have always wished a music production tool maker would do, but that never seems to make it into reality.

Screenshots, videos, or any visual accompaniment to your comments are especially appreciated.

In the coming weeks I might ask some of you to volunteer for various (quick) research activities that will help me dive deeper into such questions.

Building an app for non-linear composition

Bumping this for any evening folks.

If you want to totally ignore the topic and just tell me dumb jokes, that’s cool too. Anything but crickets! :wink:


I’ve found it a surprisingly difficult question to answer. If I consider software, the first things that come to mind (Audiomulch, Sound Forge) I’ve also been using for nearly 20 years, but anything used long enough becomes easy.

I think the on-screen, zoomable, bipolar waveform editor is a thoroughly intuitive interface. Outside of software, I think the piano keyboard is remarkable in the information it conveys, not just visually, but you get an immediate sense of how to effectively play the keys just by touching them.

Something I really appreciate in new interface design is a lack of fear to discard that which has been assumed to be the accepted proper way of doing things. Like the minimal mixer idea that comes up here often, I’d like to see designers question the need for things like a master volume control. Is it really necessary? I’ve had the same hardware mixer for seven years sitting right beside me, never once have I used the master faders.


hmmm, well i guess the first two are more abstract then what you are looking for.

to me, the most important is a piano or rhodes or wurli. i can write on either instrument and hear the bass movement, harmony progression, and melody. and since the instruments are sensitive to performance, i can get a great idea of end timbre of each section.

second would be budget/time. the amount of times i wanted to try something that would have improved a part but didn’t have the time to execute it has been staggering.

i’m personally unsure of what i want a music maker to do, i don’t often feel restricted by the tools around me, though there’s always times where i needed a slightly different tool (i.e. a plexi and not a tub screamer), but it’s fun working around those challenges.

there are tools that i like to be surprised about, or make me think differently about a simple task. for example, i just picked up this box from sknote called vastaso. it’s a compressor and distortion box rolled into one, all analog modeling of different types of circuits, with parallel processing on both the compression and distortion signals. the controls are super different then a compressor. attack isn’t a smooth selection between a fast and slow. instead, there are presets that are types of attack times and slopes. for instance, if everything is set on A, you get the A setting. but you can adjust one parameter and get a different type of compression. it’s super weird and hard to understand. but when you sit down and play with it for a good amount of time, you really start to interact with it at a very different way then just a standard compression. and consequently i get super interesting and exciting results.


I’ve used the same software for many years now (Logic and Max, and to a much lesser extent Sibelius), so it’s hard to really be objective about UI/UX with these things, but here are some thoughts/experiences.

Starting with an old one, there was a massive shift in Sibelius around Sibelius6 I think where they moved to the ‘ribbon’ paradigm of control, which is one of the worst pieces-of-shit things I’ve encountered. It’s like the worst of both worlds in terms of context and menu-driven UX (like you have to know which ribbon bit to enable, to THEN hunt around contextually for what you want to do…). Awful.

I think skeuomorph-y things tend to work well for audio UX stuff. Although I don’t have/use any of them, the Madronalabs instruments look wonderful, and seem to invite ‘play’. Similarly, Max is built around the paradigm of ‘objects’ and ‘cables’ (though in a much less playful manner!). Though this (play) is probably less important once you get your head around something. I guess something that you can get a clear overview of what’s what, and then what to play with within that.

In working on TPV2 for the last month or so, I’ve spent a lot of time playing with the MuBu package for Max (for rebuilding some of the Combine-type stuff) and holy-moly has that been an exercise in banging my head against a wall. The combination of similar-but-different naming/structure (a mubu containing tracks and buffers, which can refer to Max buffers~ (or not!)), a dense/informative (though unclear) UI, and general lack of structure has made it a real uphill battle.
I had the benefit of seeing one of the designers demo some of the stuff, so I knew it could (pretty much) do what I wanted to do, but even with that, I almost gave up several times.

I think this one is a case of smart people working on something that works well, but only if you have some kind of inside-track, because it’s near impenetrable for outsiders.

I find the Waves plugins to be pretty shitty, as they are a grab-bag of unrelated UI/UX, with seemingly arbitrary names too. Again, powerful, but an impenetrable mess.

To this day, I still find Live very confusing. I know I should learn it, especially with M4L and all, but I open it up and it just looks like a mess to me, especially since it’s a pretty mature (entrenched) piece of software now. (I should really bite the bullet on this one though)

Oh, a non-music, but related experience. The move from Final Cut Pro 7 to Final Cut Pro X. FCP7 felt like Live to me, so many things all over the place, with such a weird (coming from music) workflow, that made it difficult to do anything. I’d have to google/youtube just about everything I wanted to do, before I could do it. FCPX, I feel, is worlds better in terms of interface/usability (though I’m not a “pro”), but structurally it is really weird (Events/Projects hierarchy).

Ack, that got quite ramble-y, but I wanted to chime in with some thoughts.


I found myself nodding along with much of what you said.

Curious, how do you like Logic Pro X, especially given your FCP7/X experiences?

I agree that Live is “weird” and I found Logic to be much more (ahem) logical. But the funny thing is that now that I know both DAWs really well, I often find it faster to accomplish things in Live (but not always, it depends on the task).

I find that once my mental model about how a piece of software works start to more accurately reflect the actuality of how the software works, the UI starts to matter a lot less. It almost disappears (as long as it doesn’t have major annoying usability bugs) and gets replaced by whatever is going on inside my head. So, I think UI design is really more for the novices than for the advanced users. A good UI clearly communicates a mental model that is going to give you good results. It tells you (or better yet, shows you) how to think about how the software works.

Introducing a monome grid to a musical workflow adds another wrinkle. As performing musicians we don’t necessarily want to look at a screen at all. The monome does a nice job of giving us a little feedback and taking in a lot of input, potentially making interaction with a screen largely unnecessary. Which leaves me thinking about the screen in monome apps a little differently. It becomes less about knobs and sliders as in most music apps, and more about simply displaying information. Ideally BIG info that is EASY TO SEE without squinting, so you can glance quickly, get what you need, and stay focused on the music.

In a complex UI there is always a tension between BIG display and BIG controls, with the opposing force being the wish to make functions accessible. Comparing Logic with Mainstage sort of emphasizes what I’m getting at here. Mainstage is optimized for live performance. Things are BIG and clear and stripped down in the performance view. But because they can’t predict which things will be important to you in a performance, they have to give you a complex authoring environment so you can customize what you see in the performance view. That authoring environment ends up looking a lot like Logic.

Kinda ramble-y on my part too, but I wasn’t expecting super clear/focused responses to this thread. Just kind of opening up a jam session on the topic so that we can get used to the sounds we make when we talk about UX.

Starting to formulate some thoughts about a user survey for The Party Van that might get at how folks are using it (and how they’d like to be using it). Would love to chat about it sometime.


was also chewing on this for a bit. it’s very hard to nail down this element.

honestly, these days, being able to have clear and clean integration with hardware is highest on my priority list. TPV is probably the best of both worlds for me, because it’s incredibly easy to remember the different on-screen sections while looking at my grid of lights and buttons. parc is also very easy for me to wrap my head around (as far as being able to understand cause and effect).

and I guess clarity around cause and effect is the most important thing for me to feel like I’m actually playing software.

now, I also have a pretty scant history with computer daws. I came to electronic production through iPad apps. THAT is a realm that made immediate sense to me, because (like all touchscreen applications/ports) the best instruments and envrionments were way more likely to be clear about cause and effect. the apps I’ve loved – Samplr, Nils, Borderlands – all ably root touch (and multi-touch) into the heart of the user experience. they provide immediate results and are built for touch rather than thought or much planning. now, these are sample-based tools or sound processors, but those are much more useful to me as instruments than a keyboard on a touchscreen.

thats also why monome has come to mean so much to me in such a short amount of time. it’s immediate and its best applications don’t require much screen feedback. not that I obviously mind working in front of a screen (obvi the iPad is just a screen, but the touch makes the difference), but I don’t want to have a passive relationship to a screen when I’m making music. I want it to respond to my input, as I respond to its output.

edit: realized @jasonw22 previously nailed the particular benefits/challenges of software in a monome environment. my sentiments merely echo his.


we were typing at the same time. I can’t click like on your post twice, so here’s another one: <3


thanks sir! if of interest, Erik Sigth is the designer and programmer behind nils and a few other instruments. really incredible work: http://humbletune.com


this is something i think a lot about.

one of my design goals has been about managing layers of complexity. there needs to be an immediately accessible, completely intuitive yet still useful layer at the top. ideally this would be sufficient for an entire “tool” itself, but then i’m much more interested in the layer below that-- which requires that you actually read or experiment or investigate possibilities. but upon practicing/using this somewhat more obtuse set of functionality there is not only a “i know all the hotkeys” sort of mastery, but a big reveal of the creative ideas and potential behind the original design. the more initially unseen opportunities a tool presents, the more interesting it is to me. because the tool then leaves less of its own imprint (ideally) on the work itself.

i know that drifted off topic a bit, perhaps more in the instrument vs. tool category.

closer to the original topic:

the apple trackpad is an truly amazing achievement. it’s tuned so perfectly. i feel like it anticipates my actions. this cannot be said of windows/linux/etc/etc/etc. it’s good software, made to accommodate a human/natural feel. on the other hand, the hotkey set in sublime text i couldn’t live without-- they’ve worked themselves into my subconscious to incredible benefit.

the look/feel of user interfaces drive me crazy. my pursuit of hyper-minimalism has gotten absurd. maybe start subthreads of good and bad UI opinions with just screenshots. oof. but also i’m that guy who seriously considers using a text based webbrowser every other month.


No, this is a really crucial distinction, and I’m grateful you brought it up. Note that right in the subject line the use of the term “UX” brings with it rafts of assumptions, and much of them are around optimizing usability for tool users. I tend to think of myself as a tool maker, and it’s really critical to emphasize that an instrument is not a tool.

I need to give that one a nice long think. Because what I’m really trying to ask is “how do we make a really effective UI for a computer-based musical instrument?”


So, I’m going to go a bit tangential for a moment.

Isomorphic music keyboards vs. traditional piano keyboards. Sooooo much easier to learn how to play music on an isomorphic keyboard.

But is it possible that the fact that every key signature is played using different fingerings on a piano, while proving to be a significant obstacle for new piano players, does it actually help advanced piano players in some way? Does the different feel of different hand shapes provide some muscle memory that reinforces the changes in sonority that accompany it? Are there other subtle mental benefits from the awkward arrangement of keys?

Or are isomorphic keyboards simply a better design, and now that they’re possible thanks to electronics, the piano style of arranging notes will gradually fade into history?

We may not be able to answer these questions here, but what I’m trying to tease apart are a few things: the distinction between novice and advanced use, the degree to which mastery is simply familiarity, and whether an instrument (or a tool, perhaps) can do anything to help the novice achieve mastery (or is it all on the human to simply practice)?


I wanted to chime and talk about how instruments can be made to have these easily traversed layers of deep functionality.

Take the piano for example: After a few seconds thrashing about, one very quickly discovers that things ‘sound good’ if you only play the white keys, or only play the black keys. You can experiment and play within the white key paradigm. You’ll instantly have the sense that you’re ‘making music’, and will likely be encouraged to play further. Certainly this beginner has no formal understanding of what they’re doing, but they were able to sit at the instrument, make enjoyable sounds, and start learning through practical music making.

As one learns more and more, or perhaps even having a knowledge of what more can be done, this simple messing about with the white keys is ridiculous and unsatisfying. Yet the functionality remains, and is always there as the user continues. Decades of experience don’t remove this basic use, and yet the player has traversed many plateaus beyond.

I think this is a great way to look at computer design- Interfaces & interactions that can be seen at a basic level, but with top-level complexity that can be easily ignored while the user learns the tool / instrument. Of course this is incredibly difficult to define, but there is likely some ways to handle it.

When @mfelix & I were designing mlrv2 we were largely operating under the assumption that folks already understood what mlr did, and in many cases would already have used it’s predecessor. While that means mlrv2 isn’t the most accessible, it presents the ‘advanced mode’ idea as a potential solution. ie. Have the base version that is more easily understood, then introduce a number of enhancements and details. This tiered approach isn’t the best of course, and it would seem better to have all the details be immediately available but attention drawn to the basic functionality at first.

Another approach I’ve put a great deal of thought into, is that of a self-learning interface. An instrument whose controls are restricted to ‘classic’ sounds but are gradually expanded with use and practice. As one spends more time exploring certain regions of sound, the parameters themselves open up more possibilities in those regards. This kind of mutating system makes a great deal of sense to me, encouraging a user to take ownership over their instrument, having to practice and explore to find new sounds, all the while learning how to interact with the device in the process. Implementation is obviously a whole different story though, and I don’t imagine we’ll see anything like it for at least a few years.

In conclusion, I think my main desire in a tool that I can learn new ways in which to interact with it over a years-long period of time. I feel the guitar is an instrument like this for me. I first learnt open chords and never moved about the 4th fret, then moved onto barre chords, and in recent years have directed my attention at broader fretboard knowledge - learning to play scales across different positions. I now interact with my instrument in a different way and see it in a more holistic way, which I could never imagine even a few years ago.

Simple instruments that reveal themselves over time.


In my experience, every instrument / tool with which I have a deep connection has required several shifts in my mentality toward them. Generally these shifts come about through a combination of non-playing thought & conversation, a great deal of practice, and even moreso use. Mastery can certainly be assisted, but the time it takes is highly correlated to the depth of the tool.


I like Logic Pro X. It obviously looks more FCPX-ish, but layout wise, it’s not too different from Logic9. Both of these apps clearly have a different design team than the rest of the Apple stuff though, as everything is uber shiny, and increasingly skeumorphic. It’s like they weren’t in the day that Apple decided to “go flat!!!”, and just kept going in that direction.

Bigtime this for me, especially given how I normally perform (an instrument + electronics), I have enough shit on my plate to deal with looking at a screen too. Granted something like Mainstage('s performance view) works well in context. A big motivation for TPV2 UI revamp is that as the TPV big bigger/busier, I rarely stood back and thought about general structure, and just kind of made room for new things where I could fit them (rather than where they should/could be).

Bigtime agree here too, though it’s probably worth pointing out that the potential and the intention of the original design are not necessarily the same thing. Which is itself very exciting, since as the tool/instrument makers, we can still peel back to layers of nuance and understanding that we didn’t (and often couldn’t even) conceive.

The traditional piano keyboard is a real bastard, and I’ve spent almost my entire life playing one. There’s lots of reasons why a different layout would be better, but ubiquity is a bitch. Same goes for qwerty vs dvorak etc… Until another paradigm comes along, those are pretty set I think. That being said, I don’t use a traditional keyboard in my music making at all, nor do most of the people I work with, so the idea of a layout that’s optimized around the considerations of a fixed musical system (12 tone equal tempered tonal music), isn’t especially attractive to me. Like pencils that only write on certain kinds of paper…

Also @tehn’s comments about depth/layers got me thinking about my ciat-lonbarde instruments. When I first got my Sidrassi and cocolase, I couldn’t make heads or tails of them. They are weird (in the utmost sense of the word) as shit, but interesting enough that you change yourself to understand them. You learn the completely unique language/layout/syntax/ideas/structure, and then start doing all the work beyond that to discover the layers.

For example, a drum machine like this. And this being the manual to a “synth”. Aaaah.


I’d really love to own a ciat-lonbarde instrument, but I have no obvious way of choosing one over another.

Totally fascinating though.

I’ve been loving the Monome Notes app, playing around with different ways of setting up an isomorphic keyboard. Been thinking about such things for many many years, and it’s really nice to finally have a way to very quickly and easily try all those ideas.


This has been one of the most satisfying elements of being an instrument creator, both on a personal discovery level, as well as seeing the creations or alt-utilisations of others.

Perhaps this sheds some light on UX design though. Creating an interface where the outcome is not defined by the controls. Focus is on a toolset with a primary interaction method and the types of outcome, but not the specific one in mind.


About keyboards for a moment.

The conventional piano keyboard/action is the result of centuries of evolution, of several different kinds of instrument (fretted, plucked and struck), with a lot of dead ends along the way before we got what we have today. And the expressivity of the modern piano keyboard is not due solely to its layout (which reflects a simple tonal structure), but also to the complex mechanics underneath the keys.

Now isomorphic keyboards may be fine in their way, but can anyone play the 48 on them? or the Hammerklavier? And how could you possibly implement the extraordinary expression of the modern piano mechanism in an isomorphic layout? Plus - there are many generations of established musical practice using piano technique which simply would not apply to other layouts and mechanisms - so you’re talking about a whole culture of practice as well, handed down by generations of teachers and students.

You’re really talking oranges and apples when comparing these two approaches. Not that I’m against it on principle - heaven forbid! - but the isomorphic keyboard is really no closer to the piano keyboard than a guitar fretboard imo. It will certainly never replace the piano layout and mechanism.


Maybe I’m not the only one who occasionally fires up ‘mutt’, thinks “It would be really cool if I could do all my email from here” and then balks at the issues about attachments, images and weblinks, and goes back to my pleasant but not quite minimalist enough ‘Airmail’ for OSX.


[quote=“tehn, post:10, topic:2772”]
the apple trackpad is an truly amazing achievement
[/quote]it’s truly an engineering and UX masterpiece

I would never be able to explain the functional differences that elevate it above similar touch surfaces and controllers. The sensitivity, feel, and responsiveness are perfect for whatever task I need to accomplish (daw edits, web browser tab hopping, app param tweaks, etc).