MIDI 14-bit CC decoding help

I’m trying to bring 14-bit MIDI CCs into Unity, but my MIDI library doesn’t natively support 14-bit CC’s. I’d like to take the pair of 7-bit numbers and do whatever bitwise operation is necessary to decode them back into the original 14-bit value, but I am kinda terrible with bitwise operations and I’m also struggling to find a resource online that clearly shows how it would work.

This seems like it ought to be easy for a seasoned bit shifter. Can anyone help me out?

Convert both to a type that’s 14 bits or larger, bit shift the high bits left 7 bit and logical or with the other, e.g: (int)high_bits<<7 | (int)low_bits or similar.

1 Like

(MSB << 7) + LSB

You should have two seven bit values, you will get the Most Significant Byte from one CC channel and the Least Significant Byte from the other.

Hope this is enough to help you progress?

1 Like

Actually I realized that it was much simpler than I thought… just needed to multiply one number by 128 and add it to the other. It’s working! Thanks for the help though.

1 Like

Just for posterity - these mean exactly the same thing :slight_smile:

4 Likes

Probably non-applicable with today’s CPU architectures and compilers, but traditionally bitwise operations used to be significantly faster than multiply operations, which is why you see bitwise operations all over the place in graphics/DSP C/C++ code.

Just for kicks, I put these operations into godbolt and it seems the gcc compiler is smart enough to make them make them both use arithmetic shifts (sal) in x86-64 assembly.

3 Likes

Right and I also think bitwise operations are more “idiomatic” for this type of thing, although for this simple use case it’s six or one half dozen.

1 Like