I’m trying to bring 14-bit MIDI CCs into Unity, but my MIDI library doesn’t natively support 14-bit CC’s. I’d like to take the pair of 7-bit numbers and do whatever bitwise operation is necessary to decode them back into the original 14-bit value, but I am kinda terrible with bitwise operations and I’m also struggling to find a resource online that clearly shows how it would work.

This seems like it ought to be easy for a seasoned bit shifter. Can anyone help me out?

Convert both to a type that’s 14 bits or larger, bit shift the high bits left 7 bit and logical or with the other, e.g: (int)high_bits<<7 | (int)low_bits or similar.

Actually I realized that it was much simpler than I thought… just needed to multiply one number by 128 and add it to the other. It’s working! Thanks for the help though.

Probably non-applicable with today’s CPU architectures and compilers, but traditionally bitwise operations used to be significantly faster than multiply operations, which is why you see bitwise operations all over the place in graphics/DSP C/C++ code.

Just for kicks, I put these operations into godbolt and it seems the gcc compiler is smart enough to make them make them both use arithmetic shifts (sal) in x86-64 assembly.