i’ve struggled to understand how different software could possibly perform summing differently. assuming it really is just summing and there isn’t, i dunno, some phase distortion going on a… panning section? hard to imagine.
the only kind of theory i can come up with is, inverting the question, that any active analog circuit introduces some imperceivable amount of noisy time decorrelation. and analog summing is just “better” because each channel gets decorrelated differently and as a result their signals are more independent and “real.”
this wouldn’t explain some particular DAW having “worse summing” than others unless they are doing something kind of crazy.
(but, of course i believe that something is going on, people aren’t making it up, i must be blissfully ignorant of whatever it is)
(isn’t dithering a kind of decorrelation algorithm?)
(as in, @madeofoak i wonder if dithering each track before summing in SW, would make a perceptual difference? thought experiment)
(or dammit, i want to try some form of dithering that consists of randomly delaying each sample value by a tiny fraction of a sample, then sinc interpolating back to the sample rate… i’ve nerdsniped myself)