Please, I need some help figuring out the following: I need to delay gate signals by a minimum of 140µs, plus a yet unknown amount of time I will find out later (please background information below), with the lowest amount of jitter possible.
In total I need to simultaneously delay four different gate signals by the same (ideally identical) delay time. I would like to use two crows to do this, so that each crow handles two gate delays. This means I can’t set the delay time via CV, but need to enter it into the script, which is fine.
I have no coding experience with crow, so I don’t know
a) how to code this gate delay to minimize the delay time jitter,
b) if crow’s delay time resolution can be set in µs, and
c) how large the potential jitter of this delay time would be.
Thanks much in advance for any help!
I use the ER-101 sequencer, which I really like. It sends one gate signal and two control voltages per step. But it sends the gate signal of a step before sending the two control voltages of this step.
This can cause glitches when e.g. using a step’s control voltage A to select a sample or change the synthesis model in a module, as the module starts playing a step with the current sample/synthesis model selected, then switches to the new sample/synthesis model.
To avoid these glitches I’d like to delay the gate signal, so that the module has time to load the new sample/synthesis model, before starting to actually play it.
I measured the time that elapses between the start of the gate signal and the start of the control voltages. This time seems to vary between 44µs and 144µs (please see pics below). So I figure I should be fine if I delay the gate by a minimum of 144µs, plus a yet unknown amount required by the module to load the sample/synthese model.