Looks very nice. I also felt that I didn’t want a whole rack of video gear but still wanted to play with analog video. I have the Discret11, FFG and the CBV001S. I have all of them in a row and there is much hope that I will have time soon to really play with them. They are very nicely made with clean 3d printed cases and simple power supplies. Each one is a unique thing and you can patch them however like guitar stomp boxes. I had them shipped from France at 3 separate occasions and each time they came quickly with perfect packaging and tracking that actually worked. I highly recommend Bastien’s creations - he also makes video euro things and all his prices are pretty low considering LZX or others.


1 Like

Hi!
A few days ago I watched Alex Pelly’s workshop on Perfect Circuit and it blew my mind. I’m pretty familiar with eurorack and modular audio synthesis, and would like to learn more about analog video synthesis. I’m interested in basics like oscillators (and how do they translate to video), sync, colors, basic modules etc.

Does anyone have any resources they can point me to?
I’ve exhausted my googling skills without much luck. I’m gonna start reading some video modules manuals but not sure they would explain the basics…
Thanks!

I got a lot out of this workshop from Stromkult

https://www.stromkult.com/workshop-video-synthesis-stephane-lefrancois-lzx-industries-visual-cortex/

lots of basic things like what audio modules you can use with video and how to hook things up.

1 Like

Thank you! Will check it out.

I’ll take a stab at this but it will be general and quick as seeing it would be better but…

Video oscillators run at a much higher frequency then most any audio oscillators go and if you listen to it it’s more like dialing up aol or some wierd digital garb. They also operate between 0-1v as that is a video standard.

In basic when you sync an oscillator you can sync it vertically or horizontally and your frequently is one bar to many bars so your waveforms are the sort of snaps of each bar. An unsunk horizontal oscillators can do nice scrolling through the frame but unsunk vertical oscillators is very erratic and a very slow oscillator looks like a slow full screen fade up and down. And for color it is just black to white in shades or gradients as you add color other ways.

I would watch all the videos on the lzx YouTube and especially the ones where they go through 3 patches and show you how they build them.

1 Like

Thank you for the explanation.

I feel like I’m missing some basic piece - how does an oscillator get translated to an image?

1 Like

you need a core module like a visual cortex or chromagnon. They have many functions but one is to take an oscillator output and visualize it through its output (composite/component…)

so the visual cortex for instance has two channels each have R,G,B inputs
so you plug one oscillator output into R for instance
that would show you the pattern in B&W because the same image is going to all three RGB channels
if you plugged different oscillators into R,G and B then you would get some kind of mix

or use something like a TBC2 to take external video and translate it into LZX RGB standard outputs and then patch within a system
check out the LZX forum lots of good into over there!

That helps, but I meant on an even more technical level.
A wave is an oscillating voltage right? what are the mechanics of translating that voltage to an image? Does the voltage at a point in time translate to a pixel with certain opacity in a certain location on screen? What is the mapping between time and location?

I feel that that kind of low level understanding helps me be more purposeful with patching.

I’ll definitely check the LZX forum, that’s a great idea!

From a “theory” perspective, here’s a diagram that may be helpful (Principles of Television Engineering by Donald G. Fink, 1947).

For a monochrome image, you have a voltage signal corresponding to brightness. With the original technology used in analog TV, this voltage signal would directly modulate the intensity of an electron beam. The TV would contain two sawtooth oscillators which drive solenoids (electrically controlled magnets), the horizontal and vertical deflection solenoids. These solenoids would generate controlled magnetic fields that deflect the electron beam to hit a specific point on the (phosphorescently treated) screen.

The “horizontal” sawtooth oscillator controlling the horizontal deflection solenoid is set to a high frequency, scanning from left to right across the screen thousands of times per second – each such pass is called a “scanline”. The vertical deflection oscillator runs more slowly, at 50 or 60 Hz depending on what television standard is used in the country the equipment was made for. Each pass of the vertical deflection oscillator is called a “field”.

When you feed in some signal to modulate the intensity of the beam (the brightness), that signal needs to be synchronized with the cycles of the horizontal and vertical deflection oscillators. This is where sync comes in – horizontal and vertical sync pulses are transmitted along with the intensity modulation signal, and these sync pulses are used to reset the phase of the deflection oscillators, exactly as a sync input works on many oscillators used for audio synthesis. When the signal you are using to modulate the beam intensity is also being generated by an electronic oscillator (LZX Prismatic Ray, Cadet IX, other “video oscillator” modules), you may wish to also reset the phase of that oscillator so that your Prismatic Ray or whatever gets its phase reset at the same time as the deflection oscillators do. If you don’t reset the phase of your “video synthesis oscillator” with the horizontal or vertical sync pulses being fed to the display, then your oscillator will not display as a still image: for instance if you don’t sync the video synthesis oscillator to the vertical sync pulses, the oscillator will appear to scroll vertically. Without horizontal sync, an oscillator running at horizontal scan frequencies (a few kHz up to 10 MHz) will generally have an unrecognizable waveform.

This is how it all basically works for monochrome images. For color images, “how color is encoded” depends some on the display format (composite vs YPbPr vs RGB) but video synthesis systems are designed to let you patch with the individual color channels as though they were independent monochrome signals. (In the “glitch” domain, there are devices which deliberately interfere with the color encoding and/or sync to produce interesting visual effects.) And of course non-analog displays like LCDs don’t work like this at all, but may include some circuitry to translate from a standard analog format like a composite input to a digital representation appropriate for driving the display.

11 Likes

@csboling’s explanation is much better than the one I just wrote :wink:

That’s a great explanation! Thank you for taking the time to write it.

I hope everyone is getting a chance to check out Vidicon a free online video synthesis conference!

check out the agenda above lots of stuff for the rest of the weekend if you missed yesterday

we are playing Saturday Nov 14th at 1:00 PM

here are some pictures of us getting ready for our talk/performance

and a video preview

We will be discussing combining physical objects with a video synthesis system. The flexibility of creating your own already visually interesting content to then process further allows for much smaller systems to be more than viable.

come join us!

6 Likes

Do you know if there’s any recordings of yesterday? Definitely some talks I’d like to see!

1 Like

things are being archived here I’m not sure how quickly it will happen for all the talks but it looks like a couple are already up
1 Like

Here is a thread about the video synth patch book (wip) we just put out.

enjoy :slight_smile:

5 Likes

wow well done, thanks!

1 Like

I’ve been playing around with Cathodemer and really enjoying it. It’s a bit buggy and slow, and the documentation is terrible, but for $20 it’s a ton of fun. It can rout audio amplitude to modulate any of its parameters, and it supports MIDI, which I haven’t experimented with much other than confirming it could receive signals from my Digitakt but it seems like it would be really powerful under MIDI control.

I put this little video together this weekend using it. It mainly uses Cathodemer’s distortion effects–the oscillators are just being used to create the screen flicker and scrolling bars.

2 Likes

i’ve been using cathodemer a lot over the last few months. it’s a lot of fun, and i’ve gotten some cool results, but you’re absolutely right that the documentation is terrible.

how do you route the audio to modulate stuff? the last time i clicked on anything that looked like it would do that, my ears got blasted with noise, so i’ve been hesitant to try again.

Oh, you must have clicked on the button that turns the video output itself into audio and makes a bunch of terrible popping sounds.

It should pick up audio from whatever your default audio capture device is, so I take my mic and point it at a speaker and play music into Cathodemer that way. You can also set up internal routings using virtual audio cable programs if can figure out how to use them but I had trouble with that. But all you have to do is pick a parameter you want to modulate, right click it and select Arm to Audio Left or Right. Then go to the Extras tab and change the audio input gain, the base amount (which overrides the dedicated knob for whatever parameter you’re controlling), and the amount that the incoming audio will change that parameter. You can also set it to change based on the amplitude or frequency of the incoming audio, but so far I haven’t been able to get the frequency mode to do anything.

2 Likes

If you are on a Mac you can use this to route audio from any other app internally.

1 Like