Has anyone used csoundapi~ in pure data? i can’t see anywhere online that explains how to send multiple score statements as an event? Referring to the screenshot what score statements are current split into separate events I can’t combine into one?
I’m trying to recreate the set up from this video: https://www.youtube.com/watch?v=JX1C3TqP_9Y&t=191s
where puredata is sending values via OSC to a csound instrument however, instead of pure data im using a python script.
I’ve uploaded three files:
- Hello.py which is a python script from the grid studies examples that indicates button presses with led responce.
hello.py (342 Bytes)
- simpleclient.py which uses python-osc objects to send values to the variable “/kfreq” (which is links to the csound file…
simpleclient.py (719 Bytes)
- …recieveosc.csd containing a simple oscili instrument which should have the frequency change due to the values received from python.recieveOSC.csd (327 Bytes)
I was hoping someone more experienced with the python language could demonstrate how I take the values which hello.py receives from the monome and adapt it using simpleclient.py to assign perhaps certain frequencies to buttons from the monome to influence the csound instrument.
I appreciate i need some more experience with python perhaps, but im sure I could learn from an example.
I’d recommend using
aiosc module instead of
pythonosc, just in case you ever need full-duplex communication. Then you can basically do something like following in you
import asyncio import monome, aiosc CSOUND_ADDRESS = ('127.0.0.1', 9000) class Hello(monome.App): def on_grid_key(self, x, y, s): self.grid.led_set(x, y, s) if s == 1: freq = (y * self.grid.width + x) * 10 asyncio.ensure_future(aiosc.send(CSOUND_ADDRESS, '/kfreq', freq)) if __name__ == '__main__': loop = asyncio.get_event_loop() hello_app = Hello() asyncio.async(monome.SerialOsc.create(loop=loop, autoconnect_app=hello_app)) loop.run_forever()
This is great, it works very efficiency, thanks!
I’am trying to work out the protocol your using (if thats the right work)?
For example, I guessing self is a class? does it marry up to the monome serial protocol which is how i’ve previously got values from the monome?
Where can I found out what other functions are inside self?
I currently need to be able to toggle?
self is the instance, not the class (i suggest reading the Python tutorial, before you go on). To have toggle functionality, create a data structure to keep the buttons state. There is no documentation on pymonome as such, but the code is pretty much self-describing, make sure to have a look. Hope this helps!
has anyone spent much time with csound ?
I need to playback long samples meaning that its preferable its played from disk. With that I need the access to hop to different parts of the sample and to change the time and frequency independently? Are there suitable opcodes to achieve this?
diskin2 reads files from disk and has a kpitch control; it behaves like varying the speed of a tape.
sndwarp allows independent time and pitch manipulation but reads from tables. Depending on what a long sample is for you, and how much RAM you have, it might do exactly what you want.
I have one 1gb ram as I’m using the rip3 so I guess…The limitation with diskin2 is I can’t change the loop start and end points while the intrument is running to achieve this https://vimeo.com/295006.
I’m trying lposcil as this does receive performance values for the loop start and end as well as pitch but it loads the audio into a table. I havent got round to trying multiple tables yet but this might be where the struggle is with the Raspberry pi 3
Just thought I would through out the question out. Has anyone built a mlr type application in csound?
My monome is connected to the Raspberry pi 3 on raspbian lite