Ok, tried a bit more different things - every voltage input on Ansible freezes it in Cycles, Levels and Kria.

And sometimes, not always, when it is powered on the voltages of Cycles are stepped.

And CY.POS still has a range from 0 - 64 not 0 - 255. BUT, reading the values gives a range from 0 to 255, which is a bit odd.

Using CY.POS 1 still produces LED glitches on all cycles. The glitches follow a pattern: It’s always one cycle that shows glitches and they change in a row on after the other from left to right…

At least it seems that i2c is working better even with metro scripts.

@tehn As teletype remote seems to work better now but Ansible is not working anymore with the new firmware I am about to reconnect Earthsea and Meadowphysics to Teletype again to try remote control or do they need a new firmware update too to take advantage of the new Teletype version?

not entirely sure why (haven’t looked into how interrupts are handled) but i’m able to fix the issue with ansible freezing when something is plugged into any of the inputs by changing UI_IRQ_PRIORITY to 3:

this also matches the aleph version: https://github.com/monome/libavr32/blob/master/conf/aleph/conf_tc_irq.h

@Leverkusen could you give this a try? ansible.hex (205.5 KB)

it should fix freezing when using the inputs. tested it with both Cycles and Levels, both inputs seem to work (while i also have 10ms Metro script doing L 5 8 : CV I CV I). haven’t checked any of the other issues.

in regards to other modules - yes, as part of the fix is in the libavr32 shared by all modules their firmwares will need to be updated as well, probably makes sense to do that once we can confirm ansible is fully working.

1 Like

afraid it’s still a no go… seeing some weirdness (missed CV and trigger updates) and freezing when trying different variations of ansible CV and trigger updates from Metro script. in some cases it’s just the ansible that freezes, in some cases both. so far i don’t see a pattern. have to stop for now, will continue investigating tomorrow.

1 Like

thanks for the reports. back to it tomorrow.

1 Like

Wow. Some serious improvement in the state of things. Excellent work thus far to all!

I’ve been testing this for an hour or so and here is what I’ve found.

  • I’m seeing the same problem as @Leverkusen with the Ansible IN jack; if I plug anything into input #1 the thing is a doorstop. Take it out and reboot and it is back in business. The behavior goes away with @scanner_darkly’s firmware from above.

  • Metro reads of Ansible or my expanders don’t immediately crash like before.

  • I can execute simple rapid reads from both Ansible and the TXi expander (to the fastest metro rate possible) without locking up the Teletype immediately

  • Running simple reads for an extended period of time will sometimes result in a locked up Teletype (with Ansible or TXi)

  • More complex combinations of reads and writes on the Metro script will still lock the Teletype (read from II input and write to another); I’ve been able to verify this with the expanders, Ansible and both together. The lockup happens nearly right away.

  • Multiple reads in one Metro event will lock the Teletype; this fails for Expanders and Ansible. For example: L 1 4 : CV I CV ADD I 4 will work for a bit (with manual sets of the Ansible CV values from the console) and then the Teletype will lock up.


I got my scope out of storage (forgot I’d put it there). I can wire it up - but I don’t have the capture software like with @scanner_darkly’s R2D2 unit. I’ll keep banging around on behaviors for a bit.

THANKS ALL!!!

2 Likes

Another interesting thing. While an edge case - it might point to trouble areas.

I’m able to run this metro script until the cows come home:

L 1 16 : TO.TR.PULSE I

The cows are also happy with some more blinking lights:

L 1 16 : TO.TR.PULSE I
L 1 4 : TR.PULSE I

But, if I add this into the mix:

L 1 16 : TO.TR.PULSE I
L 1 8 : TR.PULSE I

It will run for a while blinking a crap-ton of trigger lights - then the TT will lock up.

The cows, apparently, get angry when someone does that.

2 Likes

fixed:

TC was getting defined in the UI irq in the init code. fixed in TT as well.

i’ve been hitting ansible really hard with ii commands in a fast metro without a single crash.

here’s newest tt:

i’ve been wrong before, however. but this is seeming good so far.

let’s hope!

4 Likes

Torture Test 1: Outputs

24 9ms Triggers
24 Ramping CV Values
M 50
Been Running for 30 Minutes

Holy cow! This is awesome!!

Moving on to read tests next…

6 Likes

Yeehaw! Can’t wait to try this with Just Type and see if it’ll run for more than 30 seconds! (Almost certainly won’t due to existing errors on my end heh…)

3 Likes

Torture Test 2: Reads

M 50
Write to CV 5
Human Centipede Reads to CV 6-8
Read Across to CV 1-4

Lasted a second or two:

Expanders exhibit similar behavior with multiple reads. Single reads also lock up randomly - but last longer.

It’s getting close - but not there yet with reads.

Test Patch:

A WRAP ADD A 1 0 10
CV 5 V A
L 6 8 : CV I CV SUB I 1
L 1 4 : CV I CV ADD I 4
1 Like

i did try changing IRQs in TT as well yesterday but of course forgot it has its own init!

will give this a try shortly. one more thing to try is increase number of rx buffers, i was seeing some behavior that indicated 8 was still not enough (with a loop of all 4 CVs it was only updating 1, 2 or 3). let me give that a try. have a feeling it might still be a good idea to consider protecting buffers that haven’t been processed yet, better to skip commands than to have them execute with corrupted data…

1 Like

i’ve been running this now for 3 hours uninterrupted, with heavy read and writes in a metro at 20ms.

@bpcmusic might be suspecting the expander code at this point. running your test patch starting now, working fine for several minutes now

off to fix ES/etc

2 Likes

TT scene still running about uninterrupted, with ES hotswaps on the ii cable, additionally clocking es sequences at 20 ms!!!

2 Likes

Installed the 1.3.2 Release HEX on my TT and tried my same test above with Ansible. Froze in 2 seconds. Same behavior as what I saw with the merged codebase into my TELEX branch. Hmm.

For this test I left my expanders connected to the II bus. All six of them (four output and two input). I’m going to try a test with them disconnected. Perhaps bus resistance with that many devices is a problem for reads? Perhaps the Teensy are doing something weird on the bus that affects reads (even though, in this test, the Teletype doesn’t even know they exist).

If it works - I’ll report back. Then I’ll break out the scope (after grabbing some grub).

1 Like

running the following script (the first line is to have visual feedback in the same script that reads/sets CV):

L 5 8 : CV I SUB 16000 CV I
L 5 8 : TR.PULSE I

and hitting both inputs with fast triggers (clock outs from WW&MP at max speed) managed to get it locked within a couple of minutes. going to try with more buffers now. M was 10 ms.

edit: still locks after a min or two. going to try buffer protection.

2 Likes

did you also update ansible?

1 Like

Yes; both are running the latest precompiled versions now.

I disconnected the other devices on the bus (6 expanders - leaving the Ansible and TT) and now the same Ansible read/write script I was testing above looks to be stable. I’ve been running it for a few minutes and it hasn’t locked.

In summary:

  • TELEX Branch and Latest Ansible w/ 8 Devices on Bus = Lockup after 2s
  • Latest Teletype and Ansible Branch w/ 8 Devices on Bus = Lockup after 2s
  • Latest Teletype and Ansible Branch w/ 2 Devices on Bus = No Lockup; Waiting for Cows
  • TELEX Branch and Latest Ansible w/ 2 Devices on Bus = No Lockup; Waiting for Cows

Based on those results, I’m thinking the options are:

  • TT’s Pullup Resistance not Sufficient for Reads with the Device Count
  • Teensy’s are Making the Bus an Unpleasant Place for Writes
  • Too Much Cable Between Devices Causing Issues

Out to grab a nibble now, but I’ll start playing around with the variables to see if I can more specifically identify the source.

2 Likes

does the teensy have pullups?

tt is the only device on the bus that should have them-- 10k. this value may need to change with more devices added

1 Like

The Teensy’s pull-ups are disabled in software. I was testing with an extreme number of modules. I’ll do a test with just a couple expanders attached when I get back from lunch.