This sounds like the crux of the biscuit here

Also listing “requires Passerby” in the requirements of the Library posting would be helpful.

1 Like

I don’t understand adopting a procedure that will virtually guarantee unnecessary duplication of engine files. Engines and their supporting libraries are essential to most norns scripts.

1 Like

I believe they are doing this so that authors of the engines don’t need to make pull requests to monome just to make small changes to the engine.

if you didn’t write or mod the engine don’t include it in your script’s repo, just point your script at the engine.

1 Like

concrete example: i write a script. there are three possiblities:

  • doesn’t use an engine. nothing to do.
  • uses an engine from we or from the default installation. nothing to do. (that’s you @carvingcode; i think pretty much all the engines that exist today will be in the default install.)
  • uses a custom engine. put it in my local lib; it gets installed with the script; done. if lots of other people want to write scripts around my new engine, maybe we ping monome to add it to the next update installer.
  • relies on an engine Foo that isn’t included in the default install for whatever reason. this is the one that sucks.
    i have two options:
    1. make it clear that people have to install Foo
    2. copy and rename Foo.sc to FooBar.sc and include it in my package.

your suggestion to put all .sc files in a single, separate location doesn’t actually solve the problem and means projects with custom engines can no longer be easily-installed, self-contained packages.

(for what its worth, i was a partisan of the “put all engines in we and require PR’s” camp, voicing quite similar concerns to your own. this plan was rejected for exactly the reasons hypothesized by @rdfm above.)

i can think of no ways to solve this problem other than to totally throw out SC or maybe implement a full-blown package dependency manager.

i do realize this.

8 Likes

@zebra do you know of a way to git pull the whole code folder at once?
I know this might not make sense for dev branches and whatnot, but a quick update of everything at once would be pretty convenient

no, there’s no way. that’s the problematic aspect of the fully-decentralized plan.

but i agree. my hope is that most or all project developers will be hosted on git and someone can maintain an aggregate meta-repo using submodules (or something.)

The issue today is that we’re still in a state of beta flux and the default installation has not yet been finalized or fully documented. Thus causing a bit of a moving target for script writers and testers.

Apologies for causing confusion or any uproar as I’m pulling the dev branch every day to test changes and that process is out of sync with script writers using the last beta release.

2 Likes

i don’t disagree. and @carvingcode i’m sympathetic to your concerns too. the system is confusing today and things are not where they are expected.

currently most engines that were in the old dust repo are presently in the we repo.

old (don’t clone this) : https://github.com/monome/dust/tree/master/lib/sc
new (do clone this) : https://github.com/monome/we/tree/master/lib

the engines that are are missing have migrated to their own repos
[https://github.com/tehn/awake/tree/master/lib]
[https://github.com/markwheeler/passersby/tree/master/lib]

(so @okyeron to answer your question, i think you can just clone those right now - we, awake, passersby, and be fine.)

and i think our next steps (ahead of release?) should perhaps be:

  1. encourage people to post in Library; as new projects are posted; prune duplicate engines from we
  2. set up a tracking repo that just includes submodules. for people who would rather use git than use the forum, making it easier to clone ~/dust/code in one go.

does that seem reasonable?

5 Likes

Can’t the people who want to use git clone the individual repos directly?

1 Like

sure, but they aren’t mutually exlusive. right now there are 3 repos, but later there will be 10 or 20 or whatever.

release is Thursday.

the installer will include all of the engines in v1

the docs will be done. see the norns2 branch of monome/docs if you want a peek.

I apologize for this multi day intermediary confusion. I wanted to kick off Library just ahead of release so people could immediately share/gather

(apologies, I am on my phone. near computer again later this afternoon)

15 Likes

my $0.2 worth: this is going to run into all kinds of versioning issues (just based on years of experience).

For what it is worth my suggestion would be - if you use an engine you copy and rename it locally - and you are responsible for pulling fixes from the original author(s) - at least this way the user of an engine bears the pain. Currently I foresee a lot of “I downloaded this script and it doesn’t work” :frowning:

anyway - on to more positive things

this has all spurred me to get on with my code - proper releases of Kria and Islands coming very shortly in the library thread. Must say the stable wifi has led me to move out of screen/vim and into Maiden - great work on this update. Looking forward to playing with soft cut too and seeing what it can add

5 Likes

Not if script authors can use a known path to engines.

When I did development not too long ago, a pretty common practice was to include libraries from external servers. (Thinking jQuery, etc.). Reason (among others): let the developer of those libraries maintain the code and updates, while we simply used them if they did what was needed.

Sure, if we made significant changes to some class/function/etc, we either created a special library, or overloaded a class, etc. I’m unsure why this (use of engines and their supporting libraries) is different.

so I don’t want to get into great debate about this - most of us here have been developers for a very long time and know the pain of all this. There isn’t a good answer - otherwise the whole industry would never need to talk about it endlessly

however - if you start sharing code over time you quickly need to invent versioning. Someone needs to for k the code - upstream or downstream - unless everyone is happy to constantly update their code. And already that’s clearly not the case - I bet there are still things that authors are thinking “I guess I should move it to 2.0” or aren’t even aware 2.0 is out yet

anyway - I’m not going to solve it and it’s not my pain so I should shut up

3 Likes

one last rant on this for now.

to me, your JS example illustrates almost the exact opposite point. if i build a web app that uses jquery, i will either serve the minified lib myself, or pull a specific version from their CDN - effectively the same thing as using a local copy. (regardless of whether its cached in the browser, on my server, on their server, on disk, whatever, the principle is the same: the app writer has complete control over the libraries that are served; they aren’t just pointing at “the latest version” of the lib, b/c that’s how stuff breaks. it’s why jquery &c have strict semantic versioning and multiple versions hosted all the time.)

just look at the panoply of versioning management and deployment tools built up in javascript world around managing library dependencies; it’s complex.

still, something like that would be fine here, and is totally fine for “engine libraries” - by which i guess you mean the lua ‘helpers’ that wrap parameters for passerby, &c - or for any other libraries that are pure lua. the analogous strategy here would be for your script repo to pull in a submodule - which is effectively a copy of the source library frozen at a specific version - precisely analogous to pulling a specific lib version from a CDN, with or without a deployment/packaging tool.

(again: i kinda pushed for not having .sc classes in “third party” repos, precisely so that people could have more flexible deployment strategies for dependencies. but it’s a tradeoff b/c then you have potentially tightly coupled components in different locations! there is no clear way out of that.)

the only real “difference” is that we simply don’t have fine-grained and dynamic control over where SC looks for class definitions, so we just can’t do something like that for actual .sc classes. (without more work.) it is definitely a bummer. [*] i outlined two strategies for dealing with that above. there’s a third approach which is switching from requiring all SC code in .sc classes to providing “factory” classes that you customize at runtime instead of compile time. all three are on the table for the next major update. all three are heavy changes that we should undertake only with due consideration of actual issues - not hypothetical ones.

this update is mostly focused on 1) the jack client changes (which are huge), 2) getting people used to hosting their own scripts, and 3) some huge improvements to stability of infrastructure like networking, which makes everything easier and gives us the freedom to make more seamless file management tools in 2.x.

again, if script writers are only using existing engines, or only making their own engines, none of this is going to be any kind of issue whatsoever.

for now i would love to get on with making stuff. i think this wlil be quite usable in the short term; we have i think a fine understanding of the pitfalls.


[*] i want this to be crystal clear, so here’s another example: i write a script using engine Foo that someone else wrote. it’s defined in Foo.sc.

option A: Foo.sc lives in a central location/repository, like dust/code/we/lib. it’s in the SC path.

  • bonus: Foo is always available.
  • malus: monome has to manage versions of foo.
  • malus: i have to be aware if there are changes to the Foo API and update my script accordingly.

option B: Foo.sc lives in dust/code/fooguy/lib. since fooguy/lib is in the SC path too… hey! there is no difference to the script writer! the only difference is that the decision to includefooguy is up to the user, not monome. this is what we’re doing right now.

  • bonus: monome doesn’t have to manage versions of Foo.
  • malus: the user might not have Foo installed; you have to warn them. in practice, i don’t think this is gonna matter, b/c we’ll be rolling popular engines into ‘upstream’ channels.
  • malus: i have to be aware if there are changes to the Foo API and update my script accordingly.

with either option, you can always make FooBar.sc. it is deeply unfortunate that you can’t just “deploy” your own copy of Foo.sc into your project path, updating it with new versions as you please. this would solve all our complaints and be more “javascripty.” but for now, we can’t do that; it’s a limitation we have to work around, that’s life.


i want to make one important point (responding to @junklight): off the top of my head, i’m not aware of any actual changes that need to be made to engine code for 2.0. the SC engine API is presently entirely backwards-compatible. this was on purpose; if we wanted to break it we could do all kinds of fancy stuff. so if an engine author doesn’t want to self-host, we can just leave that engine in we and everything depending on it should continue to work.

(and indeed, there was a temptation to break it early, since there are only like 10-20 engines right now; but we wanted to get the update out sooner rather than even later, to take advantage of new features, stability, and performance.)

8 Likes

that’s the idea. no need to serve a local copy. uses our server and may not be cached in user’s browser. more efficient in many ways.

ok. I’ve spoiled a Tuesday full of fun for many going on about this. I’m off topic. Pulled my two small scripts so I don’t worry about breakage and will see what’s required for sharing down the road. Peace.

Let’s not make this more complicated than it has to be. The norns ecosystem is currently pretty small, I’m sure we can keep all the scripts and engines that currently exist and the new ones in the forseeable future working just fine :slight_smile:

3 Likes

In terms of engine dependencies, some fairly informal conventions could help users.

If not tab.contains(engine.names, “passersby”) then
  print(“Engine passersby missing, download engine from [url]”)
end

I can even imagine some little web app parsing the Library category, that a Norns lua library could then use to pull engine dependencies automatically too.

9 Likes

could anybody share a glut video, so i can make a library thread for it? :slight_smile:

3 Likes

May have found a bug. I’m working on a softcut script, with no engine. I set a couple polls for "amp_in_l" and "amp_in_r", everything works fine. However, sleeping and restarting into the script causes an init error, that persists until I either load another script (that uses an engine) or I remove the polls altogether.