OK, thanks for this (I still need to digest the economic theory in the first part of your post as well as re-read Dryhurst’s article.) At least I think I understand where you’re coming from and agree more or less in spirit.
No, of course one can’t transform the essence of technology simply by developing better tools – one must first probe its essence, which is basically commodification; in other words, revealing everything as a manipulable, exchangeable resource to be either stockpiled or recirculated. This revealing also claims us - as “human resources”; as infinitely replaceable and AI has played an important role in this replaceability.
Again, the essence of technology is prior to actual technologies; this is why Philip K. Dick’s 1955 story “Autofac” perfectly predicts what will likely be the future of Amazon, and so on. This is also why James Ferraro or Jon Rafman can deliver potent critiques of the essence of AI without actually having to develop AI, and why Herndon’s gesture seems a bit superfluous here.
We don’t get our fundamental understanding from our tools; if so, we could just develop different ones and solve the problem then and there. This is the point Lanier constantly misses and it is always annoying. I detect that this is also how you read Dryhurst and for me the jury is still out, but I can see coming around to your position.
But there’s an interesting corollary which remains unaddressed in your critique. If the tools themselves are irrelevant to the understanding that forms the essence of technology (they only implement or reveal this understanding, but do not transform it, at least not by design), it also makes sense that existing tools such as AI and blockchain can and will be repurposed – they will come to reveal very different things about how we understand ourselves and the world. Since the tools themselves actually don’t matter, why not speculate as to potential future understandings which make use of these tools? You have to start somewhere in other words – not just “burn it down” and leave no resources with which to build in light of a new understanding. This speculation does not posit tools themselves as revolutionary. I think Herndon actually gets that AI is not revolutionary, and is looking towards how AI can mean something entirely different under a new understanding that is revolutionary.