Defining RSI

What is RSI?? I think it means humans exit the loop of AI development. But like … how much? it could be the case that humans are cheaper than automated alternatives for a large variety of physical labor for some time. And still, we might attribute >90% of advances in AI R&D to AI systems, and 100% of the technical or conceptual advances. Is that RSI?

Specifically, the concrete form of this claim is something like: You could automate the entire AI company. The automated AI company could have a business plan, could sell products, could “earn its way”, and spend some of the profits that it makes on physical labor that is done by humans. Is that RSI?

For the stronger definition of RSI, which requires that humans are completely out of the loop of AI development, this is not RSI. However, this company would still move much, much faster. Specifically, it would move closer to the upper bound of the rate of progress than human-run AI companies would.

This post from Ajeya Cotra offers a response to this claim, I think / hope. TODO review and add the answer if it exists.

Pure-software

We hold inputs roughly fixed, or capped under some humanly-achievable threshold, and AI of various forms is able to improve the efficiency of all the different parts of the chain that it controls.

I think this gets at a clear or an interesting dynamic where the parts of AI R&D that get automated immediately compress to small parts. As Tyler Cowen says, all the other parts become the bottlenecks. The human-run parts become the bottlenecks. The rate of progress kind of increases as a result of the automation and then quickly hits some barrier because of the next thing that is not yet automated.

It seems pretty likely to me that we do not live in a world where the genuinely pure software thing is actually feasible. That is, I don’t think we live in a world where AI systems can make advances in AI that are large enough that the next set of AI systems which have made use of these advances are able to find the next set of advances, and so on and so forth.

Part of the reason might be that good ideas are getting harder to find, and the inputs that are necessary, or relatively the inputs that are necessary for each discovery of an equivalent magnitude, are kind of exponentially larger. The gains you get from that discovery are not exponentially larger.

Most of the gains, or the way that the gains have been able to contribute to the next generation, is not so much directly, but rather indirectly via the economy, via diffusion, via revenues of the companies growing massively. Notably, however, the revenues have been increasing exponentially. If the revenues were increasing linearly, you basically wouldn’t be able to sustain anything because of this “ideas are getting harder to find” idea or principle of the power law.