Published on July 24, 2025 8:11 PM GMT
"Lord, give me chastity and continence, but not yet"
-- St. Augustine
It seems to me that the current rates of progress in AI are largely fine when measured by absolute value increments in capabilities. We are not afraid of next month's progress; in fact, we as a society are mostly enthusiastic about it. The benefits so far have clearly outweighed the downsides. Clear downsides have started to appear: chatbots, especially GPT-4o, induce psychosis in some vulnerable people. But it seems manageable; maybe it could be fixed just by putting a statement like "Mitigating the risk of psychosis from the free version of ChatGPT should be OpenAI's priority alongside other societal-scale risks such as helping create biological and chemical weapons" and promoting it widely online (especially on Twitter).
Current AI progress seems largely fine.
But it is exponential.
One day we might think "okay, this is getting too fast". It seems prudent to me that we should move from the "exponential growth" to "linear growth" paradigm before then. That's why I propose the following governance idea:
- Find out which economic indicators are most useful for tracking "AI's share in the economy".Create an international organization akin to a global central bank: but instead of forecasting and limiting inflation, it would forecast and limit the rate of growth of AI's share in the global economy, as measured by indicators selected in point 1.
The target could be to automate e.g. 0.4 - 0.6 percentage points of the global economy with AI every year. How could this "global central bank for AI" limit AI's growth? It would need to be researched, but I have in mind something like this (from least severe to most severe):
- Taxing AI revenues: a fixed percentage on all AI revenues globally, the percentage would be decided monthly based on current and prognosed statistics (akin to how interest rates are decided by central banks).Taxing AI hardware, fixed dollar amount (the same globally) per theoretical flops of the hardware pieces. Tax can be paid monthly? All hardware pieces need to be registered in order to be used legally, similar to fire weapons?Limiting / stopping global production of AI hardware (2 and 3 can be introduced together, as they are similar in severity)(After AI hardware production is stopped) Removing existing AI hardware from their owners: through introducing so aggressive taxes that those who can't afford them can stop paying the tax by giving up their AI hardware pieces to internationally monitored storage to be stored for future use or officially destroyed.
The above is just a draft. I welcome feedback and, if you judge my idea to be worthwhile, I'm looking for collaborators to develop it further. My goal is either for my idea to be destroyed by the truth, or to be developed into a governance report & recommendation. You can contact me privately at zaborpoczta(at)gmail(dot)com.
Q: Why economic indicators and not e.g. some benchmarks?
A: Economic impacts seem like the ultimate measure: hardest to game and the most objective. Even AI sceptics such as Robin Hanson would accept them. Also, everyone understands what unemployment is.
Q: Why moving from exponential to linear growth instead of just stopping?
A: Many (most?) people don't want to stop now. Even I don't want to stop completely now, and I'm at double digits P(doom). Also, how do we solve alignment after shutting down all the GPUs? Invest in improving humans via synthetic biology enough to create people >15 IQ higher than John von Neumann, then take out our pens and papers and discover the True Nature of Intelligence and then program a safe seed AI in <75 megabytes of Flare code? If that's realistic then sure... My proposal could then serve as a meaningful step in the direction of implementing an "off-switch" as proposed by MIRI. Either way it seems like an improvement over the status quo: we could limit the chances of a rapid intelligence explosion via limiting hardware, and we could gain valuable time for alignment if we go from, let's say, a global economy that is 2% AI to one that is 50% AI in 80 years (80 * 0.6 percentage point increments). Maybe just in time to prevent the declines in fertility from causing a new "dark ages" period?
Discuss