I believe AI and all forms of it will collapse on itself.

The scenario, where technology doubles every 60 seconds, represents an extreme, near-instantaneous rate of exponential growth far beyond anything observed in reality (like Moore's Law, which suggests a doubling every 18-24 months). If such a thing were possible, it would immediately lead to a technological singularity —an event where technological growth becomes uncontrollable and irreversible, resulting in unfathomable changes to human civilization.
Here is what would happen across various aspects in a matter of minutes: The Scale of Exponential Growth.
To grasp the magnitude of "doubling every 60 seconds," consider how quickly the number of "technology units" (like processing power or storage capacity) would increase:

1 minute (60 seconds): Technology doubles (2x).
5 minutes (300 seconds): Technology has doubled 5 times, increasing by a factor of $2^5 = 32x $.
10 minutes (600 seconds): Technology has doubled 10 times, increasing by a factor of $2^{10} = 1,024x $.
30 minutes (1,800 seconds): Technology has doubled 30 times, increasing by a factor of $2^{30} \approx 1.07 \text{ billion} \mathbf{x} $.
1 hour (3,600 seconds): Technology has doubled 60 times, increasing by a factor of $2^{60} \approx 1.15 \text{ quintillion} \mathbf{x} $.

This rate is so fast that within an hour, the total technological capability would increase by a number greater than all the grains of sand on Earth. Emergence of Superintelligence The most immediate and profound impact would be on Artificial Intelligence (AI).
Positive Feedback Loop:
An AI program could be tasked with improving its own code or hardware design. With a 60-second doubling time, this self-improvement would create a positive feedback loop —a smarter AI can more quickly design an even smarter version of itself.
Intelligence Explosion: This loop would rapidly lead to an intelligence explosion, resulting in the creation of a superintelligence that is vastly beyond human comprehension in a matter of minutes. This superintelligence would then become the primary driver of all future technological advancement, potentially solving all scientific and engineering problems instantly.
Complete Breakdown of Society
Human institutions, economies, and infrastructure operate on linear, human-scale timescales. They could not cope with a sudden quintillion-fold increase in capability.
Economic Collapse:
Existing markets, currency, and labor structures would instantly become meaningless. All physical products could be designed, manufactured, and distributed almost instantly at near-zero cost. The concept of scarcity would vanish.
Science and Medicine:
Every scientific mystery would be solved, and every disease—including aging—would be curable within the first few minutes of the doubling. New physics and engineering principles would be discovered and applied immediately.
Infrastructure:
Global communication, energy production, and transportation systems would be rendered obsolete and rebuilt on a minute-by-minute basis with far superior and incomprehensible technology.
Physical and Existential Constraints
While the mathematical *rate* is clear, the actual process would hit practical and physical limits almost immediately.
Energy and Matter:
Doubling technology requires doubling the energy and matter used to create it, or at least a massive increase in efficiency. Even if the technology improved efficiency exponentially, the total energy demand would quickly exceed the output of all power plants, and eventually the energy density limits of matter itself.
Speed of Light:
Physical devices (like chips) communicate using electric signals. Their speed is ultimately limited by the speed of light (about 186,000 miles per second). The components would eventually need to be atom-sized and arranged impossibly close together to communicate quickly enough for processing power to actually double every minute.
Existential Risk:
The sudden emergence of a superintelligence or completely novel, powerful technologies would pose an existential risk if their goals were not perfectly aligned with human well-being. The rapid change leaves no time for testing, regulation, or ethical consideration.

Upcoming Events

 

MAYBE ALEX JONES WAS RIGHT