Rewind to 1965. Gordon Moore, a co-founder and former CEO of Intel, said that every year the amount of transistors per integrated circuit would double. He'd later decide to make it every two years, which has since been known as Moore's Law. Since then, technology has come quite a ways, with well over 5 billion transistors on most any microchip.
Safe to call it one of the biggest change in tech to date#
In the 1970s, the TMS 1000, a microcontroller from Texas Instruments had well over 5 000 transistors. Just a couple years ago, Apple's own silicon, dubbed the A7, had well over 400 000 transistors. With this, we've gone from a chip that could do basic calculations (albeit with an estimated use life of 240 years before failing) to a chip that can do just about anything: make indie films, take stunning photos, listen to high-fidelity audio, and even play some pretty impressive games (prime example being Breath of the Wild on the Switch).
Time to put on the brakes, but not on hardware#
Every year, companies like Nvidia, AMD, Intel, and Apple have some new silicon for consumers. Nvidia came out with the 30 Series GPUs, Intel with their 11th generation of CPUs, AMD with the 6th generation of CPUs and APUs, and Apple with their introduction of Apple Silicon on the Mac, as well as new chips for their mobile and home devices (eg. iPad, iPhone, Apple TV, et al)
However, most of these chips are playing catch-up with the onslaught of intensive games and evergrowing complexity of renders for animated and CG films and shows. New World, a game from Amazon's game studio, has made a notable impression in its closed beta for causing hardware failures. Not crashes, not poor performance, blatant hardware failure. Adding insult to injury, this was on the RTX 3090, Nvidia's most expensive and highest end consumer card (at time of writing).
The RTX 3090 has an MSRP 1 500$ US, but most resellers have struggled to keep stock, leaving costs on platforms like Ebay at double and triple the MSRP.
The law we need: Ephemeralization#
Coined a little over 30 years before Moore's Law, R. Buckminster Fuller had an idea for technology to reach a point where you can do "more and more with less and less until eventually you can do everything with nothing." With the recent uptick in ARM computing, it seems reassuring that we may be starting to turn towards ephemeralization. But, although we can do so much with so little on our phones, we still have leagues to go to improve on the desktop and server appliance front.
Edit: A lot of proofreading and improvements to clarity.