So did RISC architecture really change everything or what?

So did RISC architecture really change everything or what?

There’s a moment in Hackers when Dade Murphy, aka Crash Override, aka Zero Cool, is poking around Kate Libby’s laptop. He notices the specs and starts rattling them off to prove he's in the know:

Dade: It has a killer refresh rate.

Kate: P6 chip. Triple the speed of the Pentium.

Dade: Yeah, it's not just the chip; it has a PCI bus. But you knew that.

Kate: Indeed. RISC architecture is going to change everything.

Now, that line might have meant something to a certain subset of '90s computer nerds, but I always thought they mentioned RISC because it sounded techy and gave Dade a chance to say RISC is good. As I got older, I realized RISC (Reduced Instruction Set Computing) wasn't just a buzzword.

It was a real architectural philosophy with a bold promise: strip down the instruction set, make CPUs leaner, and get more speed by doing less. None of that bloated C(omplex)ISC-style overhead. Just raw performance.

The future was RISC, and the thirty-year-olds playing high school kids knew it.

At least, that’s what Hackers wanted me to believe. Then again, it also wanted me to believe Marc Antony could be an FBI agent—so, you know, grain of salt.


The Revolution That Wasn’t (Yet)

In the mid-90s, RISC architectures were everywhere: MIPS powering SGI workstations, SPARC keeping Sun servers alive, PowerPC at the heart of Apple Macs. Tech magazines (think print blogs) swore the Intel x86 dinosaur was on borrowed time. And for a while, it felt true—RISC chips were faster, cleaner, more elegant.

Oh, and in case you aren't a nerd like me:

  • MIPS: A RISC architecture (the name stands for Microprocessor without Interlocked Pipeline Stages, whatever the hell that means) that showed up in everything from early workstations to routers, and later lived a second life inside game consoles like the PlayStation 2.
  • SGI: Silicon Graphics, Inc., a company that made high-end 3D graphics workstations—those dinosaurs in Jurassic Park were rendered on SGI machines.
  • SPARC: Sun Microsystems’ in-house RISC chip (Scalable Processor ARChitecture), famous for powering big Unix servers that kept the Internet humming in the 90s.
  • PowerPC: A joint effort by IBM, Motorola, and Apple to dethrone Intel; PowerPC chips ran inside Macintosh computers from the mid-90s until Apple’s big Intel switch in 2006.

But then reality set in. Windows ran on x86. Intel had money. AMD kept pace. And as much as RISC promised to "change everything," it quietly retreated from the mainstream desktop. By the 2000s, most people’s first (and second, and third) PCs were running CISC-flavored Intel or AMD chips, not some futuristic workstation CPU.


The Revolution That Did Happen

Here’s the twist: Burn wasn’t wrong. She was just X years too early.

Because while RISC fizzled in the desktop wars, it absolutely dominated everywhere else. ARM (Advanced RISC Machines) became the beating heart of mobile computing. Your phone, your tablet, your smartwatch, your Nintendo Switch—they all run on RISC. Billions of chips ship every year, quietly proving that yes, RISC did change everything. Just not in the way Burn expected.

Even Intel’s revered x86 eventually bent the knee. Modern x86 chips work by translating those ancient, messy instructions into RISC-like micro-ops under the hood. The world Hackers promised didn’t vanish; it just slipped in through the side door and took over while we were watching Netflix on our phones.


Hack the Planet (With ARM)

Fast-forward to today: Apple’s M-series Macs (ARM-based) are embarrassing Intel on both performance and battery life. AWS has its own ARM server chips. Microsoft is finally taking ARM Windows seriously with its Surface products. Suddenly, RISC isn’t just powering your pocket—it’s coming for your desktop, your datacenter, and your smart fridge.

So was Acid Burn right? Absolutely. It just took 30 years, a smartphone revolution, and one very smug fruit company to prove her point.

RISC did change everything. Just not when, or where, we thought.

Anyway, back to watching the Hackers reboot:

Dade: It has a killer resolution

Kate: Apple M3 Max chip. 40 cores. GPU has 80.

Dade: Yeah, it's not just the chip; it has a dedicated neural engine, optimized for on-device AI. But you knew that.

Kate: Indeed. AI is going to change everything.

Dade: Yeah. Em dashes are good.