One of the big tech buzz phrases lately is "the Internet of Things". Moore's Law has suggested that computing power and memory roughly double ever 18 months. It's gotten to the point where your cell phone doesn't just have more computing power than the systems that plotted the Moon landing, it has millions of times more power than that. The trajectory is smaller, faster, cheaper, less power draw.
I myself have been playing this tech game for so long that I remember the ROACH chip - the Router On A CHip. That's so pedestrian now as to be all quaint and retro. A smartphone is only possible because all the myriad components (and their connecting communications busses) have long since collapsed into a single flake of silicon.
And so to the
Internet Of Things. What happens when computers appear in anything? Camp Borepatch has six (!) heating/cooling zones, with smart thermostats that talk to each other over internal wiring. The last owner put all that in, and the technology is likely nearly ten years old. Your car has dozens of computers. The IPv6 address space has billions and billions of unique addresses, so all of these can be Internet enabled, allowing them to talk to each other and work cooperatively to solve problems that nobody has thought of before, because positing a solution would have seemed absurd on its face.
Silicon Valley in general (and Cisco in particular) are all over this as the Next Big Thing.
The problem is that current Operating Systems stink. More specifically, they were designed for the Apollo era - even Linux dates back to Unix which first got its stirrings in the 1960s. The network is a marvel of redundancy and resiliency (as indeed DARPA had designed it to be, again, back in the '60s), but networks go down and we're quite a long way from applications that gracefully handle network outages. The problem is that error handling is at the application level, which means that you have to write it for each of the apps on the system. Every. Single. App. It's like having to handle network addressing at the app level, rather than at the OS. Actually, it's worse.
The current computing paradigm is broken when you think of it scaling to billions of processors distributed randomly around the world. Too bad for the Internet Of Things.
Or is it? Clark at Popehat has a
very interesting (and a pretty technical) overview of Urbit, which shows the promise of
shattering the data center into a billion shiny computing shards:
Nock programs are tree structures.
This is not unprecedented – Lisp ("The greatest single programming language ever designed.") does too.
And here – suddenly – the conceptual Legos start clicking together.
Because a Nock program is functional, it operates without caring what
machine its on, what time it is, what the phase of the moon is.
Every Nock program is a tree, or a pyramid. Every subsection of the
tree is also a tree. Meaning that each subsection of a Nock program is a
smaller Nock program that can operate on any machine in the world, at
any time, without caring what the phase of the moon is. Meaning that a
Nock program can be sliced up with a high carbon steel blade, tossed to
the winds, and the partial results reassembled when they arrive back
wafted on the wings of unreliable data transport.
Nock programs – and parts of programs – operate without side effects.
You can calculate something a thousand times without changing the
state of the world. Meaning that if you're unsure if you've got good
network connectivity, you can delegate this chunk of your program not just to one other machine, but to a thousand other machines and wait for any one of them to succeed.
Moore's Law says that all of these billions of network node devices will be smarter in 18 months - twice as smart. As people replace (say) smart light bulbs in 5 years, that's 3 generations of performance improvement. There will be 8 times the computing power available in the Internet Of Things - and Urbit/Nock let you harness that.
It actually lets anyone harness that:
Nock supports and assumes full encryption of data channels, so not
only can you spread computation across the three machines in your home
office, you can spread it across three thousand machines across the
world.
The list goes on and on.
Envisioning and defining Nock took a stroke of genius. Implementing it, and Hoon, and Urbit, will be a long road.
But once it's all done, it will function like an amazingly solid,
square, and robust foundation. All sorts of things that are hard now,
because we have built our modern computational civilization on a
foundation of sand will become easy. We have vast industries based
around doing really hard work fixing problems that modern computing has
but a Nock infrastructure would not – Akamai, for example, pulls in $1.6 billion per year by solving the problem that modern URLs don't work like BitTorrent / Urbit URLs.
When an idea, properly implemented, can destroy multiple different ten-billion-dollar-a-year-industries as a side effect it is, I assert, worth thinking about.
I imagine that some of you have been following the "Anarcho" part of all of this and wondering where the "Capitalism" part comes in. That's it, right there. With a billion networked computers all more powerful than the computer you're reading this on right now, computing ceases to be a scarce commodity. This quite frankly turns the field of computer security on its head - while I don't know that this doesn't solve the problem of Denial Of Service, I don't know that it won't. After all, if your computer (whatever
that means in an Urbit world) is DDoS'ed, why couldn't your Nock programs just run somewhere else?
You can see why Cisco is pushing this so hard - the network essentially becomes the computer (as the old Sun Microsystems advert put it). It makes Cisco's networking gear more valuable.
And now to the really subversive part. Clark again:
Back in the early days of the internet when Usenet was cutting edge, there was a gent by the name of Timothy C May who formed the cypherpunk mailing list.
His signature block at the time read
Timothy C. May, Crypto Anarchy: encryption, digital money,
anonymous networks, digital pseudonyms, zero knowledge, reputations,
information markets, black markets, collapse of government.
I bring up his sig block because in list form it functions like an
avalanche. The first few nouns are obvious and unimportant – a few
grains of snow sliding. The next few are derived from the first in a
strict syllogism-like fashion, and then the train / avalanche / whatever
gains speed, and finally we've got black markets, and soon after that
we've got the collapse of government. And it all started with a single
snowflake landing at the beginning of the sig block.
Timothy C May saw Bitcoin. He saw Tor. He didn't know the name that
Anonymous would take, and he didn't know that the Dread Pirate Roberts
would run Silkroad, and he didn't know that Chelsea Manning would
release those documents. …but he knew that something like that would
happen. And, make no mistake, we're still only seeing small patches of
hillside snow give way. Despite the ominous slippages of snowbanks,
Timothy C May's real avalanche hasn't even started.
I suggest that Urbit may very well have a similar trajectory. Functional programming language. Small core. Decentralization.
First someone will rewrite Tor in it – a trivial exercise. Then some
silly toy-like web browser and maybe a matching web server. They won't
get much traction. Then someone will write something cool – a
decentralized jukebox that leverages Urbit's privileges, delegation and
neo-feudalist access control lists to give permissions to one's own
friends and family and uses the built in cryptography to hide the files
from the MPAA. Or maybe someone will code a MMORPG that does amazingly
detailed rendering of algorithmically created dungeons by using spare
cycles on the machines of game players (actually delegating the gaming
firms core servers out onto customer hardware).
Probably it will be something I haven't imagined.
Will this happen? Who knows? But Silicon Valley is pushing this because it (rightly) sees a paradigm shift. The folks at the Fed.Gov are clueless, shambling dinosaurs (otherwise they'd work in Silicon Valley, duh - yes, that sounds arrogant; yes, it's true). And so, if this happens, the Fed.Gov won't realize it until it's already happened. Until the paradigm toothpaste has shifted out of the tube.
And the punch line? Imagine how much metadata the NSA will have to analyze with 2 orders of magnitude more computers each doing 3 orders of magnitude encrypted, randomized network connections? They will need 100,000 times the compute and storage capacity within a decade. And more importantly, the imagination to know how to make this work. And they'll need a further 100,000 times the power ten years further out.
Let us know how that works for ya, Ft. Meade. There's no way that the NSA has increased their computing power by a factor of
ten billion in the last 20 years. They won't do that in the next 20, either.
The world is far less predictable, and far less controllable than anyone thinks. It's very probably less predictable and controllable than anyone can imagine - at the very time that Progressives think that they can lock down control over the populations and institute the New Jerusalem. Let us know how that works for ya, Progs.
It is our task, both in science and in society at large, to prove the
conventional wisdom wrong and to make our unpredictable dreams come true.
- Freeman Dyson
Bootnote: What is the man behind Urbit? His name is Curtis Yarvin, and he works in Silicon Valley. He also goes by the
nom de blog of
Mencius Moldbug.
We've seen him before here. Clark addresses this obliquely in the comments to his post:
The neo-reactionary stuff on Urbit that seems to be decoration is not. It is the whole point.
If Yarvin (and Cisco, and Silicon Valley) can pull this off, this is Big, big stuff. RTWT, including the comments which are packed full of smart.