I wrote this a couple of years ago (7/14/2011) for another, more corporate site. I’m reblogging here as it’s as good a summary as any of my thoughts on the respective roles of machines and humans.
It’s officially summer blockbuster season, so it seems a good time to comment preemptively on the inevitable angst, film and ink that will be devoted to the next permutation of The Singularity.
Let’s be clear: already, modern society literally cannot function without machine-based intelligence and automation. We are also creating a worldwide “hive mind,” enabled by cheap, fast connectivity, cheap, ubiquitous mobile devices, and various forms of cheap social media. Information technology is now a kind of intellectual exoskeleton, multiplying and conjoining the physical power of its individual operators to share information rapidly across large distances and audiences or to do massive projects that would once have taken years, if not generations to complete. We are already The Borg, even if most of us have no neural implants yet.
Now, here’s a fun little infographic from Chuck Hollis’ blog that neatly puts into perspective what all this increasing cheapness and ubiquity of communications and computation is going to mean for the average IT person within the next decade.
Ouch. Feeling enslaved yet?
Here’s the problem: we’ve mentally gotten our relationship with our machines exactly backwards, and that’s led us to design ourselves into a corner in IT.
Western Renaissance thinkers attempted to demystify the workings of the human body by casting it as a self-contained, deterministic machine. Most descriptions of the brain still use computer analogies. So the standard narrative of the Industrial Revolution, predicated on that earlier mechanistic construct, is all about humans becoming cogs in assembly lines–briefly and sequentially operating on items presented by machines, and generally behaving as much as possible like machines themselves.
But a funny thing has happened on the way to human obsolescence: it turns out that we’re lousy cogs. We’re optimized for adaptability, not unvarying repeatability. Our brains are highly plastic. And conversely, we’ve done a great job of developing machines that are ever faster and more precise–but no smarter. Design and policy decisions remain entirely the province of humans.
The real story of the Industrial Revolution is one of functionally separating design from execution and policy from process, and then optimizing and mechanically accelerating process execution.
So let’s go back to that slightly terrifying infographic. Right now, a lot of IT work is early-20th-century assembly line stuff, repeating the same set of procedures, hopefully the same way, to each device that comes down the line. It works well enough most of the time. But what happens when we speed up the line in the face of the data explosion IDC projects? At a certain point—just a few years away—the only way for IT to keep pace will be to let machines take over most of the actual bolt-tightening.
Management and orchestration tools by themselves will not take care of this any more than virtualization alone does. Infrastructure itself needs to become far simpler and more able to share information automatically and laterally across devices, not just upwards to the management stack. Think of it as the difference between having two-year-old quadruplets who need you to do almost everything and referee all their interactions (and how long do they stay refereed?), and mostly self-regulating teens who need strategic direction, boundaries and periodic guidance.
This will mean a move away from device-centric management. The “logical chassis” concept proposed by Brocade (VCS technology), Juniper (QFabric) and Cisco (servers, but ironically, not switches) is a very large first step in that direction. It will also require more graceful interactions between infrastructure domains, most especially where virtual machines are concerned. VMware’s CEO Paul Maritz recently commented that it takes VMware IT three days to vMotion a single machine because of all the handoffs required between infrastructure teams. The result is what Jon Hudson (@the_solutioneer) calls “VM veal”—and a complete failure of the promise of virtualization.
We’ve spent the last few centuries trying to turn people into machines, with poor results. On the other hand, the manufacturing world has proven in the last 40 years that once we reach a point of technical sophistication at which it’s possible to design a process around a goal rather than around machines’ limitations, we’re able to achieve astounding efficiencies. We’re at that point in IT now: we can choose to be technology-enabled super humans, rather than slaves to our machines.