Dizzying but invisible depth
You just went to the Google home page.
Simple, isn't it?
What just actually happened?
Well, when you know a bit of about how browsers work, it's not quite that simple. You've just put into play HTTP, HTML, CSS, ECMAscript, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Let's simplify.
You just connected your computer to www.google.com.
Simple, isn't it?
What just actually happened?
Well, when you know a bit about how networks work, it's not quite that simple. You've just put into play DNS, TCP, UDP, IP, Wifi, Ethernet, DOCSIS, OC, SONET, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Let's simplify.
You just typed www.google.com in the location bar of your browser.
Simple, isn't it?
What just actually happened?
Well, when you know a bit about how operating systems work, it's not quite that simple. You've just put into play a kernel, a USB host stack, an input dispatcher, an event handler, a font hinter, a sub-pixel rasterizer, a windowing system, a graphics driver, and more, all of those written in high-level languages that get processed by compilers, linkers, optimizers, interpreters, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Let's simplify.
You just pressed a key on your keyboard.
Simple, isn't it?
What just actually happened?
Well, when you know about bit about how input peripherals work, it's not quite that simple. You've just put into play a power regulator, a debouncer, an input multiplexer, a USB device stack, a USB hub stack, all of that implemented in a single chip. That chip is built around thinly sliced wafers of highly purified single-crystal silicon ingot, doped with minute quantities of other atoms that are blasted into the crystal structure, interconnected with multiple layers of aluminum or copper, that are deposited according to patterns of high-energy ultraviolet light that are focused to a precision of a fraction of a micron, connected to the outside world via thin gold wires, all inside a packaging made of a dimensionally and thermally stable resin. The doping patterns and the interconnects implement transistors, which are grouped together to create logic gates. In some parts of the chip, logic gates are combined to create arithmetic and bitwise functions, which are combined to create an ALU. In another part of the chip, logic gates are combined into bistable loops, which are lined up into rows, which are combined with selectors to create a register bank. In another part of the chip, logic gates are combined into bus controllers and instruction decoders and microcode to create an execution scheduler. In another part of the chip, they're combined into address and data multiplexers and timing circuitry to create a memory controller. There's even more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Can we simplify further?
In fact, very scarily, no, we can't. We can barely comprehend the complexity of a single chip in a computer keyboard, and yet there's no simpler level. The next step takes us to the software that is used to design the chip's logic, and that software itself has a level of complexity that requires to go back to the top of the loop.
Today's computers are so complex that they can only be designed and manufactured with slightly less complex computers. In turn the computers used for the design and manufacture are so complex that they themselves can only be designed and manufactured with slightly less complex computers. You'd have to go through many such loops to get back to a level that could possibly be re-built from scratch.
Once you start to understand how our modern devices work and how they're created, it's impossible to not be dizzy about the depth of everything that's involved, and to not be in awe about the fact that they work at all, when Murphy's law says that they simply shouldn't possibly work.
For non-technologists, this is all a black box. That is a great success of technology: all those layers of complexity are entirely hidden and people can use them without even knowing that they exist at all. That is the reason why many people can find computers so frustrating to use: there are so many things that can possibly go wrong that some of them inevitably will, but the complexity goes so deep that it's impossible for most users to be able to do anything about any error.
That is also why it's so hard for technologists and non-technologists to communicate together: technologists know too much about too many layers and non-technologists know too little about too few layers to be able to establish effective direct communication. The gap is so large that it's not even possible any more to have a single person be an intermediate between those two groups, and that's why e.g. we end up with those convoluted technical support call centers and their multiple tiers. Without such deep support structures, you end up with the frustrating situation that we see when end users have access to a bug database that is directly used by engineers: neither the end users nor the engineers get the information that they need to accomplish their goals.
That is why the mainstream press and the general population has talked so much about Steve Jobs' death and comparatively so little about Dennis Ritchie's: Steve's influence was at a layer that most people could see, while Dennis' was much deeper. On the one hand, I can imagine where the computing world would be without the work that Jobs did and the people he inspired: probably a bit less shiny, a bit more beige, a bit more square. Deep inside, though, our devices would still work the same way and do the same things. On the other hand, I literally can't imagine where the computing world would be without the work that Ritchie did and the people he inspired. By the mid 80s, Ritchie's influence had taken over, and even back then very little remained of the pre-Ritchie world.
Finally, last but not least, that is why our patent system is broken: technology has done such an amazing job at hiding its complexity that the people regulating and running the patent system are barely even aware of the complexity of what they're regulating and running. That's the ultimate bikeshedding: just like the proverbial discussions in the town hall about a nuclear power plant end up being about the paint color for the plant's bike shed, the patent discussions about modern computing systems end up being about screen sizes and icon ordering, because in both cases those are the only aspect that the people involved in the discussion are capable of discussing, even though they are irrelevant to the actual function of the overall system being discussed.
CC:BY 3.0
You just went to the Google home page.
Simple, isn't it?
What just actually happened?
Well, when you know a bit of about how browsers work, it's not quite that simple. You've just put into play HTTP, HTML, CSS, ECMAscript, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Let's simplify.
You just connected your computer to www.google.com.
Simple, isn't it?
What just actually happened?
Well, when you know a bit about how networks work, it's not quite that simple. You've just put into play DNS, TCP, UDP, IP, Wifi, Ethernet, DOCSIS, OC, SONET, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Let's simplify.
You just typed www.google.com in the location bar of your browser.
Simple, isn't it?
What just actually happened?
Well, when you know a bit about how operating systems work, it's not quite that simple. You've just put into play a kernel, a USB host stack, an input dispatcher, an event handler, a font hinter, a sub-pixel rasterizer, a windowing system, a graphics driver, and more, all of those written in high-level languages that get processed by compilers, linkers, optimizers, interpreters, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Let's simplify.
You just pressed a key on your keyboard.
Simple, isn't it?
What just actually happened?
Well, when you know about bit about how input peripherals work, it's not quite that simple. You've just put into play a power regulator, a debouncer, an input multiplexer, a USB device stack, a USB hub stack, all of that implemented in a single chip. That chip is built around thinly sliced wafers of highly purified single-crystal silicon ingot, doped with minute quantities of other atoms that are blasted into the crystal structure, interconnected with multiple layers of aluminum or copper, that are deposited according to patterns of high-energy ultraviolet light that are focused to a precision of a fraction of a micron, connected to the outside world via thin gold wires, all inside a packaging made of a dimensionally and thermally stable resin. The doping patterns and the interconnects implement transistors, which are grouped together to create logic gates. In some parts of the chip, logic gates are combined to create arithmetic and bitwise functions, which are combined to create an ALU. In another part of the chip, logic gates are combined into bistable loops, which are lined up into rows, which are combined with selectors to create a register bank. In another part of the chip, logic gates are combined into bus controllers and instruction decoders and microcode to create an execution scheduler. In another part of the chip, they're combined into address and data multiplexers and timing circuitry to create a memory controller. There's even more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.
Can we simplify further?
In fact, very scarily, no, we can't. We can barely comprehend the complexity of a single chip in a computer keyboard, and yet there's no simpler level. The next step takes us to the software that is used to design the chip's logic, and that software itself has a level of complexity that requires to go back to the top of the loop.
Today's computers are so complex that they can only be designed and manufactured with slightly less complex computers. In turn the computers used for the design and manufacture are so complex that they themselves can only be designed and manufactured with slightly less complex computers. You'd have to go through many such loops to get back to a level that could possibly be re-built from scratch.
Once you start to understand how our modern devices work and how they're created, it's impossible to not be dizzy about the depth of everything that's involved, and to not be in awe about the fact that they work at all, when Murphy's law says that they simply shouldn't possibly work.
For non-technologists, this is all a black box. That is a great success of technology: all those layers of complexity are entirely hidden and people can use them without even knowing that they exist at all. That is the reason why many people can find computers so frustrating to use: there are so many things that can possibly go wrong that some of them inevitably will, but the complexity goes so deep that it's impossible for most users to be able to do anything about any error.
That is also why it's so hard for technologists and non-technologists to communicate together: technologists know too much about too many layers and non-technologists know too little about too few layers to be able to establish effective direct communication. The gap is so large that it's not even possible any more to have a single person be an intermediate between those two groups, and that's why e.g. we end up with those convoluted technical support call centers and their multiple tiers. Without such deep support structures, you end up with the frustrating situation that we see when end users have access to a bug database that is directly used by engineers: neither the end users nor the engineers get the information that they need to accomplish their goals.
That is why the mainstream press and the general population has talked so much about Steve Jobs' death and comparatively so little about Dennis Ritchie's: Steve's influence was at a layer that most people could see, while Dennis' was much deeper. On the one hand, I can imagine where the computing world would be without the work that Jobs did and the people he inspired: probably a bit less shiny, a bit more beige, a bit more square. Deep inside, though, our devices would still work the same way and do the same things. On the other hand, I literally can't imagine where the computing world would be without the work that Ritchie did and the people he inspired. By the mid 80s, Ritchie's influence had taken over, and even back then very little remained of the pre-Ritchie world.
Finally, last but not least, that is why our patent system is broken: technology has done such an amazing job at hiding its complexity that the people regulating and running the patent system are barely even aware of the complexity of what they're regulating and running. That's the ultimate bikeshedding: just like the proverbial discussions in the town hall about a nuclear power plant end up being about the paint color for the plant's bike shed, the patent discussions about modern computing systems end up being about screen sizes and icon ordering, because in both cases those are the only aspect that the people involved in the discussion are capable of discussing, even though they are irrelevant to the actual function of the overall system being discussed.
CC:BY 3.0
View 103 previous comments
there nothing i can say, just +1Nov 2, 2011
I could just as easily oversimplify Ritchie's role as you have Jobs's role. Without Ritchie we wouldn't have C or Unix. As if that's a bad thing. The systems that came before Unix were so much better. Unix is the best of a bad lot, but that doesn't mean it's still not a piece of crap.
Without Jobs, we may not have personal computers at all. The Apple II was the first real personal computer that people could do things on that didn't involve programming it with switches and lights on the front (yes, Woz invented it, but without Jobs it would never have been sold into the mainstream). Not to mention the Mac was the first usable GUI machine (the Xerox Star was released a few years before, but the interface was completely different. It was modal, and the mouse was used to select things, all actions were performed with the keyboard, and there were cursor keys) that went mainstream. Not to mention that we'd all still be using phones that flip open with keypads or look like Blackberries.
While I'm aware that Apple's current OS is based around Unix and C, it only became that way because NeXT was targeting the University market, and the Universities were all hooked on AT&T UNIX at that point, and they requested a machine that was a pretty GUI on top of Unix, and that's what NeXT built (and Apple bought NeXT, and OS X was heavily based on NeXTSTEP).
So, without Ritchie we wouldn't have a crummy PDP-11 assembler that thinks it's a language, and we wouldn't have a crummy operating system that took 60 years to stabilize. Yeah, I'd sure miss those things. I really would miss my Mac and iPhone, though.
But hey, this is Google+, so go ahead and write me off as some Apple fanboy, even though that couldn't be farther from the truth. Apple just happens to be the best at "hiding those layers of technology." Unix and C definitely are not.Nov 4, 2011
Brilliant and a real eye opener for a nerd like me.Nov 11, 2011
+Brian Prentice Your Gartner "Marketecture" genes are really showing here! Unix as a mere computing widget? The relationship between Jobs and Ritchie is in many ways irrelevant, but in some really important ways Jobs was really standing on the shoulders of giants wrt to C and Unix. Pascal wasn't cutting it in so many ways when C showed up. And Smalltalk was too captive of its commercial implementers.Nov 15, 2011
This post is great, but "conspicuously complicated" (C). The simple truth is that the most of the people do like flashy things and they don't like to think unless they really need to and even when they need to, the outcomes of that are usually so awful that I'd rather see somebody else, a better thinker, did that thinking job for them.
I think, Steve Jobs understood and exploited that simple truth well and bit a hell out of other more technical thinkers, like the author of this article, to create that flashy mindless stuff that the rest of us could easily enjoy without thinking much.
The only bad thing was that he also took the last illusion of freedom, which caused a revolt among Android's aficionados and now I'm sitting and thinking how could I limit that excessive and sometimes harmful creativity to make "ordinary" user's life safer and simpler :)Nov 26, 2011
It was actually "perpetually complicated" and "physically conspicuous". Anyway, after so many years, I can still enjoy this ad:
http://www.macmothership.com/nucleus/index.php?itemid=455Nov 28, 2011