Dizzying but invisible depth

You just went to the Google home page.

Simple, isn't it?

What just actually happened?

Well, when you know a bit of about how browsers work, it's not quite that simple. You've just put into play HTTP, HTML, CSS, ECMAscript, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just connected your computer to www.google.com.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how networks work, it's not quite that simple. You've just put into play DNS, TCP, UDP, IP, Wifi, Ethernet, DOCSIS, OC, SONET, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just typed www.google.com in the location bar of your browser.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how operating systems work, it's not quite that simple. You've just put into play a kernel, a USB host stack, an input dispatcher, an event handler, a font hinter, a sub-pixel rasterizer, a windowing system, a graphics driver, and more, all of those written in high-level languages that get processed by compilers, linkers, optimizers, interpreters, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just pressed a key on your keyboard.

Simple, isn't it?

What just actually happened?

Well, when you know about bit about how input peripherals work, it's not quite that simple. You've just put into play a power regulator, a debouncer, an input multiplexer, a USB device stack, a USB hub stack, all of that implemented in a single chip. That chip is built around thinly sliced wafers of highly purified single-crystal silicon ingot, doped with minute quantities of other atoms that are blasted into the crystal structure, interconnected with multiple layers of aluminum or copper, that are deposited according to patterns of high-energy ultraviolet light that are focused to a precision of a fraction of a micron, connected to the outside world via thin gold wires, all inside a packaging made of a dimensionally and thermally stable resin. The doping patterns and the interconnects implement transistors, which are grouped together to create logic gates. In some parts of the chip, logic gates are combined to create arithmetic and bitwise functions, which are combined to create an ALU. In another part of the chip, logic gates are combined into bistable loops, which are lined up into rows, which are combined with selectors to create a register bank. In another part of the chip, logic gates are combined into bus controllers and instruction decoders and microcode to create an execution scheduler. In another part of the chip, they're combined into address and data multiplexers and timing circuitry to create a memory controller. There's even more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Can we simplify further?

In fact, very scarily, no, we can't. We can barely comprehend the complexity of a single chip in a computer keyboard, and yet there's no simpler level. The next step takes us to the software that is used to design the chip's logic, and that software itself has a level of complexity that requires to go back to the top of the loop.

Today's computers are so complex that they can only be designed and manufactured with slightly less complex computers. In turn the computers used for the design and manufacture are so complex that they themselves can only be designed and manufactured with slightly less complex computers. You'd have to go through many such loops to get back to a level that could possibly be re-built from scratch.

Once you start to understand how our modern devices work and how they're created, it's impossible to not be dizzy about the depth of everything that's involved, and to not be in awe about the fact that they work at all, when Murphy's law says that they simply shouldn't possibly work.

For non-technologists, this is all a black box. That is a great success of technology: all those layers of complexity are entirely hidden and people can use them without even knowing that they exist at all. That is the reason why many people can find computers so frustrating to use: there are so many things that can possibly go wrong that some of them inevitably will, but the complexity goes so deep that it's impossible for most users to be able to do anything about any error.

That is also why it's so hard for technologists and non-technologists to communicate together: technologists know too much about too many layers and non-technologists know too little about too few layers to be able to establish effective direct communication. The gap is so large that it's not even possible any more to have a single person be an intermediate between those two groups, and that's why e.g. we end up with those convoluted technical support call centers and their multiple tiers. Without such deep support structures, you end up with the frustrating situation that we see when end users have access to a bug database that is directly used by engineers: neither the end users nor the engineers get the information that they need to accomplish their goals.

That is why the mainstream press and the general population has talked so much about Steve Jobs' death and comparatively so little about Dennis Ritchie's: Steve's influence was at a layer that most people could see, while Dennis' was much deeper. On the one hand, I can imagine where the computing world would be without the work that Jobs did and the people he inspired: probably a bit less shiny, a bit more beige, a bit more square. Deep inside, though, our devices would still work the same way and do the same things. On the other hand, I literally can't imagine where the computing world would be without the work that Ritchie did and the people he inspired. By the mid 80s, Ritchie's influence had taken over, and even back then very little remained of the pre-Ritchie world.

Finally, last but not least, that is why our patent system is broken: technology has done such an amazing job at hiding its complexity that the people regulating and running the patent system are barely even aware of the complexity of what they're regulating and running. That's the ultimate bikeshedding: just like the proverbial discussions in the town hall about a nuclear power plant end up being about the paint color for the plant's bike shed, the patent discussions about modern computing systems end up being about screen sizes and icon ordering, because in both cases those are the only aspect that the people involved in the discussion are capable of discussing, even though they are irrelevant to the actual function of the overall system being discussed.

CC:BY 3.0
R.A. Kaur's profile photoMilind Deshpande's profile photoNick Bauman's profile photoOleg Gryb's profile photo
There's an extra level of complexity on the side: those of businesses, myriads of standards and agendas competing with each other.

For every layer or technology you mention, there are multiple ways to achieve the same or a similar result, some better than others. How you implement one of the layers depends greatly on the ones underneath, which in turns influence how to implement the layers above. The number of combinations is astounding, yet most of the choices seem obvious from a technical insight whilst other chosen solutions are absurd but exist for mere historical or agenda reasons.

Makes me consider the separation between user space and machine space in the current crop of operating systems.

The illusory separation between software and hardware is fading fast.

When app=appliance a good bit of this post will seem very early 21st century.

Jobs deserves more credit than that. He was able to explain extremely technical concepts to non-techies. He was incredibly tech-minded and spent his teenage years coding on a timeshare machine at NASA / Ames. His drive to create personal computing was more than altruistic. It was selfish, he wanted a personal computer, so he built one.
Simply put. WOW! Amazing read. Thanks for posting that. 
WHOA, WHOA, WHOA... You lost me at browsers! Now, what do I type where?
This is precisely why I love the Google + community. Thank you for blowing my mind. It was far too early on a Saturday for this.
This should be recited to the US congress. Who frankly at this point shouldn't be allowed to talk about anything with electricity running through it.
+1 for the complexity/simplicity and length of the post alone. Another +1 for the content. I agree with the others here. This is worth printing and handing to all those family members I support, in person and remotely, who say "I'm afraid I'm going to screw something up." A bow to you, sir.
A great relaxing read after a couple of hours of trying to get the printer work and then other 30 minutes trying to explain, why I am unable to answer such a simple question as: "What is the actual error?" in less than other hour or so. .-)
Beautifully put.

Would you like to come to the UK and discuss online privacy with our government?
Related: http://abstrusegoose.com/98

That's one of the reasons I love technology, and one of the reasons I chose to work on consumer electronics: at least normal people know what a browser is and how useful it is. Not the same can be said of all industry fields.
I just don't understand how our culture comes to hold in contempt the things that are truly interesting, while standing in line for pedestrian feel-good marketing.
You forgot to talk about electricity/and or batteries. I still can't get my head around electricity and also radio waves and how they go through walls.

To follow up on your Ritchie/Jobs comments. Computers are an almost unique invention as in themselves they don't really do anything. They are nothing more than a tool ... a bit like a Swiss Army knife but one that you can add attachments to.

We all use them differently. From games to porn, creating a novel or making a film .. criminality to keeping in contact ... expressing our thoughts etc.

So for me as a developer, whilst yes the first time I played with an iPhone it did feel really nice, sleek etc .. after a minute of awe I was then off seeing how my sites looked like on the browser.

My laptop of choice is 11 years old big and clunky I can leave it in the pub when I have a cigarette or go to the toilet. The batteries last well over 5 hours. It is unappealing, with tape around it and worth just a few pounds but that is its beauty. It does exactly what I want. It runs my websites locally, it is fast and I can punch out HTML5, edit images etc and if I had the skill I could create a killer site with it.

It has little or no style, but bags of substance. That I think maybe a bit like Ritchie/Jobs contributions. Plus, guess what? When I am coding Javascript and PHP ... it doesn't half feel like C :)
Thanks everyone for the comments.

Like several people noted, my examples don't cover the big picture. I didn't go into full depth at any of those levels. More importantly, I explicitly avoided any of the breadth at any of the levels, even though that breadth adds a lot of complexity.

Without mentioning the breadth of things, my examples really only dug deep on one aspect of one task, so the details I've written about in my post don't even take into account the extra complexity of creating system that can run multiple tasks. If you follow what I've read, you really only end up with a machine that can display the Google home page and nothing else.

Once you start to think about all the possibilities, you end up with an exponential explosion of incredibly many potential combinations, which is another aspect that many people tend to ignore: versatility and specialization both have advantages, and it's essentially impossible to get the best of both worlds.
Um, long rant - and I am with you on this. So anyone wants to talk solutions? The dizzying complexity makes me go "Steve Jobs" way, in simplifying to the core and saying - No to patents altogether. Yet, clearly, some control is needed, especially now that we outsource most of our manufacturing, which means we must teach "the whole world" how did We create such complex machinery and how it all works (and why). So anyone up with solution? Any opinion on recent Obama Patent reform?
Wow... Just, wow... And no, I don't mean the game... I mean, wow, this is great post!
+Adi Rabinovich - the huge paradox is that the complexity of what happens under the covers is actually necessary to be able to achieve the apparent simplicity that ends up surfacing: a lot of the complexity exists for the sole purpose of hiding complexity.

Computers aren't the only such domain. You could look at cars as well: they've become more complex so that they're easier to use and so that people (literally) don't need to look under the hood. Electronic fuel injection combined with a mass air flow sensor and an oxygen sensor. Vacuum-assisted brakes. Power steering. Automatic gearboxes. Electric windows. All-season tires. Airbags. Those have made cars much more complex than they used to be, so much that it's essentially impossible for anyone to know everything well, but at the same time they also made the cars more reliable overall, safer, and much easier to use.
+Jean-Baptiste Queru But how does this translate into domain of software? Many of us built major Infrastructure code components that are designed to hide complexity, and yet, inevitably they translate into computing overhead which we all hate. Of course, given the benefits of overall shorter development time and more stable product, this is definitely worth it. But I keep wondering, is there a better way...
+Jean-Baptiste Queru Taking your car analogy and being a devil's advocate. They still have 4 wheels, they just get you from A to B, normally work with an internal combustion engine, and whilst they maybe more comfortable and comparatively safer, not much has fundamentally changed for nearly 100 years. That is apart from the complexity now means now that the average person cannot fix the mechanics or a blacksmith just make a new part. So saying when I was a kid I seem to remember a lot of broken down cars on the roadside
+Mark Warner Luckily, once Electric gets here, I think we'll be able to simplify Cars A Lot. I can foresee where you'd be choosing a car based on most Elegant user interface, versus anything else :)
+Adi Rabinovich - I think that the overhead is necessary, not just to hide complexity from end-users, but also to hide complexity from engineers themselves (though we like to call it abstraction). In any system, no layer can be more complex than a single engineer can manage. Adding layers allows to add extra end-to-end complexity, but the cost is indeed that within each layer some of the mental capacity of the engineers involved is used to maintain the layering and not just the actual valuable logic.

That's why we explicitly try to build software in layers, so that no area needs to communicate with more than two other areas. Similarly, that's why so much effort goes into the "true" APIs (i.e. the ones that are used by applications, not between the intermediate layers): that interface is the one that has the most fan-out, i.e. where one implementation of a common function is used by many thousands of applications. It's got to be just right, if it's too high-level it's too constraining and prevents some applications from being created, if it's too low-level it forces the applications themselves to be built in layers, which wastes a lot of time that could have been used for creativity instead.
In anthropology/sociology, there is a famous, possibly notional tribesman who thinks that anyone who can't make (/repair) whatever he uses as tools is a fool or worse.

It got complicated once we moved a good distance beyond hunting and gathering, and (modulo some sort of Singularity/Borg "union with our machines") it's going to stay complicated (or mysterious, or both) unless we go back close to hunting and gathering.

Are we okay with techno-mystery? What is to be done?
+Mark Warner - but then I could say that computers are still made of MOS transistors and software written mostly in derivatives of C, so the computer analogy can be broken down the exact same way ;-)
+Adi Rabinovich Yes, it will still have 4 wheels and create 1000's of fatalities and really not be too different. I am not putting my point very clearly. What I trying to say that a lot of 'technology' today is really about style.

In no way am I dissing Steve Jobs what he achieved was phenomenal and the Stanford address was the most inspirational speech I have ever heard BUT we all fall for the reality distortion field.

Although it's quite cool to see a scrolled window or a menu bob a bit before it settles down, I'd personally prefer to save a bit of money/and processing power. Having something that has a longer battery life, that isn't flat after a month in a draw. that makes the best of a bad signal etc.

As an example I have been looking at tablets ... and what I would like to see is one that is really cheap and folds like a clam shell. Halving the size and the back protecting the screen. That would mean when folded out there would be a pixel sized line across the screen. I would live with that but I know I in a minority.
+Mark Warner I am going to have to argue with you regarding style relevance. It's really not a distortion - it is essential to how we view and interact with things. As humans - we need feedback that things "move" and "act", so most animations are designed to help us understand and remember behavior of interface, to make it ultimately As Easy as you say - when it eventually automatic and you don't need to "wait for battery wasting animation". A lot of the other side of design is elegance of simplicity - managing to start of with semi-empty screen which hides a lot of functionality, which is then gradually exposed as parts become relevant.
Regarding Tablets - I personally think we'll get there with Flexible screens, but that will be a while. For now, couple of companies attempted dual-screen folding approach - I think including Sony. It never worked out for now, but search the web and you will find them out there, perhaps on clearance due to lack of interest :)
+Michael M. Butler - As I see it, there are multiple steps:

-At the first step, groups of people work together to be more effective, each specializing in specific tasks that anyone in the group was aware of and could still understand. This is how you end up with millers, bakers, blacksmiths, lumberjacks.

-At the second step, people specialize further, so that people couldn't understand all the possible tasks any more, but such that they were still aware that the tasks existed. That's essentially the industrial age.

-Finally, as a third step, there's so much complexity that people don't even know that the tasks exist. Paradoxically, that's our information age.

Personally, I'm not fundamentally worried about the mystery that surrounds the most complex technologies, because I think that's really the only way that such technologies can have a broad reach.

However, I am worried about the mystery that surrounds the simpler aspects, and specifically food. I feel that too many people trust a mysterious black box ("supermarket") to supply food. Most people have no idea how that food is prepared. Going further, people have no idea about how food could be prepared anyway, so even if they looked in their supermarket's supply chain they wouldn't be able to actually judge whether there's anything wrong with it.
+Adi Rabinovich Hey! I am not disagreeing with you at all!. (At the moment I am working out whether I can eek out a living using Adsense compared with Jobs: Apple, Pixar, Disney itunes etc etc etc)

It's just my personal preference ... plus I don't really get tablets ... for a bit of surfing .. Wikipedia reading some techie stuff etc they seem OK .. but as a main or even secondary machine I think I'd pass. Prefer a mobile with a big screen that is touch, but also has a flip down keyboard.
+Jean-Baptiste Queru I think Food discussion deserves a whole separate post of it's own - maybe we should have one next weekend. And we'll throw in Organic versus non-Organic, for good measure :)
The SF writer Charles Stross ran a sort of thought-experiment on his blog a while back. Against a background of hypothetical space colonization, he was wondering about how many people it would take to replicate and sustain modern technological society.

The order-of-magnitude handwavy sort of conclusion he drew was around a hundred million people. Once you start unpacking the actual production and skill requirements of everything we see around us -- cars? smartphones? lettuce in February? -- it's staggering just how many peoples' collective work get combined into invisible ubiquity.
The end-user of any industry has no idea about the complexity of what goes into the product. Layers upon layers of abstraction is what our society is about... Turtles all the way down !
+Michael M. Butler - Actually, you got me digging deeper, thanks for the thought-provoking comment.

The gap between technologists and non-technologists is fundamentally that technologists typically live in that second stage: I don't know how to do work at each of the levels mentioned above (though I have worked at every software levels), but I know that they exist. Non-technologists are stuck in the third stage, though: they simply can't know what goes on under the covers.

I'm gonna go out on a limb here and extrapolate to politics.

Living at the first stage, at the scale of a village, everyone sees everyone. It's quite clear how our actions directly affect the lives of others that we literally know by name. People are within our field of vision.

In the second stage, at the scale of a city, it's not quite possible to see every single person any more or to know everyone, but it's possible to know that people are there: we see them in the streets, we talk to them in stores, we see where they work. People are still within the horizon.

In the third stage, at the scale of a country (or of the whole world, it doesn't matter), it's not even possible to get real awareness of everyone. We're just abstracting other people behind numbers. 300 million people in the US, 7 billion around the world. Most people are now beyond the horizon.

Given that our actions influence the lives of others beyond our individual horizons, I believe that it's unrealistic to trust individuals to make decisions that are beneficial for everyone they can affect, simply because individuals can't even be really aware of the people they affect. I really think that our third-stage society fundamentally requires that we entrust some of the decisions to people who manage to see beyond the horizon, so that they can guide us toward doing things that are beneficial to people we can't see and so that those people we can't see can do things that are beneficial to us.
An incredible insight into the depth of complexity involved in typing this comment, by 'sliding' over the keys, on a device with a processor twice the speed of the desktop I bought only 15 years ago, but fits in my pocket. Thank you.
+Tim Bray - I've got a gut feeling that DNS can happen over either UDP or TCP, but I might very well be wrong, and I'm too lazy to double-check. Luckily, this doesn't change my point :)
Thanks for posting this it is mind boggling to me the depth and complexity inside something that from the outside looks so simple. Then again I think that is also what attracts me to it.

Overall rating
... At least Bjarne Stroustrup is still alive.
And if I had to pick a legend, then I'd go for Jon Postel ... not Steve Jobs
Amazing writing. Saying "it's not that simple" is a bit of an understatement. 
Nice post.

BTW: Typo near end: "how out modern"
This is probably the best post I've read in a while. I'm a CS student so I could get a slight grip on the complexities of tech. I just got out of a debate with a dmr fanatic who bashed the hell out of Mr. Jobs because he was overly praised after his death unlike Mr Ritchie. Well put Mr. +Jean-Baptiste Queru. And the mention of the patents, man, I couldn't have said that any better. Thank you for this.
Was reading till I got to part about Dennis Ritchie. I remember the name, so I Google him. Realize he was the creator of C-Language and Unix.Found out he passed out away on the 12th October.
To realize his impact where will Google, Apple even Microsoft will be today without his input.
i just found from hacker news..very well written..
Thank you for the post.

Can't help but think if indeed we're turning the technology into a black box, are we shielding those who deserve the credit from receiving it?

The industry is infested with 'experts' who are more-or-less running the show. How can we, the technologists, take back the industry? How can we show everyone The Depth, still while protecting them from falling in.
So, when will we loose the knowledge required to build a film camera?
fascinating. i'll have to wikipedia Dennis Ritchie myself.
+Jean-Baptiste Queru Thank you for posting this! It gladdens me to see that at least some people still respect the history and spirit of computing. RIP DMR. You sir, are going into my "Following circle".

On an related side note.. to the readers not so familiar with history of computing.. take a look at the "Mother of all Demos":http://en.wikipedia.org/wiki/The_Mother_of_All_Demos

The MAC and all of today's devices were built by standing on the shoulders of giants like Doug Engelbart. The modern computer is still essentially the same as the computer demonstrated in the demo in 1968. Still we like to credit Stever Jobs and Bill gates as the inventors and forget the real heroes. Capitalism likes to remember its own heroes i guess.
very good written. specially about steve jobs and dannis ritchie............
Excellent post as well as followups, and +Jean-Baptiste Queru - UDP is indeed involved in some aspects of DNS - and I did assume that was what you meant while I read the post, nice it see it clarified in the comments.
While I agree on a technical ground, I don't like the notion that complexity can be used as an excuse for not being able to deal with other people.

Some programmers are extremely focused on detail. So much that they are not able to work with others (techies or not). That is not cute, adorable or something to strive for. It's just tragic. The problem, as I see it, with such persons is they lack the ability to form abstractions when talking to others. It may also be that they have very low self-esteem and have to assert themselves in the eyes of other by bringing up pointless details. It only leads to miscommunication, poor collaboration and in the end a bad result. Asberger and other syndromes may also be the root cause, of course.

You may call this "the gap", but the real problem is lack of communication skills. Yes, there you have it, there is your gap. And you know what, the gap exists everywhere and it pops up when people with different background communicates. It is just normal.

Consider a hedge fund broker and a surgeon talking. They both have very complex jobs, and yet they have to help each other understand. They have to help each other bridging the gap if they are going to work together. And they do it, all the time, all over the world. But if one of them were a programmer...

That's a funny thought.

Agreed, there is huge complexity in computers, but remember: Complexity is everywhere and computers are not special. Consider a tree, a car, a contact lens, a sprained ankle, a plastic toy or whaterver. They are just as complex as the stuff you describe once digging into the details. We are not "special". Computer tech is not more complex than anything else. The only thing that's special is the egos of programmers who think that that.

Anyway, as programmers, we don't have to bring up every tiny detail all the time. Instead we must learn to generalize by forming abstractions. This applies to the programs we write, but more importantly, how we manage the world around us. In the end, using abstractions and simplifying things helps creating a common ground when talking to other persons.

This is the basis for all human communication. Without it, there would be no society and no computers in the first place. Thinking that we, the nerds, should be exempt from that is just silly.

Richie and Jobs: RIP. You both did great!
Dude, the people "running and regulating the patent system" are technologists. Patent examiners, patent agents and patent lawyers are all required to have a technical background. (Patent lawyers and agents must have a bachelor's degree in a technical field.) The politicians have very little say in the technical aspects of the patent system, and so they can only regulate policies around it, such as those we got in the latest "reform" bill, and even then they are based on input from people who are part of the patent system. The judges that make patent-related decisions at the highest level may not be trained as technologists but they sure are up-to-speed on whatever technology they have to deal with. If you have doubts about that, read through some of the lengthy opinions they write explaining their decisions.

I guess you are basing your "patent system is broken" theory purely on the basis of design patents that Apple is using against Samsung. Design patents are actually the narrowest form of IP protection. But I can't blame you, because you've probably been misinformed by media and online discussions around patents, which is typically terribly ignorant of the facts.
Another interesting aspect of this is the continuous moving of complexity across layers, usually downwards. One easy example is floating-point maths, implemented as software library in the old times but today mostly done by CPUs. There are tons of other examples like network stack layers, or graphics applications (current GPUs do in hardware basically all rasterization, drivers and standard OS-leve APIs like GL/D3D do most of the rest, so apps/games only need to worry about very high-level stuff from the P.O.V. of any gfx/game app written up to the 90's...).

You mention keyboard debouncing... I remember writing debouncing logic for old 8-bit micros from the 80's, because the keyboard controller wouldn't do it. Debouncing was implemented in the ROM, so if you wanted something fancy and read keyboard events straight from I/O ports, you had to rewrite that stuff [for the curious: pressing a key doesn't always make a clean transition of the signal like ...0000001111111..., it's more likely something like ...000000001101011111111111... so you have to eliminated those short "bounces"] - NOT any nostalgic of this kind of work :)
+Kunal Kandekar - Actually, I'm basing my comments about the patent system on my own personal observations, which have allowed me to see the patent system from several different angles. From my own personal experience, the patent system as applied to the software industry is deeply skewed toward putting restrictions on trivial shallowly visible aspects and doesn't even consider most of the complexity that goes in the depth of a software stack. I'm sure that you'll understand why the subtleties about handling patents prevent me from giving you any specific details.
This reminds me that I still have your book.
+Jonathan Park - I think that such a statements oversimplifies things. While I clearly claim that design isn't everything, from my point of view it'd be misguided to claim (essentially) that design is nothing. Design matters, just like everything else that goes into making such products. It's one of many layers.
A great post Jean-Baptiste - I hope this means you're handling your stress better now.

While I agree that Jobs achieved the greater fame and recognition by providing something the majority understand and thus able to appreciate, I disagree that you can use this as a basis for arguing the faults in patent litigation.

Having said that, I agree whole heartedly that we have a problem. With patents acting as chess pieces and the insatiable thirst for p+l growth, we do indeed have a sad outcome: where ingenuity is impeded by competitive blocking.
+Jean-Baptiste Queru Could you explain in abstract what kind of observations you've made? I've read dozens, maybe hundreds of patents, and my observations don't match yours; the patents I've seen, software-related or otherwise, range all over the place quality-wise (for various metrics of "quality", including complexity, technical depth, obviousness, detail and so on). I wouldn't say they're skewed one way or the other.

In fact, enter any of the technologies you've mentioned above into Google Patents search and you'll find tons of related patents for each one, of which at least some will be software-based in one way or another. I find no evidence whatsoever that software-related patents are skewed towards "shallow" aspects.
I found this shared earlier today, and I simply marveled at it. It was beautiful in discussing the sheer volume of complexity involved with the topic, and in such a way that it was an utter joy to read. I can't help but tip my hat to this in awe, and hope that this small bit of praise finds its way to share the bit of joy you brought to me today. Thank you.
Great post! Found this by way of a re-post. It would be interesting to see an analogous illustration of how social organizations such a governments and large corporations operate.

For example, Obama calls for combat operations to cease in Iraq. The secretaries of state and defense task their undersecretaries to work out the logistics and practicalities of such a withdrawal, who in turn task their assistant undersecretaries with conducting a more granular set of operations ... finally, after many more layers, individual troops in these places undertake very granular orders to implement the withdrawal.

At each level an individual or social group engages with another individual or social group at the level of "interface", i.e., giving orders or receiving reports, which, importantly, is abstracted from "implementation," i.e., the details of what the subordinate group is tasked with doing. Managers don't have to (and indeed shouldn't have to) know all the details of what their subordinates are doing.

Perhaps the interface/implementation division within technology could be seen as an example of the larger tendency for social organizations (and therefore social organizations of engineers) to manage a division of labor through the interface/implementation trope. We use interfaces in software just as we use them in managing other kinds of human divisions of labor.
Your post is absolutely worthy of being printed in every newspaper and general interest magazine in the world.

As a "borderline-technologist" I sometimes fail even to explain this much to someone asking "How complex can it be?". I will always make sure to have the link to your post handy, as well as a printed copy of it. Thank you +Jean-Baptiste Queru
It's very interesting, because on one side you get perfectly why Dennis Ritchie is critical for today's world, and on the other side ("probably a bit less shiny, a bit more beige, a bit more square") you're totally not getting how critical Steve Jobs was.

We should try to move on from comparing apples and oranges, and understand that they can be both excellent fruits. ;)
First of all, thank you for this post +Jean-Baptiste Queru. Very thought-provoking and well-written.

I think what you said about technology is representative of the entirety of human knowledge today, not just technology.

Yes - there are layers to technology that the "common man" can't get, but this is true of every area of human knowledge, skill, or expertise. We have to rely on experts. What they do may look simple, but once you peel the onion, it really isn't! From a "simple" pencil, as +Michael M. Butler points out, to say, a chest of drawers to a newspaper to bottle of ketchup to a stain-resistant painted wall, there are technologies, skills, and knowledge being employed that will dizzy any one mind!

Here's the thing that gets me though: even if a "common man" tried, they couldn't really understand all there is to know about these everyday things. Let alone something like E = mc^2 and Quantum Mechanics. Aristotle, back in the day, was trying to sum up human knowledge in a way. In this day and age though, attempting something like that is foolhardy if not impossible. Slightly depressing, that! :)
Great essay. Programming inspires me because you have to keep learning and exploring. I take slight issue with the point about there being a separation between technologists and non-technologists. I think anyone can learn anything if they make it priority. However, the way in which labor is often divided means people rarely have an incentive to learn something outside of their domain.
I think that this 'hidden complexity' is now ingrained in our whole society. As a similar example, we just go the the supermarket and buy a product off the shelf. The chain behind that is just as complex as what you describe. From getting the product on the shelf, to manufacture, to growing/making the raw ingredients, to the packaging, to the advertising that got you to buy it in the first place etc.

We are now at a level in society that means most people cannot function on a daily basis without a vast number of support people behind them.

Makes you worried about what would happen if that system ever collapses.
+Jean-Baptiste Queru, This is a very excellent and thought-provoking post.

I'm currently accessing the internet and this page on Google+ through UDP (User Datagram Protocol) via a VPN (Virtual Private Network), so for me there is yet another layer on the onion you so lucidly described - and I can verify for you that DNS (Domain Name System) references can indeed be done over UDP.

Your post brought to mind a a few problems I have been playing with off and on since 1995 or so, based on the assumption that one day all digital technology just goes away - all chips don't work, connected in circuitry or not. How this happens is not dealt with - it could be a huge EMP from the sun, subtle changes in fundamental constants as the solar system traverses the accretion disk around that monstrous frame-dragging space-time metric distorting Kerr black hole in the middle of the Milky Way Galaxy as it rotates with relativistic angular velocity at its even horizon, doesn't matter - it is taken as a "given" provided by Murphy's Law for the purpose of thinking about the consequent problems, which are:

1] Coming up with a "road-map" charting the course from primitive dark-ages technology to the current contemporary level of technology that is comprehensive enough to avoid having gaps that would make it too difficult to progress from one level to the next, and formatted in a way that would allow people without personal experience with anything more technologically advanced than a horse pulling a cart to begin climbing that ladder.

2] Storing this information in a form that the ravages of time will not erode or the people of the dark age themselves will not destroy. I started thinking about this when I saw my young niece wearing a CD-ROM around the base of her ponytail - she was quite pleased with this hair-do. I asked her why the CD-ROM and she said, "Because the rainbows are so pretty!. She was referring, of course, to the diffraction of light caused by the pits encoding the data in the surface of the data layer in the CD-ROM.

I realized that CD-ROMS during a dark age would be cut up and used for things like mosaic artworks. Encoding the data within the lattices of crystals places that data in similar jeopardy. Inscribing the data onto the surfaces of stone slabs would require hectares of stone, would have to be inscribed deep enough into the stone to resist weather erosion, and in any case the stone would probably end up being broken up and used for building materials - look what happened to the Pyramids at Giza.

The solution I finally settled upon was to encode the data into the DNA of a life-form, data that would be carried from generation to generation in the "Junk DNA" bracketed between "stop codons" to prevent its expression in the carrier organism. DNA can, after all, be seen as either a Binary or Quaternary coding mechanism, depending on whether or not you can reliably discriminate between the two edges of the double helix. The information density thus provided is satisfactorily high, so you would not need a lot of room. This led me to wonder, just why do Ophioglossum reticulatum ferns have so many chromosomes (630 pairs)?



That leads to the third problem:

3] Having come up with a way to store the information in an entropy-resistant form having an information density high enough to be useful, and being sufficiently unattractive to Human beings to make the probability that it will be destroyed by them during a dark age for aesthetic reasons, how do you provide the instructions for building a reader for that information? The technology of reading the encoded information depends in large part upon the very information that has been encoded. It is a very pretty problem.

Perhaps we should be taking a long, hard, high tech look at fern DNA or our own "Junk DNA", on the off chance that it contains a "technological road-map" left behind by one of the previous cycles of civilization...

Thank you for posting something so insightful and such a good trigger for hours of deep thought along so many branches. This post is food for so many branches - the perils of compartmentalization and specialization, the fact that we may well have reached the point where our technology cannot be understood without the aid of that same technology and so on. I appreciate it.

"For it is the doom of man, that he forgets." - Merlin in Excalibur (1981)
Shouldn't the actual message be one of empowerment: i.e., that abstraction enables us to build and use complex systems without the need to understand how or why the subsystems work?
+Robert Scoble But many of the things that you mention are in part due to the layers they've influenced.

There are, of course, some other unrelated reasons, but I think that the main reason people relate much more to Jobs' death than to Ritchie's is precisely because Jobs' work is visible to the layman, and Ritchie's isn't.
Jean-Baptiste, I'm a first time reader of this blog but I am a huge fan of how elegantly you showed the insane complexity of modern technology that we rely on. You mention that computers are so complex that they can only be built by other computers, which in turn have to be built by other computers, and etc.

This kind of echos a question I've always wondered, which is: If the human population was reduced to maybe %10 and all electronic equipment, capital equipment, foundries, etc were destroyed by some sort of apocalypse, how long will it take a team of scientists, and engineers to build a computer that resembles something we use today (can run Windows XP for example)?

I've posted this on Quora (http://b.qr.ae/mQbtK3) but I was wondering if you or anyone else who reads this has an opinion on this.

I've subscribed to your blog and would love to read what you blog next!
Great post. And I look forward to the day when we understand the human brain and the levels of biology and chemistry that it builds on well enough so that we could do a good comparison from a design perspective :)
A must read. One of the best posts I've read all year regarding hyper-appreciation for what you see and hypo-appreciation for what you don't. Underneath the "simplicity" of every Apple product is years of blood, sweat and tears provided by thousands of talented engineers.

I'm reminded of that old adage of cinema special effects: they're only successful when you don't notice they're not real. Ergo, you can only appreciate them when you don't appreciate them. Sad, but true.
I will have to read this later on. It is too much content to digest right now...
"a USB device stack, a USB hub stack, all of that implemented in a single chip" -- This made me think of Intel's ad glorifying one of the inventors of USB, Ajay Bhatt, as a rock star. Sponsors of Tomorrow-Rockstar
nada == nada.....
texto brilhante de coisa nenhuma...
+Brian Prentice The reason C is so important is because Unix was written mostly in it. This made porting an operating system to another computer actually feasible, which in turn changed computer hardware architecture. The computing world is in fact very indebted to C and Unix. For example, OSX is descended from NeXt, a Unix derivative and Objective C is descended from C. So Jobs isn't quite the messiah you're looking for. There is indeed a big difference.
I think that the comparison between Ritchie and Jobs is quite telling. Says a lot about why you work on android and not the iphone.
Excellent, excellent article. Thanks for the summary of the industry.
Words of incredible insight presented in a readable and entertaining way. You certainly hit bullseye with this.
I could just as easily oversimplify Ritchie's role as you have Jobs's role. Without Ritchie we wouldn't have C or Unix. As if that's a bad thing. The systems that came before Unix were so much better. Unix is the best of a bad lot, but that doesn't mean it's still not a piece of crap.

Without Jobs, we may not have personal computers at all. The Apple II was the first real personal computer that people could do things on that didn't involve programming it with switches and lights on the front (yes, Woz invented it, but without Jobs it would never have been sold into the mainstream). Not to mention the Mac was the first usable GUI machine (the Xerox Star was released a few years before, but the interface was completely different. It was modal, and the mouse was used to select things, all actions were performed with the keyboard, and there were cursor keys) that went mainstream. Not to mention that we'd all still be using phones that flip open with keypads or look like Blackberries.

While I'm aware that Apple's current OS is based around Unix and C, it only became that way because NeXT was targeting the University market, and the Universities were all hooked on AT&T UNIX at that point, and they requested a machine that was a pretty GUI on top of Unix, and that's what NeXT built (and Apple bought NeXT, and OS X was heavily based on NeXTSTEP).

So, without Ritchie we wouldn't have a crummy PDP-11 assembler that thinks it's a language, and we wouldn't have a crummy operating system that took 60 years to stabilize. Yeah, I'd sure miss those things. I really would miss my Mac and iPhone, though.

But hey, this is Google+, so go ahead and write me off as some Apple fanboy, even though that couldn't be farther from the truth. Apple just happens to be the best at "hiding those layers of technology." Unix and C definitely are not.
+Brian Prentice Your Gartner "Marketecture" genes are really showing here! Unix as a mere computing widget? The relationship between Jobs and Ritchie is in many ways irrelevant, but in some really important ways Jobs was really standing on the shoulders of giants wrt to C and Unix. Pascal wasn't cutting it in so many ways when C showed up. And Smalltalk was too captive of its commercial implementers.
This post is great, but "conspicuously complicated" (C). The simple truth is that the most of the people do like flashy things and they don't like to think unless they really need to and even when they need to, the outcomes of that are usually so awful that I'd rather see somebody else, a better thinker, did that thinking job for them.

I think, Steve Jobs understood and exploited that simple truth well and bit a hell out of other more technical thinkers, like the author of this article, to create that flashy mindless stuff that the rest of us could easily enjoy without thinking much.

The only bad thing was that he also took the last illusion of freedom, which caused a revolt among Android's aficionados and now I'm sitting and thinking how could I limit that excessive and sometimes harmful creativity to make "ordinary" user's life safer and simpler :)