this is great stuff - thx for posting. Minute 45 is HUGE. Does the decreasing cost of death/destruction represent a fundamental limit to progress?
The idea is obv not new, I like Martin Shubik's treatment in Terrorism, Technology and the Socioeconomics of Death*. (Shattuck's Forbidden Knowledge** is also good. Bill Joy's Why the Future Doesn't Need Us***)
Taken to an extreme, every person would have their finger on the button - not just of a nuke, but total annihilation of earth, all existence... take your pick. Clearly this would not be a world suited for human beings with all our flaws: from simple errors in judgment to malice and madness and all the rest.
All of this is a potential answer to Fermi's Paradox - why aren't we finding signs of advanced life in the universe? They keep destroying themselves. The development of technology is a curse of sorts and none of them can pass through this great filter.
It takes little imagination to see this could play out in our own world.
One tendency is to want to limit freedom, to concentrate power. But this has obvious risks.
One of the core challenges of digital technologies is the low cost of replication - if you create a technology that enables high degrees of anonymity and privacy and secrecy and all that, it's impossibly costly to limit those tools to just the "good people." Everyone seems to agree that some people need those protections. But we lack mechanisms for limiting access. Post-Snowden we're going to see a wave of new tech, it looks like we'll finally have usable low-cost high-grade crypto tools. and they'll be available to everyone...
What happens then? I think new systems of restraint will emerge (primarily pseudonymous reputation laid on top of existing law and norms)...
it'll be interesting...
*Shubik paper - http://cowles.econ.yale.edu/P/cp/p09b/p0952.pdf
**Shattuck - http://www.amazon.com/Forbidden-Knowledge-From-Prometheus-Pornography/dp/0156005514*
Joy - http://www.wired.com/wired/archive/8.04/joy.html