In a sense, this is right. In another, it's not answering the right question. Why would the NSA demand encryption backdoors even if they know full well that no such thing is practical? The answer eventually comes down to institutional design.
Some thoughts on how we went wrong:(1)
In a sense, the NSA is like a large tech company which contains both engineers and less-technical project managers. The engineers are generally mathematics and computer science Ph.Ds from specialized fields. The project managers are all military officers. Employees that span both domains are extremely uncommon: most managerial positions are filled by people who started in military signals intelligence or the CIA, then transferred to NSA.
The head of the NSA is, by law, always a military officer. By custom, the deputy director of the NSA is a mathematician or engineer, but there's no mistaking who's in charge: officers with often-marginal technical ability. Insofar as the NSA believes that key escrow is possible to do safely, that knowledge is fully believed by nontechnical management, not the cryptographers that would be called upon to implement the program.(2)
In another sense, the NSA is like the Air Force.
The USAF is basically a logistics organization. Their job is to put men and materiel precisely where they belong, as quickly as possible, using planes. That's what most of the Air Force does: only 2% of its personnel are combat pilots, and less than a third of its planes are armed. By comparison, the Navy -- which is not primarily a logistics organization -- has more combat pilots and aircraft.
Nonetheless, most of the USAF's leadership comes out of the combat pilot ranks. Why? Because combat aircraft are the flashiest and most iconic symbols of what the USAF does.
Since the idea of cyberwarfare came on the scene, the NSA has been selecting its leadership from officers with a background in offensive cyberwarfare, rather than defensive cryptography. To the average congressperson, this is both more exciting and more comprehensible. This leads the NSA to systematically overestimate the value of offensive operations and denigrate the critical importance of solid civilian cryptography.(3)
Compartmentalization makes it difficult for the NSA to evaluate the systemic risk of espionage programs.
Inside the NSA, functional segregation makes it difficult to understand the risk characteristics of programs outside an individual employee's reporting chain or tech stack. Only high-level managers have comprehensive need-to-know over most of the agency's programs, but their limited attention means that it's difficult for the people in charge to assess technical risk.
Which they wouldn't be particularly good at anyway. Because they aren't technical.(4)
Secrecy is corrosive to accountability. When you don't have to justify your actions to anyone -- when, in fact, it's a crime
to justify your actions to anyone but the person who ordered them -- you stand a greater risk of doing things which are unjustified.
This means, ultimately, that the more secrecy your program needs, the more hostile oversight needs to be.
(I originally posted this as commentary on a reshare, but current events have made it relevant again, and was asked for a reshareable version. Bonus for technical readers: a primer from +Lea Kissner
about how the DUAL_EC_DRBG backdoor works.)