I had been meaning to publish this, for a long time, after getting permission. http://the-edge.taht.net/post/gilmores_list/
4 plus ones
Shared publicly•View activity
- Now that Intel has announced the end of Moore's Law, maybe we have to start getting serious about item #1 (routing scalability). :PMar 14, 2016
- The economist announced the end of moore's law, too. Front cover. It must be true, then.
There is so much untapped potential in all the new chips, software-wise. But I'd like to see interpreted languages go away.Mar 14, 2016
- Why do you want to see interpreted languages go away? Performance? Safety? Something else?Mar 15, 2016
- Performance. A couple compiled languages are emerging with added safety features, and debugging someone else's jar files is no fun.
That said, the .jar abstraction has made the wildly diverse and rapidly evolving android infrastructure possible. You can view that as a positive (underlying hardware evolving rapidly), or as a negative (underlying hardware evolving TOO damn rapidly), but still, IoS devices tend to "feel" faster than android and do more complex/lower latency stuff.
I do not much care for how most interpreters work, it is difficult to apply massive parallelism or ooo cpu techniques to one.Mar 15, 2016
- I think that the traditional trade-off of abstraction-level versus performance still applies. You know, the one where the people developing FORTRAN argued that not having to learn a new assembly language for each new computer was worth the performance penalty of using a compiled language. ;)
For whatever reason, interpreters tend to target a higher level of abstraction and thus tend to be more efficient in terms of developer time. (IMHO the Java example is an anomaly... you have all the disadvantages of a complied language and few of the advantages.)
Yes, the most popular interpreted languages suck for parallelism and don't get close to thinking about the actual underlying CPU architecture... however that's not necessarily related to the interpreted vs. compiled trade-off. Certainly functional languages (even ancient ones like LISP) kick butt when it comes to parallelism due to their (almost) side-effect free model, and interpreted languages like Erlang are all about parallelism.Mar 16, 2016
- re: #22 - diminishing pinch-points like Google and Facebook. Dave Winer (www.scripting.com) writes a lot about these services (and Twitter) with similar notions...
... many more...
PS I am aware of the irony of using G+ to diminish the power of their pinch-point :-)Jul 30, 2016
Add a comment...