Shared publicly  - 
The rise of the web comic
It is not surprising that the popularity of web comics is taking off. The downside (time invested) is extremely small, whereas the potential upside (information gained) can be quite large. It took me several months to gather the information contained herein, by reading the various Tesla biographies. There are a few nuances in the biographies that puts it in perspective, e.g. he did not have a photographic memory, and the "man" who pulled the funding of his Wardenclyffe tower was JP Morgan. Also, a lot of semantic context is missing, making this sound like a hagiography, but one feels like one is learning a lot, with very little investment (time).
Earl Hollar's profile photoLucius Crowe's profile photoPascal Wallisch's profile photoBarry Kort's profile photo
I love little moments of synchronicity! I just published 2 Tesla posts, one fiction and the other... snarky comments from my friends!
So many good meme pictures are hiding in here.
+Peter DO Smith Likely true, regarding kids. Mother Theresa? Bold choice. I'm no expert, but from my reading, Hitchens didn't have many kind things to say about her. What's your perspective?
+Pascal Wallisch : Of course the utility of comics in education has been a topic of interest since early days (, but I think you hit on a key point for the present day: information overload makes a palatable, quickly assimilated form especially attractive.

Relatedly, on two counts, I could have sworn I saw an article on this very topic come through my spew in the past few days, but now I can't find it. I liked the Tesla thing for its density of information, and I can forgive its effusive, hagiographic tone as comedic conceit, but in the comment streams I read of the many repeated postings of the comic I have encountered, I noticed a disturbingly high proportion of uncritical Tesla-worship/Edison-bashing.

How would you feel about a comic rendering of science writing?

+Peter DO Smith : "We extract, sift, evaluate and sort until we have a reduced summary that fits onto our world view. It is the exercise of this evaluative process which is so valuable, not the result."

Indeed, we have no choice. I think the result is important, too, however. Many months ago (I am too hurried to look for it), you and Pascal mused about the value of fiction. You said it offered great riches for understanding reality; Pascal felt it did violence to the understanding of reality by altering readers' priors in unrealistic ways. I demurred from comment, thinking I would reflect at length and return with something pretty damn devastating. Instead I simply disappeared. I would like to return to this, from the perspective of how anecdote, narrative, and drama affect the communication of facts.

One place to start would be the ongoing phenomenon of the "neuronovel": novels with narratives concerning neuroscience, or with characters transfigured in some way by neuroscience. Are they useful for understanding neuroscience? Do they feed into the easy tendency to use neuroscience to explain all sorts of human foible?

Lastly, as a fan of Richard Feynman, I feel obligated to point to this comic biography of that physicist:
+Thomas Tucker Quite topical. In this day and age, given so much available information and so many parallel channels, few people have the time, discipline or energy to read (walls of) text. Reading is rather linear (at least in the way we tend to do the writing). Comics bring the massively parallel visual system into play. This might be the trend of the future, allowing people to do things in parallel, whether they actually can or not (attention deployment is iffy).
+Pascal Wallisch : That got edited several times; I posted it prematurely by mistake. You may want to read the later version. I acknowledge your gentle dig at my recent exploits in the "wall of text" format in other venues. Clumsy. "Not helpful", as you would say.
+Thomas Tucker Lol. I would never dare to take a dig at you. My concern is that people will literally not bother. This is a shame, as you make great points. I'm not sure as to a solution.
+Thomas Tucker By the way (still digesting your edited post), most people can only handle one point per post. The rest is basically lost. I found this out the hard way. So the saint-like format might be more palatable to the masses.
Edit: Cool Saint. Didn't know about St Dymphna yet!
+Pascal Wallisch : Well aware am I that most people can only handle one point per post, and that chunking is not just an affected strategy, it's a natural sensory/cognitive process.

Perhaps unfortunately, I seldom see points as isolated logical entities -- everything is connected to many things, and I find it hard to limit context. For me, every text is illuminated and illuminates a whole library of other texts. Intertextuality cannot be avoided, and the challenge for me is to marshal the talent to make the intertextuality valid and informative.
+Thomas Tucker I actually share your predicament. One concept invoked triggers the next, and so the avalanche is started. As it happens in a subjective, highly dimensional space, most people cannot (and will not) follow. I don't have a ready solution.
Violating my own maxim, in response to your edited post above:
*Narratives are dangerous, particularly if fictional.
*Similarly, "Neuronovels" - methinks we don't know enough about neuroscience. The danger of neuroscience fiction beckons.
*Feynman: Why do so many people think Feynman is their favorite saint? He came up with some amazing stuff, but didn't behave very saintly, did he?
+Pascal Wallisch : Of course I am obligated to send that oatmeal to my many correspondents recently deluged by text.
+Peter DO Smith : Ooh. I would not say "well-intentioned" so much as "unintentional". That doesn't make it any less disruptive, I realize. I honestly had no idea.
+Peter DO Smith : Actually, "unintentional" also is not wholly correct; I intended to suggest tangents. What was unintentional was the (apparent) disruptive nature of the format. I feel a little sad that cogent, linear -- call it "coherent" -- argument is the mode of choice, either because it seems to me to be unnecessarily penurious, or because I often neglect to aim for it. Nonetheless, the virtue of logical coherence is undeniable.
+Peter DO Smith : I call that "InterCaps"; calling it "CamelCase" may be more of a tip-off to your programmer nature, but since my programmer nature is long-neglected and amateurish at the best of times, I'll defer to your view.
+Peter DO Smith : I concur wholeheartedly with you about narrative. However, a corollary of your point is that, as +Pascal Wallisch notes, narrative -- unreasonably effective -- is a dangerous tool in the hands of a neonate or weakwick, and precautions are essential. That said, I like nothing more than to wave narrative around like a firehose. This admission might take on new and possibly sinister meaning were I to tell you about my first experience with a 9-mm semiautomatic pistol.
+Peter DO Smith : I link to that image not because it's inflammatory or because on Facebook I received many "likes" for posting it, but instead because it seems to me to pack a whole bunch of red herring into a tiny parcel. I do not think it rises to the level of incisive insight.
+Peter DO Smith But they do. Most of the systems they study are ergodic, they are usually amenable to study by elements (in a linear fashion) and so on. If these things are not given, physics can't help (e.g. the 3 body problem). I will never understand "physics envy". The methods have to suit the complexity and suitable level of analysis of the problem under study.
+Peter DO Smith +Pascal Wallisch : "The mind/brain is the most complex organism in the universe and physics can tell us nothing about it."

Trained as a physicist I am instantly motivated to say, "Well, that's not entirely true . . .", because it isn't entirely true, but I readily agree, having studied neurobiology from a computational perspective in grad school, that it's as near true as makes no real difference.

I've mentioned before, possibly not here, that in the late 1980s and early 1990s, many physicists were leaving physics because of several related and co-dependent facts about funding, big science, and the growing tendency of physicists on the cutting edge to spiral off into string-land, never therefrom to return.

At the same time, it was the eve of The Decade of the Brain ("You're giving us a whole decade to figure out the brain? How generous!"), and so physicists, thinking themselves clever, began to pile on the bandwagon of computational neuroscience, arrogant as ever.

Watching the neurobiologists reach for their wastebaskets to throw up into was truly amusing -- not least because I was doing exactly what they found so objectionable (and rightly so).

Notably, the interesting physics is no longer so simple (and, if truth be known, the "simple" physics was always computationally quite hellaciously complex -- one reason Feynman gets props, as his best-known contribution, those loopy-wiggly diagrams for QED, was little more than a bookkeeping mechanism to make it remotely possible for physicists to keep straight all the cascading sums of integrals over all possible paths through spacetime of all possible particles involved in the simplest of high-energy interactions).
+Pascal Wallisch +Peter DO Smith : Admitting for the most part that "physicists seem smart because they deal with simple stuff" is true in a way that calling the brain a three-pound soft French cheese is not, I'm compelled to note that string theory is so complex (even if you admit that the five competing string theories are really just pairs of duals of limits at high and low energies of one overarching theory of N-dimensional "branes", for N <= 11) that the "single" theory in fact has, I kid you not, 10^500 possible distinct formulations, each with different implications. It's far from simple (and far from empirically verifiable, as is now popular knowledge) -- so far that a substantial portion of the physics community, for good reason, considers it to be little better than bunkum and hooha.
+Thomas Tucker String theory is religion without the aesthetics. Or perhaps the aesthetics without the religion. I don't see what it has to do with science, and I was saying that before it was cool.
I mean, I really like number theory. That's all fine and good, and it makes for a nice pastime. But it is not science.
+Peter DO Smith : Actually, no; that's my point. Physics no longer is simple enough for exact mathematical modeling, and in fact it has not been for quite a while. Just intricacies of "simple" inviscid fluid flow in the turbulent regime tax our computational resources, and it's still true that quantum chromodynamics does not -- fundamentally does not -- admit even computational solution without hugely wacky approximations. I commend to you the exercise of trying to solve Einstein's equations of general relativity for any real situation -- again, it's not clear it can be done, even in principle (coupled systems of ten nonlinear partial differential equations in four dimensions will do that to you). Of course, some would claim that once you're into, say, the magnetohydrodynamics of a fusion plasma, you're no longer dealing with physics, but they would be wrong.

Physics has this reputation of dealing with simple problems, which is why it's so successful. This is a grave misunderstanding. Physics has the option of starting with very simple cases that allow creation of very powerful methods of provable generality that then can be applied to complex problems with confidence that the methods are sound, even if the application is difficult. The physics most people learned in school deals with, yeah, two-body problems, and the hydrogen atom, but not because that's all physics can handle. It's all the students can handle. Physics laughs at that.

This is not to say that the brain isn't fundamentally more complex than just about any system physics would have the bad taste to try to tackle -- it is fundamentally more complex, in a very real way that transcends even emergent dynamic phenomena. But claiming that physics can't tackle nonergodic systems or whatever is about as true as saying that the brain is a three-pound soft French cheese.
+Thomas Tucker Fair enough. Discussing this in detail would require an entirely new post, though. In the meantime, I will use PasCal case exclusively.
+Pascal Wallisch : So this is what writing a book about using Matlab in the neurosciences (note the plural) leads one to do in one's spare time . . . it is not a satire. No paper that references Frege twice can be a satire.
+Peter DO Smith : I do not think +Pascal Wallisch is a mathematician, at all. That label does him a disservice. Pascal is a scientist, through and through. I think it remains to be seen whether he may be a polymath, though. Sorry for talking about you in front of your front, Pascal. You reveal new facets continually. Fascinating.
+Thomas Tucker I actually stopped saying "I am a x". Narrow and exclusive ontological statements about being are not helpful. But that also deserves a post of its own.
+Pascal Wallisch : In regard of your ontic essence: I am now formally permitted, then, to write a pair of parodies, called "Pascal Was a Neuroscientist", which I shall take the time to make brief, and "Jonah Was A Neuroscientist", which I will allow to grow to enormous size while continually confusing the concept signified by "whale" with the concept signified by "great fish".
I have enjoyed this thread greatly. This is how a debate should be: an intellectually stimulating exercise. That being said this is my comment. I cheered for The Oatmeal's commentary for it's attempt to right what I have seen as a historical deletion since I wrote that report on Thomas Edison in 8th grade. It was my father that told me about Nicholai Tesla and that was the first I heard of him. I personally researched his work back then as I often did when my interest was piqued and while I still considered Edison a great man, I was astonished at how Edison was praised while Tesla was completely ignored. That is the context I took the comic's hyperbole in and how I revelled in the exaggeration that barely covers the disservice history has done to Tesla.
+Peter DO Smith +Thomas Tucker Write it together - the antics of ontics of some once or former neuroscientists. The whale/fish comment is amazingly insightful. +Andrea Kuszewski would appreciate it, I think. +Mince Walsh I second your comment, and thanks for getting us back on "track". Having read the biographies, I had the sneaking suspicion that Tesla was in fact forgotten, and basically re"Discovered" almost by chance, due a push by Serbian nationalists. Do you get the same sense, or no?
+Pascal Wallisch : Eagleman thing. You rascal, Pascal, you got it to within three sigma experimental. I can only imagine that intellectual peregrination, where by "only" I mean "vividly", and by "peregrination" I mean "wankery". ;)
+Pascal Wallisch +Peter DO Smith Yes, It's a distinct possibility that nationality may have played a part and what we consider history is written by the winners. I had some great teachers in high school that drove that second point home. I'm more of the "complex multiple influences" kind of person and I would theorise that Tesla was ignored for a fairly mundane reason such as: History has a serious macular degeneration problem. Only a certain number of "great ones" can be seen at a time and even that vision fades in time
+Mince Walsh No question about that. There is even the effect that some quotes are misattributed to even greater people over time. Regarding Tesla himself: He simply ran out of funds after JP cut him off. He did fairly well until then, but he was basically feeding pigeons and not paying hotel bills for the second half of his life. Also, a few strokes of bad luck (e.g. lab burning down) should be considered.
+Peter DO Smith Which brings us full circle. Movies are another example. Maybe I should title my next book "TLDR"...
+Pascal Wallisch : I humbly await enlightenment.

+Mince Walsh : Your exchange with Pascal and +Peter DO Smith on the relationship between Edison and Tesla has diverted me into reading about the sociologist Robert K. Merton ( and what he saw in The Gospel According to Matthew (25:29, KJV): "For unto every one that hath shall be given, and he shall have abundance: but from him that hath not shall be taken even that which he hath", an insight about the sociology of science discussed briefly at ( and not unrelated to Stigler's Law of Eponymy (, of which I am certain Pascal knows, for reasons that remain obscure, obliterated by incorporation ( into a larger narrative of knowledge.

I think I'll scarf some pills and sleep for a few hours now.
I am glad the Oatmeal recognized Tesla but his claim of being the greatest geek of all time kind of bothered me. Don't get me wrong, Tesla was amazing, but he overlooked a lot of geeks that never get attention, ever. At least some people have a clue who Tesla is.

Also the conversation following this post is amazing. I feel more intelligent just for reading through it.
Wow +Thomas Tucker There is so meta in those links for me. I became interested in Robert K. Merton and his theories of social constructs BTW3 in my old HS library and ran into Stephen Stigler's concepts in much the same way. It all merged into the gestalt of my view of the world and I had all but forgotten of those greats who enhanced my view of social history.
Sure would love to take part in this conversation, but when a third of it is missing, it's completely impossible.
Oh, no, sorry, +Mince Walsh. It's a user block issue: a social problem of my making, rather than a technological one of Google's.
+Earl Hollar They can - hopefully - be fixed. We are all adults here. Is my hope in the moral maturity of man misplaced?
+Pascal Wallisch It's always my sincerest wish that all our hopes can match reality, that optimism can equal pragmatism. If we all work toward that end, I've no doubt we can achieve our goals, or at the very least, better what corners of the world we can reach.
+Earl Hollar That's great. Man is in an interesting dilemma. Coming from evolution, he is equipped with all kinds of instincts that facilitated survival and reproduction in the old days, including anger. The old way of resolving this conflict is of course no longer socially acceptable. This leads to all kinds of problems. The key challenge man faces these days is how he can transcend these base animal (literally, primate) instincts. There are few incentives to do so. Indeed, pretty much all of business (think advertising) encourages you to give in to them. But these "higher" powers need to be cultivated. It is the rare man who is able to do so, and do so consistently. This is the potential - but also the tragedy - of man.
+Peter DO Smith I've always found it interesting to look at the behaviors of the Great Apes, particularly our little outgroup [us, bonobos, and chimpanzees], and consider what led to the evolved behaviors of each. Bonobos are great lovers, solving social problems with sex, while chimpanzees are frequently surprisingly warlike, constantly battling neighbors. Humans are too "domesticated" for me to get a clear glimpse of what we were like before culture made the view so much more complex - and I'm probably "too close to the problem" to see clearly anyway - but our cousins teach us a lot about the evolution of social behaviors.
+Peter DO Smith Interesting idea. Testable, too. This implies fundamental differences in this regard (emotional apparatus) between those who stayed and those who left. The machinery is perhaps even more ancient than that. But it is a testable hypothesis.
+Earl Hollar I recommend the works of Dario Maestripieri on this issue. He is a real-deal academic scholar, and looks at human behavior by studying primates. He shares some astonishing insights that are hard to unsee.
+Peter DO Smith Do you suppose that's an artifact of culture, which allows us to essentially band much larger groups together than the other primates can manage?

Chimpanzees would have a difficult time organizing a group of a thousand warriors, not only because they couldn't manage the organization, but because they couldn't produce the requisite population in the first place. [And bonobos...well, I'm not sure you can wage group sex on the scale we wage mass war!]
+Peter DO Smith Do you not feel chimpanzee conflicts are extreme, or do you believe they, too, have been strongly selected for such a killing nature?
+Peter DO Smith Most interesting. In what ways? Scale and scope, or degree?

I'm thinking, for example, about the flip-side, cooperation: chimpanzees cooperate, too, just as we do, but in a much less extreme way than we do, with much less scope and scale, without the degrees of sacrifice humans are willing to make. We wage peace as much more extremely as we wage war: is this because culture allows greater organization of humans in many fields, or because mass cooperation was necessary for some extreme event in our recent past?
As far as I know, there is only one thing Tesla did not invent:

Add a comment...