Shared publicly  - 
 
I'm not sure when my hosting situation will be figured out so here's the full text of my previously mentioned post:

"In the early days programming was considered a subdiscipline of mathematics. In fact, the very first person to write an algorithm was renowned as a mathematical genius. However, somewhere along the way we forgot. We began to think of ourselves as something different, a profession not beholden to rigor or deep understanding of the models we create.

It’s easy to see how this would happen within an industry in which so much more weight is put on knowing the API of the year over understanding base principles or expressing the desire to dig deep. People can make huge amounts of money pushing methodologies when they have little to no evidence of effectiveness. We work in an environment where hearsay and taste drive change instead of studies and models. We are stumbling in the dark.

I have come to attribute our sorry state to a combination of factors. The first is the lack of formal training for most programmers. This isn’t in itself a bad thing, but when combined with a lack of exposure to mathematics beyond arithmetic, due primarily to an inadequate school system, we are left with a huge number of people who think programming and math are unrelated. They see every day how their world is filled with repeating patterns but they miss the beautiful truth that math is really about modeling patterns and that numbers are just a small part of that.

The relationship between math and programming extends far beyond SQL’s foundation in set theory or bits being governed by information theory. Math is intertwined within the code you write every day. The relationships between the different constructs you create and the patterns you use to create them are math too. This is why typed functional programming is so important: it’s only when you formalize these relationships into a model that their inconsistencies become apparent.

Most recently it seems to have become a trend to think of testing as a reasonable substitute for models. Imagine if physics was done this way. What if the only way we knew how to predict how an event would turn out was to actually measure it each time? We wouldn’t be able to generalize our findings to even a slightly different set of circumstances. But it gets even worse: How would you even know what your measurement is of without a model to measure it within? To move back into the context of programming: how useful is a test that doesn’t capture all of the necessary input state?

This is exactly why dynamic programs with mutation become such a mess. Object oriented structure only makes things worse by hiding information and state. Implicit relationships, complex hidden structure and mutation each impair reasoning alone, but when combined they create an explosion of complexity. They each conspire with each other to create a system which defies comprehensive modeling by even the brightest minds.

This is also why programmers who use typed functional languages frequently brag about their low bug count yet eschew methodologies like test driven development: tests pale in comparison to the power of actual models.

Many thanks to my friends on twitter for the discussion leading up to this post:@seanschade,@craigstuntz, @sswistun, @joakinen, @copumpkin, @dibblego, @taylodl, @TheColonial,@tomasekeli, @tim_g_robinson and @CarstKoenig. Special thanks to @danfinch for taking the time to proof read."

Once my hosting issues are settled, the post will be available here once again:
http://richardminerich.com/2012/01/why-do-most-programmers-work-so-hard-at-pretending-that-theyre-not-doing-math/
11
3
Neil Carrier's profile photoRoger Sen Montero's profile photoGabriel Claramunt's profile photoMuntasir Khan's profile photo
18 comments
 
Contrary to what you seem to believe, subtyping is not at odds with math. See chapter #15 to #19 in TAPL.
 
For me programming is not nessecarily mathematics(only if the specific domain is about it), it feels mainly like being an architect, building a nice fundament, the floor a nice mosaic, every corner has beatiful ornaments... but also, maybe I don't understand many functional concepts because I'm not that strong with mathematics, especially with all those weird scientific mathematic syntax things...
 
+Rahul Goma Phulore Anything can be modeled in math, but some models are much nicer than other. Row polymorphism is much simpler to type properly than subtyping, while giving most of what we get from subtyping.
 
Disclaiming the connection between programming and mathematical logic is about as smart as rejecting the physical laws of gravity. It indicates a lack of scientific understanding. The result is a bloody nose, or worse.
 
+Daniel Yokomizo, "Anything can be modeled in math." How about runtime metaprogramming? It cannot be modeled in math. It's still a very useful tool IMO.

Re: row polymorphism, agree. I should have said OOP is not at odds with math.
 
+Rahul Goma Phulore Which kind of metaprogramming? Many forms of advanced metaprogramming can be expressed using dependent types and/or multi staged programming. Many others are nicely expressed using first class labels in polymorphic rows. Give me some example and we can figure out a principled version of it (but I think we should move this discussion to another post).
 
Your conflation of rigour with mathematics is not very rigorous.
 
+Daniel Yokomizo I think it's quite artificial to distinguish Row polymorphism from subtyping. Yes, it's a restricted form of subtyping (a very useful restriction, I agree), but it's subtyping nonetheless.
 
Parametricity holds much stronger theorems for row polymorphism than for subtyping, that's the primary reason I try to make a clear distinction between them. Subtyping also is entangled with many other features (e.g. self fixpoint from OO languages) which makes it harder in discussions, I find myself always confused about which kind of subtyping is being argued about. OTOH row polymorphism is better defined. Other than these issues I agree with you that's an artificial distinction given some definitions of subtyping.
 
I'm sure a lot of math goes into designing a good wrench; tho you need to know none to use one.
 
+Ademar Gonzalez Type systems are that good wrench! You'd be hard pressed to use a wrench if you didn't understand the idea of a threaded bolt and nut though.
 
OO != state and mutation. I like the premise, but the OO hate is based on common practice and not its ideal. In 10 years, when less skilled developers mangle the functional ideal, and whatever paradigm shifting methodology is touted to solve those problems -- remember this. My completely ignorant prediction -- monad abuse.
 
+Lou Franco Ahh, but that's why I listed mutation, object orientation and dynamism each separately. They are separate in how each creates a different form of complexity.
 
Not sure how OO can add complexity on its own. Abstractions lower complexity. Bad ones ok, but bad functional abstractions are also complex. For example OO in clojure is immutable, flexible, and feels like the opposite of complex to me. There are Java artifacts, but if you don't need interoperability, then you don't have to use them.
 
+Lou Franco Information hiding is the problem. Instead of data structures being exposed and then having algorithms with operate on those, you have invisible internal structure. Admittedly, it's not nearly as big of a problem when you don't have mutation, but how often is that?
 
Never in C#, all of the time in clojure. I resist the conflation of OO the concept and OO as implemented in C#, Java, etc. Information hiding makes it possible for interoperability and independence. You are right that common practice is bad -- no argument there. Also, that the language and all examples promote that practice -- and alternatives are basically impossible -- no argument there.

Still, with all of its faults, OO has begat more interesting programs than any of the alternatives so far (sheer number and impact) with structured C an arguable peer.
 
My favorite book on computers and mathematics is "Computer Mathematics" by Cooke and Bez (Cambridge University Press, 1984). It was this book that first opened my eyes to computer programming as a mathematical discipline in and of itself, as opposed to merely a way to manipulate numbers. It also started me thinking about the notion of programs as a series of functional transformations and operations on sets of data -- in the set theoretic sense of the word "set". (I remember I was thinking about computer chess at the time, and I realized that the available moves for each piece could not only be specified in terms of set theory, but that this also simplified the implementation of chess moves in computer code.)

Anyway, good post.
Add a comment...