Profile cover photo
Profile photo
Sean Reilly

Post has attachment

This is another one of those generic ideas that I think translates really well to programming. Knowing how to use a tool is only the first step in the process, not the final goal. Using a tool creatively, knowing when to use it, and when not to use it are the far more important than "I can use X".

I think this is why I pay almost no attention to the "Technologies" sections of most CVs, especially when they are just a bunch of bullet points with technologies.
Add a comment...

Post has attachment
While I don't think that jobs are obsolete just yet, this article made me think a bit about what employment is going to be like in the future.

Technology is making people more productive faster than the population is increasing. At the same time, an increasing population makes economies of scale more powerful with existing technology. This is most obvious in agriculture, where the percentage of the population needed to produce the food supply has fallen from nearly 100% to a very small minority in well under 200 years. As the article notes, the post office has it coming as well.

This trend isn't going to change; automation techniques never get worse, they only get better. In fact, it's probably going to increase due to the network effect of technology (the more technological advances you have, the easier it is to invent new technology). So it's entirely possible to believe that there will come a time when it's not even practical for everyone to have a job.

Not today, maybe not tomorrow, but sometime, and probably sooner than you think. I think it's possible that mass unemployment due to technology will happen within the lifespan of people alive today.

So here's my question: does thinking about unemployment with this prediction in mind change your mindset about unemployment policy?
Add a comment...

Just found (and fixed) an acceptance test that would fail every Friday after 4pm. It's like the code was leaving early for the weekend.
Add a comment...

Post has attachment

I saw this story and thought of +Rob Maguire. It seems like the kind of thing he would do if he was mayor of somewhere.
Add a comment...

Post has attachment

Post has attachment
Interesting take on Google vs. Apple. I hadn't thought about it this way, and I think it makes a lot of sense.
Add a comment...

Post has attachment
How to (legally, from the sound of it) make your own bootable OS X Lion USB key. I think I'll be trying this on the weekend.,2817,2389178,00.asp
Add a comment...

Post has shared content
Totally cool chair that I wish was around when I went to school:
Add a comment...

Post has attachment
So, one of the reasons I thought I might like about Google+ (as opposed to Twitter) is that I can write quick rebuttals to other blog posts that I find on the internet, without the effort of maintaining a full-blown blog. This is my first stab at it. Enjoy:

The article in question: posted by Bob Warfield.

My attempt at a rebuttal:

First of all, I should start off by saying that I agree with the thought that most small to medium size projects probably* won't have enough traffic and scale to need a NoSQL solution for that reason right out of the gate. However, that alone is not enough to make NoSQL a premature optimization. The reason why not is the general exception in the definition of "premature optimization is the root of all evil": it only applies to "optimizations" that make the codebase less readable, less modular, and less well-designed. A lot of the time, using a NoSQL database can actually make domain modelling easier.

As the original article is divided into three main points, let's tackle them one by one:

1. "NoSQL technologies require more investment than Relational to get going with."

Obviously, I disagree with this statement. A well-designed NoSQL persistence can often be cheaper to implement than using an RDBMS, and the reason why is often "the Object-relational Impedance Mismatch" **:

Put simply, mapping certain concepts that make a ton of sense in the object-oriented world to a relational database can be a huge pain. If you're not using a relational database to persist your data, that pain might just up and vanish. Less pain equals faster development, sometimes dramatically so. How much faster it can be is probably best summed up by my colleague +Chris Turner in his recent post:

I've had experiences like this as well, and so have others. Here's a specific example: the current project I am working on features a json rest api. We have found that using MongoDB (which persists json documents) as our persistence layer means that we can use one consistent model the whole way through our app, never having to shoehorn a concept into a weird construction in a different paradigm. As a result, the development cost of persistence has been very cheap. Even though we don't have enough data to require NoSQL, it has been a positive choice for us: very far from a premature optimisation.

The post mentions two other points: "a learning curve", and "an operational overhead". The learning curve is fair, but in my experience part of that curve is unlearning some of the tricks years of doing relational have made second nature (like an entity to represent a many-many relationship, and master-detail relationships), and realizing that "the dumb approach" in a relational system might not be that dumb after all.

The operational overhead concern is a fair one, as many ops people are a lot more reluctant the adopt NoSQL than us developers. But not all of them; I've met a few ops people that are incredibly open to the idea, and not just because they have visions of turning the budget for Oracle support contracts into more salary ;-) At any rate, consider the case that this might largely be an issue for medium companies, because a lot of the time small companies don't do correct operational monitoring and maintenance anyways: they just install MySQL on a box, put it in a closet, and pray that nothing goes wrong.

2. "There is no particular advantage to NoSQL until you reach scales that require it....."

Well, obviously there is. I've been ranting about why for a few paragraphs already, so I'll leave this point alone for now.

3. "If you are fortunate enough to need the scaling, you will have the time to migrate to NoSQL and it isn’t that expensive or painful to do so when the time comes"

This is the point at which my opinion of the article descended into incredulity. Not because point #3 is outlandish (it isn't), but because it directly contradicts point #1! How can moving an app from relational storage to NoSQL storage be easy, but developing that same app to persist to NoSQL storage from the beginning be prohibitively hard? Something doesn't feel right here, and looking back at the article, I think I've found it.

The gist of the problem is in this quote: "as Sid Anand says, 'How do you translate relational concepts, where there is an entire industry built up on an understanding of those concepts, to NoSQL?’" Why would one have to translate those concepts if you aren't using them? The big reason why is if you've gotten into the habit of designing your applications using those concepts. This leads me to suspect that the Mr Warfield may partake of a bad practice that is depressingly common in the "enterprise computing" world -- modelling applications in terms of their (almost always relational) persistence instead of designing a persistence strategy in terms of the application.

This is a really, really, bad practice. Modelling an enterprise application in terms of its persistence is like designing a video game around the file format of its save game function. Persistence should be an entirely internal concern. None the less, many people think about the schema design first, and it leads to incredibly nasty problems: like database schema that can't be changed without breaking an external application, because it was "cheaper" for another app to just connect to a big common database.

For whatever reason, I haven't seem the same kind of thinking in the NoSQL world. It might be because the different NoSQL technologies are less similar to each other than RDBMSes, or because the lack of an impedance mismatch makes it easier to extend OO/DDD concepts everywhere, or because everyone is using a different product, or what. I don't really know why, but I do know that I don't want it to show up.

So there's my first rebuttal to a random post that showed up in my twitter feed. It's longer than I thought it was going to be, and it has footnotes -- sorry about that. The point of this idea is to not expend the effort in editing and polish that a full-blown blog would demand, so I'm not going to edit this post to death, OCD be damned.

* There are probably cases where the right NoSQL solution could turn a very large project into a medium project (or, perhaps the other way around), but let's leave that alone for today.

Add a comment...
Wait while more posts are being loaded