Shared publicly  - 
How techno-supremacists kill off indies

A good number of folks are excited by the iPad 3. It doubles the visual resolution and quadruples the size of the art assets. The game created for this new technological marvel will certainly be bigger and we are told they will also be better. This is to be expected since there is a long standing techno-supremacists narrative in the our industry that improved technology results in superior games.

The belief goes something like this:
- Better hardware means better graphics and simulation capabilities.
- This in turn leads to more player immersion and richer worlds
- At some critical threshold, technology empowers game makers to break through a magical threshold at which point games become culturally and artistically undeniably relevant.

This tale has driven many computer game developers for the past few decades. The push towards omnipresent 3D, the rote inclusion of complex physics simulation, the literati's constant promotion of extravagantly visualized narrative 'experiences'...all these trends in some manner rest upon the belief that more technology inherently makes games better.

Technologists love this story because it puts them at the center of all progress.

Futurists love the story because they get to convincingly say 'and then magic happens' without ever understanding why games work.

Press loves the story since pretty pictures and lush experiences sell like porn.

Businesses love the story because it creates a constant cycle of new stuff to sell.

Customers have been sold this story via billions of dollars in marketing. Do you actually love 3D or were you a young and impressionable consumer that was fed decades of very expensive propaganda? (I'll give you a hint...the only populations that love modern 3D graphics are those where 3D graphics have been heavily marketed as a positive feature)

I personally believe that this tale is one of the more damaging visions we can hold for the future of games.

Technology doesn't inherently yield better gameplay. It is just one tool of many to be used by design. However, it does have a distinct cost. History has shown over and over again that when it is applied to games, the immediate impact is to erect surprisingly massive barriers to entry that prevent innovative smaller developers from competing with more established companies. Technology may not make games better, it almost always makes them harder to make.

A few people may remember the move to 256-color graphics. It wasn't merely a matter of adding a few more colors. The entire art pipeline shifted towards richer imagery. Per title costs didn't merely increase 10-20%. Instead they doubled or quadrupled and development times also increased substantially. Companies that couldn't make the jump went out of business.

The same thing happened again with the move to 3D and again with each subsequent console generation. There was a point at the end of the 90's just as 3D came online where smaller developers hit a breaking point. They could no longer raise the capital in order to make games that competed visually with the current market standards.

With the first death of the indie movement at the end of the Shareware days, the graphics became richer, but the pace of innovation noticeably dropped off. Consider: we essentially are left with two dominant genres in the retail AAA console market 1st person shooters and 3rd person shooters. There are the occasional saurian kings left over from previous genre heydays (I'm looking at you Madden) but simply hitting the quality bar costs too much to take on any additional risk. Our fixation on technology as a product differentiator yielded a homogeneous marketplace.

Are the games actually better? Is this cost worth paying?
The advent of new platforms such as mobile or social networks and the blossoming of digital distribution, created a small space for tiny, non-technology based games to grab a foothold. And when they did, they multiplied like worms in nightsoil. Will Wright's Cambrian explosion is happening. Low cost development combined with cheap customer acquisition has yield an astounding number of new genres, new forms of play and the fulfillment of unmet customer needs.

Is it a surprise that the majority of the innovation occurs using 2D and relatively unsophisticated engines? Lowest common denominator tech means that small independent teams can focus on making great games instead of paying enormous costs simply to get a pixel on the screen.

These upstart markets populated by decrepit technology and innovative games are growing at double or triple digits. The markets that attempt to differentiate with technology and visuals are either growing slowly or declining (depending on which numbers you select).

At the very least, people seem to still love games even if they aren't sporting the latest whizbang 3D engine packed with 10s of millions of dollars worth of hyper detailed content.

Second verse, same as the first
Apple has released the iPad 3 because they want to sell more hardware. In turns, massive amount of money are being spent to convince players that improved graphics will make their gaming experience vastly better. The story is the same. The technologists gush. The businesses count their coins. The customers are worked into a buying frenzy.

Will there be better games? I'm not so sure. I do know that:
- Costs will rise.
- Smaller companies will find themselves at a financial disadvantage
- Game play innovation will slowly falter as teams become larger and more risk adverse.
- We'll start seeing a rise in the same old marketing messages of 'visuals, immersion, experience and narrative'. Gameplay becomes a solved problem.

I distinctly remember a keynote by Big Fish just as the casual downloadable market was imploding in which the CEO pimped their leading vision for the industry's future. It was surprisingly similar to the last point. He was so proud of how much they had spent on a recent hidden object game in order to craft the world's 'first cinematic gameplay'.

The game industry has a history now.

It repeats.

And will do so again.

What to do
How can indies keep their costs low and avoid being drowned by the constantly rise tech tides?

Avoid the mature markets dominated by techno-supremacists.
Yeah, that means no longer developing for consoles. It means going for odd markets like PC downloadable games or web games that are heavily fragmented and have a wide range of poorly formed community standards. Some may see these markets as difficult. However they have niches that smaller, risk loving teams can live within.

Counter marketing
Counter the techno-supremacist marketing message with your own that focus on the inherent delights of gameplay. Enough indies promoting a counter culture keeps a space for low cost, innovation friendly development. 'Games for people that love game play', not 'bloated experiences for techno-wankers'.

Focus on non-technology dependent art styles
We use pixel art in Realm of the Mad God. It will look the same on the iPad 3 or Flash running in a web browser. The art says upfront "No, I'm not playing your silly game of bigger numbers. But I still look good if you like my aesthetic."

Using broadly available lowest common denominator tech
This future proofs your engine. Pick something that can run everywhere and is used by everyone. Chances are that there is enough communal value in the broad technology choice that the hive mind will figure out a way to create future compatibility even as the technologists push forward arbitrary changes.

Use off the shelf tools
Avoid getting caught up in writing your own technology. The 200% increase in cost is not worth the 10% increase in some minor aspect of the experience that most users will ignore. Take your 'engine' out back and shoot it in the head. Instead use existing tools and game engines that let you focus 90% of your efforts on making a great game. Use them to reduce your costs so that you can iterate more quickly and spend less time reinventing content pipelines.

Compete on gameplay, not tech
If your game is average to mediocre looking and is missing a half dozen cool buzzwords and people still play it because they love the game, you've got a product that can ride out the inevitable tech bubbles. It is no suprise that Tetris and Nethack are still going strong despite their lack of GPU support.

take care,
Brian Meidell's profile photoLars Andersson's profile photoArash Payan's profile photoMike F's profile photo
Great article, and I agree that 'Technology doesn't inherently yield better gameplay', but that doesn't mean it can't either. Your post equates better technology with better graphics, but there's a lot more than just graphics that can explored with upgraded tech. Deeper physics and AI simulations, more connected experiences as networking capabilities increase. As an indie developer I'm excited about the possibilities of a powerful tablet - while I have no intention of trying to compete in the graphics war with AAA studios, there is still vast unclaimed landscapes in gameplay that are totally ignored by AAA studios that are just now becoming possible with this new technology.

You mention Realm of the Mad God as an example of eschewing technology in favor of gameplay, but I would argue it is a great example of using technology (networking technology) to build something great, creating a kind of gameplay that AAA studios have overlooked. Tech != Graphics!
As a "journeyman" game designer and someone who is creating a game that is inspired by lots of things (including Triple Town), I appreciate this article. And, I'm glad Danc that you didn't just present a problem, but thoughtfully talk about what to do. I guess we'll see what happens, but I think the advice of using off the shelf tools is key.

I also think its ironic that the main reason I bought my iPad 3 was because of what it does for text. Reading email, news, using Safari ... the iPad 3 makes this an entirely new experience.
iPad 3 is particularly egregious. It adds additional visual fidelity, without the processing power and RAM to trivially support that fidelity. Indies can't simply say, "export the art at higher res," and have it work. Rather, developers must invest additional development resources to support the new capabilities. This is atypical in the history of iOS. iPhone 4 was so much more powerful than the 3g that the increase in resolution was insignificant for existing apps.

It's not simply a 1 year problem, since the iPad 3 will be a concern for at least 2 more years.
pretty good article! actually our tiny development team face the same issue, high resolution new ipad will helping us to get extra coin or spend extra coin just for enlarge the images size but doesn't improve anything for the gameplay?
This has been a similar issue among console game development, hasn't it?
Speaking of using existing toolsets, why not save yourselves eons of headaches and use a technology like Adobe AIR to port your web based games to mobile platforms vs. spending time and resources writing native language? A carefully designed application can use the same codebase and easily transfer to and size screen and perform well regardless of screensize or pixel density. Best advice on graphics? Use vectors whenever possible and dynamically cache to bitmaps appropriately sized for target device at runtime. They'll look as sharp as can be and perform exceptionally well. See example here: Hero Mages Multiscreen Demo
A good game is a good game regardless of the hardware it's running on. However, a game like Journey (thatgamecompany) wouldn't be as awesome as it is without top-notch hardware underneath it - stunning audio and visuals can add something to a game that gameplay alone can't. Good hardware doesn't make a good game but it can certainly make a good game better.

And let's be honest, the indie game scene needs a good spring clean now, the market has been flooded with cheap crap games because it's far too easy for Random Joe to develop and release a game at the moment.
Excellent post, Dan. I've been ranting about this for years.

There is a specific strain of techno-supremacist thinking that is especially seductive for creative people working in AAA games that goes something like this:
- Games will only attain the legitimacy of film once they have attained the holy grail of graphics, "photorealistic human characters".
- Technological power, moreso than artistry, is what keeps us from attaining this grail: we need to be able to push more polygons, with more complex shaders, higher resolution textures and more bones in facial rigs.
- Keeping technological "progress" marching onward is the only way we will attain this grail.

David Cage is probably the most media-visible adherent of this idea.

I believe this idea does tremendous harm to the medium of games.

Every new advancement in graphics since around 2005-6 has brought us further past a point of diminishing returns - development costs have risen far out of proportion with the overall quality increases they bring.

I believe the quest for "photorealistic humans" is a false grail, unattainable until the far-future point where we can simulate human physiology and psychology(!) at a very low level.

It is not just a matter of more polygons, bones, pixels, etc - Team Fortress 2's stylized characters look great and they were possible with tech that's now a decade old. It's not even a matter of artistry, ie modeler and animator skill - more than half the time when a character that's trying to be photorealistic looks uncanny, it's essentially because of bad AI. LA Noire characters look decent in screenshots but they fall down hard in motion, particularly with full-body acting and anything that requires them to respond to player input.

The techno-fetishist approach treats it like a problem of Data (and the horsepower needed to generate and process it), when in fact it's a problem of Process - the simulations and algorithms underpinning the behaviors we seek to replicate. This latter approach is so bottom-up that it would rightly be considered highly experimental, ie not something you can roll into a shipping game 2-3 years down the road.

Programmers continue to be seduced by the former approach because they don't understand the subjective side of the problem clearly enough, and treat it like rendering a 3D world (a problem we have more or less cracked). Artists and designers fail for the opposite reason - they treat technology like a black box that can materialize their mind's eye. Common to both is the inability to spot the underlying intractability of the problem.

Even if all this weren't the case, and it were simply a matter of throwing enormous wads of money and time at an already-solved problem, haven't we just spent millions attaining something that the most humble indie filmmaker can attain virtually for free just by pointing a camera at two actual humans? Why do we aspire to such an inferiority complex?

The Uncanny Valley is a bottomless asymptote of intellectual laziness that we will continue trying to landfill with money. This money translates into man-centuries of very talented peoples' time that could have been spent making games that further our medium and entertain or even enlighten billions.
+Si Robertson "And let's be honest, the indie game scene needs a good spring clean now, the market has been flooded with cheap crap games because it's far too easy for Random Joe to develop and release a game at the moment."

What are you suggesting exactly - that games would be better if they were harder to develop?
+Jean-Paul LeBreton
No not all, I'm not saying games would be better if they were harder to develop, I'm saying the market wouldn't be flooded with as much crap as it is now if games were harder to develop (or at least if game development required more investment in one form or other, be it financial or time etc). The ease at which people can make games at the moment will be thing that bursts the current indie gaming bubble - it has happened in the past - the market gets flooded to the point where it can no longer support itself.
It's a good thing that technology advances. It has more applications than just inidie game developers, and I'm personally glad I'm not programming ASM trying to fit a game into 8k. And I like that my computer has a color screen.

The internet gives us an open market and there will always be niches to fill. Aesthetic is important in good design and not fidelity. All the pixels in the world don't help crap art. Make an innovative game with a purpose and forget about the technology altogether. It's a wasted thought.
There can never be an indie bubble because 95+% of all indies do it as a hobby. That is like saying that there is a 'writer bubble' or an 'photographer bubble' because of the cursed influence of cheap keyboards and camera phones. Imagine saying "The only people that can play basketball are those that start out at NBA caliber". Oddly enough, the number of NBA caliber players starts to decline.

'Cheap crap games' are people learning to make games. After 9 or 10 crappy games, maybe that human debris that you look down upon will make something interesting and wonderful. This is how the creative process works. There are not hard limits between geniuses and imbeciles. Instead there is a hard journey from student to master. Seed it with many seekers at the start to maximize the number of great works at the end.

Higher barriers to entry do not result in better games. What they result in is kicking a bunch of students out of school so that they never have the chance to fulfill their potential.

We've been down this path before. Remember the phrase "breaking into the game industry"? That emerged when barriers rose so high that hobbyists were effectively blocked form making games. It was not a strong reality early in the game industry. Nor is it a reality now. Anyone with a computer can make a game and release it commercially. It is one of the darker periods of our practice. Let's not repeat that mistake in some skewed pursuit of 'quality'.
+Daniel Cook
I think we will have to agree to disagree on this one, obviously we are both looking at things from completely different angles :)
I appreciate the argument, but when you say "Technology may not make games better, it almost always makes them harder to make" you seem to be suggesting that these advances in technology somehow force game makers to engage in the graphical arms race, that the power on offer has to be tapped. Yet it seems to me that technology is making game development easier, more accessible, and the real commercial as well as critical opportunities lie far from the world of multi-million dollar budgets and photo-realistic art assets. I don't think the higher resolution screen of the iPad 3 is going to pull the ladder up from the 'indie' development scene, but I do suspect it will help sell a lot more iPads, and afford those developers an even more diverse and attractive market to experiment with...
+Matthew Franklin - you say "[iPad 3] adds additional visual fidelity, without the processing power and RAM to trivially support that fidelity," but that's not the case if you consider the evolution from the first-generation iPad. The original iPad has 256MB, iPad2 512MB, and iPad3 1GB of RAM (not all of which is available to a single application). The GPU is between 1.5x and 2x as fast as the iPad 2 (source:,3156-6.html), which in turn was far faster than the original - about 8x according to most sources I've seen.

It's true that you might not be able to blindly double the resolution of all your sprites/textures on the 3rd-gen if you were previously pushing the iPad 2 to the limit, but you should be able to do the ones that really count. If your game was running well on the original iPad, then I wouldn't be surprised if you could trivially double everything for iPad3.
Fantastic piece, great conversation. For me it's never been a struggle of "technology good" vs. "technology bad", but rather a tension over who is steering the ship. Technologists (as well as designers) often make the mistake of assuming "more is better" - be it polygons or mechanics - failing all the while to ask the question "What is this game really about?" and "Does this feature directly serve that end?" If it serves the game, great. If not, it often only makes things harder - how many RPGs denied themselves the ability to iterate due to a pre-slated voice-over recording schedule? How many shooters shipped with 7 enemies instead of 27 because the art pipeline & production bandwidth couldn't support that many "heavy" assets? That said, I for one am happy to have a mix of AAA technology-to-the-wall games out there as well as niche / innovative / hedgehog-concept indie games. Both drive the industry forward in ways that are valuable to the whole. For me, however, graphics (and other tech) has always been a means to an end, and nothing more; and in most cases, my brain fills in whatever details are missing (and there are always details missing). 5 minutes into a really good game, all I care about is "that monster around the corner that's trying to kill me" rather than the number of polygons on the monster. Then again, maybe I played too many fantastic Infocom / MUD / NetHack games to be able to value presentation over substance.
Add a comment...