Shared publicly  - 
Steve “berto” Bertolacci's profile photoAndrew Dieffenbach's profile photoClay Ginn's profile photoDrew Bannister's profile photo
+Steve Bertolacci The primary reason PC gaming hasn't completely superseded console gaming is the relative lack of fragmentation in the console market.

A developer creating games for a console only has to worry about a single set of specs.

A consumer with a console doesn't have to worry about upgrading their video card to support DirectX 50 and OpenGL with Hyperdimensional Shading Technology, or their processor to support the latest in hyperthreaded predictive technology, requiring at least 100 cores. Once the consumer buys the console, (s)he knows that the console will last as-is for (at least a few) years.

If a game is released for a console, developers and consumers know without a doubt that the game will work optimally for that console without any unforseen circumstances.

Don't get me wrong. I think the nature of the PC gaming market has been largely responsible for the rapid advancement of PC hardware technology (and rapid decline in its price) as well as some semblance of standardization in APIs.

However, for a console, fragmentation is just an all-around bad idea. If it's going to be possible to buy a game for a console and have the console be incompatible with the game, then what's the point of having the peace of mind of owning the console in the first place? At the very least, they should find a way to let us upgrade old +OUYA consoles or trade them in for a significant discount.

TL;DR: The PC gaming market is a constantly moving target. The console market, not so much.
Hmm this will be OUYA's might make people annoyed/frustrated that new ones come out maybe too often? It's like with iPad's, a new one comes out when they buy it
+Christopher Parker, the reason isn't lack of fragmentation, it's just that consoles are simple and cheap. Gaming PCs require a certain level of expertise to tweak and money to burn. One could argue that the lack of fragmentation keeps the cost low, but that would just be BS.  
Might not be a good idea. You may want to stop think in the box and get away from the metality of Sony and Microsoft. If they can bring out new ideas and if software again pushes the hardware this will be the way it needs to be. Having the same console for 7-8 years and just buying it to need a new accessory to play a game (a single game is some cases) is last year. Also not much different if the consoles stay that they $100 price point.
Though I agree with the increased frequency of hardware releases, annually is a bit much--it may be counter-intuitive, as many people like myself will not buy each iteration. Skipping OUYA consoles might kind of be like skipping movie sequels. I'm not going to see part 3 if I didn't see part 2 or 1.

I'm tired of the fragmentation argument. That's the FUD that Apple and Microsoft like to feed people, to corner them into their monolithic business models. If fragmentation is a concern--and releasing new hardware causes fragmentation, why don't you nay sayers just stick to your Pentium Windows 95 machines and shut up.
Yes but look at it as they have done for years. Each come out with a new accessori tha cost in the ball park of $100-150 a year. I think as long as the Games and drive the upgrades this will work. If the games run just as good on last years model then your correct why upgrade. But if the price point stay's at 100 won't be much different. I mean I upgrade my phone every year and kids get the old one. This may work same way. New one in living room kids get last years model.

And fragmentation I totally agree it is a non issue for users. It is an issue on the development side but only if the developer wants it to be.
New hardware options are a good thing people. Sitting happily with a box of aging hardware to "avoid fragmentation" is silly. You've been brainwashed.
Add a comment...