Shared publicly  - 
 
That's about right :D

Via +Nancy A. McCoy :D
25
12
Dave Lee's profile photorodolfo galvez's profile photoAyoub Khote's profile photoAlex Bowers's profile photo
55 comments
 
Uh? Linux is not a PC? Am I missing something?
 
Linux is NOT the wolf. PC is.
 
They are talking Operating systems. t should be Mac/Linux/Windows
 
Yeah that's: Mac OS / Windows / Linux . Because now all are PC. The order is good.
 
Yawn. Cute but misinformed... For me, Windows is a toy used by your average non-techies world-wide. I love my Linux and have used it fulltime since 1993, but Mac is also hard to beat. How often have you seen a Windows system come with a full-fledged C, C++ and Objective C development environment? Macs do. Windows even stopped shipping with BASIC back around Windows 95. Need a full-fledged command-line? Bash, Perl, Python, etc are all standard on Macs whereas Windows user typically resort to a DOS command prompt reminiscent of 1985. Give me a Mac over a Windows box any day of the week and I'll get real work done instead of just playing games.
 
Don't expect any help for Linux? Are you kidding? Who writes this stuff?!
 
Sorry +Jim Buzbee, but I can do damn near anything a Mac can do using my Wondows based PC, EXCEPT for those programs that are MAC ONLY like Sountrack Pro, or Fonal Cut Pro.. Oh wait, I can do all that shit with Adobe Software. Mac's are expensive pieces of shit. I can build a PC computer that has damn near the specs of a Mac, for half the cost. Python, Perl, and Bash can be downloaded onto a PC.
 
+Andrew Head Yawn.. rolls eyes... Here's a quarter, go get yourself a real computer

;-)
 
I have a real computer dumbass. I'm usually civil until shits like you do the whole rolling eyes and quarter bit. At least I can upgrade my computer for less than 1,000$ while you have to buy an entire new fucking mac every 5 Years. +A. Fonzarelli Yes, I do. Course I get my Windows at a reduced cost. I get 7 Ultimate for 30$ thanks to PURDUE.
 
+Andrew Head Methinks thou dost protest too much.

And realistically what's running on your desktop is getting less and less relevant as much of what an average consumer wants to run is available on the cloud where servers are already running a real operating system: Linux
 
I can buy a faster, higher performance PC for half the price of a mac, dual boot it to Windows and Linux, and do everything on it.

Macs look good, they work well, and the OS upgrades are cheap, but the hardware is, essentially, an overpriced PC in a custom box. Hell, if I go AMD rather than Intel, I can build a computer that is faster and more stable than my mac for a fraction of the cost.

I have a mac, and I know how to use it, and I know what its limitations are. I also have a few Windows machines, and some Linux systems. As an Operating System enthusiast, I can see the points made by the graphic above, and, most importantly, the humour in it.

Anyone who takes a graphic like that seriously needs to take five, and realise that it's a bloody joke.
 
+Jim Buzbee I could care less what you think. You have already pissed me off, thanks to the Quarter comment.
 
+Jim Buzbee - Operating systems are each good for the purpose they're designed for. OS-X is king of the UI, Windows is king of the flexible desktop, and Linux is the king of servers... Of course, there are other OS' out there, and other opinions that might disagree with mine, but hey, we're all entitled to our opinions :)
 
+Andrew Head are you new to Google+? Insults dont have any value here. You must show your arguments. That's the way.
 
+Ayoub Khote Yep. No argument here :-) I use them all but have my definite preferences due to my Unix background. Luckily with IOS, OSX, Android and Linux all having Unix roots, I'm a happy guy these days!
 
This only conferms it MACs are wimps. Maybe it needs to use Linux's mirrors to look bigger than that ankle nibbler that it is.

Not that I'm degrading the people that use MAC's, it's a pre-programmed response your MAC loving teachers gave you because they had free MAC computers given them and they could not even afford an electric typewriter. But my Atari did not do well either, they invested in Japan stuff were Nintendo was king.
 
+Ayoub Khote Maybe in the distant fog of computing past, DOS can be traced back to into the evolution of the great Unix tree but I can't forgive the fact that they got the directory delimiters backward and used letters to designate drives :-)
 
Atari introduced the bit mapped icons and the bump registors, as I calld it the Amiba, that single celled life form did things more in the line of video processing. But they were about 6 years apart in human age, and 8bit against 16 almost 32 bit age in CPU's. And the 6502 against the 68K which the MAC used. And the Amiba was more Euro, Atari more American.
 
+Qim virtua
, Really now? How much did that run you for a Home Computer, and whats your Specs? I'm always looking to upgrade. Course considering I can't find anything for Fiber Optic Home PC's, I'm pretty sure your joking, but if you arnt, then please send me the info.
 
The way I read this... Anyone can own a Mac. A skilled handler can get a PC to do what they want. Nobody should be allowed to have Linux in their home..
 
+Dean Holyer, That takes me back. I recall the old Atari bit-mapped icons. I would use the vertical-blanking interrupt to change the definition of the icon slightly thus getting real-time animation on a 1Mhz, 8-bit computer. We've come a long ways...
 
Hey +John Wright, I've had Linux in the house since 1993! Now my Sony TV has an embedded Linux system, my cable set-top box runs Linux, all the cell-phones in the house have a Linux kernel and even my WiFi access point runs Linux. Not to mention that fact that I raised my Son to use Linux every since he could use a keyboard. Now he works for Google and will support me when it comes time for me to retire :-)
 
Ok +Qim virtua My Laptop is an Asus G17 Gaming Laptop Used for Graphics and Animation for Video Game design. It's got 8 GB of Ram, and a Geforce GTX260M Cuda with an additional 1 GB of Ram on it. Its not bad for what I use in school.

My Desktop which is in need of an upgrade has an AMD Phenom 2 Black Edition with 16 GB of Ram, an Nvidia GTX 400 series. Can't remember the exact one, but It did its thing fora good while, and is still good, I'm just ready for the new GTX 680 which is a beast. I plan on going with 32 GB of Ram, or an i series processor, and water cooling. Eventually I will have 3 monitors, and it will be used for Game design which is my major.
 
The odds are the Graphics card is an embedded feature. They need to make laptops a plug and play hardware optional system. Want a better CPU or GPU or even BIOS you unplug the old and replace with the new and better lego brick piece. I wrote of this back in 1996 to PC Mag, World, and a few others no one liked it but both PC mags in 6 months had topics on this. I'm not saying copied more like influenced.
 
+Dean Holyer, that is why I'm rebuilding the desktop. The laptop I use at School, but I have the same programs on both. That why when I get back to the apartment I can just work on it on the desktop. Still, being an Asus Laptop, theres a good chance its actually a separate card. Asus is good about that.
 
+Qim virtua, I wasnt trying to get into an E-Peen Battle with you. I'm going into a field that needs a good top notch computer to do the work on, so 'm genuinely Interested in any kind of new PC tech.
 
I want a PC that looks at running 3 Open windows of 3dsMax, 2 Windows of Maya, 2 windows of Poser, and a Window of Mudbox, and just laughs. I want a PC that can turn a normally 30 Hour Render into 3 hoursr or less.
 
You mean Asus uses an oriental nrain to control American made I/O hardware. Almost seems like what happen to the Mars Polar lander my brother helped build. Some of the orbital software was made in Metric and the American hardware thought the numbers it got was in English measurements. Thus sensors said the ground was kilometers away when it was only feet away. I did that kind of error in 9th grade Earth Science as we had to use the Metric system in the mid 70's. In ways like the 5v TTL level and the IC chip level of 1.3 volts. And now some go as low as .795 volts
 
+Qim virtua nice. I'm wanting something like that. My Field of Study is Graphics and animation for Video Game design, so you can see why I need a top notch computer.
 
What you need is a Super computer liquid cooled card that can render 1 billion pixels a second and thats a slow card, Sone AMD cards can do 380 Million a second on a fan cooled card, but the card may suck as much juice as your big screen HD TV. In other terms that's it eats batterys faster than you can feed them to it.
 
Well right now I make due with the upgrades I can. Still I'm wondering if what I need is a better processor, or a better Video card for render speed?
 
umm yea, I know how to install my boards and cards and processor, I know how to setup and install software. Etching stuff is a bit out of my league.
 
I'm not sure what you mean by Iterations, but seriously, at this stage in the game, I'm looking to make a full Render, (Figures, motions, sound, all that jazz) that would take say 6 to 8 hours and at least cut it in half... for now. Later on when I actually have my job, Then I will do the serious upgrade. As for Iterations, maybe we haven't reached talking about those yet in my classes. Mostly we use the word Polygons, and Vertices. Maybe one of those has the same meaning.
 
Its supposedly a 4 year degree, but thats at 15 credit hours each semester. I'm doing 12. umm so another 2 years, maybe 3 max even though I'm already 2.5 years in so far, but thats due to certain classes not being available at certain times, and all that college crap.
 
To the teachers that crap is money in their pockets. And things tend to be structured for them not you the ones whom gives the money they need. But they make the frames you have to fit into. And the odd hours are their to insure you are devoted to the education given. If it was easy anyone could do the work you have to.
 
Well the good news is we just got certified with atmae, and we are getting a new degree, one I can switch to, and that may cut down my hours by cutting out parts of the degree I dont need, like the web based PHP class, and the Database class thats the CIT side of things.
 
It's nice to be the first, you can set records or embarrassments to be remembered through history. The bad thing is you get no royalty's form your actions. Just ask those of Animal House. Which I have to move from my Dish to by DVD burner to make a full screen HD version of it to replace my old 4:3 one.
 
+Qim virtua, I know the real world is worse. I dont think the hours are bad in that they are too long, I mean I can't get the damn classes I want when I want or when I need them so my degree is behind.
 
CGI as an employ is best thought of as a profitable hobby, because you'll work 25/8 and they try to pay you like your a temp. But the the IRS thinks your a movie producer raking in the billions. That is possible but not likely.
 
Thats one of the finniest explanations Ive heard about it yet.
 
And I left out the part that if you have talent everyone will think GOD helped you make that beauty.
 
Considering I'm Catholic, maybe GOD will help me with it, I'll certainly be praying every damn minute I'm having to work overtime. ;)
 
Thats not called over time it's called getting it to look right.
 
My career right now is finishing up college so can get to the actual career. :D
 
It underestimates the Mac by a little, the PC by a lot and overstates all aspects of Linux.. It's heart is in the right place though.
 
I liked My DOS on my Atari, it was much like 4DOS on the PC and it could read PC disks. And all hardware connected through a high speed serial port. And the joysticks were just two 8bit I/O parallel ports that you could configure for TTL voltage in or out. I used them to control stepping motors to move mirrors to aim the lasers in the laser light shows. But the updates where limited to 60 fps of video. Since Atari clocked everything basses on the video clock speed. At the time 3.58 MHz and apple was 2.0 MHz. But Apple had a 44pin card slot. Atari had a 40 pin memory card slot that one day those four slots cold hold 256Meg memory cards That page flashed 4Meg blocks as a game cartridge or a huge 1 Gig of ram disk. And the video chip could read that as video RAM so you could play hours of 256 colors out of 65,535 colors in a 360x228 video image in 1981, years before the VGA card came out and that was a 65k palette out of 24bit color and 320x240 now called MCGA. And today you can do 2560x1600 in 32Million colors per pixel.
 
We have come so far and achieved so little.

It seems for all this technology it's only ever used to 'display mondo titties in like mad resolution'
 
Just remember the more friends that you know in that job, the smother it is for you to get the job. It could be the more you know that have and can do the job the better odds you will have doing the job, because if you do not know how your friends will help you get the work done.
Add a comment...