Code said:
5 years ago, Apple had did great on floating point.
Now-a-days, PC chips perform as well as Apple.
I use Macs because thats what I've always used.
I mean I own PCs, but only for video games.
My fiancees mother/stepfather own a Mac. It is a shiny iMac - looks a bit like an art deco lamp had sex with a toy or a snowing paperweight and their offspring happens to have a keyboard.
They sort of typify Mac users that I have to deal with a lot. They will call me over, convinced that their machine is destroyed. I take a look at it and see that they haven't turned it on.
They don't understand the difference between sleep and off.
For the people like that, I'm just glad that they can move about the house and not kill themselves, so if they want a shiny computer for morons, that is fine by me.
That is an entirely different arguement than claiming that they are somehow superior in anyway.
Each system has its own merits for whatever it is you plan on doing and you need to then place a monetary value on the assets each has and then look to see where that monetary value compares to the machine that you are paying for.
For me, at this point, I only need a laptop with FreeBSD or Linux on it - but unfortunately, I have to admin a Windows network, so instead I have a laptop with WinXP Home on it.
While I do enjoy the shininess of a Mac, in the end, they aren't cost effective for anything I do.
If I had a really nice digital video camera and really needed to make videos quickly and easily, I would probably at least consider a Mac.
Other than that, they just aren't for me.
If my math code for some reason starts to need the 64bit processor, then I will start comparing it to the AMD/Intel versions to see - but at this point, nearly everything I do is with money, which is really just interger math in the end - fast stuff on cheap prossesors.