Not the only place. I have found the German site NotebookCheck.net to be more thorough than Anandtech when it comes to laptop reviews.
Not only do they measure color space and color accuracy, but they also measure backlighting levels across the screen, comment on backlight bleed, test for viewing angles, check the reflectiveness of screens, etc.
In addition, they also test decibel levels of the fan, temperature of the laptop surface, etc. at various loads. They point out situations where a certain laptop may ship in two configurations. They point out the pros/cons of the nearest products from competing vendors, which have also been tested with equal thoroughness.
In other words, they actually test the machines against a checklist, to assess the performance of each element. In contrast, when an Anandtech reviewer claims that a machine is quiet or that the screen is matte, you don't actually get all the information. Maybe the ventilation system was masking some of the sound. Maybe the screen is only semi-matte.
The only problem is that NotebookCheck is based in Germany, so some of the machines they review are not available in the United States. Still, I'm happy restricting my choices a bit to avoid being surprised with a laptop purchase.
However, according to this review, the new Mac Pro doesn't work with the new Dell 4K monitors (I don't consider 30Hz refresh as 'working'), and even with the 4K display that Apple sells, it only works at its native 3840 x 2160 at 60Hz. When choosing a 'Scaled' resolution, it renders blurry junk.
That is pretty disappointing (although I imagine it will be fixed at some point).
And that's probably generous - from the GPU analysis appears that the tested unit has D700's which bumps the price to $8299 - a configuration that isn't mentioned anywhere in the article. About the only thing left to upgrade on the test unit is the RAM to 64gb.
Since the article calls itself a review, it would be better if the review unit was accurately described. It seems to me there's a bit of bait and switch because the performance numbers presented are not for the $3000 or $4000 presented in the article's lead.
OTOH, it's a little surprising that Apple gave out 12 core review units, since the 8 core seems to be better at benchmarking.
In light of the fact that much less expensive machines often performed better, it seems obvious that the favorable conclusion of the review ought to provide some rational case for being so.
It's a bold bet on a possible trend, which I like a lot, since it would mean that non-gamers would profit from the GPUs that would otherwise bore themselves to death on their machines. Also, it would give AMD a better position, maybe averting x86 being a complete Intel monopoly.
The line of video professionals happy to pay $10k/box to get their renders done faster screaming "TAKE MY MONEEY NAOW" might have something to do with it...
Also, since you missed all the charts showing the MP demolishing everything else by 2.5x+ on multithreaded workloads (ya know, the thing that people buy MPs for) you may want to verify your consumption of what the kids call "h8rade".
The MacOS market in general might be described niche, but it seems to have sustained itself over the years.
three-year Mon-Fri 8-5 next business day, parts, labor and 24x7 phone support,
They come to me.
Edit: to all the naysayers: I'm in the UK. HP here is pretty good. We have over 200 machines on next day and we've had only two (!) problems and they were relating to part supply resulting in a quick purchase on Misco that arrived next day.
I ended up torrenting an HP restore ISO and thankfully the BIOS's licensing key matched the ISO.
On desktops though, I'm willing to put more effort into settling software update issues/device conflicts, since I probably have to do that anyway to write performance-optimized code (depends on the exact purpose of the desktop though, but I do a lot of scientific computing). So a Linux/Windows split boot on a generic PC usually wins out. I used to do a lot of PC gaming, but that's really less of a factor now.
As the AnandTech review points out, you're looking at $2700ish if you opt for hardware that is apples to apples comparison... regardless of whether you went with hackintosh, Linux, or Windows.
I code with a focus on TDD and small unit tests, so even though I do mostly statistical computing work, my development tests use small amounts of data and low computing power. When I actually need to run big production stuff, it goes off to the Linux box or a Linux server cluster.
My beef with the imac is that it is not really that much faster than my MBPr, but has terrible heat management. When I am running big jobs on it, it gets a few degrees off from a toaster and heats up my office. A 2.5k linux box would not have this problem and would be way faster.
Apple, your stuff is mostly nice. Fix your product warranty and maybe I'll open my wallet.
When you're spending $3k on a computer, I don't see how $250 will really affect the bottom line...
payoff * Probability of payoff = EV
3000 * Probability of payoff = 250
3000 * P / 250 = 250/250
3000 / 250 * P = 1
12 * P = 1
∴ P = 1 / 12
You require 8.3% of the computers to fail in years 2 and 3, and for the failure to cause $3000 worth of 'damage' avoidance, and for it to be something covered under the warranty terms, for your gamble to make sense.
Judging by the fact they offer the warranty, I would imagine the fail rate to be somewhat lower than this, or for the failures to be vastly less expensive than that - else they'd be losing money.
Older workstations these days are a lot more affordable and can easily be upgraded. Most MACS you're stuck with what you get.
Case in point, I just purchased an HP 8400 workstation for a friend. $320 for a dual proc 2.6GHz quad core Xeon, 16GB RAM, Two 320GB SAS drives in RAID config and ATI Fire V7350 1GB video card. Sure its a pig and isn't the quietest PC in the room, but it completely shreds anything I could find in a retail setting.
If this was true, it would really explain HP's financial situation /s
EDIT: that graphics card despite being old, still sells for $300 on Amazon: http://www.amazon.com/ATI-100-505143-FireGL-512-bit-Express/...
Note that the D700 specs match closely with the R9 280, 280X, or 290, cards which sell retail for $349 to $449 or so each (the Mac Pro has 2 of these in the 2x D700 config). See https://en.wikipedia.org/wiki/AMD_Radeon_Rx_200_Series . The 2048 shaders would match with the 2048 number for the 280X, I think (if I am reading the chart right).
The W7000 is the much more expensive "pro" version which has ECC RAM on the card and much less volume in terms of sales.
It's funny though, I remember as a kid, thinking how cool it was that you could have a backpack-able computer (the original Macintosh could be ordered with a padded backpack). Now at 11 lbs and not very large, you could almost do the same again!
He's basically dumbfounded by the current situation :-)
I haven't found any since the 10.9.1 update. They fixed the quicklook slowness bug which was my only complaint. Mavericks is one of the most polished OS X release I have used, behind Snow Leopard.
One has to wonder what all their engineers are actually working on.
Anybody running Linux on Macbooks or Mac Pro? Does it work well?
Techcrunch interview here: http://techcrunch.com/2012/04/19/an-interview-with-millenium...
*edited to add link
Which is complicated when you are trying to build hardware and realize "oh crap, the standard thermal layout leaves no room for us to actively cool <insert-x-part>, how are we gonna keep this thing from overheating?
The way we have our computers laid out today - vertical pcie cards, cpu sockets with either the amd snap-brace or intel screw in backplate heatsinks - is entirely arbitrary, but for posterity it persists as nobody wants to be the guy to throw out 10+ years of expansion card compatibility.
My PC had the same price three years ago, but it has 5GHz CPU, 32GB RAM, 1TB on SSDs and 5 TB on spin HDDs.
I remember when they said people wouldn't need ADB or SCSI. Or floppies. Or serial ports. It worked out.
If your 5 GHz CPU a Xeon with ECC? The RAM can be upgraded (those are options), the SSD can be upgraded (1 TB is an option) and it's faster than SATA, and you can attach as many disks as you want in Thunderbolt arrays. Did you have dual GPUs for that price?
It's a nice looking machine, very quiet, and very innovative. Maybe it will be a misstep, but I'm glad Apple is trying something interesting. I want to see what happens with this.
I am more inclined to say Apple doesn't want people to think they need them.
But I don't want two high-end compute cards, and I suspect that many who are trying to convince themselves that they'll benefit from it will gain no value from it.
For many, many workloads, modern compute still represents an iffy proposition (at the price levels being talked about, the Xeon Phi would almost certainly represent a better proposition). With unified memory things might get more workable, but as is it remains a relatively fringe benefit, and it seems odd that the entire value proposition of the machine relies upon it.
In which case, do you want a Xeon workstation of any sort? As mentioned later on in the review, you make significant sacrifices for Xeon (startlingly expensive, last-gen cores), and, besides the option for more cores than you can get on an iX, the main thing you get is extra PCIe lanes, which are not actually that useful for most things; one of the few things they _are_ useful for is dual hefty GPUs.
ECC memory is a big one. Usually getting a Xeon workstation comes with SMP, though not on the new Mac Pro. Big memory support. Lots of PCI lanes. Usually lots of space to drop in extra storage, a couple of 10GbE ports (the Mac Pro has just 1Gbps ports which is another oddity).
There are a lot of traditional reasons a so-called workstation features a Xeon.
Worth noting that there is a couple of Haswell (therefore AVX 2.0 supporting) Xeons -- the E3 v3s. Unfortunately they're the baby ones so they have ridiculous low max memory, no SMP, and max out at 4 cores. Hopefully the E5 v3s are out soon.
I honestly don't get how the Mac Pro hasn't gotten more mainstream criticism. It solves a problem that I don't remember anyone ever having (honestly a garbage can form factor seems like more of a nuisance than the flexible cubes we're all used to), while bringing a ton of problems to the table, and being a massive sunk cost for fixed hardware that is going to be outdated very, very quickly.
The Xeon Phi is significantly more expensive, draws significantly more power, and is significantly less useful for creative software loads.
Which is what? It's just anecdotal but most of the people I know with Pros got it as a high end development workstation, building iOS apps, etc. Final Cut Pro is pretty much the only app that benefits from the dual GPUs, and even then the gain is relatively marginal over a machine four years old. And that's paying for very high priced "workstation" GPUs (I called them compute cards because that is what they are geared for, though as with all compute cards they are derivatives of GPUs. They're really price ineffective as GPUs), when you can get almost all of the same advantages on a basic ATI card for a couple hundred dollars.
Virtually every review of the Pro seems to be giving it a very soft glove approach for some reason. It is an enormously expensive monument to the dual compute GPU, for marginal gains in most apps.