Are they intended for everyday computing purposes that might require graphics acceleration, such as high end display managers?
I'd certainly love to see them enter the graphics card arena and compete with ATI/Nvidia by having phenomenal open source drivers. I'd vote for that with my wallet.
They don't even have DRAM on the chip yet - normal graphics cards use monstrously high bandwidth connections (10x higher than DDR3) to stream in textures. HD 4000 et al just access main memory, competing with the CPU for bandwidth.
They might be unimpressive now, but it's a focus of Intel to keep improving them, and that will happen significantly faster than Moore's Law.
Also, it's wrong to compare them to discrete graphics cards. They're cheap and low power, used in the MacBook Air. They replace the much inferior Intel integrated graphics and nVidia chipset graphics (used in the original Air).
They're now nipping at the heels of low end discrete graphics chips (especially on laptops). That's a great thing.
As a game developer, I'm excited by them. My games will run badly, but at least now they'll run on even the cheapest computers.
As a sidenote, John Carmack was reasonably enthusiastic about the latest generation Intel integrated graphics at QuakeCon 2012 (at YouTube you can find the whole 3,5 hour talk).
From http://pcper.com/reviews/Editorial/John-Carmacks-QuakeCon-20...:
"Several factors have pushed iD in this direction. First off the hardware is now good enough overall for gaming. The latest Intel processors have a graphics portion that is entirely able to run games at decent resolutions and quality settings."
Personally I started playing Counterstrike: Global Offense last week on my 2012 MacBook Air and it performs well (medium settings) on the integrated Intel HD Graphics 4000.
They've always been equivalent to low-end previous-generation parts from ATi and NVIDIA. They've never had anything even remotely comparable until the HD 4000, and even that is a bad joke.
You're right - Intel (like nVidia's chipset) gives you a GPU that's a rung below the cheapest current cards. In the bad old days, it used to be an order of magnitude worse if you were lucky, and wouldn't run your game in the worst case.
Moving forward, they're giving the cards enough power to help render movie times, playing games such as League of Legends casually at low (think 1366 by 768, not 1080p) resolutions and medium settings, etc that can satisfy 90% of user needs.
Maybe in a few years Intel will make their own graphics cards that can compare in power to current Nvidia and ATI cards. Until then, these cards serve as cheap stuff that can be found on every computer.
For some applications this is an enormous benefit. For an office computer, which doesn't require high-end 3D to start with, the HD4000 will be more than good enough.
They're meant for gaming, but not performance gaming. You can play something like Starcraft 2 on it without much trouble if you tune it down to "Low" settings. It just doesn't look anywhere near as amazing as it would on "Ultra".
Now, I might be a bit bitter because I had two different nvidia chips in two different laptops die on my a grand total of 4 times, all due to faults on nvidia's part, accentuated by not-ideal thermal design in either laptop. That led to my decision that no matter what, my next notebook (whenever/whatever that will be...) will not have a discrete GPU, but will use chipset graphics. At least there, only one thermal source has to be dealt with and Intel will hopefully produce reasonable specs for that.
With the HD 4000 I heard you can play some of the relatively recent games at lower resolution and low details.
I remember playing CS and UT years ago on centrino notebook with intel graphics and it was all i needed. Now those chpis can do a lot more, so i'm looking forward to get away with that horrible dual-graphics setup in my notebook in the future :)
These performance would have been nice if not for Retina display manages to use 4x the pixel on screen. Things are just never fast enough.
A better performance/energy consumption ratio for lightweight laptops?
>It seems that performance-wise they're quite far below discrete graphics cards, so I'd guess that they're not really meant for gaming.
No they are not. But then again, very few GPUs are actually used for gaming. Most people over 25 use their computers for others tasks (including all enterprise and work computers).
>Are they intended for everyday computing purposes that might require graphics acceleration, such as high end display managers?
Most computer use today requires graphics acceleration. From browsing the web (canvas), to watching an HD movie, to talking on Skype/Facetime.
The question I want answered is how these chipsets compare with other offerings. The context I would like to see is the same tests run on low/mid/high-end AMD and nVidia parts using both the open-source and proprietary drivers.