Makes it great for selling old laptops for the same price as the new one.
Bravo Nvidia LOL
I had to check if my hacker news app wasn't updated with the newest post list since April Fools.
"Here's the new iPad. It's newer than the old one, but good luck trying to tell the difference. So to clear things up, we've named it the "iPad". You can thank us later.".
https://en.wikipedia.org/wiki/GeForce_10_series#GeForce_10_....
edit: The Tesla K20 is also in competition in my view (despite the much higher cost) due to its focus on higher double-precision performance.
General thoughts: Don't expect to get _any_ information out of NVidia unless you are running everything on their hardware compatibility lists (i.e. server-case) Do not mix & match consumer-rated gear with 'professional' gear. (i.e. If you put the K80 in a system with a GTX1080, then the Nvidia drivers restrict the number of available processing cores to 2 per device)
Air-flow: The Tesla's run HOT even with a blower attached, and/or installed in the recommended case.
NVENC: the Pascal-based cards performance is incredibly faster AND better then the Kepler-based cards.
For anybody else doing Video encoding work: Grab an Nvidia TK1/jetson dev-kit. This little card is a MONSTER and can handle everything we throw at it without breaking a sweat.
Huh? Not sure what exactly do you mean by "number of processing cores"?
I use two development boxes on a regular basis with Teslas side-by-side with GeForce cards and they all work just fine.
>We have reached out to Nvidia for a statement about compatibility down the line with lesser 10-series cards, and I’m happy to report that Nvidia states that all Pascal-based GPUs will be Mac-enabled via upcoming drivers. This means that you will be able to use a GTX 1080, for instance, on a Mac system via an eGPU setup, or with a Hackintosh build.
https://9to5mac.com/2017/04/06/nvidia-titan-xp-beta-pascal-d...
This is one of my biggest feature requests for Apple. I want a tiny little laptop with an integrated GPU when I'm on the road. But when I'm home I also want to be able to run simulations, play games on a big screen & do VR. And for that I want a desktop class GPU when I'm at home. And I want that GPU to be upgradable - CPU speed isn't improving anywhere near as fast as GPU speed, so it makes sense to keep the rest of my system across multiple GPU generations.
The laptops are already there. The RAM fiasco aside, the current laptops are fine little machines. And with thunderbolt 3 they should have no problem supporting external GPUs.
All thats missing is an official apple egpu enclosure and software support! People on the internet have already gotten them working via injecting kexts into the kernel. But first party support would make the whole thing way better, and way more stable. C'mon apple! We're so close! Take my money!
*Still rocking a Mid 2010 Mac Pro since the 2013 model was underwhelming and nothing has been released since.
If your workload can be efficiently split between multiple cards then a $1400 pair of 1080tis will vastly outperform a $1200 Titan Xp - 16% more money for nearly double the throughput.
I was more curious if Titans had some lower precision data type or better dataset packing than vanilla Pascals, or something similar that would help with ML.
In case you are just checking on your comment thread, do check the other threads as there's interesting performance comparisons being discussed.
I was about to ask "What is it we expect from GPU companies in the next ten years? Why/how will they dominate computing/innovate in ways that we care/compute?"
you answered my question. I think now is the time to invest in Nvidia and any other GPU manufacturer as the ML/DL/AI field is on the precipice to explode computing in the next 15 years. (15 years happens much faster than you might think)
That said, NVDA's stock just went 7% down due to an analyst's downgrade (which in the long run is relatively meaningless), so if you'd want to buy NVDA stock and hold it long term now might be a good time.
Disclosure: I am long NVDA, and my stock picking track record is atrocious.
The stock price reflects the market's overall expectation of all future cash flows discounted to the present value - all the way out to infinity.
When you buy at stock, you are implicitly betting that reality will exceed those expectations.
So, you just said that massive growth is priced in, and the fact you hold the stock implies that you are betting on even more massive growth - relative to expectations.
I think brain-space is the safer investment if we are not talking about getting rich from the stock but enrichening the cyber-sphere from that which can be developed in the ML/DL/AI space...
https://www.newegg.com/Product/Product.aspx?Item=N82E1681413...
If you're doing memory intensive stuff, NVIDIA wants you to spend a whole lot more.
Comparable on paper perhaps, but Nvidia architectures tend to get more actual gaming performance per FLOP than AMD architectures do.
i.e. the AMD RX480 (5.8 TFLOPS) is 30% faster than the NV GTX1060 (4.4 TFLOPS) on paper, but in practice they perform more or less the same.
Even in titles where AMD performs especially well the advantage is around 10-15% in favor of the RX480 - still less than the specs would suggest.
Of course that since my foremost interest is computation, then NVIDIA it is. But if you just want to game, AMD gives a better bang for the buck.
This is a very good explanation/speculation which deals with the NV driver optimization for DX11 where they break up the draw calls between threads because the scheduler is software based where AMD is hardware based and can't do the same. In DX12 this isn't needed so AMD scheduler being hardware based can be better utilized.
I mean its certainly better than the Titan X (Maxwell) which could only address less than half it's memory while running at 60Hz.
It just seems like an effort to inflate the price of the product without adding much value.
You're in the game and look at a house, then you turn around an look at a tree, so you need the geometry and texture of the tree, but no longer of the house. Then you look down and a chicken walks into frame, so you now need that, you kill the chicken and suddenly need the dying chicken animation etc.
Almost never you need all data for a single frame. That would be way too much work for the render pipeline anyway.
(Will believe the new Mac Pro when I see it, but It's likely to have AMD cards)
http://www.nvidia.co.uk/download/driverResults.aspx/117771/e...
I've the original Titan X Pascal and Tensorflow works great with some old driver version, I even forgot which.
The GTX 1060 is also a good card if you're looking to game at 1080p (which most people still are). It can be found for around $200.
you might be able to find some 2013 Titans for $400.