Leaked specs show that Nvidia’s GTX 1080 Ti has the Titan X factor

Want to know how powerful Nvidia’s GeForce GTX 1080 Ti will be? Well, handily enough the spec of the juiced-up 1080 has been leaked.

According to the spillage of details which was spotted by OC3D thanks to a denizen of its forums, the GTX 1080 Ti will allegedly run with a base clock of 1503MHz (with boost to 1623MHz) and 3,328 CUDA cores – close to the same amount of the Titan X which has 3,584 cores (compared to 2,560 for the vanilla 1080).

The card will also boast 12GB of GDDR5 memory, equaling the Titan X – but of course that’s slower memory than the GDDR5X used on the Titan, and indeed the plain GTX 1080 – with a 384-bit memory bus. The 1080 Ti boasts 384GB/s memory bandwidth, which is still considerably chunkier than the 1080 at 320GB/s, despite the latter’s superior GDDR5X video RAM.

TDP will run to 250W, just like the… yes, you guessed it, Titan X, sucking up considerably more power than the 180W demands of the GTX 1080.

Titanic performance

The 1080 Ti offers compute power of 10.8Tflops, a sliver away from the Titan X’s 11Tflops. The major differences between these two cards appear to be the slower video memory of the Ti, which will probably be the main disappointment for most high-end card addicts, and the drop in CUDA core count of 256.

Some folks are pointing to the supposed use of GDDR5 memory as an avenue to discredit this leak, arguing that it doesn’t seem likely Nvidia would drop down to this after using GDDR5X with the vanilla GTX 1080.

And that’s a fair observation, but obviously cuts have to be made somewhere, and indeed this card – if these purported specs are on the money – is already looking a very close runner to the Titan X. Something has to give somewhere – but as ever, we’ll just have to wait and see when the final spec is revealed.

The current Pascal-based Titan X is 60% faster than the previous-gen incarnation, and back when it was revealed we called it irresponsibly overpowered.

Source: techradar.com

#Amazon #Android #Apple #Asus #camera #Galaxy #Google #Games #iPad #iPhone #Lenovo #Lumia #Laptop #Microsoft #Moto #Motorola #news #Nexus #Note #OnePlus #phone #Plus #Releases #review #Samsung #smartphone #Sony #Watch #Windows #Xiaomi #Xperia



Top Brands

12 Comments
  1. Reply Luz Bauch IV September 15, 2016 at 2:15 pm

    it makes little sense to see that the power envelope of 250w would be the same considering the differences in core count, memory and other stuff. I'd say 200-220w would be the actual power envelope under those changes.

    So I'll take this with a grain of salt. Seems more like someone trying to click bait into getting views with speculation rather than actual leaked information.

    It doesn't make sense that they would use GDDR5 rather than GDDR5X, especially considering it will probably be in the $800 to $1000 range.

    The benefits of GDDR5X are kind of what pascal needs to shine, because regardless of total bandwidth, it's what memory can do in X milliseconds that matters most. And considering the extra memory size along with the core performance, you really need GDDR5X to handle that.

    I'll even go as far as say it's a bad photoshop of the spec sheet, the numbers aren't even lined up properly. Where it says core count 3328, the 28 is noticeable lower. It looks like they cut and pasted the numbers from other parts of the spec sheet.

  2. Reply Jordane Hamill September 15, 2016 at 4:38 pm

    No that doesn't explain the TDP, especially compared to the Titan X. You can't have two cards use the same amount of RAM, and the card with a noticeably less core count use the same amount of power. The differences between GDDR5 and GDDR5X simply wouldn't explain that.

  3. Reply Ashleigh Schmeler September 15, 2016 at 4:40 pm

    Man, it sounds bad, but I'm kind of glad to see these cuts in the 1080Ti are made (Memory and fewer CUDAs). I'm glad because I just bought two Titan X Pascals. I don't care about the money but I want to know that my graphic cards are the best on the market (I don't care about price).

  4. Reply Dr. Hulda Jakubowski September 15, 2016 at 6:50 pm

    I could see GDDR5. It's a higher bus so it has higher mem bandwidth than the gtx 1080. That is really what is important.

  5. Reply Thaddeus Ankunding September 15, 2016 at 7:42 pm

    I think Amd gave up on being superior at least for now, even if Amd comes on top, volta is rumored to launch end of 2017 with hbm and async compute.

  6. Reply Jennie Mills September 15, 2016 at 10:00 pm

    And you would be right to be surprised. Vega is already promising 16GB HBM2 memory, and compare that to 12GB GDDR5? Along with all of the AMD benefits (Async compute, better DX12 support, better 4K, etc.).

    And of course the bang-for-buck factor favors AMD dearly. Add a bunch of FreeSync monitors (I think they're at a 100 right now?), they would steal the market right under NVIDIA's nose.

    But… NVIDIA is known for their incredibly formidable marketing tactics (they even worked on me with my Titan X Pascals), so I doubt that they will let AMD have a superior product for long.

  7. Reply Mrs. Reba West III September 15, 2016 at 11:51 pm

    I would be surprised if they release this before vega.

  8. Reply Dr. Marlee Wilderman Sr. September 16, 2016 at 4:05 am

    /shrug
    Why does Intel have a bunch of different cpus with the same tdp?

  9. Reply Dr. Griffin Hegmann September 16, 2016 at 4:53 am

    History says you're right

  10. Reply Owen Marquardt September 16, 2016 at 5:48 am

    The tdp is probably like that due to the binning process, even though it's GDDR5 it still has higher bandwidth than the 1080 due to bus width.

  11. Reply Delphia Sauer September 16, 2016 at 7:07 am

    Whilst I agree with everything you said to a degree, the difference in alignment with the last two digits of 3328 can only be seen from the distance it's set at, once you zoom in, it's perfectly fine 😀

  12. Reply Dr. Deondre Brekke September 16, 2016 at 7:20 am

    Is it really a noticeable difference? It's got 93% of the core count.

Leave a reply