Cyberpunk 2.0 is BRUTAL.

I really enjoyed the game after several patches improved a lot of things, I was doing a second run early this year, but when I heard about the 2.0 patch overhauling just about everything, I held off.

Fast forward to last night playing it a new once again, it is vastly different to the OG game and better in every respect. It does come with a warning and it is not a joke. In an article one of the CDPR guys suggests doing a Cinebench R23 run to see how your thermals hold up before trying it out. At first I thought it was a bit of satire but after playing for a while last night. They really meant it. The new update makes it utilise up to 8 cores. Which is pretty intense as you can imagine. 

I can play Destiny 2 at QHD 120hz on my A15 6800H RTX 3070 (reworked with Conductonaut Extreme and U6 pro, sat on a GT500) and usually spike at most 65c on both CPU and GPU. Average temps are much lower usually around 55c. Last night my CPU saw 80c spikes and was sat in the 70c range and the GPU saw 74c. The workload is real.... this was limited to 70fps too to try and limit some of the draw. Oh and I used MSI afterburner to undervolt and OC the GPU so that it is cooler than stock settings.

What am I getting at you say?
If your machine throttles in Cinebench, you may have a real hard time running Cyberpunk 2.0. If you don't have a good cooler like the GT500, it may be worth investing in one. Stock TIMS may also not be enough depending on your machine. Before I reworked my girls gen 7 Legion 5 6800H RTX 3060, her machine could hit 80c on the CPU and mid 70s on the GPU when playing destiny at QHD at 60hz on the TV. Rare spikes and all, but you see what I'm getting at, D2 is not a demanding game and it can still draw some heat. her machine is sat on a homemade cooler with 2 x Arctic P12s at 12v too.

If more developers end up taking a similar approach and can hammer 8 cores at once, laptop thermals may become an issue without devices like the GT500 and budget friendly options may also need the use of liquid metal to actually be used without throttling.

  • What about the gameplay changes? Is the 2.0 worth it to someone who never played cyberpunk? Is this a redemption story similar to No Man's Sky?

  • I haven't paid much attention to the thermals on my gen8 slim 7 yet, I should run hwinfo in the background next time I play Lies of P and see what happens. It's not been screaming at me though so I'm guessing it's alright.

    I'm all for game developers finally making proper usage of multi-core architecture even if it does push the hardware in the process. It's always felt wrong having games only leverage 2-cores out of a possible 16 depending on the CPU. Couple this with DirectStorage, load screens should be vastly sped up or even eliminated with the same speed benefits for shader loading/pre-caching

  • I'm SOOO looking forward to playing this, but I'm waiting for the Phantom Liberty DLC to come out next week before I try out the 2.0 update as I'm going to have to start a new character anyway so may as well wait for the full experience. I normally hate restarting/replaying games but I'll let CP2077 off on this occasion given the total revamp and fundamental changes to the game mechanics it only seems fair to give it a fresh start rather than respeccing and adapting an old character (although you can - it's just a me thing). I'll be interested to try out the new visuals. Gamers Nexus did an interesting benchmark comparing the difference the new DLSS 3.5 Ray Reconstruction makes to CP. Seems it's mostly positive but only just as it fixes a lot of the problems of Ray Tracing but also introduces some new artefacts. Plus you need RT Overdrive on for RR to work so that pretty much rules out everything below an RTX3080Ti but that might just be a CP2077 thing. FPS did increase a little with RR on but that's after your FPS has been tanked by RR anyway so it's a weird trade off.  Looks like DLSS 3.5 performance gains and losses might be heavily game dependant. I'm happy games are beginning to use more cores, that always felt like and under utilisation of resources to me. I'm pretty sure desktops with semi-decent cooling will be fine Fingers crossed but I can imagine some laptops beginning to thermal throttle earlier when pushed. But there will probably be a few years yet before games fully utilising extra cores get released so there is time yet for manufactures to adapt their cooling. 

  • I would say so, I played it last year when I picked it up in a sale and despite not being perfect still really enjoyed the game and story.

    So far I prefer what they have done, they have added more to the game to which is just improving on a good game. You can have combat in cars now and cars can have weapons. The specs you can build in your character and cyberware are also vastly improved. If you're interested in it, I would say it is worth giving it a go.

  • Yeah, PT and RT overdrive are gimmicks. At this moment in time even the top of the line hardware cannot deal with it and keep things smooth. another few generations of GPU and we may have a good RT card to try and play it.

  • I would say it's good that they're making an effort to make the game better use more of the available performance, if you buy a good pc you want the games to use 100% of it don't you?

    Laptops are really in a tough spot though, even new ones! The CPU cooling is getting smaller and smaller, which for the combined maximum workloads is going to push temps up, the VRAM limitations up until this gen excluding edge cases (16gb 3080 and 4090) are very restrictive for game developers IMHO. Hopefully they release some banging new designs with updated RTX and coolers (fingers crossed AirJet coming soon) at CES in Jan

  • Interesting that this is the first game to properly use 8 cores. I bought a 6 core CPU thinking that it be enough for a long time, but that may have been a bit too optimistic.

  • 75C on the Ryzen 7840HS and 64C on the RTX4060, that's fairly good, I didn't have the fans on the Targus Chill Mat running either

  • Oh God, my newest laptop is only a 4 core, so really need a update before taking on those kinds of games, but I like that they optimize it for what your machine should be able to handle and maybe a new driver update might be in the works now that game developers are taking a step up in hardware usage.  

  • That's not bad at all.

    Wish I could afford the jump to a 4080 personally. The 12gb VRAM would help out massively.

    Dont the newer Legion 7s use liquid metal from factory?