Opinion: Frame Generation is bad for gaming.

I know a lot of you may disagree here but the way I see it FG is pointless.

Mostly because you actually need a good base level of performance for it to be worthwhile otherwise you introduce a lot of latency due to the fake frames not actually having an input window. Nvid muppets are trying to push that FG is the future and raw rasterisation is pointless. Pretty much their literal words. Then comes the offset, Devs are so *** lazy these days a lot of games launch an unoptimized sack of frogs. Terrible FPS as well as a whole host of other issues. Yet, their go to like Todd Howard is blatant denial of fault and tell you you need a better PC, or a rushed implementation of DLSS 3 and likely now FSR 3. 

So you don't get the performance from the hardware you have because they can't be bothered, but you need newer hardware to support FG and that makes the overall experience worse. Then there's the "get a better PC" remark. Yes, spend 1600quid on a GPU alone to get 60fps at 1080 because the game is just trash.

Even as this tech matures I do not see it as anything other than a crutch for poor optimizations in games and a cover for dev laziness. I would sooner a game launch be delayed a few months and it actually run properly instead of needing better hardware and still having to lean on these AI techs.

Not every game is launched a mess, but a lot of them are these days and it is a terrible trend.

  • Well worded, I completely agree, Intel XeSS by the way can be leveraged by NVIDIA hardware and recently I saw it outperform DLSS on one stream with an RTX4070 Desktop GPU being used to play Assassin's Creed Mirage

  • I think games that are released poorly optimized would be released poorly optimized regardless of if frame gen was a thing or not, it's not about laziness it's due to cost cutting. It would be great if all devs took the time to make games run well but unfortunately in real life there are the economics to consider, if it costs them an extra million dollars in dev time or they have contracts with publishers that say game must be ready by x date it's easy to see why it's one of the things they would cut first and make excuses later, the guy putting the project together would look at the big picture and make a business decision. Sucks for us cause PC is often the platform looked at the least due to its complexity, no standard hardware plus people expect higher fidelity than consoles which often leads to bad results for older hardware (but that's been the case since my first ever GPU, the radeon 9200 pro lol).

    Anyway I think both upscaling and frame generation will be the norm going forward into the future, next up will probably be a way to decouple rendering from mouse input, there's already demos for this kind of thing and I'd bet companies are working on it, this will solve any input lag issues:

    https://www.youtube.com/watch?v=f8piCZz0p-Y

    Eventually I think there will be a complete package of technologies that will essentially recreate the entire experience of the game, I think that's what we're moving towards but it's still early days and the pieces aren't all together yet. I like this idea in theory because it's more efficient than brute forcing graphics but time will tell how it plays out.

  • I don't think that it is useless for people with high-refresh monitors who play less demanding games, and who don't prioritize input lag. Refresh rates are getting higher and more affordable. If you have a 240, 360 or 540 Hz monitor, then FG gives an input lag of a 120, 180 or 270 Hz monitor, which is pretty good.

    The issue is that this is a very small niche of gamers. Many gamers want high quality graphics and lower settings to an acceptable FPS range. Using FG to do that, will usually mean that the input lag is quite high. Gamers also typically choose 1440p or 4k over maximizing their frame rates.

    And as you note, game developers don't tend to target 240, 360 or 540 Hz gaming on somewhat affordable hardware, or even on any currently available hardware for that matter. So the way to actually benefit from FG is to go into hibernation and then play the current FG-enabled games in the future, on far faster hardware. But most people want to play current games.

    PS. You can vote with your wallet by simply not buying these poorly running games right now and going back to older games or those that do run well. Money talks.

  • I also think that we must understand that game devs have gotten used to a decent price/performance improvement in GPUs every 2 years, so they plan to use more resources for each new game, to use that power. With price/performance suddenly stagnating, they got in trouble, because they can't suddenly go to their bosses and say: "we thought that we could optimize it properly for mainstream GPUs in X time, but now doing that would cost twice as long." Time is money, after all.

    So FG is used as a crutch. In fact, Nvidia is intentionally pretending that the crutch is the same as faster hardware.

    Ultimately, the market decides though. If people simply refuse to turn on FG and return the game if it doesn't run well on their hardware or leave scathing reviews, then FG will die. If people will actually use it, then it will succeed.

    Just like you encourage Nvidia's behavior if you buy this generation of cards, rather than go for AMD or wait until they correct course.

  • AMD are pulling a similar stunt with FSR3 and fluid motion so the better option is to skip this generation completely. Intel are supposedly releasing a new gen of Arc cards too which are much more potent than the current Arc line up. But even Intel have XeSS which is a similar crutch to DLSS and FSR

  • Tbh those specs of monitors are ridiculous and my personal opinion a waste.

    I have a 240hz G7, but I have it at 120hz. Firstly since my main laptop died a 3060 machine aint doing QHD at 240hz, secondly unless you're obsessed with FPS, over 240hz is overkill for 99% of people.

    As for games, I do agree. That said, I shouldn't have to wait 12months for a game to be released then polished to be worth playing. It shouldn't be the case of being a glorified beta tester and there is no justification for that. If you applied that same logic to any other product it would be called out immediately. Albeit it is not a safety concern, but allowing them to release broken games and get "you need a better machine" is plain wrong.

  • RTX someways forced that change by the increase on the required amount of power (graphic and electric) to be playful.

    Even If we reached the point where is possible 4k 60fps, RTX is the reason is still not fully possible (besides owning the most expensive GPU).

    I'm agree on the fact that the gape between development and release is getting thinner.

    Unfortunately optimisation and bugs are a direct consequence of demanding deadlines and pressure demanded by the market.

  • Its a interesting argument you have made and I agree if devs use it without improving/ doing it the right way. Then we will see a definite decline in quality. The segmentation of the market with top end cards costing quite a lot and everyone else on mid to budget GPUs will only exacerbate the issue further. 

  • Offline in reply to AR

    PC gaming and lazy Devs are making the ecosystem pretty toxic. I hope more devs actually make the effort to clean up their act and I hope GPU OEMs stop being so *** greedy too.

  • This nvidia gen is terrible regardless of any new features, good or bad, IMO it should fail spectacularly just because of the price gouging stunt they've pulled but I don't decide the market haha.

    It's kinda normal to have these issues during a transitory period where new consoles get released and game requirements jump up, though it does feel worse than expected.

    Anyway I don't think what GPU vendor you choose matters, this is where the industry is going, nvidia is just first to market because they lead the industry.