I saw a meme about something of fake frames, but i don’t know what happened.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    edit-2
    1 day ago

    Fake frames is “frame generation” for Nvidia it’s called DLLS.

    Rather than having the graphics card create 120 frames, you can crank the settings up to where you only get 60, then AI “guesses” what the next frame would show doubling it to 120 but keeping the higher settings.

    This can make things blurry because the AI may guess wrong. So every odd frame is real, every even frame is just a guess.

    Frame 1: real

    Frame 2: guess

    Frame 3: real

    If the guess for #2 is accurate, everything is cool, if #2 guessed a target moves left when it moved right then #3 corrects and that “blink” is the problem.

    The bigger issue is developers relying on that tech so they don’t have to optimize code. So rather than DLSS being an extra ompf, it’s going to be required for “acceptable” performance

    • Coelacanth@feddit.nu
      link
      fedilink
      arrow-up
      24
      ·
      1 day ago

      Not to be nitpicky but DLSS is a different technology than frame generation, though it also involves AI guessing - just in a different way. DLSS (Deep Learning Super Sampling) means rendering the game at a lower resolution than your screen’s output, then having it upscaled to the correct resolution via AI. This is much more performance friendly than native rendering and can often lead to a better looking visual end product than turning graphics features off and rendering natively - though it will depend on the game, genre and personal preference.

      Frame generation is as you described. Worth noting is that DLSS without frame generation doesn’t suffer issues like artifacts and input lag in the same manner as FG turned on. Frame generation also works better the higher your base frame rate is, so it’s a bit of a “win-more”. Using FG to go from 30 to 60 FPS will feel much worse than using it to go from 60 to 120.

      The fake frames memes I believe stem from the updated frame generation technology in the 50 series guessing three frames at a time instead of one. So in effect you’ll end up with a majority of the frames you see being “fake”.

      • NekuSoul@lemmy.nekusoul.de
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 day ago

        On the other hand, NVIDIA has started to consolidate all these technologies as the NVIDIA DLSS suite a few months ago for some reason.

        So it’s DLSS Super Resolution and DLSS Frame Generation, DLSS Ray Reconstruction and so on, with the exception of DLAA. Probably because that would get too stupid even for them.

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 day ago

        DLSS is a different technology than frame generation

        Thanks! Got them mixed up

        2 fake frames instead of just one I hadn’t heard about either. I already leave it off on a 4070 super because 1:1 is already bad enough

    • stankmut@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 day ago

      To add on to this, the 5000 series now generates 3 fake frames per real frame instead of just 1.

          • NewNewAccount@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            22 hours ago

            Yeah not sure if there’s a better word to use without coming across as pedantic.

            Fake certainly implies these are worse (which they of course are), but I’m not sure if they’re that much worse. I think in many scenarios the proverbial juice would absolutely be worth the squeeze, but naysayers seem to disagree with that sentiment.

    • ch00f@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      Can someone explain how AI can generate a frame faster than the conventional method?

      • MrPoopbutt@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        24 hours ago

        It is image processing with statiatics rather than traditional rendering. It is a completely separate process. Also, NVidia GPUs (and the new upcoming AMD ones too) also have hardware built into the chip specifically for this.

        • ch00f@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          Which part? I mean even if it isn’t generating the frames well, it’s still doing the work. So that capability is there. What’s the grift?

          • RubberElectrons@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            23 hours ago

            That it’s reliable. The key point they’re selling is that devs don’t need to optimize their engines as much, of course obfuscated under a lot of other value-adds.

            I’d go further than this and say part of our problems are generally that optimization of code isn’t a focus anymore. Apps which merely interface with web APIs are more than 90mb sometimes. That’s embarrassing.

            That an AI can step in as savior for poor coding practices, is really a bandage stuck on the root cause.

  • BougieBirdie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    1 day ago

    A saw a graphic the other day that was comparing the number of frames generated between the 4x and 5x, and people in the comments were saying that the 5x uses AI frame generation to speed things up

    People in the know would know that AI is largely hype, and the generated frames probably don’t look as good as if they had been properly rendered

    • MDCCCLV@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      23 hours ago

      Yeah, but if you have a high refresh rate monitor and you want 4k plus 240 hz then you probably need this.