Like I’m not one of THOSE. I know higher = better with framerates.

BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.

The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!

… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.

Yet like.

I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.

And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?

  • RobotZap10000@feddit.nl
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    25 days ago

    FPS counters in games usually display an average across multiple frames. That makes the number actually legible if the FPS fluctuates, but if it fluctuates really hard on a frame-by-frame, it might seem inaccurate. If I have a few frames here that were outputted at 20 FPS, and a few there that were at 70 FPS, the average of those would be 45 FPS. However, you could still very much tell that the framerate was either very low or very high, which would be perceived as stutter. Your aforementioned old games probably were frame-capped to 20, while still having lots of processing headroom to spare for more intensive scenes.