What Does Hz Mean In Gaming? (Revealed!)


Hz stands for “Hertz”, and refers to the refresh rate of your PC’ monitor, TV, or other display. This means how many times per second your monitor is able to generate a new image. High refresh rate monitors, at 144Hz, for example, refresh an image 144 times every second.

There is some confusion about what exactly is Hertz compared with what your system’s frame rate is.

Hz is to do with your monitor itself, and how often its able to refresh.

The framerate, on the other hand, is about how many frames your graphics card can send to the monitor to begin with.

Let’s find out more.

What Does Hz Mean In Gaming?

What does Hz stand for in gaming?

Hz stands for Hertz in gaming.

This is the rate at which your monitor or display is able to refresh a new image onto the screen per second.

When you play a video game, the image is constantly refreshing to update based on your movement, the movement of any element in the background, and basically any change whatsoever.

That needs to be refreshed constantly to keep up the illusion of a moving image.

Your Hz rate is a number, and can vary quite a lot.

Modern Hz rates on computer monitors range from 60-144Hz generally, though they can go even higher.

It’s pretty uncommon for any modern display to go anywhere below 60. Modern TVs, for example, usually have a refresh rate of 60 or 120Hz.

60 is the industry standard now for just about any display.

Obviously, the same basic principle applies no matter what is being displayed on the PC.

The image must constantly change in order to present a moving picture.

However, for games, this is far more important as there is a great deal more change to keep up with.

It’s very important to understand the distinction between your monitor’s refresh rate and your system’s overall frames per second, or FPS.

Your FPS is the maximum number of frames per second that your graphics card, whether from your gaming PC or your home console, can send to the monitor to begin with.

But if there is not a high enough refresh rate on the monitor, then not all those frames will be displayed properly.

At the same time, if you have a 144Hz monitor, but your graphics card cannot put out more than 60 FPS, then this is the maximum you will have displayed.

Where do we get this unit of measurement?

 

Where does hz come from?

As you might have guessed, Hertz is the name of a person from history from which we get the abbreviation Hz.

Heinrich Rudolf Hertz lived in the 19th Century, and was the first person who proved, conclusively, that electromagnetic waves existed.

The term was proposed in the 1920s for use in cinema and TV in honor of the scientist whose work allowed the technology to be built.

Though it does naturally all rely on very different technology now to what it did in the 1920s, the basic principle remains the same.

Moving images are made up of countless still images, and the quality of the picture is going to depend on the number being refreshed per second.

Now, in film and television, a higher refresh rate isn’t automatically better—but we can, by and large, say it is in video games.

So what’s the best Hz for gaming?

 

What is the best Hz for gaming?

It does depend a little bit on you.

Generally speaking, modern video games are designed to run at 60FPS, so you would want a monitor that had at least 60Hz refresh rate.

However, as I said, monitors below that refresh rate are exceedingly rare nowadays. 60 is the industry standard.

With that in mind, most people find 60Hz to be perfectly adequate for any kind of gaming.

For casual, single-player gaming, 60Hz does the job just fine. It looks butter smooth, performs well, and allows you to play the game as it was intended.

It’s important to note that, with lags in refresh rate, this can have a serious adverse effect on your ability to play the game properly.

But as long as you are monitoring a steady 60Hz, you shouldn’t have any issues.

That said, for some of the more serious gamers, 144Hz gives them a greater advantage over their competition, allowing them to see things at higher speeds.

But it’s all about your personal preference—the average person, with the naked eye, really cannot tell the difference between 60Hz and 120 or even 140.

 

Is higher refresh rate or higher graphical fidelity better?

So, the question is whether you want your game to look visually better in terms of fidelity at the expense of your frame rate, or the other way around.

Higher resolution and graphical settings can cause the refresh rate to drop below 60FPS depending on your system, and this is unacceptable for some players.

On the other hand, if you want the game to be played with the highest graphical settings, you may be willing to sacrifice the refresh rate for the sake of added pixels or higher fidelity overall.

Experiment—it’s entirely up to you!

So, it can be somewhat confusing, but for the average person, 60Hz is usually a perfectly adequate refresh rate.

The eye often struggles to tell the difference beyond this point, and so for most people 60Hz is perfectly fine.

However, you may get an advantage out of 144Hz if you are playing fast-paced, competitive multiplayer games like Call of Duty.

It’s all about personal preference.

 

More in Gaming Meanings

  • Polly Webster

    Founder - @PollyWebster

    Polly Webster is the founder of Foreign Lingo and a seasoned traveler with a decade of exploration under her belt.

    Over the past 10 years, she has journeyed to numerous countries around the globe, immersing herself in diverse cultures, traditions, and languages.

    Drawing from her rich experiences, Polly now writes insightful articles about travel, languages, traditions, and cultures, sharing her unique perspectives and invaluable tips with her readers.

Was this article helpful?

Thanks for your feedback!

Leave a Comment