The Lag Tester: A New Standard


Best HDTVs for Gaming: Part 2


When I made my inaugural post for this website, I mentioned how the established standard for testing input lag (using a clone display and a software program) was prone to a variety of potential variables, causing some doubts in its legitimacy. As it is a cumbersome process and not something that can be done with ease, it was about time someone did something better. That man was Leo Bodnar, and he developed the Lag Tester to combat these inaccuracies to fight the phenomenon known as input lag.


What is the Lag Tester?

[dropcap]T[/dropcap]he lag tester, quite simply, is an external box powered by 2 AA batteries. The box uses a HDMI cable to send a 1080p signal to the display, and presents 3 flashing white bars on a black screen. The box comes with a sensor that you have to place right onto the flashing bars, and it presents a reading on the screen. This reading is the combined input lag and response time reading calculated from the sensor.


Wait! Why are there three bars instead of one?

Because there are different display technologies! A plasma screen has a near-instantaneous response time of the whole screen, and is rendered from top to bottom in one interval. This means that the image is rendered as one whole piece on every frame. Meanwhile a LCD (or LED) screen has to render the image from top to bottom gradually. What this means is LED/LCD screens have a lag time before the whole image is rendered on the screen. The difference between the top and the bottom of a LED/LCD screen is nearly 16ms (1 frame). Due to this difference in display technologies, the Lag Tester has 3 separate measurements for different parts of the screen.


So how are the measurements on Display Lag calculated?

[tabs ] [tab title=”Editor’s Note”]As of June 24th, 2013, Display Lag, in accordance with sites like CNET, HDTVTest, Sound+Vision, and Anandtech, has shifted to using averages as the standard of grading using the Leo Bodnar Lag Tester, instead of the bottom measurement. As a result, some information in this article is outdated. You can visit the Testing Method page for more information. The original text of this article is preserved for reference.[/tab] [/tabs]

In the interest of fairness, all LCD/LED displays on Display Lag are graded using the bottom flashing bar. This is the area with the most lag. If you are wondering why the results on Display Lag seem higher than results you have seen before, this is why. It makes more sense to measure the completed frame after the image has been completely processed before a new frame enters. A typical TN monitor tends to have readings similar to this:


[list style=”arrow-right”] [li]Top: 5ms[/li] [li]Middle: 9ms[/li] [li]Bottom: 18ms[/li] [/list]


Its easy to assume that the middle section is the correct reading and a means for an “average”, but the end result is what matters. It makes more sense to calculate the highest possible lag, and the fastest displays in that regard are the ones you should be purchasing. For example, the ASUS VH236H is used by highly competitive fighting game players worldwide and has a history of great input and response times. Even though 18ms seems like a lot, this number factors in the display response time as well as the input lag, measured from the bottom frame. With respect to this, its actually one of the fastest monitors tested on Display Lag.


Best HDTVs for Gaming: Part 2



Being a highly competitive fighting gamer myself, it is very important that a standard is established for measuring input lag, so that we can show manufacturers that we care about this phenomenon. Input lag is not an advertised specification from manufacturers and has to be hand-tested by people. Till now, it was a slow and cumbersome process. However, thanks to the simplicity and ease of use of the Lag Tester, it is now a reality to be at the forefront of fighting display input lag.


About the author

Adeel Soomro

Adeel Soomro

Adeel Soomro, also known as "Four Wude", has been a competitive Street Fighter 4 player since 2008. Using his extensive gaming experience on a casual and professional level, he aims to spread the awareness of input lag existing in today's displays. Having tested over 300 displays for input lag, he hopes that DisplayLag will aid gamers around the world when purchasing the best HDTV or monitor for gaming.


  • Hey Four Wude!

    Could you run a test on the AOC i2757 Fm/Fh for me? It’s a monitor rather than an HDTV, but can function as the same thing due to HDMI. I only just got it after trawling around input lag reviews looking a for good gaming TV on the net for some time before coming across this site.

    I’m very satisfied with the responsiveness. It cut my twitch reaction by around a frame over anything I was previously using. Traditional testing rated it’s input lag at 5.1 ms, so I’d like to see where it rates under this method; and your database immediately goes from around 20 ms to the mid 30’s range, so It might help bridge that gap.

    • DevilKnight:

      I know the numbers look high, but don’t worry. The only reason its rating high is because its using a new standard for grading, which is going to give different numbers, as it calculates response time as well as input lag. For example, the very popular ASUS VH236H rates around 8.6ms using the old method, but is rated at 18ms using the new method on my website. Its an excellent monitor for gaming, even on a competitive level. As far as your request is concerned, its a bit hard for me to track specific displays down at this time, but I will grade anything I can get my hands on (even if its AOC). I hope you like the site so far, it will be updated very soon!

  • Actually, that might be kinda lazy of me. If it’s possible to buy the lag tester somewhere cheaply, I might be able to contribute the result to your database myself.

  • Why do you test the monitors in ‘PC-mode’?

    You should run the test after setting them in gaming mode, that makes a huge difference for many of the monitors.

    These results seems kind of arbitrary because alot of the monitors are significantly better in ‘gaming mode’ with alot less input lag.

    • Everything is tested in Game Mode. The only displays that are not are the ones labeled “PC” in the database. PC mode usually brings lower lag than Game Mode on some displays.

      • This is a good point that is not well understood (due to inconsistency among models).

        I recently realized my Samsung LN-T4069F has this behavior. This display has both “Game Mode” and a stealth “PC mode” (rename the input to PC). I don’t have a tester yet, but I can tell subjectively from a 1080p rythm game that “PC mode” is several frames faster. Interestingly, “Game Mode” can be enabled while in “PC mode”, and this slows it down. Similarly, while in “PC mode” there is an option “Home Theater PC”, which also causes the same slow down. (Both of these lag-inducing options make color settings available — see below.)

        On the LN-T4069F, “PC Mode” disables the ability to tweak any basic or detailed color options. Only Contrast, Brightness, Backlight and color-temperature can be modified. This is a significant drawback, and I’m curious if you always find this limitation with the “PC Mode” TVs here. It seems worth noting somewhere if it is at all common.

        • Unless stated in my database, all HDTVs/monitors are graded under “Game Mode” or any mode that reduces image processing. Due to some PC modes being better than Game Mode, the ones tested under PC mode are labeled “PC” in my database. This does not mean that other TVs don’t have a good PC mode, but rather I couldn’t test PC mode at the time, or the PC mode was equivalent to Game Mode.

  • This is a great project. We have desperately needed an objective measurement of input lag to drive the issue as a product discriminator. Your site is is an excellent way to ensure it becomes a selling point.

  • One thing I am concerned about is your presentation of a single number when the device reports 3. It seems like an effort should be made to ensure the same choice is selected as the baseline for all reviewers. For example, I see one review site ( that has started including results from the device. But it’s not clear which value they report. (I couldn’t find any overlap between your models to see if the match.)

    I also wonder if measuring the bottom value overstates the difference between small and large TVs. Do the top and bottom values vary more based on the size? Some examples would be an interesting post if you get a chance. 🙂

    Do you actually measure & record all 3 numbers? It seems ideal to retain them all. Perhaps some day the standard will be to report all the values.

    • I will be recording all 3 numbers in my next batch of tests, but I will still be using the bottom number for the database unless someone requests otherwise. Unfortunately, I do not have 3 sets of numbers recorded for the displays currently in the database. My standard was actually derived from AVForums! They grade using the bottom measurement with the same reasoning, so in order to create a standard I also use the bottom measurements.

      • FYI, I measured a random Samsung TV with the lagtester device today and the numbers were backwards — the lowest time was at the bottom. Soooo … I think the standard for using only one value is misguided. (Or at the very least should be the middle value.)

        By the way, have you seen the lagtester reporting steadily incrementing values that reset back to the floor value after ~16 seconds?

        • I noticed this happening with a handful of TVs that I’ve graded. Unfortunately, the majority of my database was only graded with the bottom bar readings, however I started recording all 3 readings as of my latest database update. I’m still going to publish bottom bar readings in order to stay consistent with places such as AVForums, but if someone has a question regarding the measurement, I can supply them the other readings if I have them recorded.

          To answer your second question, yes I’ve had incrementing values that reset back to the floor position. Although if it takes around ~16 seconds to do it, I assume the reading is fairly accurate and publish the most consistent value I get. Some TVs have fluctuating readings the minute I test them, and for those displays I just list the range of values I got into my database.

  • Is there any chance that you will run a test on a simple CRT system so that we can compare the ‘old standard’ to the ‘new standard’?

    • Sorry, I do not have the means to test using the old standard. There are other sites on the web that do use the old standard though, so I would recommend cross-referencing if testing has been done with something in my database.

  • When using the Input Lag Tester on of my computer flat screen I got really weird results.
    It didn’t matter where on the screen I pointed the optical sensor. The input lag always showed 0 or 1 ms.
    Has this happened to you?

    • It happened on one Sony HDTV I graded, an older model. I’m fairly certain its an error in the processing, maybe due to some circuitry in that specific HDTV. So I wouldn’t use it as a true reading. Out of all the HDTVs I’ve graded, I’ve only had maybe 1 or 2 exhibit that kind of behavior.

  • Thanks for this excellent website.

    Is your lag results for the Samsung Un55f6400 on game mode or PC mode?


    • Both Game and PC mode exhibited identical readings, which is why I didn’t put the (PC) symbol. I only use the symbol if there is a clear difference between Game and PC mode.

      • We, at the Blur Busters Blog, believe that the middle is a more honest measurement, because this equals the average input lag of the whole image. It also neutralizes differences in scanout (e.g. top-to-bottom, sideways, bottom-to-top, all-at-once, etc). We have encountered a few displays that exhibit unusual image-presentation behaviours that are not traditional top-to-bottom scanout.

        — Sony HX950 configured to “Motionflow Impulse” (strobe backlight mode)
        — ASUS VG248QE configured with LightBoost enabled
        — Samsung S27A950D configured with 3D mode enabled (it strobes in 3D mode)

        As middle of screen equals the average input lag, and most people stare at the center of screen (e.g. videogame crosshairs, pointing at enemy), we believe that the centre of the screen is a more honest and fair comparision of input lag between different displays utilizing different scanout/presentation mechanisms.

        • One thing you should note about your comment is you’re basing your testing on current gen gaming. For the rest of us, I’ll tell you what… play a few games of Super Mario Bros or better yet, Mega Man. Once you fall into a pit or get nailed because you couldn’t avoid some random shot enough times, you’ll have the excuse you need to re-evaluate your testing methods.

          Retro gamers would really appreciate your consideration for representing us in the best interests as opposed to being shunned for the bigger crowd as it been as of late. After all, we’re still part of the big picture.

          • I understand your concern Randy. For anyone in the retro gaming scene that is interested in knowing 3 separate measurements instead of the average, please feel free to email me using the “Contact Us” form on the top of the site. I can provide them for any display that is listed as AVG in my database. Hope this helps!

  • There is a flaw with this specific input lag measurement method when we test the brand new strobe backlight displays (e.g. “LightBoost” displays, and “Sony Motionflow Impulse”). They refresh the screen in total darkness, then the backlight is flashed all at once.

    We believe that average input lag measurement is more realistic and comparable because:
    (1) People usually stare at the center of the screen
    (2) It neutralizes differences in scanout (top-to-bottom versus all-at-once presentation)

    120Hz display without LightBoost strobe flashes:
    Top: Xms
    Middle: Xms + 4ms
    Bottom: Xms + 8ms

    120Hz display with LightBoost strobe flashes:
    Top: Xms + 8ms
    Middle: Xms + 8ms
    Bottom: Xms + 8ms

  • Since the tester only supportes 1080p, does this mean that all 4K TVs tests are actually done through upscaling? This would introduce additional lag, wouldn’t it?

    • There is a chance that it can, however there is no way for me to currently do input lag testing natively at 4K. All of my tests on 4K HDTVs are done using a 1080p signal. I figured that it’s better to provide some numbers, rather than none. However, I’ve found that scaling hasn’t drastically affected input lag, as much as general image processing has. It’s possible that 4K HDTVs invoke more processing when scaling 1080p sources to 4K. Once a method is available for me to test input lag at 4K, I will definitely incorporate it into the site.

  • Who gives a shit about 60Hz pattern generators, how do you get 144Hz results? Why does all your data look skewed compared to Tom’s Hardware, PCMonitors, and Battle(non)sense at higher refresh rates?

Leave a Comment