What's new
I would LOVE to see how this panel compares to consumer grade stuff. This would give us a good benchmark that we could compare the rest of the market to.
I 2nd the need for a unbiased test of the monitor to know how it really stacks up to the competition.
I sold my original panel, so I can't provide results for that... But I will look into getting test results for the new screen.
 
Well then.. It is up to Derick and Jassin to test the LCDs or we can try to put a donation thing together and send tftcentral a unit for testing.. But if the test comes out outstanding Taito will be out of stock and these things will be on eBay for big $$
 
for testing overall display lag you'll want one of these: leobodnar.com/shop/?main_page=product_info&products_id=212 I would LOVE to see how this panel compares to consumer grade stuff. This would give us a good benchmark that we could compare the rest of the market to.
Well, I have both screens available so I can test them. Just need to get a tester, let me see what I can find on amazon.
you're not going to find that tester on amazon, the guy that designed and built is AFAIK the only person who sells them. I don't believe there are any other lag testers on the market, at least not any that cheap. The Leo Bodnar tester is quickly becoming the de-facto standard for testing lag, so unless you borrow or buy that specific model any comparisons would be pointless.

The next best lag tester is Rock Band (2 or newer), the official guitar has a lag tester built in for calibration purposes, but it's generally considered to be less accurate than the Leo Bodnar tester, and again, unless you're using the same tester that everyone else is using there are enough inherent differences in how they work that you can't really compare the numbers.
 
I have Rockband 3 on 360, and that version of the guitar.
I also have both a HDMI and VGA cable for output options (lag should be less on VGA).

I will get some numbers up tonight.
 
I have Rockband 3 on 360, and that version of the guitar.
I also have both a HDMI and VGA cable for output options (lag should be less on VGA).

I will get some numbers up tonight.
excellent... it's important to note that lag is generally a few ms different between the top of the screen and the bottom. so identify the sensor location on the guitar and run the test a few times at the top of the screen and a few times on the bottom.

the Leo Bodnar tester produces 3 regions on the screen (top, middle, bottom) to test the difference. Display lag.com used to report the bottom reading (the one with the most lag) but I believe they now average the reading... from their testing method document:

The tester presents 3 flashing bars on the top, middle, and bottom of the screen. Displays marked with “AVG” are calculated using the average of all 3 bars. Displays marked with “BTM” are calculated using the bottom flashing bar (typically the area with the most lag). As of June 24th, 2013, all newer measurements will be reported as “AVG” on Display Lag, as it is the grading standard agreed upon between Display Lag, CNET, Sound+Vision, HDTVTest, and Anandtech. Displays marked with “BTM” measurements can be updated to “AVG” measurements at any time without notice. When comparing displays, please compare “AVG” to “AVG”, and “BTM” to “BTM” only.

I'm also very curious as to the difference in lag between VGA and HDMI. Which is something that the Leo Bodnar can't do (it's HDMI only).

also important to note that the Guitar Hero method will read higher than a Leo Bodnar tester since it's also including input lag from the controller to the console. this makes sense for the game but that's partly why you can't directly compare GH readings to LB readings.
 
you're not going to find that tester on amazon, the guy that designed and built is AFAIK the only person who sells them. I don't believe there are any other lag testers on the market, at least not any that cheap. The Leo Bodnar tester is quickly becoming the de-facto standard for testing lag, so unless you borrow or buy that specific model any comparisons would be pointless.
Damn it, just looked it up. So, do I get the 1080p or 720p or do I need both? The old screen is 720p, new is 1080p?
 
you're not going to find that tester on amazon, the guy that designed and built is AFAIK the only person who sells them. I don't believe there are any other lag testers on the market, at least not any that cheap. The Leo Bodnar tester is quickly becoming the de-facto standard for testing lag, so unless you borrow or buy that specific model any comparisons would be pointless.
Damn it, just looked it up. So, do I get the 1080p or 720p or do I need both? The old screen is 720p, new is 1080p?
The standard for testing uses the 1080P resolution, even on 720P panels... does the old screen support down-sampling from that resolution?
 
The standard for testing uses the 1080P resolution, even on 720P panels... does the old screen support down-sampling from that resolution?
Hmm...I believe so. I am going to order the 1080P version, since I am replacing the screens any way and thats what the new one is meant for.

Ok, order placed, should have it by end of the week.
 
Last edited:
So the first round of Rock Band 3 Xbox 360 auto calibration tests are in!

For this test I used only the HDMI port, I do have the VGA cable... Some place?
I will redo/update this test once I locate it, now on to the results!

BenQ RL2755HM "Gaming Monitor" 27in 1080p monitor
IMG_20160321_190609.jpg


Taito N94R0442A "VEWLIX LCD kit" 32in 1080p monitor
IMG_20160321_192055.jpg


Samsung UN32EH4003FXZA 32in 720p TV
IMG_20160321_195358.jpg


Ok, so this wasn't what I was expecting at all! :D
No I did not "cherry pick" the best results, I ran the test twice on each monitor... On each monitor it was the same both times (I only ran once on TV).

So the Taito LED is 1ms behind one of the best gaming monitors you can buy?
Color me impressed! 8o

I think its important to note; The BenQ is 27", the Taito is 32" and with size always comes lag.
 
Last edited by a moderator:
Nice! that's damn impressive.

For others who don't know. Displaylag.com rates the BenQ at 10ms and the Samsung at 26ms.

So it looks like the GH test measures 14-15ms higher than the Leo Bodnar.
 
Last edited:
cool.. staff. so it seems that once you hit 32" it is like 3 times as much lag than 27" monitor at Displaylag.com

@twistedsymphony - when this thing says 4ms.. are we saying 4ms input lag when gsync is on with a compatible gsync card? or is it 4ms input lag with or without gsync like if you connect a console.
http://www.newegg.com/Product/Product.aspx?Item=9SIA0AJ3YZ9174&cm_re=gsync-_-24-009-893-_-Product
it says 4ms "response time"... "response time" is NOT "lag", it's a completely different measurement, and it's only a very small contributing factor when considering the total lag you experience while playing.

When you're talking about lag you have
1. Input lag - is how long it takes for your button push to be registered by the game code, generally this is effected by the controller design (how long it takes to encode the button push into a data packet for wireless transmittal or a USB signal etc. as well as how long it takes the internal architecture of the PC or console to interpret that signal and actually register the button push in the game code. Really bad keyboard encoders on MAME setups (including early 1st gen JPACs) actually had bad lag here, most wired setups have very small amounts of lag even when being encoded for USB or JVS.

2. networking lag - if you're playing online you may need to wait for a signal to be sent or received before the game code can make a decision on what to do next. If you are playing online this is the largest lag contributor, typically effected by "ping time" which is how long it takes to send a receive a response from the server you're connected to. Typically it has little to do with how fast your connection is and everything to do with how far away from the server (or other players) you're located as well as how many "hops" from internet server to internet to server it takes to get to the actual game server and back.

3. video processing lag - which is how long it takes for your console or your PC to generate the next frame after it makes the decision about what to do next. If you enable v-sync or buffering to improve the graphics this can add as much as 16ms ( or 1 whole frame) of lag because you're telling the PC to wait that long before releasing the frame. Also if you have a really under powered PC this the step where the lag comes from. At the end of this step you have a signal leaving the console or PC in the form of RGB, VGA or HDMI.

----- everything before this is determined by your console or PC, everything after this is determined by your monitor -----

4. display input lag - not to be confused with the controller input lag from step 1, this is the amount of time it takes your display to decode the signal that was sent to it from your PC or console, In general it's assumed that VGA is better than HDMI since the signal is being streamed to the display pixel by pixel with a VGA signal while HDMI generally needs the whole frame of data to be sent before it can start interpreting it. Also many devices encrypt the digital signal which means that your display will also have to decrypt it before it can use it. ALL digital display technologies will have some amount of lag due to this step, even Digital CRTs. However, this is the step that the G-Sync monitors claim will be greatly reduced. This step is what REALLY separates the low lag displays from the high lag displays.

5. scaling lag - LCDs and other modern displays are "fixed pixel" displays, which means that they can only ever display images at one resolution, their "native resolution", if the signal you're sending the display is NOT the native resolution then your display will have to scale that image to match it's native resolution, this takes time and adds lag. if you run your console or PC at the native resolution of your display then you can eliminate this step (though it may result in lag being added at step 3). It's important to note that most "720P" displays typically have a "1366x768" native resolution, while the 720P spec calls for "1280x720" that means that sending a 720P signal to a "720P" display might still require some scaling. Higher end displays are very good at scaling, they make hardware scaling chips that have a sub 1ms processing time, however most lower end displays perform software based scaling which is cheaper, but more time consuming.

6. post processing lag- many modern HDTVs perform other image adjustments in order to improve the image quality, these might make movies and TV look better but it generally adds a lot of lag, if your TV has a "game mode" this essentially disables this step. Though some TVs don't completely disable all post processing in game mode, and other displays don't even offer a game mode to disable any of it. Technically scaling is part of this step but I'm listing it separate since they can be controlled separately. Computer monitors and industrial monitors (like what they use in arcades) are generally better at this than TVs because they don't bother with any of the post processing crap at all.

7. response time - once the video signal has been decoded and any scaling or processing are done. the electronics then tell the actual LCD part of the LCD to change from one color pattern to the next. Response time is how long it takes for a single pixel to change from one shade of gray to another shade of gray. Technically this isn't considered lag at all, but obviously it's still some amount of time before the change in color is perceivable so when you're talking about "real world" lag then it's definitely part of the overall picture. It's important to note that technically only a portion of the response time contributes to lag because the lag testing device will perceive the change before the display has completed the transition from one color to the next. This is also part of the reason why you can't compare the reading from two different lag testing methods even if they "work the same way", the difference in the hardware design can lead to vastly different lag reading depending on how much or how little it is effected by the display's response time.

----- everything before this has to do with your display ------

8. human reaction time - technically has nothing to do with the console or pc or display but it does take some time for you to see an image, process what happened and then push a button in response, for most humans this takes about 20ms just to put everything in perspective and close the loop on the whole lag cycle.

When we're talking about the Guitar hero testing method that's measuring the lag from step 1 up through step 7, everything but the human part of the equation. When we're talking about the Leo Bodnar lag test that's testing from step 4 through step 7, everything having to do with just the display (which is perfect for comparing displays against each other). When you see a display claiming "4ms response time" then that's ONLY looking at step 7 and nothing else.
 
Last edited:
Thanks, I did forget one though... between 3 and 4 if you're using an external video converter or scaler then that's another lag adding step.

I should also note that IMO unless someone actually tested to determine the numbers I wouldn't consider them reliable, there is so much guessing and people claiming their person perception as fact "My LCD has no lag bro..." or "I didn't notice any lag from my converter" etc.

Input lag is actually pretty huge, on a PC there are differences between Raw Input devices and Direct Input devices and PS/2 keyboard devices. there was some interesting testing done on BYOAC that put some "rumors" to rest about the differences. IT's impossible to say how much lag is actually caused by each device alone since the testing method they used was again testing steps 1 through 7 but for comparing the devices to each other it lets you see the difference between them.

I'd suspect in the PC world that JVS is slower than native JAMMA, and that newer "FAST I/O" inputs are replacing JVS on games where low latency is critical.
 
Last edited:
I'd suspect in the PC world that JVS is slower than native JAMMA, and that newer "FAST I/O" inputs are replacing JVS on games where low latency is critical.
I agree with this statement 100%!

Nothing is ever going to be as fast as JAMMA, its direct wire instant input.
Next "Fast I/O" because of the way modern motherboards are designed.
Finally JVS, because its an encoded signal.

This was the problem when using JAMMA-to-JVS adapters, the game play lag was worse (game play wise) then video lag.
 
About the input lag, there's a guy who made some tests in case you are interested check it out:

http://www.teyah.net/sticklag/results.html

I was looking to update my cab with the brook fighting board so this was useful for me.
That's an excellent resource. Interesting to see how different the PS360+ ranks on the PS3 vs 360. I'd be curious how all of those options compare on PC, especially in comparison to the JPAC/IPAC/KADE/etc.
 
Back
Top