Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here's an example of what's going on:

1. The game decides what happened since the last frame: an opponent appeared on the screen

2. Game -> GPU (3D geometry)

3. GPU -> Frame buffer (geometry is rasterized into pixels)

4. Another part of the GPU handles the DisplayPort protocol... Frame buffer -> DisplayPort -> Monitor

5. This is where G-sync works. Based on all the images supplied on the NVidia site, the $100 G-sync board is a drop-in _replacement_ for the controller board in your monitor. So obviously, it's only guaranteed to work on the ASUS VG248QE. Monitor's DisplayPort input -> Frame buffer

6. Where G-sync shines: a vanilla controller board in your monitor should not buffer very much of the pixel data. But they sometimes buffer waaaay more than necessary. To oversimplify, you could receive 1 row of pixels on the DisplayPort and send them out as analog signals to the transistors in the LCD panel while filling up the buffer again for the next line. NVidia's uses something about the vertical blank packet to tell the board that a new frame is coming _now_.

7. The transistors react to the change in voltage and either block light or allow it to pass through.

8. You see the change and shoot your opponent.

Because of the way the liquid crystals respond to voltage, the analog signals to the panel are anything but simple [1]. The display can't just be left "on" and can't be expected to instantly react to changes. So the G-sync board has to be panel-specific for the Asus display, but is smarter than your average display controller.

The oversimplified example above breaks down because the transistors embedded in the panel need a varying signal, so the controller is actually driving the panel at rates much higher than 144Hz per full frame. This way, each transistor experiences an average voltage something like [1] caused by the rapid updates coming from the controller. Separate LCD driver chips disappeared over 10 years ago; now all LCD panels are designed to be driven by a high-frequency digital signal that averages out to the voltage the panel needs. In car audio it's called a "1-bit DAC," but inside an LCD it's called a "wire." :)

Interestingly, G-Sync may actually be capable of shortening the panel's life. Since it is capped at 144Hz which is the panel's maximum rated speed, it may be perfectly safe. But any time you go and change the analog signals to the panel there could be harmful effects: slow changes in color calibration, ghosting, or even dead pixels.

[1] Image only: http://commons.wikimedia.org/wiki/File:LCD_Panel_drive_%28Ac...

I looked for a good explanation of the physics behind LCD drive voltage, but Google apparently can't find any good articles out there...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: