Anonymous edits have been disabled on the wiki. If you want to contribute please login or create an account.

Topic on Glossary talk:Graphics card

Cables, interfaces, and you

1
Mirh (talkcontribs)

We don't have an article about display interfaces, so here's a roundup of problems:

  • Despite DE-15 being technically compatible with SCART, component, composite and whatnot, passive adapters (but even a lot of "normal" active ones, which are just meant to go from a modern digital connection to PC VGA) won't work with most older TVs due to just-too-different video sync signals, scan rate or color space.
  • Not only may monitors support different "overt features" depending on the protocol and version of the port they are currently hooked with, this factor can even affect input lag (let alone the absolute madness that the TV world is)
  • The final capabilities and bandwidth of a video connection is determined by the greatest common divisor between the GPU display engine block, the physical link encoder, cables (and/or possible adapters), and display-receiver:
  • Data rate bounds are set by the RAMDAC frequency for VGA, while DVI and HDMI terms are set by the TMDS clock. These can often lead to lower supported resolutions than the official maximum of the protocols, especially on laptops (I have yet to hear about similar unexpected shortcomings in DP though)
  • these limitations, which can be even more paradoxical with adapters in-between, can be sometimes worked around (even with respect to the official specification), particularly on linux.
  • As far as "features themselves" are concerned instead, display controllers are relatively programmable (as seen many times), and these are almost never theoretically unachievable.
  • it goes without saying though, the GPU drivers are paramount in doing anything. And when it's not for the flip-flopping bugs (best to draw a veil over this) they still have arbitrary blocks, from VRR to chroma subsampling. Sometimes even the specification itself can be too stupid for its own sake.
  • "Embedded" applications (i.e. about anywhere that isn't a desktop PC) have less stricter (and uniform) requirements on the transmitter chip bridges (this is most notable in the TV world with the most disparate subsets of HDMI 2.1 limitations)
  • Seriously, adapters are a can of worms.
  • Strictly speaking, cables don't have "versions" (well, except HDMI w/ ethernet which is separate by design I guess). You only have a minimum guaranteed bandwidth certification, which the actual cord could also easily exceed in practice.
  • Of course there are all similar kinds of problems on the screen's end too.
  • not last, since HDMI for some whimsical reason cannot have the YCbCr 4:2:2 bit depth measured (the thing actually always being carried on a fixed 12-bit signal?), some displays diagnostics menus can report "8 bit" as a placeholder value
  • tinkering around bandwidth limitation of the interface with custom resolutions is even more of a nightmare when the usual troubles meet HDR
  • many TVs may outright refuse any non-standard (as in: anything that isn't commonly found in a "home theatre" setting) resolution, even when forced
  • this is at least still better than the past, when they may have set the color space based on the resolution, or they didn't support non-subsampled 4:4:4 colours either (HDMI technically requires accepting a RGB color format, but that says nothing about processing and final presentation)
  • "Just Scan" has always been important to avoid overscan at least on certain trashy displays (caveats), but despite being found under the aspect ratio settings it technically entails much more than just that. Some newer features may require or disavow it
  • all your base are belong to EDID (which can always be fixed with CRU, but a slightly different one could even prevent the GPU from ever going idle)