Graphics and video
- Field of view (FOV)
- Windowed / borderless fullscreen
- Anisotropic filtering (AF)
- Anti-aliasing (AA)
- High-fidelity upscaling
- Vertical sync (Vsync)
- Frame rate (FPS)
- High dynamic range (HDR)
- Ray tracing (RT)
- Color blind mode
Computers have always had to display their output in some way, and for the most part that was done as a text only interface, only later with future systems did the default interface become graphical. Back in that time, systems had built-in dedicated hardware for drawing the interface and graphics, once DirectX and 3D Gaming started to become popular, more powerful graphics were needed to render games at decent speeds if at all, this is where Graphics cards from 3DFX, Nvidia, and ATI (Later bought by AMD) come into play, these were addon cards which housed powerful graphics processing units and their own memory which added increased graphics capability to systems that didnt have them previously.
Dedicated versus integrated graphics
Graphics come in two forms, dedicated and integrated. Dedicated graphics have a bigger processor and their own memory (VRAM) seperate from the central processor which offers better performance. They are often separate cards that are added to a computer, whilst laptops and other forms of computers have built-in dedicated graphics. Integrated graphics are part of the CPU or motherboard, stopping the need for an additional card at the expense of performance. Usually memory is shared with the computer's RAM, with a small amount of dedicated video memory, which reduces the amount of total usable memory that a system has.
Both AMD and Intel provide their CPUs with integrated graphics, but AMD also sells a line of "APUs" (Accelerated Processing Units) that have increased graphics performance making them more suitable for gaming.
Gaming / Workstation laptops come with dedicated graphics processors that are usually used when running a intensive workload or game, but this isn't always the case since the operating system has to decide whether a task is intensive enough to warrant using the dedicated processor, which can lead some tasks or games to run poorly. Systems can rely on several mechanisms to decide what processor to use, like using the dedicated when plugged in and integrated when on battery. Nvidia's Optimus technology is one such example of switchable graphics.
Some graphics cards can be used in tandem with up to 3 other graphics cards to boost the overall output of the cards. With Nvidia cards this technique is called SLI, with AMD/ATI cards it is called Crossfire. Nvidia's technology requires video cards that are exactly the same type (e.g. A GTX 760 and another GTX 760). AMD/ATIs technology requires cards from the same series (e.g. A HD7970 can be combined with a HD 7950). This feature's benefit has always been variable and doesn't give a perfect 2x performance boost, if any boost at all. As time went on, the feature got prioritized less and less until being dropped entirely from Nvidia cards starting with RTX 4000.
Identifying the graphics card
For Windows systems:
Overclocking is causing the GPU to run at speeds beyond what the manufacturer recommends. Overclocking can damage the GPU if performed improperly, If done properly can provide a notable increase in performance can be achieved. The gain further depends on the type of GPU and type of workload/game. It is recommended to look up corresponding information online to see the "sweet spot" of the GPU to overclock.
Overclocking can be done through software like EVGA's Precision X, MSI's Afterburner, or AMD's Radeon Software.
Please note that overclocking will usually void the warranty and that it increases the chance of a GPU malfunctioning. Overclocking is done at the risk of the user and is not recommended for novices.
GPU scaling allows the GPU to determine how non-native resolutions are displayed on the display. If configured to perform scaling on the Display, the video scaler of the monitor will determine it instead. Some TVs and other non-monitor displays may show black borders on widescreen resolutions. GPU scaling does not affect this; see Overscan for solutions, See the glossary page for more information.
Overscan and underscan refer to the behavior of certain television sets and displays to show the image incorrectly; typically as a result of misaligned configurations or expectations between the TV/display and the graphics card that sends out the video signal. This is an issue that originates from how early analogue televisions had quite loose manufacturing standards, and the different solution that TV producers came up with to counteract these differences. While the overscan article on Wikipedia covers the subject in more detail, the importance of it is that the solution that TV producers came up involved adding black borders around the actual image that was sent. So instead of a video signal only containing the intended image to be displayed and nothing else, the signal would also include black borders around said image.
- Underscan refers to when TVs/displays shows the black borders around the image that was added by the source device to the video signal. The receiving display end up showing these black borders because the black borders were not expected to be a part of the video signal, or the TVs/displays expected the black borders to be smaller than what they are sent as.
- Overscan refers to when TVs/displays crops parts of the actual image. The receiving display does this because it expected additional black borders around the image to be a part of the video signal, but in fact the video signal either did not include black borders at all or the black borders were smaller than expected.
The solution to both scenarios is to tweak both or either device so that their configuration matches each other properly. For a crisper result, disable or tweak the settings of the display if at all possible before implementing overscan correction on the graphics card end.
The 3Dfx Voodoo card was the first true 3D accelerated video card (prior cards simply increased the number and sizes of available display modes and/or increased the color depth available). It utilized its own unique API known as Glide, which itself was simply a subset of the OpenGL API. Unfortunately after their acquisition by Nvidia, their Glide API was abandoned and Nvidia did little to add support for it. Luckily it can be emulated through various tools: