> explain/link what a color space, gamut, and whatnot are
Ughhhhhhhh.... That's one topic I haven't wanted to delve into at all.
Regarding hardware requirements, it's at the end of the day all relative to what color depth/bitrate, refresh rate, resolution, etc selected as the bandwidth/pixel clock becomes the limiting factor.
This is the limitation that Asus ROG Swift PG27UQ and Acer Predator X27 ran into -- the DisplayPort pixel clock (and subsequently the bandwidth it resulted in) resulted in compromises being required to keep the display signal below the bandwidth limit.
Here's an overview for a 4K resolution on the Asus ROG Swift PG27UQ (though the same exact limitations applies to the Acer display as well): https://i.imgur.com/Hrr4apw.png
A user wanting to run 4K at 12-bit RGB (or no chroma subsampling at YCbCr 4:4:4) with HDR enabled would be limited to 82 Hz. Above that Nvidia's GPUs would automatically enable chroma subsampling to keep the bandwidth below limitations (and effectively result in a worse visual experience whenever the frame rate got too high, lol).
And while 4K at the normal 8-bit and RGB/YCbCr 4:4:4 is possible at 120 Hz, you wont get "proper" HDR at 8-bit -- it is generally said to require at least 10-bit color depth.
Nowadays the hardware requirements are pretty simple all things considered -- get a proper HDR display (not one of those HDR400 "hardly-even-entry-level" displays), a semi-modern GPU (GTX 10 series or later), an updated copy of Windows 10 and you're basically at the finishing line.
Where the major limitations are nowadays are primarily in the software-side of things. Media playback of HDR, for example, in Windows is still in its infancy and while stuff like VLC and recently MPV supports it out of the box, other players like MPC does not unless paired with madVR (and even then users can have different experiences such as Rose had where HDR passthrough didn't kick in at all). Other products such as Plex doesn't even support HDR playback on their Windows clients yet, and instead opts to tonemap it to SDR which Windows then rebalances the whitepoint of over to the HDR range (gah?!).
In games, it becomes a question whether a game makes use of DXGI's native stuff to enable HDR (which Special K itself makes use of), or if they use proprietary NVAPI or AGS calls to enable HDR on those particular hardware. Suffice to say DXGI is cross-vendor, and plays fine with Windows' own HDR settings, whereas NVAPI/AGS can result in weirdnesses where e.g. the DWM isn't restored properly to the previous color space after an Alt+Tab and whatnot.