Because HDMI 2.1 uses a proprietary protocol that's not implemented in any free OS[0]. If you want to use HDMI 2.1 features right now, your only option is to use a non-free OS like Windows or MacOS.
from a purely technical point of view i do wish HDMI 2.1 was able to gain traction. On a couple of things I own that do actually use it, its an actual noticeable improvement and I feel does a better job than DisplayPort.
Granted, I suspect quite strongly the next wave of consolidation is going to continue the trend of being around USB-C, since the spec should have the bandwidth to handle any video / audio protocols for quite some time. Matter of time until that happens IMO.
It also lets you have a single cord that could theoretically be your power cord and your A/V cord.
From a purely technical standpoint display port is a better standard. HDMI couldn't get their shit together to do anything with USBC and thus all USBC to HDMI converter cables run display port internally.
Display port already allows multiple video streams, ausiostreams ... Why do we need a closed standard to also do this?!?!
Not really. That same link talks about how Intel and nvidia drivers can provide HDMI 2.1 on Linux but it is via their non-free firmware blob.
AMD doesn't (can't? won't?) do the same but there is a workaround: a DisplayPort to HDMI adapter using a particular chip running hacked firmware. That'll get you 4K 120 Hz with working FreeSync VRR.
I don't remember where,but somebody explained that the adapters also have some kind of limitation. I can't remember what but they went into deep details and the whole thing is revolting. Governments should protect open source.
[0]: This came up recently with Valve: https://news.ycombinator.com/item?id=46220488