Does it though? Does it really?
I don't understand this move from HDMI Forum. They're handing a win to DisplayPort.
Of course not. It's just protectionism and rent-seeking.
> I don't understand this move from HDMI Forum. They're handing a win to DisplayPort.
I don't think so, at least at this point. Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them, and of those who do, not all of them use Linux and/or care about open source drivers.
Sure, that situation may change, and the HDMI Forum may walk back these requirements.
At any rate, for some reason DisplayPort has just not caught on all that much. You very rarely see them on TVs, and a good number of mid-/lower-end monitors don't have them either.
It's bizarre, really.
DisplayPort won everything, except not becoming the physical connector for home cinema. Heck, even within those HDMI-exposing devices, DP won.
The vast majority of display drivers speak eDP. Few things actually implement HDMI, and instead rely on DisplayPort to HDMI converters - that's true whether you're looking at a Nintendo Switch or your laptop. Heck, there is no support for HDMI over USB-C - every USB-C to HDMI cable/adapter embeds a HDMI converter chip, as HDMI altmode was abandoned early on.
The only devices I know of with "native" HDMI are the specialized TV and AV receiver SoCs. The rest is DP because no one cares about HDMI.
However, seeing that home cinema is pretty much purely an enthusiast thing these days (the casual user won't plug anything into their smart TV), I wonder if there's a chance of salvation here. The only real thing holding onto DisplayPort is eARC and some minor CEC features for AV receiver/soundbar use. Introducing some dedicated audio port would not only be a huge upgrade (some successor to toslink with more bandwidth and remote control support), but would also remove the pressure to use HDMI.
With that out of the way, the strongest market force there is - profitability - would automatically drive DisplayPort adoption in home cinema, as manufacturers could save not only converter chips, but HDMI royalties too.
Arguably true, but I think that is changing all the time while there is a push towards open-source drivers regardless of the average user knowing/caring what that is, along with resolutions and refresh rates increasing.
I was affected by HDMI Forum's decision by buying an off-the-shelf 4K 120Hz monitor which refused to work at that resolution/refresh rate on an HDMI cable.
I was not expecting an arbitrary decision affecting software to be the cause instead of a hardware problem - which took me a while to figure out.
Now I know if I want to use my hardware to the full capacity, I need DisplayPort in future.
I do, but this hardware doesn't have DisplayPort. I switched from Nvidia to AMD specifically for the open source Linux drivers, so I'm quite mad at the HDMI forum for this.
On the other hand, my next TV likely won't have DisplayPort, either, because almost none of them do, so it is indeed questionable whether this is going to loose them any mind share.
Don't know why you're being downvoted but it's true. Especially when you see that the HDMI standard was developed by the cartel of TV manufacturers and major movie studios[1] when DVI and Display Port already existed but those didn't generate royalties or have DRM.
Despicable standard. There wasn't even a standards "war" like in VHS vs Betamax, or SD vs MemoryStick, or USB vs Fire Wire, to say that HDMI won over DisplayPort, it was simply shoved down consumers' throats since every TV, media player and games console only shipped with that port alone as they were manufactured by the same cartel that developed the HDMI standard.
So much for the so called "free market".
Maybe that's still a tiny amount, but it's likely the most common 'need'.
Most people maybe not but a simple 4K TV that can do > 60 FPS fits that criteria. Those aren't that rare anymore.
I suspect all the nice features that make DisplayPort a better standard are harder to implement cheaply, eg chaining
And that would be bad how? DP is an excellent standard and royalty free.
It's protecting your standard from being used by others when wide adoption is the only thing that differentiates your standard from others
i.e. they're shooting themselves in the foot
Is this useful, if all the relevant devices only have HDMI ports and not DP?
(actually works both directions, e.g. if you have a portable display with only Type-C connectors like this https://www.hp.com/us-en/shop/pdp/hp-e14-g4-portable-monitor — BUT it can't power the display, you need to use another connection on the display for that.)
There is no HDMI over Type-C (there was an attempt at it, but it died. Probably for the better of not having even more Type-C confusion and interoperability issues.)
Using Type C for DisplayPort instead of the good full-size DisplayPort connectors is less reliable (easy to disconnect accidentally) and it permits only shorter video cables.
More importantly, this blocks the Type C connector, which I need for other purposes, e.g. an external SSD. I do not want to carry a Type C dock, so I end using HDMI, even if I do not need HDMI and I do not want HDMI and even if in almost all cases the devices had enough free space for a full-size DisplayPort connector.
Even replacing the HDMI connector with a DisplayPort connector (so that the devices would have only full-size and Type C DisplayPort) is always a better solution, because there are a lot of cheap adapters from DisplayPort to HDMI, which do not need a separate power supply and they can even be incorporated inside the video cable. The reverse adapters, from HDMI to DisplayPort, are much more expensive and much bulkier, so usually they are not acceptable.
For me the experience is not so good, given that HDMI signals always require at least 2 very long seconds to be recognized by a monitor, often even more.
DisplayPort is superior in every other way imaginable. Except for the fact that almost no TV supports it.
Low-end monitors also don't usually have them, but as far as computer monitors go, I'm not interested in the low-end ones.
As for TVs - just give me a dumb screen with ports. I'm going to attach Apple TV to it anyway.
No device output in DP, no device accepts it, so no pressure on device to accept/output it. I guess the license price is low enough.
On computers, it sort of evolved where DVI was, you get mort port, you get better feature set, it's just superior.
But in the non-tech market I think the "real" fight will end up being hdmi vs usb-c, both of them are evolving to the point where they feed everything ethernet included. HDMI has ARC and waayyyy simpler cable and port compatibility (one version to check), usb-c has power output and every single pocket device and laptop/tablet/...
HDMI ethernet and HDMI eArc use the same pins. eArc won, HDMI ethernet is pretty much dead.
Yeah, if we exclude basically every half decent GPU and ~70% of laptop USB-C ports in existence.
So frustrating. I'm using a 42" LG OLED TV as a monitor right now. Very nice monitor at half the price of the same panel in a "real" monitor. I'm driving it with an AMD card at 60 Hz for exceedingly stupid reasons.
To make things even worse, this monitor supports sending back the ARC audio over DisplayPort, but only in stereo. If I use HDMI between the monitor and the computer, I get all of audio channels.
So, perhaps people should favour DP instead of HDMI and gradually switch?
Neither my monitor nor my GPU support DP2.0 which does have enough bandwidth. So until I upgrade both, I'm using HDMI. My computer is not outdated either, there's just nothing to upgrade to. None of Nvidia's consumer GPUs support DP2.0, and I can only find 2 monitors that support DP2.0. Anyone getting new hardware now will be in a similar situation to use HDMI2.1 over DP1.4 until their next upgrade.
ARC could also be considered as a bug, a hindrance, or both.
ARC and its various implementations would not exist if the HDMI Forum would not be so fanatically force copy protection on everything. The whole problem, or feature that ARC is or is not, would disappear with the reliance of protecting every stream. The alternative would be a full datastream, decoded, going back to the device in question. The prerequisite would be to remove the shitshow that HDCP is and allow full-blown HDMI-in and outputs, which is the exact opposite of what the Forum wants.
HDMI in its current implementation hinders technological progress in the audio segment by forcing everyone to output analogue signals after the decoding stage or not allow decoding at all.
However, devices have a lifecycle, and a lot of this hardware will still be in use in 2-3 years, where this will have moved into the center part of the gauss curve. Higher resolutions and HDR (which may push 10bit) will trip this much more than a 240Hz display [which ain't ever gonna' be mainline, really, considering we went down to 60Hz from CRTs with faster refresh rates]
CEC can be done over the DisplayPort AUX channel. I think there were attempts at an ARC equivalent but they floundered.
Another interesting question though is how much A/V connections in general will still be used in the "TV world" down the line… with everything moving towards more integrated networked appliances instead. E.g. streaming service "HDMI sticks" are now apps on the TV instead…
In 2002 there was XBMC (later renamed to Kodi). Microsoft even had Windows XP Media Centre Edition in 2005. At that time it was perfectly possible to set up a media centre that could do everything. No need for shitty TV remotes and CEC. You would use a much higher quality remote of your choice. Oh how far we've come in 20 years...
Most monitors don't ship with DP2.0, because it's just not necessary. All modern GPUs support DSC, so monitor OEMs take that free 3x bandwidth reduction.
Nonetheless, Nvidia shipping RTX 4000 without DP2.0 is baffling.
(290 points, 6 months ago, 164 comments) https://news.ycombinator.com/item?id=39559318
(394 points, 6 months ago, 237 comments) https://news.ycombinator.com/item?id=39543291
I know HDMI is used in some AV production setups, but that feels like a very small niche to justify having 2 HDMI ports on a display like this?
[I'd rather have 2 DP ports and only 1 HDMI… or no HDMI at all]
I don't expect that to ever happen, of course. But I can dream...
The solution is to not buy proprietary standards[0], in this case, I'm looking for DisplayPort when I buy... and a big + to AMD for trying.
Hey Intel! Come back!
[0] Pile of comments here pretending it's sooo difficult.
The government created the playing field. The only entity that can fix the situation is the government: they created a bad playing field, and they need to fix it.
That's not to say the government shouldn't get involved. I think the bigger thing here is that if an industry group is specifically setting things up so that Linux is shut out of having high-end video support, then it looks an awful lot like cartel behavior -- industry incumbents are working together to lock out a competitor. Maybe it could be the basis of an anti-trust lawsuit?
Presumably Apple and Microsoft would have the most to gain. Microsoft is a member of the forum. Apple doesn't appear to be, but an Apple guy is on their board of directors.
I'm not a lawyer and I don't know how such a lawsuit would work. Who represents Linux in this case, since it's not owned by any one company. Linus Torvalds? AMD? And would all the companies involved in the HDMI forum be liable for the behavior of the forum (which would include AMD)? Does intentionality matter? I.e. if Linux was excluded accidentally rather than deliberately?
https://hdmiforum.org/about/hdmi-forum-board-directors/ https://hdmiforum.org/members/
On the other hand, a signed sha3-256 digest along with the original[0] file before YT re-encoded it (and stripped it's metadata) is unobtanium for the plebs. It is the _most_ important data for the host. It's the first thing they backup. As far as I know, they (YT/Rumble/Tora3) never talk about it. Some would love to only serve hallucinated (when convenient) upscaled derivatives.
Power is threatened by persistent lossless public memory.
[0]: https://news.ycombinator.com/item?id=20427179
[1]: (Mr. Bean, 2024) https://www.youtube.com/watch?v=xUezfuy8Qpc
Related phoronix threads - 6 months ago - 394 points
I got 8k/60 working in Linux using an nvidia card and a dp-to-hdmi adapter cable, but I have a feeling it's not meant to be supported (the same cable does not work in windows).
Hdmi rejected them, display port isn't ubiquitous enough, thunderbolt (usb c) is owned by Intel.
USB-C is distinct from Thunderbolt. And Thunderbolt itself got rolled into USB4 which is now an industry standard rather than one controlled by Intel.
Even so, Intel and AMD aren't so hostile to each other to avoid cross-licensing when it's mutually beneficial. A lot of the newest generation of AM5 motherboards actually include an Intel chipset for handling Thunderbolt/USB4.
And it did.
That said, those guys usually play a "back and forth" game on the long run... so stay tuned.
You should have an eye on MPEG too, because those are the same "type" of people (and ARM ISA is not far behind...)
Even if I despise big tech on nearly all fronts, sometimes we can agree, and this is AV1 and DisplayPort.
And this type of behaviour namely not having a DisplayPort port could be a perfect regulatory (anti-competitive) project for EU, like they did with apple...
Oh, okay. Fuck the HDMI forum, then.
It seems a common issue enough with that that model is specifically called out sometimes: https://forums.tomshardware.com/threads/how-to-connect-to-a-...