But still, I like it. :-)
For some reason the bandwidth is too low?
So for 60Hz there is interop. but for 75Hz you need a thick DVI cable.
However, compatibility is hit and miss in general. You may have had luck replacing one or more cables or adapters, but switching to displayport is likely to get the best results, anyway.
had that same issue with a 1920x1200@60 display on old hdmi connection and with a 1920x1080@60 tv on dvi-hdmi.
lowering the refreshrate slightly (59.9 Hz) made it work
And assuming they do, they all have HDMI anyway.
As for everything else most people would connect to their TV, they all have HDMI and precisely none have DisplayPort.
Given that, supporting DisplayPort is an unnecessary expenditure on bill of materials and labor for TV manufacturers.
Just throw one and if the ecosystem grows (like it did on PCs) then you keep replacing HDMI ports with DP ports.
And outside of gaming consoles it is becoming increasingly more rare to connect anything to the TV outside of the luxury segment (as even build in sound often is better then any external sound up to a price region where the lower luxury segment starts, so buying a slightly better TV without an external sound system is often better then a cheaper system and external sound).
And many of the "killer features" of HDMI (like network over HDMI) are semi-dead.
And DP is royalty free, HDMI isn't. So gaming consoles probably would love going DP only.
So as far as I can tell the only reason physical HDMI interfaces are still everywhere is network effect of them always having been everywhere.
I.e. if there is a huge disruption HDMI might go from everywhere to dying off.
And USB-C is perfectly suited for such disruption (long term).
I mean everything from server centers to high speed interconnects of consumer PCs is currently moving to various forms of "PCIe internally". USB4 and NVMe just being two examples. So it might just be a matter of time until it reaches the console/TV space and then USB-C with USB4+ or Thunderbolt would be the way to go.
Thing is, DP → HDMI adapters all suck when you’re using them to send anything but a basic 1080p picture. They nearly all fail or struggle with uncompressed 4K. I tried several different cables and adapters and despite marketing claims, they all had trouble. The best was one that was externally powered via USB, but even it exhibited quirks.
I no longer have the Rift hooked up to that machine which freed its HDMI port up, but I too wish TVs and receivers had even just one DisplayPort.
In my testing (several years ago), at least half a dozen other similarly-priced DP → HDMI 2.0 adapters purchased from Amazon were limited to chroma subsampled output (4:2:2 or 4:2:0) at 4kp60, which is obviously unacceptable for desktop use, so I do see your point.
I've used the linked adapters successfully now for several years with both a 2013 Mac Pro and a pair of DP-only AMD GPUs (WX 3200 and WX 4100), all connected to a 2019 LG TV, and, while testing, confirmed all claimed signal outputs using an HDFury Vertex.
DP++ adapters tell the GPU to output an HDMI signal instead (DP is an entirely different protocol), and then just level-shift the signal. Type 1 adapters are limited to i-forgot-how-many MHz which means no more than 1080p60. Type 2 adapters contain a tiny 256 byte rom that tells the GPU its maximum supported bandwidth.
Other adapters are active, they convert the signal and thus add latency, and often need external power so can get quite hot.
I picked up a 55" curved 4K about 5 years ago for use as a monitor. Now I keep my desk in the living room, so I use it as a TV too (have to move the chair). Curved TV is kind of dumb, but huge curved "monitors" are awesome. You can't find curved TVs any more, and wide curved monitors don't have the height to double as a TV.
Only if you you define recently popular as "More than yesterday, but still a rounding error."
My guess is that it's not a technical or even a user experience issue. It's probably a money issue with the deals tv manufacturers make with the media industry.
i.e. Netflix won't allow their app on a device that can circumvent HDCP.
DVI connectors offered analog VGA for years. This meant that graphics card vendors could put one port on their card that did both, huzzah, and a passive adapter got you VGA out of DVI.
DVI is ahead of DisplayPort by 8 years. The DMCA is passed and HDCP becomes a Thing. Many card vendors do put DisplayPort on their cards, since it's the "Professional" standard for video, but that isn't until 2008 or so. DisplayPort would not be widely adopted until 2012-ish.
Fast forward, VGA dies. DisplayPort and DVI-Dual link are there. DVI Dual is forwards-ish compatible with upcoming HDMI displays for TVs, as pushed by the MPEG-LA and DVD makers. In 2009, less than 5% of devices shipped had DisplayPort on them. DisplayPort at this time also cannot handle the two most popular color spaces in use: sRGB and Adobe RGB 1998.
Part of the issue was a perception thing: Displayport was widely adopted by Apple early on, and consumer understanding of Mini DisplayPort was that it was an Apple standard, rather than an open standard, and this further pushed the port out of the limelight.
This poses an interesting question, maybe some of the hobby-lawyers on HN like to chime in and post heir theories :o)
Let's assume they print that Logo on their box, call it HDMI in their Windows drivers, but don't do so in their Linux drivers, while it's still a spec-compliant implementation. Would that pose a potential legal problem, and if so why?
If it's the fact that they have access to the official HDMI 2.1 spec, implemented that, but call it something else, which I could imagine they forbid in some contract, would things change if some random hacker with too much time on their hands reversed the protocol by sniffing it, implementing it for the AMD driver (again without calling it HDMI)?
Too bad the HDMI forum doesn't feature an email address on their home page, I'd have loved to tell them what I think of them.
At best a random hacker will reverse engineer the binary driver enough to make something work in some capacity.
They actually do at the bottom of [1].
So a different term would be needed (like "WLAN" or "BT" is used by non-members)
Maybe that's what you already meant by "calling it different than".
> Select brands and logos are offered license free and intended to be used widely throughout the Wi-Fi ecosystem by Wi-Fi Alliance members, non-members, industry partners, media, and analysts to describe products, technology, network deployments, and operating system support.
It sounds to me like anyone could use the trademark "Wi-Fi", since it's listed in the freely available ones.
It's not just that, it's about playing nice so they can have access to the next version of the spec if one comes along.
>Video Out port: DVI-D signal in 640х480 px, 60 Hz. The port supports a well-known video standard that we can't name due to copyright limitations The first letter is H, and the last one is I.
https://blog.flipper.net/introducing-video-game-module-power...
Of course, they don't do HDMI 2.1 or anything advanced like that, but I guess the reason for the name not appearing anywhere is the same as you're discussing here.
Not nice. Neither worms, nor HDCP forum.
On a serious note, I don't believe that these technologies prevent a determined person from copying anything. However, I'd rather have all the features on board rather than not being able to use them when required.
Aren't we able to create and use open standards?
DisplayPort is not proprietary; it is a VESA standard.
> Lightning
If you owned any iPhone between 2012 and the present, you had to use Lightning, without choice. And for some inexplicable reason, Americans love iPhones.
I have never heard of such a thing. Was it a common problem for you?
In theory, sure. In practice you'll have to construct a financially sustainable organization that is able to motivate all interested parties to chip in and is also able to certify the implementation and at the same time also doesn't fall victim to internal corruption (e.g. high C-level compensation making it unsustainable). I think there are few-to-no precedents for that in the open source space in general, and even less when it comes to standards body organizations for maintaining a standard at that level of complexity.
In most domains proprietary specifications form the backbone of everything. A lot of governments refer to ISO standards, which by default are not open access.
Here's a list of just the best known ones [0]. There are literally hundreds of thousands of open standards for everything from communications to mechanical engineering, to packaging to chemical formulas....
They make the world go round.
No piece of technology you use today, especially the Internet would function without open standards and standards bodies.
For some reason bits of the digital tech industry, in particular media and entertainments, have a parochial disconnection with the rest of reality and forget that they stand on the shoulders of giants and operate with the assent of everyone else in the world giving them the standards space within which to work.
[0] https://en.wikipedia.org/wiki/List_of_technical_standard_org...
As a fun experiment, I went through the first 10 entries of the Wikipedia list. Only one of them[0] produces _open_ standards, which they have available for free download. For the rest of them the "standards" link on their website either directs to a webstore to purchase individual standards or to a membership signup.
I very much recognize that the world we live in is driven by standards. But while those standards drive the world forward, I think it's also important for industries (and governments) to recognize that the way their standards bodies operate in a way that's almost fundamentally incompatible with the forces that drive innovation in the software world (that they often proclaim that they also want in their industry).
Building a standards body as you point out isn't difficult and has been done many times over. What's difficult is building an _open_ standards body, which as of today looks like a mostly unsolved problem (same as open source funding).
The moment you want to control what people watch and how they watch it, you lose any hope of having an open standard.
Translation by obsolete sandwich-fed LLM:
In practice, zealously litigious organisations will assemble corporate lawyers in a room to compute profits and to define access constraints for consumers.
Standards documents being behind a paywall is not at all the same thing as something being proprietary or needing to be licensed. ISO charges for standards documents to pay their administrative costs, you can implement those standards without paying any extra money. And if you happen to have an alternative way of implementing the standard without reading its document, that is fine too. If you implement JPEG XL by studying its open source reference implementation, that is A-OK.
In almost all cases of standards you can implement those standards without reading the document, from an IP standpoint. From a practical standpoint it is often just not feasible to reverse-engineer everything without the original documentation, or worth it if you can't slap the trademarked name of the technology on it.
It may be possible that AMD could even implement an open source driver stack for HDMI and be legally in the clear. What they fear is more souring the relationship and losing access, so they don't risk it when they were told not to do that.
in the mean time keep paying taxes, rent, subscription, and utility bills.
i'm not even sure it's capitalism that needs to get modified? maybe it's something about how private/public property works that is cleary off the mark and needs updates?
I'd argue that in the internet specifically, open sourced implementations of the protocols are the backbone of everything not closed proprietary specs; aren't most internet specifications open?
But in many spaces where you interact with the "real world" you very quickly make contact with proprietary ISO standards (e.g. CAD, architecture). I'd argue that this is one of the big contributing factors to why there isn't more open source penetration in CAD, as central standards like STEP[2] would require contributors to purchase a number of ISO standards.
There are also some spaces where proprietary standards exist (usually when open implementations precede the standardized ones) like SQL[0], but the proprietary nature is ignored, as most people don't need their SQL implementations certified. AMD can't do that as they need to keep a friendly relation with the HDMI Forum for official certification.
There are also some ISO standards (associated with JTC1) that are open access[1], which seems like a decent model. I'm not sure who usually foots the bill for the whole standardization process here though.
[0]: https://www.iso.org/standard/76583.html
[1]: https://standards.iso.org/ittf/PubliclyAvailableStandards/in...
You can, but you need monitor/graphics chip companies to use it. These are mostly the members/creators of the HDMI standard, they are also probably the best able to create a standard that others will use:
Once you get a libre OS you can dump the content of the BUS or fake whatever HDMI hardware out there to get pristine audio and video frames. Also, the current Hollywood movies are very subpar compared to what we had in the 90's, so who cares.
My SO has an Amazon Prime account and yet they want to show adverts in middle of a media she already paid to be displayed without ads in theory. So, you are paying them twice. Thus, I don't consider Bittorrent piracy when you legally paid for a service but the streamers can break out the rules anytime.
The average power user won't be able to run SkyShowTime on Linux. The idea is locking everyone on Windows, OS X or Linux with Secure Boot and verifiable boot chain if you want to watch movies or TV shows.
No-one, in Hollywood's view, should be able to watch protected content on a VM. Plus for me it's choppy. Waiting for better virtio graphics drivers for Windows (bugs exist in Visual Studio with HW accel on and there is no 3D support).
Their first sentence is "Any Linux user trying to send the highest-resolution images to a display at the fastest frame rate is out of luck for the foreseeable future, at least when it comes to an HDMI connection" but that's plainly not true. Hardware with closed source drivers, such as the standard nvidia ones do support those because they don't have this legal limitation. Then they even end it with that possibility of closed source AMD and didn't bother even asking themselves if anyone else has done this.
https://hdmiforum.org/about/hdmi-forum-board-directors/
I see people from Apple Panasonic Sony Nvidia Samsung etc.
Hardware companies. Maybe you have to buy your way in the club.
Almost anti-competitive that the HDMI 2.1 spec people won't allow an open implementation.
That they don't even allow open implementation should have been a red flag to all of us that HDMI 2.1 has not been subjected to sufficient review.
Have any of you sufficiently reviewed an actual implementation of this spec? Only with black-box testing because it's closed source?
https://www.amazon.com/AmazonBasics-Aluminum-USB-C-DisplayPo...
They also have an adapter: https://www.amazon.com/AmazonBasics-Aluminium-DisplayPort-Ad...
I don't understand. Fuck whatever committee said whatever crap. Open source is open source. Just make the damn driver and give the suits 2 middle fingers.
Also, "HDMI" itself is a trademark with usage only allowed to its Members (like "Wi-Fi"), so even if a non-Member would do an open-source HDMI-implementation against the will of the HDMI-Forum, he would likely not be allowed to call it "HDMI" (like "WLAN" is used by companies without Wi-Fi Alliance Membership)
WiFi is pretty much there already
This is the HDMI Group that's being Idiots in this case because the Content Mafia is afraid someone might use this to steal content. Anyway, back to torrenting stuff...
A cheap DP-HDMI dongle makes all this go away. As long as VESA doesn't behave the same way, anyway.
55" monitors with DisplayPort do exist but they are only a selected few and seem to cost 4 times the price of a 4K TV.
So 4K is still pretty much a luxury that I will just ignore for now.
You can even get cables that are DP on the input and HDMI on the output with minimal bulk.
Not for the past few years. Although DP 2.0 has been 'released', there are no products actually shipping with it, and in practice DP 1.4 is the latest standard. DP 1.4 can't transmit 3840×2160 at 4:4:4 144 Hz without Display Stream Compression (DSC). HDMI 2.0 can.
I normally run it on 120Hz over DP, but it will work fine over HDMI 1.1 at 60Hz. My (5+ years old) TV runs at 1080p/120Hz just fine too.
https://en.wikipedia.org/wiki/Mini-DVI
Which is cool, but not quite as cool as micro-DVI, which they used for exactly one MacBook Air generation before mini-DisplayPort arrived on the scene
https://en.wikipedia.org/wiki/Micro-DVI
Edit: you are right though. I just pulled up pictures on early G4 variants, and there is indeed full-size DVI. I completely forgot that.
Even DVI had HDCP though, so it too was flawed :-( https://en.wikipedia.org/wiki/High-bandwidth_Digital_Content...