It's not the connector, it's the communication protocol.
It's super lame though. It will be great to watch the downfall of HDMI Forum when their artificial dam against DisplayPort in the living room finally breaks.
What is the dam against DisplayPort anyway? I never see it on TVs for whatever reason.
Actually it’s a bit odd, in my mind DisplayPort is highly associated with quality. But I don’t actually know if it is the superior connector or if it just seems that way because monitors are usually better than TVs in every metric other than size and brightness.
It's already difficult to find TVs with four fully-compliant HDMI ports; often you'll get a TV with one HDMI 2.1 port and three HDMI 2.0 ports, and sometimes the 2.1 port will also be the only eARC port so you have to choose between high framerates/resolutions and using a sound bar. In other words, even with just HDMI getting a decent set of ports is difficult.
The idea of TV manufacturers also adding DisplayPort ports seems ludicrous to me - not because it's a bad idea, but because I can't imagine them going to the trouble if there's no tangible demand. At best I could see them replacing HDMI ports with DP ports because there's limited space on the motherboard, but that would still require the board to have both HDMI and DP circuitry/chipsets and HDMI/DP certification/testing.
Then you have a TV with, say, two HDMI ports and two DP ports - which, for most users, means "two ports" since 99% of people don't have any hardware they want to connect to their TV that supports DP anyway.
So basically unless we start seeing game consoles, AppleTVs, and Rokus supporting DisplayPort we won't see TVs supporting DisplayPort, and we won't see any of those devices supporting DP because they don't need to - HDMI works fine for them and it's sufficiently universal.
Maybe China's new HDMI replacement will take off over there and make its way into devices over here, but I'm not holding out hope.
My understanding is that the HDMI 2.1 port situation on TVs is, weirdly enough, a SoC limitation from a single vendor.
Almost everyone (apart from... Samsung and LG, IIRC) is using MediaTek SoC for the brains for the TVs, and they just seem to be unable to make one that has enough bandwidth for 4xHDMI 2.1.
AFAIK LG and Samsung still handle theirs in-house (and that's why LG was the very first "big" vendor to ship 2.1 at all, and they rolled it out to all four ports even on their midrange TV's in _2019_!); and it's common to see those brands have more 2.1 ports.
This should be getting better in 2025/2026 model years, since it seems MediaTek has finally managed to ship a SoC that does it; but it's ridiculous how long it's taken.
The supported version of DisplayPort in that TV is on par (-ish) with HDMI 2.0; and not enough for HDR 4k120; which is one of the selling points of HDMI2.1.
Here's a stupid question: per the site, "any entity wishing to make an active and material contribution to the development of future HDMI Specifications" can join the HDMI Forum for $15,000 p.a., and the Board of Directors is elected by majority vote by members.
Is there anything other than the money and desire to do so stopping 100 well-heeled Linux users from joining up and packing the board with open source-friendly directors who would as their first official act grant AMD permission to release its driver?
This sounds like what microsoft did to get their Office formats standardised by ISO. Paid membership to a bunch of folk and had the vote in favour of approving the standard. (I'm summarising *a lot*, but that's the general gist of it).
Did that change in a more recent version? According to the (admittedly old) source linked from the Wikipedia article, integrators are allowed to skip HDCP but incentivized with reduced royalties if they do support it.
> For each end-user Licensed Product, fifteen cents (US$0.15) per unit sold.
> If the Adopter reasonably uses the HDMI logo on the product and promotional materials, then the rate drops to five cents (US$0.05) per unit sold.
> If the Adopter implements HDCP content protection as set forth in the HDMI Specification, then the royalty rate is further reduced by one cent (US$0.01) per unit sold, for a lowest rate of four cents (US$0.04) per unit.
> It's not the connector, it's the communication protocol
In particular the link training procedures needed to reliably push 48 Gbit/s over copper are probably very non-trivial, and could be considered "secret sauce".
GPMI isn't an open standard and it doesn't support HDCP. It might end up being very popular in China but it will be a hard sell in markets that aren't primarily consuming Chinese media.
It's cheaper to implement than HDMI. So if DisplayPort ports are common on displays, devices will start using it (cheapo devices first). If DisplayPort ports are common on devices, displays won't need HDMI anymore. Plus, industry-wide, it's wildly inefficient to have one high-bandwidth video connector for monitors and a different one for TV's when the technical distinction between those is pretty much non-existent and we could scale our engineering effort across a much wider set of devices.
So, after a transition period, cost-saving will eventually lead to DisplayPort taking over.
Because the manufactures don't have to pay a license fee and so once someone start using it everyone will follow and then drop hdmi. However so far nobody has cared enough to be first.
Well, if this were a free market, b/c there would be demand for it? I want a more standardized protocol so I need less cabling and connectors, and I want features like 4k that HDMI effectively (see TFA) does not support.
I would vote with my wallet … if I could.
Like, why do we need two connectors, for the same thing? DP is clearly technically superior.
Of course, there's a wide range of issues: there's a number of comments on this article stating how the HDMI forum is manipulating the market (e.g., by suppressing competitor connectors on the board, offering lower royalties for bugs, suppressing specifications), and then there's just getting out-competed by the litany of consumers who have no idea and do not care to know what they are buying, and marketplaces like Amazon that promote mystery-meat wares.
Instead, its strength tends to be a continued improvement over the long term, in a way that commercial software just can't sustain because it needs to show a return on investment.
I'm pretty sure that the majority of shops that aren't worrying about Android have moved on from Java 8. The JVM team only keep Java 8 working for customers paying them lots of money for extended support contracts. And that's only because they have this long-term extended support system for all LTS JVM releases (they are also still supporting 11 in a similar manner).
On the other hand, Android doesn't even support Java 8. It supports the long-dead Java 7 plus a subset of Java 8 features. Android essentially froze their core application runtime in amber over ten years ago and have just been adding layer upon layer of compiler-level sugar ever since. The effect is an increasing loss of the benefit of being on the Java platform, in terms of code sharing.
I never understood why they do not track the OpenJDK versions. I don't work on Android apps.. but it seems mildly insane to basically have a weird almost-Java where you aren't even sure if you can use a given Java lib.
> I never understood why they do not track the OpenJDK versions. I don't work on Android apps.. but it seems mildly insane to basically have a weird almost-Java where you aren't even sure if you can use a given Java lib.
NIH syndrome
> (and not a Dart chat app.. but something actually performant that uses the hardware to the full extent)
I used to work on Android, quit two years ago and have used Flutter since, it's a breath of fresh air. It does use the hardware to the full extent, imo it's significantly more performant: it does an end-around all the ossified Android nonsense.
Hmm, so if you wanted to make an AR app, or some audio processing app, would you do that in Flutter? All the projects I have in mind involve using the camera/microphone/gps etc. Looking at Dart sample projects it just seemed to be quite different from what they're aiming at
My caffeinated instinct is to say basically "yes I'd do anything in Flutter", I honestly would rather stop coding than go back to anything I've done before (ObjC/Swift/Java/Kotlin with side journeys in C++). It boggles my mind how much of a different job dev is with true hot reload.
More carefully, and dealing with what you're indicating more directly:
There's stuff that we just need every millisecond of performance from.
Generally, Dart's great, I don't notice any difference between iOS / Android standard UI platforms.
But...for example, Flutter's image decoding is actually using "native" code behind the scenes, i.e. calling into C or OS-level APIs or browser APIs as needed on each platform. And there's a Flutter package called "image" that's Dart-native but I abhor because I know it's going to be higher latency than going thru lower-level code. (now I'm wondering how Java does this...I wonder if its JNI...)
Let's do a scenario: I've been contracted to build a bus route app for the local gov't. They want an AR feature. What happens if I choose to build on Flutter, build out the basic features, then get to the AR, and I'm getting 5 fps?"
This might feel convoluted at first, it did to me, but really, all that's going on is: when things are slow, we write a Dart interface, then for each platform where we want to use native code, provide impls of that interface in native.
Yeah, I'm currently developing a Flutter app and also using flutter_rust_bridge to separate the business logic and I can hardly believe how enjoyable it is.
Other than the initial project setup which is a me and Nix flakes problem it all comes together pretty smoothly.
Subject to the accuracy of the translation, you are guaranteed to answer that one correctly by not selecting any options. That is because the instructions do not constrain what you may do about incorrect statements. Thus, in a purely logical sense, you must not select any correct statements but you may select or not select incorrect statements.
It's not standard mini-itx. Since the physical form factors for their laptop boards are published publicly and are somewhat stable, are "desktop" cases for them.
They were ejected because they submitted some seriously broken patches, which triggered kind of a panic re-review of their previous patches, which concluded that they had been careless the whole time (Chesterton's Fence type stuff).
I keep hating this but nobody has ever linked to the supposed broken packages. With how political RedHat is I would not be surprised one bit if it was completely overblown and exaggerated just so they could use it as an excuse to kick out someone who's politics they didn't like. Same thing has happened multiple times in the last few years (Hyprland is just one notable example)
reply