HDMI 2.2 will require new “Ultra96” cables, whenever we have 8K TVs and content
HDMI 2.2 will require new “Ultra96” cables, whenever we have 8K TVs and content
The physical connector is, confusingly but expectedly, the same.
HDMI 2.2 will require new “Ultra96” cables, whenever we have 8K TVs and content
The physical connector is, confusingly but expectedly, the same.
HDMI is the proprietary monopoly scam. It is added to devices by the owning members of the scam. Display Port is the open source free equivalent standard that the educated consumer goes looking for.
Problem is, very few things output or input DP.
HDMI, for better or worse, seems to be ubiquitous.
A lot of laptops nowadays output video over usb c, in most cases they use DisplayPort alt mode, which as the name implies is just DisplayPort.
That doesn’t solve the issue of every x in 1 dongle only having HDMI or TVs only having HDMI inputs
So here's what we do to fix that situation: don't buy those things
🧠🤯
Thank you, I did find myself thinking theres a reason why I have the DP cables for my PC monitors which don't seem to have an issue running high resolutions... But then I'm not running 8k on anything so I wasn't really sure about that
Got any of that juicy 8k content?
No? Because noone does and noone cares.
There are 8K cameras, but the only reason to use them is to create stabilized 4K content.
They are also important for VR content. You need a lot of pixels to fill someone’s Field of View.
There are a few broadcast 8k channels in Japan and South Korea. There’s some YouTube 8k videos and 16k is being worked on. 8k is pretty awesome, though I really just want 8k screens for large PC monitors. I currently use a 4k 43” and 8k would be even better at that distance. Both Samsung and Sony have 8k screens for sale right now and they’re not really that crazy expensive for cutting edge. (75” Samsung 8k QLED for $3k)
My next TV purchase will be based on which models have Display Port.
...And which don't have smart features, but that's a given.
That's going to be harder and harder to find.
The transformation into crochety old man is complete. This AI being shoehorned into everything can get off my damn lawn too.
Sounds like a growing market to me...
I got a new Android TV for offline use. Most people say you get an OK experience if you don't connect the TV into a network.
The biggest remaining annoyance is that it takes 45 seconds to cold-start. Almost as if it's booting an OS desgined for a phone or something.
Dang. No way to jump straight to input mode? Probably not without a soldering iron.
That's just a commercial display. Most commercial displays don't have an OS and require a separate device for showing video like an Nvidia Shield, PC, etc.
The keyword is digital signage display/television.
Is DVI completely out of the picture? I hate the connector, but I've had a lot of issues with DP, mainly around Linux support and multi-monitor setups.
I was kinda hoping USB-C/4/Thunderbolt would step into this space and normalize chaining and small connectors, but all of those monitors are stupidly expensive.
The only problems I've had with DP are when I have to put it through an adapter to turn it into HDMI for a display that didn't have DP input.
Video over USB-C just ends up being Display Port, doesn't it? I guess it depends on the subtype of USB.
DVI isn’t capable of the bandwidth needed for higher resolutions. Even dual link maxes at about 8 Gbps and 2560x1600 @ 60Hz. This new HDMI spec is 96 Gbps for reference.
Ironically though HDMI is pin compatible with DVI and you could output HDMI to a DVI monitor with just a simple HDMI to DVI cable, or vice versa. I know a lot of people who like DP bit in order to convert you need active circuitry and that can impact quality if you don’t have native DP on both ends.
I also find USB to be limiting when it comes to range. I can go about 50 feet with a nice thick HDMI with copper wiring, but any further than 20 feet on USB necessitates fiber optics. Not an issue for everyone, but something I have been running into.
Mine would be as well, but tbh I don't love the kodi UI. At least I didn't a few years ago when I tried it.
Maybe Nvidia will drop a new shield with DP support, but not going to hold my breath on that
No TVs have DP, and the largest monitors you can find now are below 55". I wish you luck.
I haven't even gotten on the 4k bandwagon yet. I fully expected to by now, but then again, my eyes aren't getting any better and 1080p content still looks... fine.
I have a 4k TV. I don't think I've ever actually watched something on it in 4k because finding the content isn't worth the effort.
I have to filter out all the 4K feeds I get on Kodi because I can't play them. I sure haven't seen a shortage of them. Now whether they play at an actual 4K would be the question, but they've been there for years.
A few weeks ago I watched Ladyhawk on a 13" TV with a built in VHS player. I realized that my brain didn't care about the quality as soon as I started paying attention to the content. I still like my 1080p but there's definitely massively diminishing returns after that.
I realized that my brain didn’t care about the quality as soon as I started paying attention to the content.
You are a genius! At least compared to everyone seriously discussing how important it is to replace one barely (if at all) visible pixel with 4, or better 9, or 16, more pixels. More of everything. If you are buying a movie that takes up 4 times more space, then it must be 4 times more content. There's such a nice word, "content", as if food for one's brain and soul, which is art, could be factory-produced by the schedule.
1080p is fine, but I really like the colors of HDR. I am NOT a fan of the higher refresh rate for movies though.
Higher refresh rates for movies are meh at best, VRR OTOH is a godsend as 24Hz just won't fit into 60Hz. Gaming, too, is much nicer when you have VRR, figures that delayed frames are quite a bit less noticeable than dropped frames.
1440p at 120Hz+ is superior to 4k 60Hz and is much more achievable for most hardware anyway. That's the sweet spot in my opinion.
For media 4k is a pretty big upgrade from 1080p though.
Yes, but 1080p content looks like dogshit on a 1440p display
I couldn't care less about 8k since I can't even see streaming 4k content without using a platform infested with DRM.
Allow me to introduce you to Stremio + Torrentio + Real-Debrid.
Just for the record, the HDMI consortium can place their mouths on my genitals and consume my waste
With what bandwidth though?
I don't want Digital Restrictions Management in my cables.
Until Elon can install it into your occipital cortex, this will have to do.
Is there something that makes you think there is? I didn't see there was a chip in the cable.
The HDMI standard needs to declare cable bankruptcy and start over with a new port design. We all have way too many HDMI cables supporting 23 years of standards. There is nothing in the specification to clearly label, across brands, what type of hdmi specification is supported by the cable or port.
Also, the DRM baked into the specification is such bullshit.
Also, the DRM baked into the specification is such bullshit.
That's the one thing they have absolutely no interest in getting rid of. They'll change everything about the spec, including the connector, but that part's staying in.
That's why I added it as an addendum. Even sourcing HDMI cables without HDCP is getting very very rare.
They need to switch to fiber optic or a simple coaxial cable (like SDI) with BNC connectors. That would end this madness with long cables being outrageous, and the connectors being flimsy.
They also need to separate audio back out into its own thing.
"...whenever we have 8K TVs and content."
The TVs exist, but there won't be content for years and years. Companies barely stream usable 4K right now.
Because the bitrate over streaming is garbage. Get physical media if you want good 4k.
This is a genuine question but—what physical media? Blu-ray players are no longer being produced by name brands, and DVDs certainly aren't capable of storing the data.
Frankly get physical media as a fuck you to the parasites.
Paying for streaming is a fool errand, you are funding the enemy.
Physical at least gives you some property right.
Doesn't make much sense anyway. More than 5k is only wasted computing power/bandwith.
I think there are less 8k TVs now than 4 years ago. Some lessons were learned
Japan has had an 8K TV channel since 2018, they really thought that would start being adopted a lot quicker https://en.wikipedia.org/wiki/NHK_BS8K
A buddy of mine worked in a theatre and told me that the film's were all 1080P. I called bullshit. Those screens were huge they were clearly 4K. He showed me the reel and yup he was right.
If theatres don't even bother with 4K, your TV doesn't need 8K.
Actual film doesn't work like that (35mm or 70mm IMAX for example), but you are correct that most cinemas these days are digital and they use "1080p" (more accurately DCI 2K which is 2048×1080 when the aspect ratio is 1.90:1). There are a few that do 4K, but overall not that many.
The main reason that's enough for cinema though is that those "1080p" films are like 500GB with very little compression displayed through a DLP projector, so they look a heck of a lot better than showing a blu-ray through a massive TV with palm sized pixels.
Also you're quite far away from the screen so even if it's bigger you don't need as much resolution.
Optical has built-in upscaling. No AI bullshit needed.
Yeah but you’ll want full gold plating and nitrogen-infused insulation for the best picture.
It's liquid helium for me. I won't settle for less.
Good thing the word Premium® is there to let me know it’s a high quality product!
No, you're just paying a premium.
"Ultra96" sounds like it could have been a codename for the Nintendo 64.
Or the GameCube...or an add-on to the N64.
The N64's codename was the Ultra 64 afterall!
At what point do we just declare that the screens they try and sell are pushing for higher resolution than real life?
I believe 4K is already basically there. I have a 50" 4K (2160p) that I sit 9 feet away from and based on the Nvidia PPD calculator, that makes for 168ppd, and according to that page 150ppd is around the upper limit of human vision. Apple's "retina" displays target around 50-60ppd (varies based on assumed viewing distance), which is what most people seem to consider "average eye visual acuity". Imo 4K / 150ppd is more than enough.
According to this calculator, my 65" 4k setup is around 100ppd.
I find that anything with a higher density than that (e.g. sitting further away, or replacing it with an 8k screen of same size) requires scaling up text and wasting a lot of pixels when rendering other things.
So yeah, I think 8k is a total waste if you're not targeting a much higher fov, at which point a curved screen would probably be better.
Maybe some applications like these could need a high density just for the size of it, but then again you're not likely to be looking from a living room distance either. Or things like VR where you're looking from very close up.
My biggest screen is a 55" 4K and I just don't get why you would need much more unless it was a full on theater setup.
Them: 8K!
Me: Whatever.
One day I might care about 4k, but it hasn't happened yet. So I really can't muster a shit to give about 8k.
I doubt the general public cares about or can even tell the difference from 4k to 8k. Not to mention the amount of bandwidth that will be required.
This exaxt comment could have been for 1080 to 4K. That said 4k has had a lot less fanfare to HD.
Not really.
Sorry but no way native 8k is going mainstream any time soon. 4k is genuinely better looking over 1080p but 8k to 4k is not much of a difference for the relative increase in requirements.
I really don't know a movie the cinematographic virtues of which were better seen in 4k as compared to 1080. Colors' quality, refresh rates - maybe, but the amount of pixels? You see a square 1080 pixels tall, 1920 pixels wide. How the hell are you going to discern a group of 2 or 3 pixels in that, let alone a separate pixel (I'm aware it's in fact a group of 3 colored ones)? Even if you will, what will it change in your perception of the movie, which detail?
Apple users sometimes boast how fonts look nicer on their screens, but IRL I haven't seen much difference with a glossy screen of worse resolution.
I expect the content delivery companies to do something stupid again with 8k (like when they rolled out 4K), and totally nerf the bitrate and encoding quality, making it look worse than a properly encoded high bitrate 720p/1080p file.
Yeah but there will always be freaks like me who need to sit less than 2m away from a massive 75" screen so that they can browse the desktop at 4K without increasing display scaling beyond 100%.
Needless to say, I could benefit from 8K.
Great! Now they can sell my grandma an HDMI cable in 50 installments!
Meh. Wake me up when the HDMI consortium requires vibranium cables. Ending forever audio lag AND frame skip.
"premium"? That's what they decided on? That's sure to age well and not be confusing at all...
To be clear it will "require" a new cable to push at that max (8k/120fps?)
It's not like you need a new cable just because a new TV supports it.
You kinda do though. It’s like the difference between Cat5 / Cat5e / Cat6. Physically they all have the same Pin outs, but the tolerances vary greatly. Could you get 10Gbps on Cat5? Possibly, but only over short clean distances. You can do it more easily on Cat6, even though they all plug in together. The HDMI cables are the same way.
Ehh, can I get the Basic Low Speed HDMI Cable with Wifi?
Somebody - probably those guys thinking how they'll sell more stuff, - is operating in reality where "you can, but you shouldn't" can't be said in human languages.
I've been reminiscing parts of my childhood where I'd watch a lot of karate matches and try to repeat moves (I know it sounds stupid). The fights, the lights, the room, my grandma, the summer evening outside matter in these memories. Not how much logical dots there were on that goddamn screen.
One of the most depressing things about now is how, even compared to 10 years ago, people are thinking not about new things to do with tech, but about doing old things with more resources wasted, because that's apparently better.
1920x1080 seems an overshoot sometimes. 8K - why the hell? And with more expensive shorter less reliable cables, other things equal.
I guess hoping for cheap thin LPD displays is useless.
Resolutions higher than fullHD are useful not only for TV in front of your sofa. You need 8k (and even higher) to VR-helmets. Also there already are cinemas with giant displays instead of projectors. Don't be a retrograde.
Don’t be a retrograde.
I already know that launching stuff to the orbit this way is more expensive, but there are situations where a retrograde launch makes sense. Mainly of military nature.
I refuse to accept other meanings of the word.
Also there already are cinemas with giant displays instead of projectors.
Tool for the job. For me instinctively a projector and a screen seem to be a better solution than a humongous LCD display. But if that's cheaper (how in the world though), then let them do as they want.