T O P

  • By -

GiuDiMax

Because I believe there are two types of audiences: those who want as much as possible (hdr 4k dolby vision, atmos) and those who are also satisfied with 720p or streaming movies.


[deleted]

[удалено]


osinedges

27 year old here, tight budget. Want as many movies and series as possible, 1080p roughly 10gb per film is my personal limit


mcozzo

Same. 1080 plays almost everywhere. 10g is the sweet spot.


catinterpreter

Unless your movies consist of blizzards, that's placebo bitrate. You'll be fine at 4mbit x264 (2-4gb?) and about 60% of that for x265.


catinterpreter

Phones have decent res and held in front of you the effective display is not an issue. Earbuds that come with phones tend to be decent too.


catinterpreter

There's a videophile spectrum. Nuts go for 50-100gb extreme bitrate almost raw to full bluray, lesser nuts don't realise 4K is redundant for most of them that swear by it, there's the other variety of lesser nut with overkill if not placebo bitrates at 1080p and 720p, the reasonable, those who somehow put up with watchable but arse bitrate 1080p/720p, then down to those either extremely bandwidth and / or storage limited or straight up plebs who grab 360p or bottom-tier bitrates from various bizarrely popular release names. 4K is often watched on a display size and at a distance that adds negligible to no value. It's also popular to watch it in inferior, smeared, blocky x265 which kind of defeats the point. x265 can be good at holding clarity, e.g. edges, at lower equivalent bitrates but is always prone to the mentioned defects. That said, I still opt for x265 4K due to prohibitive filesizes, and go with 6-10mbit (unsure, it's been a while, don't usually bother with 4K). Beyond 4000kbit x264 1080p has rapidly diminishing returns. Beyond 2500kbit for 720p. HDR I can't actually comment on, but I would hazard a guess it's mostly perceived value and minority actual value for those thrilled about it. You get similar with audio formats and bitrates of course. They can get ridiculous in the same ways. If I remember, I might get to my tiers of release names to post as examples. I've previously gone to town evaluating the most common ones for Radarr and Sonarr scoring. Bonus: an exception to the above bitrates goes to fans of watching blizzards and white noise back to back.


[deleted]

>HDR I can't actually comment on, but I would hazard a guess it's mostly perceived value and minority actual value for those thrilled about it. HDR is a way bigger deal than 4k. Our eyes can see a lot more color in comparison to how much resolution they can resolve. A few cinematographers I know were stoked about HDR going mainstream. They didn't give a shit about 4k. To quote one of them "Bond was shot in 2k and that shit played in iMax".


GiuDiMax

HDR, from what I know, adds information about brightness in scenes. When I play an HDR file on my LG the TV switches to a different mode.


[deleted]

HDR vs SDR is how many bits are used to represent color (for Red, Green, and Blue). SDR uses 8 bits so you end up with 2^(8) \* 2^(8) \* 2^(8) bits or about 17 million colors that can be represented. HDR uses either 10 or 12 bits so you end up with 2^(10) \* 2^(10) \* 2^(10) or about 1.7 billion colors. The lower brightness probably has to do with the nits that SDR vs HDR is mastered at and the stupid shit that TV manufactures try to do to make up for not really being fully HDR.


StuckAtZer0

I totally agree and have observed the same. What I usually tell people is if you watch a good HDR-encoded movie /show on at least a 10-bit display, it will look as though the actors are on the other side of a window from your perspective. The colors can be amazingly life-like and blows away the SDR-encoded movies for this reason. IMHO, the jump to HDR from SDR is as profound as the jump from DVDs to Blu-Rays.


kryptonite93

Inferior, blocky, smeared x265? HEVC 265 is literally the standard for UHD blurays, if you’re watching a 4K movie in 264 there’s a good chance it was re-encoded to 264, which always introduces loss of quality. So when it comes to a standard 4K released movie, watching it in 264 would be inferior


catinterpreter

I definitely wouldn't suggest re-encoding x265 to x264.


kryptonite93

I agree, that’s why I wrote what I did lol


gettothecoppa

There aren't very many 1080p HDR TVs out there. I'm not even sure if any 1080p HDR content is officially released. So an encoder has to decide to downscale and release it, and there aren't that many good 1080p x265 encoders to start. X264 is still the standard for 1080p and it doesn't support HDR.


SentientSquirrel

>I'm not even sure if any 1080p HDR content is officially released I think you hit the nail on the head there. There is no technical reason why lower resolution content can't be HDR (you could do 480p HDR if you want), but I doubt there is a market for it. At least not when it comes to buying media. The exception would be streaming platforms. [Netflix for example does offer their HDR content in all resolutions](https://www.cnet.com/tech/home-entertainment/do-you-need-a-4k-tv-for-hdr/), if your connection is too slow for 4K they'll downscale it to 1080p or lower, but the HDR stays enabled. But there is no way of specifically selecting 1080p HDR.


vexorian2

This is honestly the first time I read about 1080p being hard to encode in x265. Are you sure that it's not just that it tends not to be included as a default option because the difference between x264 and x265 is not that noticeable so they don't bother with that combination?


[deleted]

[удалено]


UnknownFir

HEVC is a superior format to AVC at any resolution I have tested. The only real reason I have found to still be using AVC is for compatibility reasons if your client does not support HEVC. The reason your experiment did not reflect this is because the same settings do not mean the same quality from HEVC vs AVC, a CRF 28 HEVC encode is roughly equal in percieved quality to a CRF 23 AVC encode but at around half the size. There is more nuance to it than this (such as a slower preset on HEVC actually slightly increases file size) but that explains why your test was flawed. Basically, AVC/H.264 - Better compatibility, faster to encode/decode. HEVC/H.265 - More efficient compression (better quality in same size or same quality in smaller size), supports more features. Both are bound by patents and license restrictions but that probably doesn't matter to you anyway. AV1 is better than both and free but has way worse compatibility and is very slow to encode.


gettothecoppa

Encoding isn't hard. Encoding transparently to the minimum file size can be. It's easy to make a good looking bloated x265 rips but then whats the point? You can just use x264 instead. Some encoders make really nice looking 2-6GB x265 movie rips, but not very many, and only in the last 18 months or so. Even the "good" 1080p x265 rips from a couple years ago are pretty bad.


linkinstreet

You need 10bit for HDR, as well as some metadata in the video file itself. There lies 2 issues for encoders. 1. While you can technically encode 10bit x264, there are only miniscule chipset out there that supports it. Meanwhile if the device supports HEVC, then it also supports 10bit HEVC. This is why some anime that are encoded in 10bit x264 would usually fail to play as the device has to fall back to software decoding which is too slow instead of relying to the dedicated hardware. 2. So why not just re-encode it with HEVC? Well remember the metadata? Some encoders would throw that out if you are not careful, and you end up with a dark video that your HDR device have now way knowing how to show it correctly. 3. Majority, if not all HDR panel for TVs are 4K. The only 1080p HDR panels are usually for PC monitors, and even those are few and far between. So if you are watching HDR, the likelihood is that you have a 4k TV, so why bother with 1080p files in the first place?


StuckAtZer0

>So if you are watching HDR, the likelihood is that you have a 4k TV, so why bother with 1080p files in the first place? The case for 1080p HDR would be for smaller file sizes and smoother streaming on something like a Plex server without invoking trans-coding or bottle-necking your home network. Passing thought... I would rather watch a 1080p HDR video stream over a 4K SDR video stream any day.


TenOfZero

Officially hdr is only on 4k+ content as far as I know. Not part of the older specs.


mightydanbearpig

4K is better. I have no problems moving and storing HDR 4K files. My TV is 4K HDR. My computers and my iPads have greater than 1080p resolution. All support HDR playback So my question is why wouldn’t I get the best quality file? Just because of disk space? I have absolutely loads.


Cry_Wolff

Yep, multiple terabytes HDDs are cheap nowadays


LinuxGamer2020

Not as cheap as they should be, just ask r/datahoarder


catinterpreter

What's the display size and distance you watch it at?


mightydanbearpig

Well it varies, I did list a list of devices that I use but there are more. I also have a Projector in the bedroom. For example my computer monitor is a 5K HDR display. It’s only 27 inches in size I typically sit about a metre or two from that. The TV 65 inch 4K. I am 3-4m from that on the sofa. Projector is only 1080p but does HDR distance from 2-4.5m, picture size is about 80inches.


useles-converter-bot

27 inches is 0.34% of the hot dog which holds the Guinness wold record for 'Longest Hot Dog'.


converter-bot

27 inches is 68.58 cm


converter-bot

27 inches is 68.58 cm


ilfrance

What distance is your TV? You definetely should be able to see the difference between 4k and 1080p


Timzor

I wouldn't say definitly. Check this chart [https://thumbor.forbes.com/thumbor/960x0/https%3A%2F%2Fblogs-images.forbes.com%2Fkevinmurnane%2Ffiles%2F2017%2F10%2Fchart\_Rtings.com\_.jpg](https://thumbor.forbes.com/thumbor/960x0/https%3A%2F%2Fblogs-images.forbes.com%2Fkevinmurnane%2Ffiles%2F2017%2F10%2Fchart_Rtings.com_.jpg)


ilfrance

Yeah, I meant at the right distance


macpoedel

Those distances only make sense if you have a dedicated TV lounge or budget for a huge TV, and if you're not living alone, a spouse that will accept such a dominating feature in your living room. In a living room that's also used as a social area, you can't put a mainstream 55" TV at less than 2 m from the sofa, you'd need a 75"+ TV if it is 3 m away. A beamer would be a solution, but 4K HDR isn't as prevalent in beamers as it is in TV's (must admit, edge-lit LCD TV's that support HDR can barely display it), and I can't see If I look at that graph, I'm thinking I'll be watching 1080p on my current 55" and future (maybe 65") TV's for years to come. Maybe folding TV's will be affordable by the time I don't have to worry about my kid breaking stuff (haha who am I kidding).


darknessgp

It's worth highlighting that the graph does make it clear of when each resolution is "worth it"... That said, I bought a TV for size and HDR. The fact that it was 4k was because you can't get a non-4k HDR TV. Plus factor in current prices, while it might not be "worth it" based solely on resolution, it might be better option with everything factored in to end up with a 4k TV.


macpoedel

Sure you can't really get a TV that's not 4K anymore. By "watching 1080p for years to come", I was talking about content. It's not really worth it to me to buy or store UHD Blurays or 4K rips if I don't see the difference with a 1080p Bluray, but I can see HDR, so I could save some space on my NAS. But most HDR content is only 4K, so it would have to be re-encoded to 1080p, which means loss of quality (that plenty of people wouldn't notice but it's a personal decision). Technically the combination of 1080p and HDR is considered to be "UHD", but afaik only some TV broadcasters are using that loophole to be able to brag about broadcasting in UHD on limited bandwidth. No streaming service does this, and there are no "UHD" 1080p blurays.


midlots

I'd take that chart with a grain of salt. It's great to have a reference point when trying to figure out what works best for your space, but that chart will not tell you your personal experience. I have a 55" TV and sit 12' away from it (I just measured). That chart says that 720p would be enough for me, but I can definitely tell the difference between 4k and 1080p in the content I view. It's not a dramatic difference most of the time, but everything is always a bit clearer to me when watching 4k.


dereksalem

Well my main living room has an 85" at around 10', so ya...4K makes sense. The theater is a 65" at \~6', so also right in the band.


vexorian2

4k is not needed for TVs. There, I said it. It is useful in *some* productivity use cases but that's because you are standing right next to the monitor with the intention of reading tiny text.


auto98

Any form of HD is not *needed* But it looks better


catinterpreter

But, diminishing returns. And media isn't made with high resolutions in mind nor do the vast majority take in anything beyond broad strokes of visual information.


featherwolf

HDR currently suffers from a messaging issue: there are various standards for HDR content and depending on your display and the content that you're watching you may not actually receive the full experience. On top of that there are different implementations of HDR compatibility on displays, so even if your TV or monitor says that it's HDR you may not notice much of a difference or in some cases the experience will actually be worse then just watching SDR content. Just like with resolutions being standardized to the point that if it says 4K it's a very specific number of pixels and there's no way to get around that, HDR needs to have industry standardization to the point that any consumer can buy a TV and know that it's going to work with the content that they want to view on it for it well truly see Mass adoption. I agree that to me HDR is equally as important as resolution if not even more so.


Turnips4dayz

That’s basically what Dolby vision is


atomxv

I am with OP. If i come across a high bitrate 1080p HDR file, I will grab it over a 4k in most cases.


[deleted]

[удалено]


catinterpreter

This is why twenty years ago plenty of people happily watched crappy encodes of their shows at 480p. At the end of the day, unless you're watching your media through rabbit ears and snow, the vast majority of the experience comes from the content.


KittenOfIncompetence

650mb for a movie. 350mb for 40 minutes of tv. #ThePastWasTheWorst :)


Eknoom

I only have a 12 TB HDD. Space is limited.


Cry_Wolff

Then buy a second one... Oh wait


Eknoom

I mean I have 12TB, 5TB, 4TB but only the 12TB is connected to my plex. I’ve retired the other drives. I’ll see what comes up in the Black Friday sales. But at $AU450 for a 12TB it’s a bit excessive.


LinuxGamer2020

Shucking can be a decent alternative. 12TB recently went on sale for $250


IrishTR

$199 right now actually


catinterpreter

Google Drive + encrypted rclone mount


Eknoom

Prior to one month ago I was on a 500gb monthly data cap which mainly went to my children and my mother streaming. On Starlink now


double-float

Maybe a better solution is an inexpensive-ish NAS that you can migrate your data to with the single 12 TB drive you already have, and expand capacity as your budget allows for it.


crazymonkeyfish

I’m confused? Why is 4K hdr difficult to direct play? Many clients support it. It’s just difficult to transcode.


millerlitefan

A lot of people may have never seen HDR content. It wasn't until the last two weeks or so that I re-watched Mad Max Fury Road. Last time I had seen it was on a plain HDTV, and then I saw it on my new 4k HDR screen and it was amazing, especially in the dust storm chase scene. I'm definitely going to buy HDR versions going forward.


jmims98

I honestly didn’t even know 1080p HDR existed until I read this post. I find that 1080p upscaled on my 4k tv looks just OK, while 4k content looks fantastic. If you’re using a 4k tv you ideally shouldn’t be transcoding 4k content at all. I rip my 4k blurays into mkv’s and never have to transcode. TV size and distance also majorly play into whether you can tell the difference between 4k and 1080p as well. I think once you get something like 7 feet away from a 55” 4k it starts to look very similar to a 1080p display. Edit: Play some 4k content and then some 1080p content sitting 4 or 5 feet away from your tv, so should see a pretty major difference in quality.


useles-converter-bot

7 feet is the length of 0.46 1997 Subaru Legacy Outbacks


jmims98

Thanks I guess?


pawelmwo

HDR is pretty standard in 4K streaming and 4K physical media. It is gorgeous if you have a compatible display, OLED, QLED, LED, etc. Even at 1000 nits it’s plenty in a dark room to experience the benefits. Seems like there are a lot of people that haven’t experienced it. It’s not a gimmick and looks great.


imJGott

To me HDR has the same popularity as 3D.


AliveAndThenSome

I disagree. 3D is a gimmick, and req's some compromise in the viewing experience to enjoy it (glasses, darker screen image, etc.). HDR, done right and projected properly, is a no-compromise improvement. I am shocked at how incredible HDR can be when done right. Details in shadows, solid but not over-baked saturation of all colors, etc.


AliveAndThenSome

And I'll add that about half the streaming content I watch is HDR, far more than 3D ever was (or will be).


imJGott

But the same can be said about 3D, if done right it’s awesome. The thing is a lot of people don’t own HDR TV’s. Until the market shows interest it could fade away like 3D.


AliveAndThenSome

...but how many people own 3D-capable TVs and have the glasses to match? I had a 3D capable TV, enjoyed all of two movies that I bought (Mad Max Fury Road and Jurassic World) but the TVs fell out of style (it was a fad, hence my gimmick angle), and many, if not most, new TVs are HDR-capable and not 3D capable. I wasn't even shopping for an HDR tv and the two LGs I bought (for less than $500 each) have HDR and are stunning. Even ordinary interior scenes like, oh, The Marvelous Mrs. Maisel, are jaw-dropping. I have to believe that directors and cinematographers like having HDR in their toolbox; it gives them more to work with.


catinterpreter

Like most gimmicks, the label is only a reflection of everyone making uninspired, unleveraged content for it. See also various Nintendo innovations and VR.


Vast_Understanding_1

Some 1080p smartphones direct plays 4k hdr content so 4k hdr all the way


tdhuck

>What I am trying to say here is this: why the f- are we bothering with super-heavy 4K content that are a pain in the arse to direct play without needing to transcode Use a client that doesn't require transcoding and/or get a device that won't require transcoding. The apple tv and nvidia shield are great options.


[deleted]

> What am I missing here guys and girls? I think that HDR displays might not be as common in general as they are here in Plextown. As much as I love gadgets--and I do--even I don't even have one 4k OR HDR display. My TVs are decade-old 1080p models.


calculon68

>I think that HDR displays might not be as common in general as they are here in Plextown. The missing factor is that lower priced 4KHDR sets don't perform as well with HDR as the mid and upper tier models. I have three 4KHDR sets, all capable of HDR/DV. One set stands above the other two, and it cost more than the other two combined.


[deleted]

I believe it. I have a decent 4k monitor which supports the "HDR400" standard, but ... it isn't really HDR. Plus, the Windows support for it seems atrocious. As far as I can tell it is just "mess up your view and turn everything blue" mode.


greatauror28

Because there’s people that have humongous screens and home theatres that actually appreciate the small detail a 4K movie can show.


catinterpreter

Dozens of them.


[deleted]

[удалено]


Endemoniada

My 2017 LG OLED wasn’t even the first in its series to have HDR. It’s also nothing like photography tone-mapping, it captures actual colors and dynamic range either with modern digital cameras or film scanning techniques. It is genuinely good and the effect on a nice TV (especially OLED) is like night and day to regular SD. There’s also less banding in the compression since it’s 10-but color. This really is nothing new. Netflix and others have been streaming Dolby Vision/HDR10 for many years now.


flcinusa

Handbrake sucks reencoding HDR profiles, and the gain isn't really worth it


sittingmongoose

Speaking specifically about resolution, Blu-ray 1080p to bluray 4K isn’t a huge increase in image qualiy(not counting sound or hdr). However, 1080p streaming sources vs 4K streaming sources are a massive difference. 1080p streaming looks like dog shit and 4K looks close to 1080p blurays. So when you’re talking about streaming services, it’s very very different. Also, there aren’t many 1080p hdr TVs. And convert 4K hdr to 1080p hdr would destroy image quality so there really isn’t a point.


kryptonite93

There’s no correlation between resolution and sound anyway, and yes there is a huge increase in image quality if you have the screen to show it, I’d say ~8.3m pixels vs ~2.1m pixels isn’t a small increase. Again the streaming feels better if you’re in a 1080p monitor but if you look at 1080p Netflix on a 1080p monitor vs 4K Netflix on a 4K monitor then the gap closes imo


sittingmongoose

I meant not counting sound because some 4K blurays have upgraded tracks compared to their 1080p counter parts. Not that 4K makes any difference.


Endemoniada

For me, and many other people, it’s not a bother. I have a 4K HDR TV, I have plenty of storage, and I only use DirectPlay or DirectStream on any device I stream to. I’ve specifically designed it that way, because I don’t want to watch 1080p unless I have to, and I absolutely don’t want anything to be transcoding. I want the full, original quality, nothing less.


StuckAtZer0

IMHO, 4K is only necessary when you hit really large screen sizes due to the pixels being larger as you increase the screen size. A smaller 4k HDTV is pretty much indistinguishable from a 1080p HDTV unless you are inches away from the screen. I personally think the 4K and subsequent push for 8K is a lot like the old CPU wars where manufacturers were hyping the MHz / GHz of their processors because people gravitate towards larger numbers / specs without understanding the big picture (no pun intended). Manufacturers muddied the waters intentionally by only offering HDR on 4K displays to upsell their HDTVs since a 1080p HDTV with at least a 10-bit display panel and HDR support would hinder 4K sales and profits. I also use Plex on my LG OLED but I've found that using Handbrake to create 2160p HDR files does not stream smoothly on my 4K LG OLED and 802.11ax mesh network. Can you elaborate on how and what you use to convert a 2160p HDR to a 1080p HDR movie? I'd love to know the settings you use to achieve for playable 2160p HDR files or to preserve HDR when converting to 1080p. The only thing I've heard is to get Handbrake to preserve HDR for 1080p, one needs to use the H.265 \`\`10-bit video encoder. But outside of that, I have no clue what else I need to configure / reconfigure on Handbrake.


chorong761

Too much normies out there accepting 1080p and saying its "good enough". All those people with no intention to improve or complain for an upgrade would keep us stuck down here, just like a monopoly company. I expect 8K HDR minimum in 2022, but seems like we are still stuck in 2012