Closed coreygreenberg closed 4 years ago
@coreygreenberg Thank you for the kind words and for using my software! Sorry I took so long to reply.
I'm sorry this happened but it's a known problem. Transcoding from 10-bit HEVC to 8-bit H.264 not only loses the High Dynamic Range (HDR) color information, but those HDR colors are not mapped correctly into the new limited color range. Thus, you get the washed out look.
This is a limitation of both FFmpeg and the various encoders. It's not something I can fix or even work around in other-transcode
.
Worse, even transcoding 10-bit HEVC to 10-bit HEVC will have a similar problem because new HDR information is not generated by FFmpeg and the various encoders.
Thanks Don, I was afraid you might tell me that.
HEVC! HUH!
WHAT IS IT GOOD FOR?
ABSOLUTELY NOTHING!
ESPECIALLY IF YOU'RE STICKING WITH 1080p FOR THE FORESEEABLE FUTURE AND JUST WANT TO SHRINK A 4K BLURAY DOWN TO SIZE SO PLEX DOESN'T PUKE TRYING TO TRANSCODE IT ON THE FLY!
So what are you doing now to transcode 4K BR rips? Or are you just leaving them as rips because FFmpeg can't properly transcode them yet?
Maybe it's just me and I'm wrong for sticking with 1080p because my beautiful plasma's the last one Panasonic will ever make and 4K does nothing for me on a 50" in my living room. But it sure does seem to me that HEVC is a step backward in most ways, particularly in how it makes life so much harder for hardware as well as software.
I hope FFmpeg gets updated to deal with these issues, because I find transcoding HD rips with your scripts oddly satisfying, and I want to continue shrinking movies and throwing them onto the Plex server. What can I say, I'm an Old and I like hosting my own library. The market just wants me to buy a 4K Samsung and subscribe to $800 worth of streaming services and let God sort it out. Those of us still trying to transcode physical media in 2020 so we can keep it on a private server are the back-of-the-trainers in "Snowpiercer" eating roach bars.
The issue you're facing isn't because the input is HEVC, it's because the input is HDR (High Dynamic Range). Among other things, this means it has a much wider range of colours, and it's this extra colour information that's lost during transcode.
Most 4K Blu Rays will be HDR (but I don't think they're required to be), so until ffmpeg
is updated to handle HDR content correctly, you either need to stick to transcoding 1080p inputs (from normal Blu Rays), or leave the 4K rips as they are, untranscoded.
Ah, thanks @samhutchins, that makes sense. I still loathe HEVC though. The extra burden it places on playback silicon is regressive and inelegant, especially since it looks no better than x264 while making it a lot harder for hardware to deal with.
HDR SHMAITCHDR. You kids with your 10-bit video and your hoverboards. Feh. Don't need it. What did HDR ever give us except DeNiro with a 40 year old face and a 70 year-old mouth like Clutch Cargo gone horribly wrong?
Look, I don't automatically reject the new. I flipped to CDs the day they came out, even though it took awhile for CD mastering to get things right. I couldn't be happier with $30 Chinese eBay Class D amps replacing my old Class AB space heaters. "Goon" is a superior film to "Slapshot" despite its lack of Paul Newman and Hansons.
But when Catalina's a trainwreck, I stay on Mojave. When iPhones got so big they started crushing my balls I downsized to an SE, which is still in my front jeans pocket. And I'll be honest, I have yet to see a single demo that makes me yearn for 4K/HDR/Dolby Vision over 1080p on my Viera plasma. DVD was an unmistakable step up from VHS, as was HD (even 720p) over DVD. But 4K/HDR to my eyes is like 24/96 audio. I know why it's better on paper but my ears just don't hear it. I wish they did. I like the idea of 24/96. I wish I could hear the difference but I don't. That's how I feel about 4K/HDR. At least in terms of a 50" TV in a living room. Maybe if I was projecting at 120" I'd feel differently.
But where were we? I hope they lick this issue and get FFmpeg to handle transcoding HDR to reg'lar without borking color. I've seen the improvements in sharpness and detail that downconverting 4K to 1080p brings when importing my own 4K camera video into a 1080p FCPX project, and I'd love to see the same benefits when downconverting 4K BluRay.
Hi @coreygreenberg , just some background having read the above. Whilst it's getting easier and cheaper to buy 4K TVs and content (UHD/HDR), the state of transcoding core tools (like ffmpeg
, HandBrake
etc.) is in catch-up mode.
Transcoding at 4K resolution and leaving it at 4K dimensions is today possible with hardware encoding or multi-core software encoding. It's also slow. But 4K raw content also came with the even bigger (and more important) advance: HDR. This is the real win for consumers/pro-sumers. But adding this to transcodes has not been solved yet so you get washed out colours for now. In addition, UHD audio content adds spatial/object sound codecs so if you have an Atmos setup, then you get spatial object sound. In a nutshell, transcoding "4K" content properly just got a lot more difficult with the toolset available because a lot of improvements were made to the spec.
Believe it or not, when used correctly, HEVC is better than AVC/H.264. Yes it has better compression and sometimes has a bad reputation but it (as a format) also is needed to distribute the massive increase of data on Blru-ray UHD disks.
A small group of people helped @donmelton throughout 2019 in testing other-transcode
and we can be even more critical than Don at times! Suffice to say, after significant effort, 10-bit HEVC transcodes of 1080p are by far the best solution. The tweaks and defaults provided by Don have taken months of effort and testing alone. Yes, it is more silicon intensive but today's client-side hardware (FireTV 4K, AppleTV 4K, NVIDIA Shields, Rokus) are hardware accelerated specifically for this. They work brilliantly and I strongly recommend trying one (for Plex playback). They also bring two other massive advantages (excluding the Rokus): they direct play embedded PGS subtitles which means there is no load on either the server-side Plex server OR the client-side. And secondly, FireTV 4K and Shield handle most audio without transcoding too.
I suppose in summary, @donmelton's new tools have been privately tested for over nine months, give fantastic results for 1080p transcoded content and 10-bit HEVC is the best solution. I hope you give it a chance because you'll really benefit.
As for 4K/HDR, nobody in the open source community has solved this so far as HDR transcoding is super-complex. Someday it will be possible but for now, @samhutchins' advice for 4K content is correct: stream it in its raw format to keep all the 4K/HDR goodness. Use the traditional BR disks which are also included to transcode the 1080p content ... and give 10-bit HEVC another try - you won't be disappointed.
Also, one final thing: Plex plays raw 4K HDR without transcoding direct to 4K TVs if it helps. But your setup has to be correct end-to-end.
Once all of this is in place, not only can I Direct Play raw/ripped UHD disks directly to my LG OLED via ATV4K and/or FTV4K running the Plex client, the Plex server CPU is nothing more than 6-8%. Also, I can just turn on any embedded subtitle and it also is direct play. All HDR data is also preserved and used on screen.
So it's worth taking a look at the setup for 4K if you're seeing server-side transcoding for 4K too. You will definitely see it for 2160 -> 1080p if basing things solely from the UHD content but that's where the previous advice stands: 4K raw + 1080p-sourced transcoded is the solution as it stands today.
@rhapsodians,
Thanks for the info, I really appreciate your taking the time to explain things.
I guess it's all down to nomenclature. What I think you mean when you say HEVC is better than H.264 is that it has greater compression efficiency, i.e. it delivers equivalent visual quality with smaller file size. On that we agree 100%.
What I mean when I say HEVC is no better than H.264 and in crucial ways worse is I don't really care whether a transcoded movie file is 4GB or 6GB, but I do care about HEVC's greater demands on encoding/decoding hardware which seem to me to be a step backward at a time when everything else is moving in the direction of lower power and greener footprint. Even if the difference amounts to a handful of watts per playback, it's still regressive and inelegant, and doesn't seem worth the slightly smaller file size. Maybe I shouldn't care about a few watts more but I confess that I do. It's a step backwards, and it's kind of a hoax -- smaller files and lower bandwidth for content providers, but end users pay the price on every playback. Whether it's software or hardware decoding, your playback system sweats more with HEVC than H.264. It's not supposed to go that way with successive gens. If Class D amplification was less power efficient than Class A, B, or AB, we wouldn't be talking about Class D because it would have been shelved until they figured out how to do a Class E that did have greater power efficiency than older designs. That's how I see HEVC. It was sold on the free lunch of smaller file sizes for the same or better video quality, and only neckbeards would ever actually read the fine print and notice the increased toll it placed on hardware fore and aft, and the higher power demands passed onto consumers.
I do own an Apple TV, the 4th gen 1080p version. I use it almost solely as a Plex client, except for when I'm in the YouTube app watching Limmy's descent into madness on Twitch. I'm going to do a couple of test transcodes tonight with the BluRays of "Ad Astra" and "Once Upon a Time in Hollywood" and see how the 10-bit HEVC you guys like compares with the H.264 transcodes I did a few weeks ago. I'll report back on the file size differences, quality differences if I notice any, and Apple TV Kilowatt readings when playing HEVC vs. H.264. I will also update my Bumble profile with news of this project, because this kind of stuff is goddamned CATNIP to the wimmens.
Hi @coreygreenberg, thanks for the explanation and believe it or not, I mostly agree with what you say especially the green footprint side of things, especially tech.
But there's one very important consideration here which you may not have considered: your 4th gen ATV uses the A8 compared to the more efficient A10 in the ATV4K. This is important because the A8 will do software decoding (high energy, maxing the CPU) whereas the A10 will be h/w accelerated (lower energy). When using HEVC on the older model, you'll see random stuttering as the CPU fails to keep pace - doesn't happen on the A10-based ATV4K so I'd guess it's lower power too as a result.
As for file size, yes HEVC gives me 30% smaller sizes and that's important as I have a large collection of transcodes on NAS, internal and offsite backups. The overall smaller footprint means I don't have to buy more h/w to run and backup my collection.
But that aside, 10-bit HEVC, especially using Nvidia NVEnc gives significantly better output as it virtually eliminates colour banding. Intel QSV also gets close albeit more slowly and not as fine-tuned as Nvidia. Nothing in the H.264 space gets close though. And transcoding with something low power like a GTX 1660 dGPU uses just 36W, runs at 55C and the fans on the card don't even spin up. As a one off, it's pretty efficient for a transcode process.
I have to be honest in that I haven't looked at the power draw per play but as h/w decoding/encoding is being built into more and more silicon, the power per play is dropping drastically too. The FireTV 4K needs 5.2V and whilst it needs to plug into the mains, it's just marginally above the USB bus power so a very low power device. And whilst the ATV4K's UI is better, I also suspect it's lower power too as custom silicon to enable HEVC in h/w is present too.
I guess what I'm saying is that the most modern Plex client h/w (in 2020) will give the advantages of HEVC playback with minimal usage per Watt because they are designed to do so by custom silicon. Once you transcode to 10-bit HEVC (Windows only, not possible on a Mac as Apple's VT only produces 8-bit from a Blu-ray raw source), then you benefit by a better quality transcode (punchier colour, virtually no banding) and super smooth playback. If run on new client-side h/w, then serverside transcoding of 1080p can also be virtually eliminated at home (saving energy here) as Direct Play is the standard even with embedded PGS subs.
It's worth considering a h/w upgrade to avail of the benefits, both quality and energy usage. And also then get to enjoy the better results.
Hi @coreygreenberg,
If reducing power consumption during playback is your primary concern I recommend buying a new TV. I do not know the exact model you have, but you mentioned it was one of the last Plasma TV's Panasonic made. Assuming it is the TC-P50ST60 (released in 2013), then the average power consumption when playing content is 136W (https://shop.panasonic.com/support-only/TC-P50ST60.html). A current generation OLED TV, such as the LG C9, consumes 89W (https://www.rtings.com/tv/reviews/lg/c9-oled) and a current generation QLED-LCD TV, such as the Samsung Q70, consumes 58W (https://www.rtings.com/tv/reviews/samsung/q70-q70r-qled). This reduction in power consumption would dwarf any reduction in power consumption from the player. For example, an Apple TV 4K playing 4K HDR content consumes between 5.58W and 6.07W depending on voltage (https://www.apple.com/environment/pdf/products/appletv/Apple_TV_4K_PER_sept2017.pdf), so the maximum saving you could achieve by using a more power efficient video codec would be ~6W. Where as, replacing your TV with the Samsung mentioned above would result in a 78W saving. That is 13 times larger power saving than selecting a different video codec.
While far less significant than changing your TV, there are power consumption advantages to HEVC over H.264. One of these advantages is related to the reduction in file size that can be achieved with HEVC. If you were able to realize a 25% reduction in video file size compared to H.264 (which Don's script does), then you could (hypothetically) use a 6TB HDD rather than an 8TB HDD to store your content. This would result in a reduction in the power consumed by that HDD when reading from it of at least 3.5W (https://documents.westerndigital.com/content/dam/doc-library/en_us/assets/public/western-digital/product/internal-drives/wd-red-hdd/data-sheet-western-digital-wd-red-hdd-2879-800002.pdf). There would be further power savings when the HDD is idle but not when in standby mode.
This is a long-winded way of saying: 1) If you care about energy consumption while playing content on your TV, buy a new TV. 2) HEVC is not as bad as you suggested.
@rhapsodians The information you provide here is great. Maybe we can convince @donmelton to let you add it to the wiki for future users to have readily available, since most people don't tend to dig through all the issue threads like some of us :)
@coreygreenberg ... you mentioned the following:
"...I have yet to see a single demo that makes me yearn for 4K/HDR/Dolby Vision over 1080p on my Viera plasma. ... That's how I feel about 4K/HDR. At least in terms of a 50" TV in a living room. Maybe if I was projecting at 120" I'd feel differently."
Are you seriously saying you cannot see the difference between a 1080p SDR movie and a 4K/HDR movie on a 50" screen?? It's like going from DVD -> Blu-Ray!! As @martinpickett says, I think you need a new TV.
Also, Dolby Vision is one of the HDR profiles, not a separate thing. Today we have HDR10, HDR10+, Dolby Vision and HLG (used for broadcasts) with HDR10/Dolby Vision being the prevalent two.
Last night I tested other-transcode with three different encodes of the original VC-1 1080p BluRay of "Once Upon a Time in Hollywood". I followed Don's usual protocol of prepping with MakeMKV to remove unwanted audio and subtitle tracks and encode DTS-HD to FLAC, and then did three encodes:
Intel hardware-encoded H.264 using the terminal command other-video --eac3 --crop 140:140:0:0 /Volumes/Media/Plex/Once\ Up on\ a\ Time\ in\ Hollywood_t00.mkv
Intel hardware-encoded HEVC using the terminal command other-video --hevc --eac3 --crop 140:140:0:0 /Volumes/Media/Plex/Once\ Up on\ a\ Time\ in\ Hollywood_t00.mkv
Intel hardware-encoded HEVC "10-bit" using the terminal command other-video --hevc --10-bit --eac3 --crop 140:140:0:0 /Volumes/Media/Plex/Once\ Up on\ a\ Time\ in\ Hollywood_t00.mkv
I know, regular BluRay is only 8-bit. I did the third encode because I was curious if adding the --10-bit
option would do anything different than just using --hevc,
.
Once I had the three files, I plugged my Gen 4 Apple TV into a Kill-A-Watt to see if there was any difference in terms of power draw between HEVC and H.264 playback. There was no difference back at the server, as Plex Media Server merely passed the files over as-is and let the Apple TV handle decoding.
The server running Plex Media Server, transcoding with Don's wonderful scripts and hosting all my media is a Hackintosh on macOS Mojave. CPU is an Intel Skylake i3-6100 with onboard HD-530 GPU and no external video card.
Right off the bat, we need to toss out the "HEVC 10-bit" encode because for whatever reason, the --10-bit
option produced ridiculous Venetian blind artifacts. I remember discussing H.265 encoding with Don on Twitter a few years back and complaining that to my eyes it wasn't near ready for prime time because of this exact artifacting, and he agreed that x264 was still the gold standard. I guess he's come to like H.265 in its current iteration but jeez, I mean you look at those red extras behind Steve McQueen:
HEVC "10-bit"
H.264 "Ahh, that's better"
Click on the images to embiggen them so you can see the damage done to the HEVC encode. The HEVC encode without the --10-bit
option was free of these artifacts:
HEVC "8-bit"
Oddly, only the terrible looking 10-bit HEVC encode was able to pack the film into a smaller file, at 8.06GB vs H.264's 10.46GB. The 8-bit HEVC was actually a bit larger, at 10.47GB! That was a surprise. I'd been led to expect a smaller file size with HEVC encoding vs. H.264. Visually I couldn't tell the difference between them, whether on my 1080p plasma or pixel peeping on my Apple 27" monitor.
In terms of the Apple TV's power draw, using the Kill-A-Watt I measured 4.5W at idle, 5-5.5W with the H.264 file, and 7-7.5W with the HEVC.
With the Skylake i3-6100, other-transcode was able to encode the film to H.264 at 6.3X. The HEVC encoded at 1.75X.
So I think I'll stick with 1080p and H.264 for now. I fully understand that my priorities may differ from some or even all of you. Like I said, I love my last-one-off-the-line 1080p Panasonic plasma (P50ST50) to death and haven't seen an OLED or other 4K TV that looks overall better to me. If that means missing out on HDR and 10-bit, Dolby Vision and whatever else comes with the new sets, I'm okay with that. If you think I need a new TV, I'm okay with that too.
I can see the difference between 1080p SDR and 4K HDR on a 50" screen -- I just don't think the latter looks "better" than what I've got, and definitely not from ten feet away. I know it's antithetical to CE but I'm beyond happy with what I've got. And beyond 1080p vs. 4K and SDR vs HDR and all the rest of it, the simple fact is I strongly prefer the plasma's picture to any OLED or LCD I've looked at. What can I say? Billie Eilish just won an armful of Grammys. There's no accounting for taste.
I do appreciate all the time you guys took to contribute to the thread. Even if I'm the only one here happy to carry on transcoding VC-1 BluRays to H.264, I'm still grateful to Don for all the work that he's done and continues to do to drill down on superior transcoding, as well as all of you who help him with your feedback. And when 8K hits and some kid's on here telling you your problem is you need to ditch your 4K and buy a new TV FFS, try to remember that you were young once, too.
Hi @coreygreenberg
I owe you an apology, my comment yesterday was snide and unwelcoming, I am sorry. You deserve better from me and so I will attempt to make up for it by providing a little information about the file size differences you saw.
other-transocde
essentially uses and average bitrate control mechanism. This means that the whole video's average bitrate is controlled, but various scenes are allowed to go above or below this average depending on their complexity. Don offers a better explanation here. The other-transocde
default average bitrate for 1080p output is 8000kbps. The default rate is used for both H.264 and 8-bit HEVC encodes. This is why your H.264 and 8-bit HEVC encodes have virtually the same file size. An exception is made for 10-bit HEVC where the bitrate is reduced by 25% to 6000kbps for 1080p output. Hence, your 10-bit HEVC encode has a smaller file size.
One other bit of information. Skylake processors officially do not support hardware accelerated 10-bit HEVC encoding. I know Wikipedia cannot always be trusted but there is a good table detailing which processor generations support which codecs for encode and decode here. I cannot explain why your system is capable of hardware accelerated 10-bit HEVC transcoding, but the lack of official support may explain the relatively poor visual performance you are getting.
Hi @coreygreenberg,
Whilst this thread started out as 4K UHD -> 1080p washed-out colour question, it's gone down a quite a surprising HEVC route whether 4K or 1080p, energy usage and basically disagreeing with the defaults provided by @donmelton in other-transcode
. Everyone is entitled to their opinion and we listen to and respect that because if you prefer H.264 1080p on your plasma, then cool.
But I suspect many others will be reading this conversation and may start doubting the defaults too. So I'd like to address some of the items mentioned above partly to explain some differences but also to ensure that the detail and context is available for everyone else now and in future. If I get any of the following wrong, I'm happy to correct it too
Apple & VideoToolbox and hardware transcoding The VideoToolbox (VT) framework is Apple's extraction layer so that applications (open source, commercial) can interface with underlying hardware and software. This is the API used to access the appropriate hardware encoders/decoders. It has to cover a multitude of scenarios from old to new hardware to multiple encoder/decoder options. They make (externally undocumented) choices which essentially locks us out so we can only summarise what we think is happening with VT.
If you had an older Mac just using Intel's integrated GPU, then the encoder element of that iGPU may be available for h/w encoding. Each iteration of Intel's CPUs bring changes (https://en.wikipedia.org/wiki/Intel_Graphics_Technology) both from a marketing and capability perspective. So what you can do with a 2015 Skylake will be different from a Kaby Lake Refresh and same with Xe coming later this year. Apple makes choices on what they want the API to do in each s/w release and it is not customisable. So you take what you get ... ffmpeg
, HandBrake
, Final Cut, Premier are all constrained in this part.
You then complicate it with AMD graphics cards (mobile and/or externals via eGPUs/Mac Pro options) and their internal encoders. These are very different beasts and choices are made by Apple again without exposing them or having the ability to change/tweak them. Add in T2 chips and more encoding functionality is being offset from the GPUs to the T2 chips too in the newest Macs.
So VideoToolBox is an abstracted framework which can't be externally tweaked and those choices picked by Apple offer a generic API actually which leads to compromises in HEVC encoding quality when targeting VT. We've tested it. I don't know if there are separate issues with Hacintosh implementations or not.
Non-Apple implementations - Intel QSV, Nvidia NVEnc etc.
When you move onto Windows (and to a certain extent Linux), the configurability/tweaking capability changes. ffmpeg
(and HandBrake
) take advantage of more knobs and dials on Windows by accessing published QSV and NVEnc controls thereby giving potentially better encode results. I say "potentially" for a reason as we've discovered soooo many inaccurate and bad recommendations online over the past nine months or so covering both H.264 and HEVC. Many come from a game streaming requirement as opposed to one-off encodes where quality is the main driver.
Over months of testing, Don and a number of us, have discovered some major tweaks. For example, we were surprised at the quality of Intel native QSV HEVC encoding but it is slow-ish and not very configurable relative to Nvidia's separate encoder chip(s). Adding 10-bit made QSV quite impressive too. And then Nvidia's NVEnc: 8-bit NVEnc transcoding with months of tuning still gave quite a lot of banding and artefacts especially in dark or similarly coloured segments (e.g. sky, candlelight etc.). In this case, on many occasions, H.264/AVC software encoding (with Don's previous and current magic formula, e.g. other-transcode --x264-avbr
) worked better. Not always but enough to question nvenc
HEVC quality even at comparable bitrates.
But add 10-bit to the equation, then colours popped and banding virtually disappeared. BUT it needs to be encoded by a Pascal or Turing Nvidia GPU. Things like spatial and temporal adaptive quantization (latter in Turing) just added to the quality. This is why @donmelton recommends Windows with Nvidia in his ReadMe (https://github.com/donmelton/other_video_transcoding/wiki/Platforms). When side-by-side testing the most challenging movies and segments with the best H.264 v. 10-bit HEVC NVEnc, 10-bit HEVC won every time. Downside, unless you have an eGPU with an Nvidia on a Mac booted into BootCamp, you can't avail of 10-bit HEVC Blu-ray transcodes on Macs today.
So the quality curve is Win10/Nvidia NVEnc 10-bit HEVC, Win10/Intel QSV/10-bit HEVC followed by 8-bit VideoToolbox HEVC ... but the latter can be of variable quality and at times, the older H.264/AVC output is better.
Back to this topic
When you transcode your tests, you're actually using the worst of the new hardware encoding options (VideoToolbox) and that's sad from a Mac user of 30+ years. So when many people here say "10-bit HEVC", we actually mean "Nvidia NVEnc 10-bit HEVC". I guarantee you there's a very noticeable difference between the Apple-encoded output v. the Nvidia one. And the Nvidia option has to be the 10-bit variable with spatial (and if available via Turing), temporal AQ. [For reference: other-transcode --nvenc --hevc --nvenc-temporal-aq
... spatial AQ is on by default for Pascal- and Turing-series Nvidia].
I've transcoded over 1500 movies/tv shows with the older video-transcode
software-based HandBrakeCLI
on my Mac and got very good results (albeit annoyed with the last 1-2% where banding or blockiness was an issue. I then repeated the massive retranscode with 8-bit NVEnc via Win10 and was mostly happy but maybe 5% were better with software H.264 encoding. But the revelation in testing was 10-bit HEVC via NVEnc - it was a massive step up in quality. That triggered another massive reTranscode.
So I think we need to compare like-with-like because you may be seeing the disadvantages of 8-bit HEVC compared to the output from 10-bit NVEnc HEVC.
I've been using @donmelton's scripts since they were bash
based ... when QSV and NVEnc quality was realised in private other-transcode
testing, I built a custom Windows 10-based transcoding rig. 10-bit HEVC made me realise that the investment was so worth it in comparison. And I've looked at the content on multiple screens and technologies from 5-8 year old backlit LCDs to brand new LG C9 and Sony OLEDs (and many in between): NVEnc 10-bit HEVC (with 640kbit EAC3) is as close to the original raw BluRay playback as I've ever seen ... and I'm picky.
Hope this added some background but the most important thing is to enjoy the content on screen :-)
Oh, I forgot to mention: if someone showed me those three pictures, I too would throw out the 10-bit HEVC one. I have no idea why you're seeing this - is it Hackintosh-related? I genuinely don't know other than to suggest that your Skylake (specifically QSV) doesn't support 10-bit HEVC in h/w (see the table linked to by @martinpickett earlier) so ffmpeg
is falling back to something else ...
@coreygreenberg
One final thing ... did you use VLC to capture these images? If yes, that's a problem - VLC is awful in this scenario. Please use MPV instead. (https://mpv.io)
I went and looked at the movie from my own transcode (other-transcode --nvenc --hevc --nvenc-temporal-aq
) and it's crystal clear.
But then I played it back via VLC, that blockiness appears. (see below)
VLC
MPV
Also, FYI ... did you know that the lead developer of mpv
now works for Plex? The 'experimental' player in Plex is actually mpv
... and it's good.
@martinpickett No need for any apologies but thank you. I mean, I get it. I'm inordinately passionate about this stuff too. When I read some fellow old crowing about how his 1940's Klipschorns still beat the pants off any speakers today I think good for you gramps, back in your Rascal now so you don't hurt yourself. But maybe he isn't just some ignorant old. Maybe he was hardcore into this stuff when he was my age, chasing that demon like Chuck Yeager, churning through Bozaks and Dynacos and even building his own speakers from surplus parts. Maybe those truly godawful Klipshorns finally gave him that sound he remembered from the PA at the local dance hall when he first danced with that girl that he wound up raising a family with for forty years before she passed away, and now when he listens to those speakers they take him back to that first dance. The CE industry used to value guys like him before it turned into an eternal insecurity generator that sees longterm customer satisfaction as the enemy of quarterly profits and does everything it can do to make you feel out-of-the-loop if you're still using a two year-old phone, god forbid a five year old CPU. Five year-old CPU?! Was that from the Union side or Johnny Reb?
@rhapsodians Yes I'm aware that the capabilities and quality of HEVC hardware encoding vary as you move across Intel gens, as well as Nvidia's and AMD's own iterations. It's a fast moving target that essentially forces you to replace your encoding platform every year at the very least. And now you're saying to get the best results you need Windows and an Nvidia card? Yeah, no thanks. I'm sufficiently techy and obsessive enough to drill down and know who Don Melton is, but I'm not going to replace my motherboard/CPU/GPU every year and go back to Windows just to get the latest HEVC ballz. I will certainly try whatever new tricks Don comes up with but the honest truth is my other-transcode
H.264 files look indistinguishable from the originals to me. If there are artifacts I'm not noticing, or if my playback chain isn't sufficiently hep to resolve them, I am at peace with this. After years of tweaking Handbrake and never being completely thrilled with what I was seeing, Don's scripts were the clouds parting. Even "old busted" transcode-video
delivers files I'm completely happy with. Now that other-transcode
gives me equivalent H.264 with vastly quicker encoding, I'm beyond happy. I know, there's that word again. "If you're happy you're not paying attention!" Sarge barked as he slapped the canteen from my lips. I am glad you guys are chasing the demon and buying new hardware whenever the chipsets refresh. It's got to better for you than Fortnite. Me, I'm good with what I've got now. The only reason I even opened this issue in the first place was because I saw how good my 4K camera footage looks when FCPX downconverts it to 1080p and I wondered if the same trick was possible with 4k BluRays. Now that I know it isn't, even with the hardware some of you guys have, it's frankly a relief.
So you think other-transcode --x264-avbr
is better than just straight other-transcode
, hmm? Argh, okay I'll bite. Will I see any difference given my clearly rheumy eyes, outdated TV and Victorian-era Apple TV? Only another test will tell. Okay, I see that it's churning along at .72X, about what I got on this server with trancode-video
. Am I just giving FFmpeg the same orders but in a different accent?
RE VLC yeah it's the pits, even I know that. No, I'm an IINA man through and through, which I believe is really MPV under the hood. And I think I did read about MPV in the latest Plex experimental player, but I'll have to wait as I let my Plex Pass lapse years ago. In answer to your question I used IINA for the screen grabs, and I see the same artifacts in the HEVC 10-bit encode on my TV and on my Apple monitor as I do in IINA.
RE Hackintosh I'm pretty confident there's no inherent difference between a vanilla Hackintosh like mine and an equivalent Apple Mac when it comes to video encoding. Although Apple never made an i3-6100 Mac it did do Skylake iMacs like the 5K 27", so the chipset is supported in macOS and my Skylake rig is 100% functional, everything Just Works and it benches just like an equivalent Mac. But I agree that 10-bit HEVC and my Skylake did not get along. Fortunately for me I am perfectly happy with H.264 and x264 so it's not really an issue.
Still waiting on the new "Once Upon a Time in Hollywood" x264 other-transcode, putting along at .72X. Already I miss the 6X encoding speed of default but I'll let you know if I see anything interesting. Thanks for all your info, I've learned a ton from this thread.
@coreygreenberg
If you compare software-based transcoding (other-transcode --x264-avbr
) with VideoToolbox-based hardware encoding (other-transcode
), then your mileage will vary. Yes h/w transcoding will be faster as you've seen and the quality will be similar in most cases but there will be scenarios when the VT-based h/w transcode may look worse due to more excessive colour-banding or blockiness. It's mainly because the h/w cannot be tuned due to the closed nature of what's available. So give both a shot, use other-transcode
for the majority and then if something doesn't look quite right via h/w, drop back to other-transcode --x264-avbr
Also, I suspect we'll find that IINA is another player which decodes HEVC incorrectly ... which is why you're seeing the blockiness. Native MPV and Plex's Enhanced Player correctly decode 8- and 10-bit HEVC.
And one point to note ... we're not constantly upgrading at every point. It doesn't work that way at all - perhaps if you're into Fortnite, yeah sure but transcoding is a different beast. The only time we upgrade is when there's a massive benefit.
So it's important to realise that hardware transcoding is not about the latest new thing but does the latest new thing actually bring any transcoding benefits. Most don't until there's a major leap in things like NVEnc, Intel Xe or AMD's new stuff later this year.
Anyhow, enjoy "One Upon a Time ... in Hollywood" ... it's a good movie :-)
@rhapsodians
So I compared the x264 encode with what I got from other-transcode
default H.264 and despite it taking approx 6X as long to encode, the image quality was indistinguishable to me. So as you said, I’ll stick with VT for now and if I see something jicky I’ll try it with the x264 option.
I’m pretty sure IINA is just MPV with a GUI, at least that’s what the dev claims and what everything I Googled seems to confirm. And I first noticed the weird 10-bit HEVC artifacts watching “Once Upon” on my TV via Plex on Apple TV, not on a desktop media player. I merely confirmed that the artifacts were there by taking a second look on the desktop with IINA. I’ll try native command line MPV though and report back if I see any difference playing that problematic 10-bit HEVC encode.
Maybe my comments about needing to upgrade hardware every year to keep up with the Alpha Don were a bit overblown but I am afraid if getting the very best that other-transcode
can deliver means building a Windows 10 PC with a newish Nnvidia GPU, I’ll settle for the lower tier I’m getting on a Mac with VT. I’ve got enough spare parts to throw together a Windows box and a 1050 off eBay will cost me $100 or so, but I swore off Windows decades ago and every time I’m asked to sort something on a friend or colleague’s Windows PC, it’s an awful experience. I don’t wish to keep a Windows box running on my network just for transcoding. My hope is that FFmpeg catches up the Mac and Intel/AMD GPUs to where Windows and NVidia are with Don’s scripts now, but if that never happens, I’m happy with what I’m seeing from my current setup. I look at this like I do Catalina — if this is really where Apple is going with macOS, I never thought I’d ever be that guy but I guess I’m standing pat with Mojave and if future apps won’t run on my OS, I’ll use whatever still runs under Mojave until Apple gets macOS back to where it was before this deal-breaking debacle.
Agreed on “Once Upon”. Easily my favorite of his films since the “Kill Bills”. I could watch Brad Pitt’s stunt double throw fake Bruce Lee into the side of that beautiful car on endless loop all day. And I’d watch a whole movie of the fake Steve McQueen. I never realized how much Damien Lewis looks like him. He needs to star in the remake of “Sand Pebbles”.
So I have a question for you and the other Meltonians. Don at one point listed some specific movies and scenes that made life particularly hard for transcoding and thus were good benchmarks for tasting. Do you have any specific discs and/or scenes you lean on when comparing different transcoding schemes? Other than that weird glitching when my Skylake puked on 10-bit HEVC, I have yet to notice anything awry when transcoding any of my discs since even the first rev of other-transcode.
Any torture tests you recommend would be welcomed and definitely explored here.
@coreygreenberg The Blu-ray version of "Blade Runner 2049" is an excellent test case rife with many scenes prone to color banding. And they're hard to miss. I would recommend trying that first.
My thanks to everyone adding content, but I think we've strayed from the original topic here. While the color change problem you noticed is definitely undesirable, it is, as I mentioned before, a limitation of both FFmpeg and the various encoders. It's not something I can fix or even work around in other-transcode
.
I'll close this now but feel free to continue commenting. Thanks.
Hi Don,
I've been using other-transcode since you released it and I like it a lot, thank you for your efforts in making lickety-split hardware encoding look as good as software.
Something odd happened today though and I wanted to flag it for you. When I transcoded a 4K HEVC BD rip of "Casino Royale" to 1080p H.264, I was surprised to find that the transcoded 1080p had very different colors than the original. I've used other-transcode to transcode VC-1 BluRay rips to 1080p H.264 at least a dozen times and never noticed this issue until now.
Here's a screenshot of the original 4K HEVC BluRay playing in IINA:
And here's a screenshot of the 1080p file after other-transcode, also playing in IINA:
It seems that transcoding HEVC to H.264 changes the gamma somehow, or at least with my setup and this particular BluRay. Did I miss something in terms of options when transcoding 4K HEVC to 1080p H.264? I tried transcoding again but with the --no-filters option and it did not fix the colors.
It's worth mentioning that playing the BluRay rip with IINA shows the true color balance, but if I use IINA to generate a screenshot, the PNG shows the same color anomalies as the transcoded 1080p file. If I use MacOS's native Screenshot utility instead, the PNG shows the proper colors.
Anyway, not life or death, but definitely weird enough to report back to you in case anyone else is seeing this issue.
FWIW my system is a Skylake i3-6100 running MacOS Mojave, with an AMD 280X GPU (although other-transcode seems to ignore the AMD and just use the Intel HD 530 internal GPU).
Any ideas? Thank you again for all your amazing work!
Corey