Open SICKSAUCE opened 2 months ago
I'm also trying to reverse engineer the spreadsheet.
Looking at the example diffs (here and here), it seems like the general algo can be reduced to something like:
1) Take your input hex code, convert it from hex to binary 1) Determine which one of the below your TV supports:
AB18:AB21
1) Convert your new bits back into hexI don't currently understand enough about the EDID spec encoded in the hex format to know for certain, but it seems like the bits we need to manipulate are always in the 5th and 6th "characters" of the hex code. Specifially, they are the last 2 bits in the binary representation of the 5:6 hex range.
Is this all correct?
If so, is it possible to reduce this to a short python script that does the above?
Something like:
import re
class DolbyModes:
LLDV = int('00', 2)
LLDV_and_LLDV_HDMI = int('01', 2)
STD_and_LLDV = int('10', 2)
STD_and_LLDV_and_LLDV_HDMI = int('11', 2)
def hex_to_int(hex: str):
return int(hex, 16)
def int_to_hex(num: int):
return f'{num:x}'
def ones(bits: int):
return (1 << bits) - 1
def enable_dolby(hex: str, dolby_mode: int):
assert len(hex) == 14
hex_chunks_index = 2 # the index containing the dolby values
# split into chunks of 2 characters
# https://stackoverflow.com/a/9477447
hex_chunks = re.findall('..', hex)
# set the relevant bits for `dolby_mode`
dolby_bits = hex_to_int(hex_chunks[hex_chunks_index])
new_dolby_bits = dolby_bits
new_dolby_bits &= (ones(new_dolby_bits.bit_length() - 2) << 2)
new_dolby_bits |= dolby_mode
hex_chunks[hex_chunks_index] = int_to_hex(new_dolby_bits)
return ''.join(hex_chunks)
def run_tests():
samples = (
('480376825e6d95', '480377825e6d95', DolbyModes.STD_and_LLDV_and_LLDV_HDMI), # https://github.com/balu100/dolby-vision-for-windows/blob/e393cbb47571e9053db4273e1cd62bb6ce162a21/README.md?plain=1#L23
('4403609248458f', '4403619248458f', DolbyModes.LLDV_and_LLDV_HDMI), # https://github.com/balu100/dolby-vision-for-windows/issues/1#issuecomment-2188170968
)
for hex_input, expected_hex_output, dolby_mode in samples:
hex_output = enable_dolby(hex_input, dolby_mode)
assert hex_output == expected_hex_output
I've been trying to figure it out too. In my case I'm working with an LG C2 (480a7e86607694)
.
I read around on https://discourse.coreelec.org/t/edid-override-injecting-a-dolby-vsvdb-block/51510?page=1, which is the domain @djnice linked the file from in the other thread. Digging around I can upload the exported CRU .bin file here: https://people.freedesktop.org/~imirkin/edid-decode/
This will feed us the EDID in a readable format.
Under Vendor-Specific Video Data Block I found some information. I can see: Interface: Standard + Low-Latency
. That tells me my TV supports Std + LLDV. But some of it is confusing for me. I.e. Supports 10b 12b 444: Not supported
. Does this mean 10b is supported but 12b is not, or that neither is supported? Then there's other values for which I do not know what they are named in the EDID, like Backlight Ctrl, etc.
Thanks for the link to edid-decode; I had not noticed it earlier.
And yeah, I'm struggling to interpret all these values as an EDID newb.
My hex code is 48039e5898aa5c
(Sony A95L), and the edid-decode also shows Interface: Standard + Low-Latency
I suspect we should be using/updating the EDID to be Std + LLDV + LLDV-HDMI
.
I started wondering about the above after noticing that using the previously posted enable_dolby
function with DolbyModes.STD_and_LLDV
has no effect on the "enabled" hex code. This...
def run_attempts():
hex_codes = (
'4d4e4a725a7776', # https://github.com/balu100/dolby-vision-for-windows/issues/2#issue-2506429409
'480a7e86607694', # https://github.com/balu100/dolby-vision-for-windows/issues/2#issuecomment-2330407252
'48039e5898aa5c', # my Sony A95L
)
for dolby_mode_name in (
'STD_and_LLDV',
'STD_and_LLDV_and_LLDV_HDMI',
):
for hex_code in hex_codes:
dolby_mode = getattr(DolbyModes, dolby_mode_name)
new_hex_code = enable_dolby(hex_code, dolby_mode)
if new_hex_code == hex_code:
print(f"No difference calculated with `enable_dolby('{hex_code}', DolbyModes.{dolby_mode_name})`")
else:
print(f"Modify hex code from '{hex_code}' to '{new_hex_code}' to enable DolbyModes.{dolby_mode_name}")
... outputs this...
No difference calculated with `enable_dolby('4d4e4a725a7776', DolbyModes.STD_and_LLDV)`
No difference calculated with `enable_dolby('480a7e86607694', DolbyModes.STD_and_LLDV)`
No difference calculated with `enable_dolby('48039e5898aa5c', DolbyModes.STD_and_LLDV)`
Modify hex code from '4d4e4a725a7776' to '4d4e4b725a7776' to enable DolbyModes.STD_and_LLDV_and_LLDV_HDMI
Modify hex code from '480a7e86607694' to '480a7f86607694' to enable DolbyModes.STD_and_LLDV_and_LLDV_HDMI
Modify hex code from '48039e5898aa5c' to '48039f5898aa5c' to enable DolbyModes.STD_and_LLDV_and_LLDV_HDMI
I opened a PR to hopefully address this issue: https://github.com/balu100/dolby-vision-for-windows/pull/3
The enable_dolby_vision_hdmi.py
script in that PR is working for my Sony A95L.
@SICKSAUCE and @robb213 -- Can you consult the updated README and confirm if the pre-computed
values work for your TV models?
Separately, I have another question: is it possible for this EDID hack to support other media players? I naively thought that it would work across the OS, but it seems to only work for Windows Media Player. I've tried Jellyfin Media Player and VLC, but neither are able to play DV.
Separately, I have another question: is it possible for this EDID hack to support other media players? I naively thought that it would work across the OS, but it seems to only work for Windows Media Player. I've tried Jellyfin Media Player and VLC, but neither are able to play DV.
I have the same results as you. Additionally, I installed the Dolby Vision app from the MS Store (via AdGuard). @sam-6174 Have you installed the app (latest .appxbundle
) and noticed any changes?
EDIT: I have doubts this will ever work system-wide, presumably due to lacking the embedded license in the hardware. My test case was a DV supported game, Guardians of the Galaxy. DV does not kick on.
I think guardians of the galaxy doesn't have dolby vision on pc. Also I used the python script on my PUS7304 (Philips TV) and I get this "Warning: video_hex
of '459f4ba43b26bb' is already enabled with LLDV-HDMI", what this means that only supports low latency dolby vision? I want to enable TV-LED ("real dolby vision mode"), Is this possible with my TV?
EDIT: In the link robb213 posted (https://people.freedesktop.org/~imirkin/edid-decode/) says my tv supports Interface: Standard + Low-Latency + Low-Latency-HDMI.
Hi @sam-6174, thanks a lot for your reply !
I tried the value you got for me, and it seems to work, at least partially. The HDR certification in Windows parameters shows "Dolby Vision", and I get a "Dolby Vision" watermark when lauching a DV movie in Energy Media Player, which wasn't happening before.
However, it is the only player that shows that watermark, although I have the HEVC and Dolby Vision extensions. Also, this watermark seems to appear in the player and is not displayed directly by my TV like when I play a DV file inside a TV app, and all 3 DV extension presets look the same.
Finally, I normally get special image parameters on my TV while it is playing DV stuff, but here I still simply have the HDR10 settings, like usually.
I think guardians of the galaxy doesn't have dolby vision on pc. Also I used the python script on my PUS7304 (Philips TV) and I get this "Warning:
video_hex
of '459f4ba43b26bb' is already enabled with LLDV-HDMI", what this means that only supports low latency dolby vision? I want to enable TV-LED ("real dolby vision mode"), Is this possible with my TV?EDIT: In the link robb213 posted (https://people.freedesktop.org/~imirkin/edid-decode/) says my tv supports Interface: Standard + Low-Latency + Low-Latency-HDMI.
Based on my understanding of the hex code (from reverse engineering the spreadsheet), your hex code is already enabled for Standard + Low-Latency + Low-Latency-HDMI
.
The explanation is...
The 4b
is the part of the hex that contains dolby vision info:
459f4ba43b26bb
^^
(It's always the "3rd pair.")
Convert 4b
from hex to binary and we get 1001011
.
The last 2 bits are what define the values for Standard and/or Low-Latency and/or Low-Latency-HDMI
:
1001011
^^
The last 2 bits correspond to the values of DolbyModes
mentioned above.
Given that this repo only cares about toggling the HDMI piece of the equation, we only need to set the last bit, and we can ignore the second-to-last bit.
I think guardians of the galaxy doesn't have dolby vision on pc. Also I used the python script on my PUS7304 (Philips TV) and I get this "Warning:
video_hex
of '459f4ba43b26bb' is already enabled with LLDV-HDMI", what this means that only supports low latency dolby vision? I want to enable TV-LED ("real dolby vision mode"), Is this possible with my TV?EDIT: In the link robb213 posted (https://people.freedesktop.org/~imirkin/edid-decode/) says my tv supports Interface: Standard + Low-Latency + Low-Latency-HDMI.
Yes, my oversight with the game. The Dolby gaming page I was on was a landing page for their Dolby in games in general. While it seems easier to find games that support Atmos, I can't really find any lists that show DV for PC games. Battlefield 1 may, and Metro Exodus may as well, but only for XSX. Will try both and edit the post with an update.
The only games that have dolby vision on pc are Mass Effect Andromeda, Need for speed Heat, Battlefield 1 and Anthem.
I tried out BF1. Switching to DV in-game appears to do nothing for me. The TV does not seem to reflect the mode either.
The only games that have dolby vision on pc are Mass Effect Andromeda, Need for speed Heat, Battlefield 1 and Anthem.
Anthem does not, and for others Nvidia removed support
Hi !
I'm really sorry to bother you but I don't understand how to use the excel sheet at all... Could you explain me or help me get the correct hex string for my case ?
My Vendor-Specific Video code is 4d4e4a725a7776. It is a TCL C825.
Thanks !