Closed iHimiko closed 7 years ago
I don't think that the library itself need to be built as Unicode, because just the Textoverlay need support for it.
The Textoverlay backend itself just support 7 Bit, but can support 8 Bit with a bit of modification. Unicode/16 Bit need a big change because atm the backend creates one large character-sheet, for each font that is created (no sharing). I don't think this would work with 16 Bit because of RAM usage and the performance to generate it. I would suggest for 16 Bit support, to implement a shared bitmap for each character. So only characters that are needed would be created. I think this would work, because you don't use each character of Unicode. (@agrippa1994 I think this should be good for overall performance)
For the second problem I would like to see an example (screenshots and the code to reproduce it) of it to understand the problem.
See #30 also for the Unicode issue.
tbh, we need just support for showing unicode strings. if it can be done with custom ttf font file extracted from game resources (not installed system-wide) - it will be the best. for (2) - it was calling program issue, fixed already
@Gl0 I look into it.
Edit: I created a prototype in the last hours, but I think there are some major bugs with formatting. I will look into that in the next days.
Here's the current progress:
@iHimiko @Gl0 Here are the binaries for Unicode support: https://github.com/shadowlif/DX9-Overlay-API/tree/unicode/bin
Please notice to use TextCreateUnicode, TextSetStringUnicode and TextUpdateUnicode for fully support. See here: https://github.com/shadowlif/DX9-Overlay-API/blob/unicode/include/c%23/Overlay.cs
At the moment there is no texture sharing, so the performance might be in comparison low. I would like to hear some feedback, so that Unicode support can be implemented in this repo.
Seems working, thanks. But faced with another problem - sometimes GetScreenSpecs() return zero point (I use it to check, whether hook was successful and I can draw anything). Only game restart helps.
Maybe there's an fake window of the game? Because GetScreenSpecs return the size of the viewport. The viewport is directly from an DirectX9 call, so it should be reliable.
May be it shows something dummy while loading and it get hooked. How can I force window reselect? Now just doing every second:
// check device for ready draw
private static bool CheckForDevice()
{
DxOverlay.SetParam("process", "TERA.exe");
return DxOverlay.GetScreenSpecs().X < 200;
}
atm I don' think you can force the API to use another window after the first API call because the API just have one IPC-Channel for every process, so just the first opened process is connected.
You can maybe try window and use_window to 1 instead of the process name. Maybe the "invisble window" (if it exists), don't have the name of the real game window.
Maybe we can add an parameter to set the pid. So you could check by yourself if its the real process and then attach the API to it.
Process should be correct one, it always was only one instance. But I'll try with window, may be it will be more reliable. At least first 10 attempts of starting game worked fine.
Hey, guys! I liked your implementation, but there are some issues related to rendering text and other elements of overlay. (1) No Unicode support (2) When drawing multiple elements, the one that should be in the background will go to the front of the cover, and it should be visible
I could build a library with Unicode support, but there are no files in the source, such as "BOOST/FILESYSTEM.HPP" and the rest of the boost. I hope for your help.