Open dcvz opened 1 year ago
Hi @dcvz, nice to meet you!
LPWStr
Isn't W Utf16
instead of Utf8
? And shouldn't it be WChar
instead of Char
?
Hi @dcvz, nice to meet you!
LPWStr
Isn't W
Utf16
instead ofUtf8
? And shouldn't it beWChar
instead ofChar
?
Nice to meet you @dcharkes! Great stuff here in ffi!
I have also tried to WChar to no avail, although I don’t have a branch to play with for that. Should I share one?
Any tips on debugging what ffi does internally to check out the pointer contents?
I have not completely figured things out but using Utf16 gets me further than before.
It seems there's some difference between ffi.WChar
and Utf16
.
Also, I only need this type on windows. Is there compile time defines? So I can set for example the TCHAR typedef to one thing on windows and another on everything else?
Also, I only need this type on windows. Is there compile time defines?
https://api.dart.dev/stable/2.16.0/dart-ffi/AbiSpecificInteger-class.html
It seems there's some difference between ffi.WChar and Utf16.
Yes, look at the definitions for both types:
/// The C `wchar_t` type.
///
/// The signedness of `wchar_t` is undefined in C. Here, it is exposed as the
/// defaults on the tested [Abi]s.
///
/// The [WChar] type is a native type, and should not be constructed in
/// Dart code.
/// It occurs only in native type signatures and as annotation on [Struct] and
/// [Union] fields.
@Since('2.17')
@AbiSpecificIntegerMapping({
Abi.androidArm: Uint32(),
Abi.androidArm64: Uint32(),
Abi.androidIA32: Uint32(),
Abi.androidX64: Uint32(),
Abi.fuchsiaArm64: Uint32(),
Abi.fuchsiaX64: Int32(),
Abi.iosArm: Int32(),
Abi.iosArm64: Int32(),
Abi.iosX64: Int32(),
Abi.linuxArm: Uint32(),
Abi.linuxArm64: Uint32(),
Abi.linuxIA32: Int32(),
Abi.linuxX64: Int32(),
Abi.linuxRiscv32: Int32(),
Abi.linuxRiscv64: Int32(),
Abi.macosArm64: Int32(),
Abi.macosX64: Int32(),
Abi.windowsArm64: Uint16(),
Abi.windowsIA32: Uint16(),
Abi.windowsX64: Uint16(),
})
class WChar extends AbiSpecificInteger {
const WChar();
}
WChar is defined as an integer with differing sizes on different platforms. Utf16 is defined as an opaque type (Pointer<Utf16>
is similar to Pointer<Void>
or void*
, you can't look into it.)
Hello team,
Excuse the confusing title, but this is an issue that I'm not quite sure where it stems from and just have guesses. I'm working on migrating flutter_storm, a bridge for [StormLib](), from using native bindings to FFI. However I'm running into a an issue (only on Windows).
If you try the example project in the
flutter_storm
and try to open a test archive (Test.mpq.zip), it'll fail on Windows. After some research I found a similar issue on their issue tracker: https://github.com/ladislav-zezula/StormLib/issues/260 which ended up being an issue between usingLPStr
andLPWStr
which led me to explore how FFI is doing string conversions.ffigen
generated uses ofPointer<ffi.Char>
for me -- I've also tried using PointertoNativeUtf8().cast<Char>()
.Here's the branch of
flutter_storm
where I'm doing the migration to ffi: https://github.com/HarbourMasters64/flutter_storm/tree/feature/ffiHere's the branch where I've attempted to use
Pointer<Utf8>
to no avail: https://github.com/HarbourMasters64/flutter_storm/tree/feature/ffi-utf8