The OS X toolchain appends an underscore in front of symbols by default. For example, the symbol for printf is actually _printf, and if we emit a symbol _D3std… in LLVM, it becomes __D3std….
This seems to make sense, since GDB also expects the underscore and displays it as _D3std…. However, DMD emits an underscore less for extern(D) functions, thus this either needs fixing on our or on DMD side. I'm inclined to say that the DMD behavior should change, because it causes e.g. symbols in GDB to display as D3std…, whereas the D ABI documentation explicitly specifies the underscore as part of the symbol name.
The OS X toolchain appends an underscore in front of symbols by default. For example, the symbol for
printf
is actually_printf
, and if we emit a symbol_D3std…
in LLVM, it becomes__D3std…
.This seems to make sense, since GDB also expects the underscore and displays it as
_D3std…
. However, DMD emits an underscore less forextern(D)
functions, thus this either needs fixing on our or on DMD side. I'm inclined to say that the DMD behavior should change, because it causes e.g. symbols in GDB to display asD3std…
, whereas the D ABI documentation explicitly specifies the underscore as part of the symbol name.