Open tgross35 opened 11 months ago
Actually, maybe we could use the rustdoc JSON output? It is unstable, but it includes the following:
"0:3:1735": {
"id": "0:3:1735",
"crate_id": 0,
"name": "I_AM_A_CONST",
"span": {
"filename": "rcrypto/src/lib.rs",
"begin": [
1,
0
],
"end": [
1,
38
]
},
"visibility": "public",
"docs": null,
"links": {},
"attrs": [],
"deprecation": null,
"inner": {
"constant": {
"type": {
"primitive": "usize"
},
"expr": "_",
"value": "63_737usize",
"is_literal": false
}
}
},
from this
pub const I_AM_A_CONST: usize = foo();
/// export thing
pub const fn foo() -> usize {
63737
}
Yes, rustdoc can do it! I needed to move bindings generation to a new binary crate (Cargo deadlocks if you call cargo rustdoc
from within build.rs
...) and this only works for integers, but the recipe is here
My consts are a mess, generated in macros
/// Length of the nonce (initialization vector) for
#[doc = $name]
pub const $noncebytes: usize = <$alg as aead::AeadCore>::NonceSize::USIZE;
/// Length of the key for
#[doc = $name]
pub const $keybytes: usize = <$alg as aead::KeySizeUser>::KeySize::USIZE;
/// Length of the MAC (tag) for
#[doc = $name]
pub const $macbytes: usize = <$alg as aead::AeadCore>::TagSize::USIZE;
But using this strategy I get the full correct thing:
#define RC_AEAD_AES128GCM_NONCEBYTES 12
#define RC_AEAD_AES128GCM_KEYBYTES 16
#define RC_AEAD_AES128GCM_MACBYTES 16
#define RC_AEAD_AES256GCM_NONCEBYTES 12
#define RC_AEAD_AES256GCM_KEYBYTES 32
#define RC_AEAD_AES256GCM_MACBYTES 16
#define RC_AEAD_CHACHA20POLY1305_NONCEBYTES 12
#define RC_AEAD_CHACHA20POLY1305_KEYBYTES 32
#define RC_AEAD_CHACHA20POLY1305_MACBYTES 16
#define RC_AEAD_XCHACHA20POLY1305_NONCEBYTES 24
#define RC_AEAD_XCHACHA20POLY1305_KEYBYTES 32
#define RC_AEAD_XCHACHA20POLY1305_MACBYTES 16
#define RC_SECRETBOX_KEYBYTES 24
#define RC_SECRETBOX_NONCEBYTES 32
#define RC_SECRETBOX_MACBYTES 16
main.rs
for the crate that does this (requires serde_json
and rustdoc_json
):
use serde_json::Value;
use std::fmt::Write;
use std::fs::File;
use std::{env, path::Path};
const HEADER_BASE: &str = "/*
* This file is automatically generated upon running
* `cargo +nightly build`. Do not modify by hand.
*/";
fn main() {
let this_dir = env::var("CARGO_MANIFEST_DIR").unwrap();
let target_lib_dir = Path::new(&this_dir).parent().unwrap().join("rcrypto-lib");
let json_path = rustdoc_json::Builder::default()
.toolchain("nightly")
.manifest_path(target_lib_dir.join("Cargo.toml"))
.build()
.unwrap();
let reader = File::open(json_path).unwrap();
let json: Value = serde_json::from_reader(reader).unwrap();
let index = json.as_object().unwrap()["index"].as_object().unwrap();
let consts = index
.values()
.filter(|v| v.get("visibility").map(|v| v.as_str().unwrap()) == Some("public"))
.filter_map(|v| {
// Messy, but we first check if `entry.inner.constant` exists (indicating a constant)
// then pair that constant's value with its name
v.get("inner")
.and_then(|v| v.as_object().unwrap().get("constant"))
.map(|c| {
(
v.get("name").unwrap().as_str().unwrap(),
c.get("value").unwrap().as_str().unwrap(),
)
})
});
let mut header = HEADER_BASE.to_owned();
header.push('\n');
for (cname, cval) in consts {
// We only focus on integers, so trim the suffix from e.g. `0usize`;
let new_cval = cval.trim_end_matches(char::is_alphabetic);
writeln!(header, "#define {cname} {new_cval}").unwrap();
}
cbindgen::Builder::new()
.with_crate(target_lib_dir)
.with_no_includes()
.with_sys_include("stdint.h")
.with_parse_expand(&["rcrypto-lib"])
.with_language(cbindgen::Language::C)
.with_cpp_compat(true)
.with_header(header)
.generate()
.expect("failed to generate C bindings")
.write_to_file("../rcrypto.h");
}
It seems like cbindgen unfortunately hits a wall in converting
const
s to#define
s when it involves expansion or function evaluation. But would it be possible for bindgen to compile a shim and extract those values? For example, I have a crate with this:It seems like Bindgen could turn this into:
compile it, and read the result. It's a bit tricky but seems not impossible?
There is of course a bootstrapping problem when using the library, i.e. the build script can't complete without already having a built library. But it would be usable via CLI or from a separate crate.