Open mehcode opened 4 years ago
With #181 closed and available on the master branch, has there been any interest in this? It would be nice to avoid serde_json::Value
.
Confirmation: I would love this so i can directly export to CSV without parsing, ie dumping the entire rows' values
If you're using Postgres, you can use the COPY
command with .copy_out_raw()
:
let mut stream: BoxStream<sqlx::Result<Bytes>> = pg_pool.copy_out_raw(
"COPY (SELECT * FROM my_table) TO STDOUT (FORMAT CSV)"
).await?;
unfortunately CockroachDB doesn't implement this :) thank you though!
Heh, CockroachDB doesn't implement a lot of things.
if possible, an example that shows how to iterate through all columns in a row of type PgRow / SqlxRow would also be more than sufficient
Is it currently possible to implement this behavior on the consumer side?
When simply trying to decode a column to serde_json::Value i somethings get errors like on VARCHAR columns
error occurred while decoding column 0: trailing characters at line 1 column 3
on INT columns
error occurred while decoding column 3: mismatched types; Rust type core::option::Option<serde_json::value::Value> (as SQL type BINARY) is not compatible with SQL type INT
Using some kind of PostgreSQL trick is sadly not an options as we're stuck on mysql.
Yes, that would be a great addition ! Here is how I work around the problem in the meantime :
struct SerializeRow<R: Row>(R);
impl<'r, R: Row> Serialize for &'r SerializeRow<R>
where usize: sqlx::ColumnIndex<R>,
&'r str: sqlx::Decode<'r, <R as Row>::Database>,
f64: sqlx::Decode<'r, <R as Row>::Database>,
i64: sqlx::Decode<'r, <R as Row>::Database>,
bool: sqlx::Decode<'r, <R as Row>::Database>,
{
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error> where S: Serializer, {
use sqlx::{TypeInfo, ValueRef};
let columns = self.0.columns();
let mut map = serializer.serialize_map(Some(columns.len()))?;
for col in columns {
let key = col.name();
match self.0.try_get_raw(col.ordinal()) {
Ok(raw_value) if !raw_value.is_null()=> match raw_value.type_info().name() {
"REAL" | "FLOAT" | "NUMERIC" | "FLOAT4" | "FLOAT8" | "DOUBLE" =>
map_serialize::<_, _, f64>(&mut map, key, raw_value),
"INT" | "INTEGER" | "INT8" | "INT2" | "INT4" | "TINYINT" | "SMALLINT" | "BIGINT" =>
map_serialize::<_, _, i64>(&mut map, key, raw_value),
"BOOL" | "BOOLEAN" =>
map_serialize::<_, _, bool>(&mut map, key, raw_value),
// Deserialize as a string by default
_ => map_serialize::<_, _, &str>(&mut map, key, raw_value)
},
_ => map.serialize_entry(key, &()) // Serialize null
}?
}
map.end()
}
}
fn map_serialize<'r, M: SerializeMap, DB: Database, T: Decode<'r, DB> + Serialize>(
map: &mut M, key: &str, raw_value: <DB as sqlx::database::HasValueRef<'r>>::ValueRef,
) -> Result<(), M::Error> {
let val = T::decode(raw_value).map_err(serde::ser::Error::custom)?;
map.serialize_entry(key, &val)
}
For those hitting this page...
The above SerializeRow kinda/sorta works. It's a great start.
We're still missing more than a few for at least mysql. Any of the above not already explicitly called out will get shuttled to the default (&str) mapping for the value. This fails in some cases; not all values are valid strings.
It is possible to split up map_serialize and its calling function (decode). When I did that, it seems like we really only have access to a few basic primitives (e.g. f64, i64, bool, etc.). I'm not sure what the full possible list is, but unsigned aren't included. Explicitly, for DATETIME and TIMESTAMP, I attemted to map to an i64 and then use chrono's from_timestamp to build a NaiveDateTime. However, I got values ranging all over the place (year 2049 all the way back to year 1970...and some just failed). I'm pretty sure I'm missing some transformation step in the middle.
I believe it's possible to create a struct, add serde to the struct, and then use query_as to convert the data into a struct record of the row. At that point, it's pretty well documented how to build a json string using serde_json. But this requires updates to the code base each time a query changes and it requires that all of the responses be strictly typed (through serializing structures). For a few different applications, this probably isn't desired.
Any plan to release this feature?
I created a small library for this (postgres only):
@brianbruggeman the query! macro returns anonymous records, I want a simple way to serialize them, would the snippet posted @lovasoa work on SQLite?
I have a pet project to learn Rust: a micro baas built with clap, actix-web, sqlx and all the popular crates. This project is based on pocketdb.
Let's say I have a collections API, each collection is a table in SQLite.
This would be the use case
POST /collections -> CREATE TABLE {name} -> 201 Created -> JSON { message: "Success" } GET /collections/name -> SELECT FROM {name} -> 200 Ok -> JSON { name: [ { ...record1 }, { ...record2 }, { ...record3 } ] } GET /collections/name/id -> SELECT FROM {name} where id = {id} -> 200 Ok -> { ...record1 }
I having trouble wrapping my head on how to stringify a query result without having a struct on build time.
I want a simple way to serialize them, would the snippet posted @lovasoa work on SQLite?
Not sufficiently, but maybe for your use case? You'll have to decide that.
I have a pet project to learn Rust: a micro baas built with clap, actix-web, sqlx and all the popular crates.
This probably isn't a great project for just starting out on Rust. I don't really know your background, but there's a ton of learning just in building a simple clap. Building a microservice-based saas project as a first project probably isn't a good learning experience. But good luck!
I having trouble wrapping my head on how to stringify a query result without having a struct on build time.
The preferred useage of sqlx is to build a new serde-compatible struct (with Debug) for every query. If you need to you can always use printf!("{:?}", result)
to display the debug version, but it's not pretty and it's not something you can effectively use outside of debugging.
This doesn't work, but this is what I had hoped sqlx had implemented. You could, of course fork or add this locally, but this would probably satisfy about 90% of the requests around serialization. serde_json::Value
does have FromRow
implemented, and that can only be added within sqlx.
let url = "sqlite::memory:";
let sql = r#"SELECT * FROM foo"#;
let pool = sqlx::sqlite::SqlitePoolOptions::new()
.max_connections(5)
.connect(url)
.await
.unwrap();
let rows: Vec<serde_json::Value> = sqlx::query_as(sql).fetch_all(&pool).await.unwrap();
// JSONL output
for row in rows {
let json_string = serde_json::to_string(row)?;
println!("{json_string}");
}
}
I would really like to have this feature. At the moment I am doing this for MySql DB
match type_info.name() {
"FLOAT" => {
match row.try_get::<Option<f32>, usize>(key.ordinal()){
Ok(val) => {map.insert(key.name().to_string(), json!(val));},
Err(e)=> {return ResponseEnum::Error{tag: tag.clone(), error:Box::new(e)};}
}
},
"DOUBLE" => {
match row.try_get::<Option<f64>, usize>(key.ordinal()){
Ok(val) => {map.insert(key.name().to_string(), json!(val));},
Err(e)=> {return ResponseEnum::Error{tag: tag.clone(), error:Box::new(e)};}
}
},
"DATE" | "DATETIME | TIMESTAMP" => {
return ResponseEnum::Error{tag: tag.clone(), error:Box::from("Time fields DATE, DATETIME, TIMESTAMP etc cannot be returned currently.")};
},
_ => {
match row.try_get::<Option<String>, usize>(key.ordinal()){
Ok(val) => {
println!("_deb here");
map.insert(key.name().to_string(), json!(val));
},
Err(e)=> {return ResponseEnum::Error{tag: tag.clone(), error:Box::new(e)};}
}
}
...
I can't figure out how to get dates to work. My use case is json -> sql query -> json so have arbitrary query/response.
This is what my version in SQLPage currently looks like: https://github.com/lovasoa/SQLpage/blob/main/src/webserver/database/sql_to_json.rs#L43-L91
@lovasoa Thanks so much for that, works!
Idea: It would be interesting to select rows from the database and instantly and easily serialize to a data format.
Blocked on #181