oasp / oasp4j

The Open Application Standard Platform for Java
Apache License 2.0
60 stars 303 forks source link

Add CSV mapper for REST server #606

Open mathieu-lavigne opened 6 years ago

mathieu-lavigne commented 6 years ago

A default JSON mapper is already bundled with OASP4J. It may also include a default CSV mapper.

This CSV mapper should :

CSV format

HTTP Headers

What we send

Content-Type: text/csv

We are sending CSV.

Content-Columns: col1, col2

Order and column names (as seen by Jackson). It should correspond to Eto fields. Column name can differ from field name with @JsonProperty annotation.

If not specified client should include column names at the first line of its body.

Content-Column-Separator: ;

CSV default separator is , but client may use another one.

If not specified server should use , separator.

Content-Quote-Char: "

CSV default quote char is " but client may use another one.

If not specified server should use " quote char.

What we want to receive

Accept: text/csv

We want to receive CSV.

Accept-Columns: col1, col3

Order and column names we want to receive. See Content-Columns. The Content-Columns of the server response must exactly match this client header.

If not specified server may use whatever columns it wants to include in its response.

Accept-Column-Separator: ;

Same as Content-Column-Separator when we want the server to use a specific CSV separator.

If not specified server should use , separator or server should include a correct Content-Column-Separator to specify which separator is used.

Accept-Quote-Char: "

Same as Content-Quote-Char when we want the server to use a specific CSV quote character.

If not specified server should use " quote character or server should include a correct Content-Quote-Char to specify which separator is used.

CsvFormat Object

Here is what can be configured :

Usage

You can create a default CSV Format :

final CsvFormat base = new CsvFormat();

And create another one based on this one :

final CsvFormat format = base.withColumns("code,comment").withNullValue("");

Then we may read some CSV content :

final File file = new File("...");
final Eto readEto = format.readValue(file, Eto.class); // Eto class annotated with @JsonFilter(CsvFormat.FILTER)

Or write Eto to a CSV file :

final File out = new File("...");
format.writeValue(out, eto);
hohwille commented 6 years ago

Just stumbled over your work here. Nice feature enhancement. I wanted to suggest that you put that code to a separate module but as it seems you already did all that (csv module) so it perfectly fits to the new opt-in strategy we are following. Happily awaiting your PR when you are done 👍

hohwille commented 6 years ago

BTW: Are you also proposing this for large arrays to reduce overhead (we discovered some bandwidth issues for larger array datasets with JSON as all the property names are redundantly repeated). As it seems your solution is based on jackson and therefore perfectly integrated. Excited to see this coming...

mathieu-lavigne commented 6 years ago

Hi Jörg and thanks for your encouraging comments !

This feature was intended to filter out columns we don't want to see in a CSV file. In fact Jackson parses every property and this may throw exceptions if the property is a composite object. The CSV values must always be simple object like Strings.

Indeed it seems it can be used to reduce overhead for large arrays but it hasn't been tested this way.

mathieu-lavigne commented 6 years ago

The PR has been created here

Still I do not understand why the build does not work...

Edit : resolved by ignoring not implemented test CsvProviderTest