Open abatista75 opened 7 years ago
internally, we have a thrift idl based system that has not been open sourced yet. json schema and swagger is far from a complete solution.
In our case, Jason is used to exchange data between systems (not through REST API but queue-based system).
Now, we use a Json-schema validation (in java) and it works fine. However, each json buffer must be parsed twice (schema validator + Jason-iteratot) which impacts performances.
From a functional point of view, Jain-schema is simple and powerful (even if it is not as XML-based solutions are). We defined our own Json types (hexadecimal-only content, custom date format, optional/required fields, fields that can contain several types...) in addition to the default ones.
All the format-based requirement are now checked by the Json schema validator, allowing the app itself to be simpler to maintain and evolve, and to focus only on business content analysis.
We don't like thrift as any change in the content rules requires a java code update (ex: extend the max length of a field).
Moreover, Json-iterator allows Json binding to java class (turning Thrift irrelevant) and Swagger focus mainly on REST api (which does not apply to us).
My wish would be to have Json-iterator checking the Json schema:
Any other proposal?
json-schema to generate the class definition. then bind json input to the class. which should cover most the validation requirement. Then from the object bind, further business level validation can be carried out. doing the validation and binding in the same process will not save any cpu resource.
I raise again that feature request. The use of a json schema is to split the business logic (depending on the application) and the format validation, as it allows the use of a quite powerful format checking without any code.
Let's take our use case: We receive several message coming from our partners. Of course, they must follow our specifications but we cannot trust them. Therefore, any input data must be checked before processing:
We have 3 options: 1/ Either we implement all the rules in the Java application => Too costly, can only be done by the developer, any change requires a new application deployment and validation. 2/ We use a Json validator library => That's our current decision, but the json file is parsed twice 3/ Json-iterator implements a json-schema validation feature, allowing the check of the json when loading it => That's the current change request.
I understand the last scenario will not save CPU resource, but the json-iterator is much more optimized as other json library. So, I assume the json-validation feature could also be faster ;-)
Here is what I plan to support it:
btw: What is your current JSON library?
Sounds great.
We used the everit.org library (https://github.com/everit-org/json-schema). However, the networknt seems faster for big json files (https://github.com/networknt/json-schema-validator) but it is based on Jackson and we did not want to use 2 json libraries.
I am the author of networknt/json-schema-validator and thinking about switching to either DSL or JsonIter from Jackson. Not only that, I have several microservices frameworks and they are all depending on Jackson now and need to be switched. I am still in a process to evaluate both DSL and JsonIter. I like the Any in JsonIter but haven't found an example to serialize an object to ByteBuffer yet. Also, do we have something like this? https://github.com/ngs-doo/dsl-json/blob/master/java8/src/main/java/com/dslplatform/json/ResultSetConverter.java
Thanks for the great work.
I recently discovered json-iterator and it rocks: simple, efficient, quite powerful binding.
However, the competition is hard and the business needs are increasing day after day. So, a very great upgrade would be the capability to manage json-schema directly within than lib (http://json-schema.org).
It will allow to perform consistency check outside java source code in a simple way. We already use JSON Schema and it simplifies a lot our developments (as many of our json files comes from external partners and we must be sure about their consistency before inserting them in our databases).
PS : That post is a change request, not an issue ;-)