Open SteadBytes opened 4 years ago
I'm curious about the "High Level Goals". With the exception of defining models using JSON Schema, fastapi https://fastapi.tiangolo.com was created to meet the high level goals. Is RESTX is aiming to become a synchronous version of fastapi?
@vdwees There are many frameworks working toward those high level goals. marshmallow has a setup built on Flask already. there are similar frameworks out there on top of Django, Bottle, Hug, etc (actually, Hug might be compliant out of the box...it's been a little bit since I used it). Some more (ha) examples include marshmallow-code/flask-smorest, connexion, pyramid w/ pyramid_openapi3 plugin, etc.
So, no, i don't think we're a synchronous version of fastapi. If anything, we're closest to flask-smorest, especially if we decide to use marshmallow for generating models. That's part of what makes the Python community so great and frustrating all at once. See a problem? Make a framework or library, and put it out there! did someone else do it something similar? Probably! But that's ok, cause maybe yours does it a little bit differently!
On topic now, we should probably split up doing some research on different modeling libraries, coming up with good pros/cons of each, and then coming to a decision. Some things we should consider:
We should also consider the "write it ourselves" approach, taking into account the time it'd take to do well, and if we can personally maintain it to the level of other libraries.
Agreed with @j5awry on the fastapi point, these are two different projects and I'm fairly certain flask-restplus existed before fastapi so it anything it's the other way round 😉
I agree with your dissesction there @j5awry , I'm hoping to take a first step in looking at 5. and/or write it ourselves approach this weekend (I was planning to last week but got dragged into the failing CI issues instead 🤷♀️)
Like I said already, I trust you folks on the swagger/openapi part. I believe you know about this subject way more than me.
Anyway, in my view, we will soon be facing a dilemma:
So I'd suggest we release v1.0.0 with the removal of python <= 3.5 support and a clear deprecation notice for the requparse usage, but we should keep maintaining a 1.x branch for let's say 1 year where we would ship bugfix and compatible improvements.
And in parallel, we can go ahead and release a v2.0.0 where we simply drop support for reqparse and we start working on an alternative.
My personal opinion is that we should use an external package like Marshmallow for model validation and schema generation. Community around Marshmallow is really good and you can see day to day activity on that project.
I flask-restx since it was forked from the original flask-restplus. You also have a really good community, but I think there are not enough people to cover creating a "new marshmallow", integration with existing Marshmallow should be enough for now.
I'm open for the discussion and I also have a little bit of free time so I can help you with the integration if you choose to go that way.
I've been experimenting a little bit with what API we could provide using. This is a modified version on the todo_blueprint.py
example aiming to explore a possible user-facing API for the models/validation with Marshmallow (not tied to Marshmallow, just where I've decided to start).
The idea is to keep the existing @api.marshal
(for responses) and @api.expect
(for requests) API but use Marshmallow Schema. Behind the scenes, the validation, marshalling and OpenAPI spec generation will take place.
Disclaimer: This is not meant to be a good example of implementing a REST API, just exploring the model definition/marshalling/request parsing API.
Any thoughts on the API? (not so much the underlying implementation)
from uuid import uuid4
from flask import Blueprint, Flask
from marshmallow import Schema, fields
from flask_restx import Api, Resource
api_v1 = Blueprint("api", __name__, url_prefix="/api/1")
api = Api(api_v1, version="1.0", title="Todo API", description="A simple TODO API")
ns = api.namespace("todos", description="TODO operations")
# Class-style schema
class Task(Schema):
description = fields.String(required=True, description="The task details")
class Todo(Schema):
id = fields.String(required=True, description="The todo ID")
task = fields.nested(Task, required=True)
completed: fields.Boolean(required=True)
created_at = fields.DateTime(required=True, format="iso")
completed_at = fields.DateTime(format="iso")
# Dictionary-style schema (unchanged from user API perspective)
Todo = api.model(
"Todo",
{
"id": fields.String(required=True, description="The todo ID"),
"task": fields.nested(Task, required=True),
"complete": fields.Boolean(required=True),
"created_at": fields.DateTime(required=True, format="iso"),
"completed_at": fields.DateTime(format="iso",),
},
)
# Dummy definitions for demonstration purposes
def db_get_todo_or_abort(todo_id: str) -> dict:
""" Fetch Todo from DB by id, abort with `404` if not found. """
pass
@ns.route("/<string:todo_id>")
@api.doc(responses={404: "Todo not found"}, params={"todo_id": "The Todo ID"})
class Todo(Resource):
"""Show a single todo item and lets you delete them"""
@api.marshal_with(Todo)
def get(self, todo_id):
"""Fetch a given resource"""
# Dict returned here is passed through Marshmallow schema for validation
# and marshalling to JSON in the response
data = db_get_todo_or_abort(todo_id)
return data
@api.expect(
Task, location="form",
)
# Could also define inline as a Dict
# @api.expect(
# {"task": fields.String(...)}, location="form",
# )
@api.marshal_with(Todo)
def put(self, todo_id, task):
"""Update the task of a Todo"""
# args automatically validated and passed in as an instance of Task -
# no parser.parse_args
data = db_update_todo(todo_id, task) # Use the Marshmallow schema directly
# Todo Marshmallow schema returned is automatically marhsalled into JSON
# in the response
return Todo(**data)
@ns.route("/")
class TodoList(Resource):
"""Shows a list of all todos, and lets you POST to add new tasks"""
# Define some query params for filtering
@api.expect(
{"completed": fields.Boolean(), "completed_at": fields.DateTime(format="iso")},
location="query",
)
# Marshal a list of a given schema
@api.marshal_list_with(Todo)
def get(self, args):
"""List all todos"""
# Imagine this uses the filters in args properly to filter the result set
# from the DB...
return db_get_all("todo", filters=args)
# Validate args with NewTodo schema then dump to a dict before passing
# into handler
@api.expect(Task, location="json", as_dict=True)
@api.marshal_with(Todo, code=201)
def post(self, args):
"""Create a todo"""
# Just an example, I don't want to start a flame war around which values
# should be used for IDs in a DB!
todo_id = uuid4()
# Do some work with args
task = sanitise_input(args["task"])
data = db_create_todo(str(todo_id), task)
return data, 201
if __name__ == "__main__":
app = Flask(__name__)
app.register_blueprint(api_v1)
app.run(debug=True)
Any comments on that initial API example? I've got some ideas about the implementation however it's going to be a significant amount of work to do properly so I don't want to get started unless we're generally happy with the user-facing API 😊
The core Namespace
/Api
/Swagger
classes are tightly couple to the Model
implementation at the moment so I imagine it's not going to be a simple process 😅 Current extensions to integrate Marshmallow (e.g. flask-accepts) generallt convert the Marshmallow schema to flask-restx models/reqparse objects. However this work will be largely replacing models/reqparse so that we can fix a whole host of issues! 😁
I like the idea though I would need more details on this part:
# args automatically validated and passed in as an instance of Task -
# no parser.parse_args
Do you have an idea of how we can split this work to help you out?
I also have a concern about one specificity of restplus/restx: the Wildcard
field.
It is documented here. @sloria do you know if there is an equivalence in marshmallow?
In marshmallow, the handling of unknown fields is configurable. https://marshmallow.readthedocs.io/en/stable/quickstart.html#handling-unknown-fields
There isn't a built-in way to do the globbing feature of wildcard, but using the above setting you can accept unknown fields.
Also, there's fields.Dict
for nested, unstructured data: https://marshmallow.readthedocs.io/en/stable/api_reference.html#marshmallow.fields.Dict, but I'm not sure that meets the same use case.
Thanks for the quick reply.
My understanding is that unknown fields don't get parsed/validated.
The fields.Dict
look interesting but like you said, I don't think it is what we need either.
I'll run some tests though.
So, one thing that's bothered me about "wildcard" is that it's not really what it is in json schema (which is where restx goes before openapi). It's "technically" a regex, right? So I think we need to look at how jsonschema + openapi deal with regexed value. I'm fairly certain what we'll see is that openapi doesn't actually handle regex values for object names (which is to say in my quick search, I'm not seeing that...)
I need to give more time to reading this and looking at things. One thing that might help us ensure we're capturing the correct things is filling out some user stories, then ensuring the interface matches the expectations there.
I don't think such equivalent exist in the specs either and that's what lead to #57.
But I know of several usecases for the Wildcard
fields. Now, since I'm the original author of this feature I can't argue that those usecases are bad so if you think we shouldn't support them, then I'll deal with it.
Here was the original feature request: https://github.com/noirbizarre/flask-restplus/issues/172
Yep @sloria Schema.from_dict
was exactly what I was thinking of for that (I hacked together something using that whilst experimenting) 👍
@ziirish I'm not sure at the moment r.e. if/how the work could be split up. I'm going to take some more time over the weekend to look deeper into implementation (as apposed to just API design). Hopefully then I'll have better idea 👍
@ziirish the use case of wildcard is fine. I think it's an odd implementation issue, and a mismatch on what folks expect to see. I commented on the issue with what I would expect to happen. i think it's just shifting things a bit. a wildcard is nothing but the most open-ended regex possible
i need more dedicated time to think/look at things. Unfortunately I may not have that time soon. Just woke up early at my company "sprint" and it was too rainy to walk for coffee.
It seems like there's already an awful lot of activity around this area (especially in the marshmallow
ecosystem). Specifically flask-smorest
seems to be almost identical to how flask-restx
might work when using marshmallow
. It uses the same existing libraries that we've been considering to do the 'heavy lifting' of request parsing/OpenAPI schema generation e.g. webargs
, apispec
and marshmallow
(along with the extensions each of these provide).
I'm not necessarily suggesting we don't go ahead here, but (personally at least) I'm struggling to justify the (fairly large) effort to properly replace the existing models, request parsing e.t.c. - which would certainly introduce breaking API changes anyway - when such similar projects already exist :thinking: Does anyone else have a perspective on this? Like I say, I'm still happy to try and do this, but I wanted to get people's thoughts in general on whether it's worth it :sweat_smile:
Let's separate this into a few larger main points
models
but separate from the autodoc? Meaning create a model how we would normally, but don't register it to the API or Namespace.Some initial answers from my recent notes made on the topic @j5awry 😊
Interface:
Should we keep the interface(s) stable?
Ideally, yes, but I don't think this is a hard requirement - especially if it makes the implementation much more complex.
Should we expose the underlying implementation more directly (or allow it)
My ideal goal here is to provide some abstraction with which other serde/validation libraries can implement adapters for. E.g. a flask-restx
model represents a contract that can be fulfilled by Marshmallow, pydantic e.t.c. We would likely provide first class support for one of these but users can plug in as necessary.
Backend:
Keep the same, find another modeling/parsing/marshalling library, or re-write from the ground-up?
Definitely not keep the same. As I mentioned previously, there's a lot of existing work in this are of validation/parsing and it makes sense to utilise this ecosystem where possible.
What input and output formats do we want / are required? Meaning, what format(s) does OpenAPI expect? What options/mappings do we have from input to output?
From the perspective of OpenAPI, all that matters is that we produce a valid OpenAPI schema according to the current JSON schema specification. From the end user perspective, this would be represented as JSON or YAML.
From the perspective of Flask-RESTX (as it currently stands at least), the input to this schema is Resource
s, Model
s and various @api.doc
decorators. All of which fundamentally modify the __schema__
property of the objects in question.
This is currently quite convoluted and tightly coupled to the implementation of pretty much everything. One of my desires for this effort is to decouple this.
What libraries do it for us already? Do they operate in the way we want?
Some I have already mentioned:
Whether they operate how we want depends on defining exactly what we want 🤣
Is there a way to support OpenAPI 2 and 3?
In theory, I don't see why not. However, IMO, OpenAPI 2 ought to be considered legacy and supported primarily for backwards compatibility.
Requests:
What is the main use of the requests parser? How are people using it now? What do they want out of it? (I'm going to be honest, I never used it)
Difficult to answer this in the general case - is there a way we could get some feedback from users here? Anecdotally, I've primarily used it in the past for validating and parsing filter parameters.
Do we merge it into models but separate from the autodoc? Meaning create a model how we would normally, but don't register it to the API or Namespace.
Im not quite sure what you mean here sorry. If a model exists to define what a request should expect as input then it should be part of the OpenAPI schema.
Hi guys, I'm new here. I would like to give some comments as I use this library.
What is the main use of the requests parser? How are people using it now? What do they want out of it? (I'm going to be honest, I never used it)
Funny to say that I use reqparse only!!! And I'm wondering how you manage to avoid it when you need to filter out your data based on user requests. All public API work this way. The scenario is simple, when you want to expose your data from a datawarehouse, you don't need CRUD ... just filtering data based on your custom logic. So I come with a generic set of parameters needed for each resource .. and depending on the context I extend/remove parameters for specific resources.
So, to me reqparse is very important .. it can be replaced by something else internally but the main features need to stay (validation, location, order, add/remove per resource)
Here I give a few ideas for your consideration. These are mostly related to the fact that in the future openapi will be aligned with json schema, see https://apisyouwonthate.com/blog/openapi-v31-and-json-schema-2019-09.
Currently it is possible to use json schemas to define models both for requests and responses. I mean something like the following:
from flask import Flask
from flask_restx import Api, Resource
app = Flask(__name__)
api = Api(app)
request_schema = get_myendpoint_request_json_schema()
request_model = api.schema_model('mydenpoint_request', request_schema)
response_schema = get_myendpoint_response_json_schema()
response_model = api.schema_model('mydenpoint_response', response_schema)
@api.route('/myendpoint')
class MyEndpoint(Resource):
@api.expect(request_model, validate=True)
@api.doc(model=response_model)
def get(self):
return {}
app.run()
This is very useful in my opinion because json schema is a standard, so there is no need implement anything to define models. Furthermore, data that follows some json schema could be received from some (potentially non-python) source and a flask-restx endpoint could include this as part of its response. In this case extending the json schema from the original data source makes total sense instead of having to define the response model from scratch.
I have looked at alternatives such as marshmallow and fastapi, but none really allow to do as I want with simple json objects and flask-restx. So I surely hope that this feature from flask-restx to define models from json schemas is preserved and even improved.
The json schemas could also be used for filtering when marshalling, by defining a subset schema that defines only the information that should be included in the response. It would be implementing something like https://github.com/uber/json-schema-filter but for python.
I was wondering given the TodoMVC example was there meant to be 2 models one for request payload and one for response.
Alot of cases, there are certain properties that are synthetically generated by the application and shown on the response. But they are not properties that you set during request.
Therefore, are they meant to be 2 different models? Or is there a way to have 1 model and declare what properties are needed during marshal in and which are needed during marshal out?
Oh I now see that the readOnly
is meant to toggl properties that between GET/POST... etc. How come properties that are readOnly
still show up during for the POST method?
DW, I misread it was meant to be readonly
not readOnly
.
Hi! I'm not sure if my inputs will make a difference but here they are - 1) Keep the api less verbose, the marshmallow, webargs and apispec setup use to be great but with all those decorators it's less readable. 2) Can we draw from FastAPI's beautiful API? I propose we use something less verbose like pydantic, Faust or even python dataclasses if at all. 3) A dependency injection feature really helps, even if that's a single layer. By that i mean dependency injection only works for endpoint methods, that's a great feature to have.
I have been trying come up with something similar FlaskEase, it is far from perfect with zero test coverage as of now. I want something similar but with community effort that way it's easy to maintain.
are you guys planning to handle host parameter in other way? https://github.com/python-restx/flask-restx/blob/master/flask_restx/swagger.py#L270
It's difficult when you want to generate openAPI specification and put this into the repository as an individual file, or just test this at https://editor.swagger.io/.
An option to support back words compatibility is to use python reflection with the adapter pattern so you adapt pydantic or any other framework into the current model. I wrote an adapter that works with the last version of flask_restplus
so it should be compatible with Flask-Restx
as long as there where no major changes to the Api models or fields interface.
Its incomplete with all the properties on fields but can extended for full support.
This is a modular approach could allow for pydantic
adapter to be in different pip package. Developers would need a base class like SchemaAdapter
and a way to register the adapter into the instance to override the base class.
""" Flask Adapter
"""
import datetime
import decimal
import re
from typing import List, Union
from flask_restplus import Api, Model, fields
from pydantic import BaseModel
class SchemaAdapter:
""" Example
"""
api: Api
def __init__(self, api: Api):
self.api = api
@staticmethod
def python_to_flask(python_type: type) -> str:
""" Converts python types to flask types
:param type python_type: type that is to be converted into flask type
:return: flask type represented as a string
:rtype: str
"""
if python_type is int:
return 'Integer'
if python_type in [float, decimal.Decimal]:
return 'Float'
if python_type is bool:
return 'Boolean'
if python_type is datetime.datetime:
return 'DateTime'
if python_type is datetime.date:
return 'Date'
return 'String'
def adapt(self, base_model: Any)-> Model:
""" Base implementation just returns the base_model
"""
return base_model
class FlaskRestPlusPydanticAdapter(SchemaAdapter):
"""
Adapter for flask rest plus pydantic
:param api flask_restplus.Api: flask_restplus Api instance to which is needed for \
api.model function call
"""
def adapt(self, base_model: BaseModel) -> Model:
"""
converts Pydantic model to flask rest plus model
:param base_model pydantic.BaseModel: Pydantic base model the will be
converted to use flask restplus Model
:return: Model instance
:rtype: flask_restplus.Model
"""
result = {}
entity_name = base_model.__model__ if hasattr(base_model, '__model__') else \
re.sub(r'(?<!^)(?=[A-Z])', '_', base_model.__name__).lower()
for name, python_type in base_model.__annotations__.items():
if '__' in name: #skip the python methods
continue
regex = None
description = ""
required = True
field_data = dict(base_model.__fields__.items())[name]
if field_data is not None and hasattr(field_data, 'field_info'):
regex = field_data.field_info.regex
description = field_data.field_info.description
required = field_data.required
# TODO implement all field attributes: idea make a dict of attributes
# pass down using the **attributes vs makeing variables for each
# union type includes Optional which is Union[type,None]
if hasattr(python_type, '__origin__') and python_type.__origin__ == Union:
args = list(python_type.__args__)
if type(None) in args:
required = False
args.remove(type(None))
python_type = args[0]
# List logic
if hasattr(python_type, '__origin__') and python_type.__origin__ in [List, list]:
args = list(python_type.__args__)
current_type = self.python_to_flask(args[0])
result[name] = fields.List(getattr(fields, current_type)(
readOnly=False, description=description, required=required, pattern=regex))
continue
# Nested classes
if hasattr(python_type, '__bases__') and BaseModel in python_type.__bases__:
result[name] = fields.Nested(self.pydantic_model(python_type))
continue
current_type = self.python_to_flask(self.python_to_flask)
result[name] = getattr(fields, current_type)(
readOnly=False, description=description, required=required, pattern=regex)
return self.api.model(entity_name, result)
Hi @SteadBytes I find very interesting your proposal. I think new features would be antifragile if they are:
The third point is important as not to reinvent the wheel. That point has been used by sucessful modules like PIL (coupled with numpy) and Pandas (with numpy, sqlalchemy, xlsx modules...).
In fact, one of the most important features of restx for me is that I can add restx as a blueprint and have it as an extra module of my flask app.
Today I discovered a module, forked from restplus that allows the integration of marshmallow and webargs (also hosted by marsmallow) would be great to be integrated in a future version of restx
.
Hi everyone, did anyone have a way to adding pydantic validation and doc generation to restx? Does @bluemner 's adapter work with flask-restx? Anyone have an example?
Thanks!
@conradogarciaberrotaran not to say pydantic is incompatible with flask-restx, but it represents a totally different approach (as @bluemner shows above). Work on this specific redesign has halted recently as the current set of maintainers moved into roles where flask-restx has not been part of their day to day work.
One thing, if you're looking for a platform now, there are several frameworks and plugins geared specifically toward pydantic. There's the Flask-Pydantic framework which seems geared toward the validation side. There is also a fairly young project called flask-pydantic-spec specifically for generating OpenAPI specs
Hopefully we'll be able to bring on more people, and help move this design work forward.
What is the main use of the requests parser? How are people using it now? What do they want out of it? (I'm going to be honest, I never used it)
I have to say that I use that a lot and it is a critical feature of flask-restx for me. I am not interested in models etc.
The Reqparser allows to quickly specify a no-nonsense input spec and handle the validation/conversion.
You can find an example usage there tshistory_rest.
Guys i think we have to accept it now, the flask community loves Marshmallow and hence we have to choose one such flask extension. So, I think APIFairy it is, miguel built it, so it has to be stable we can start using it and create PRs for features that we think we need. IMO
I like @zero-shubham's proposal. I agree with Marshmallow as the most loved.
I think the deprecation warning has damaged the image of the project and we should offer an alternative as soon as possible. It discourages to use the parser and perhaps the use of the entire module.
I;m back with another option :) flask-smorest
Any updates on this? Is flask-restx compatible with marshmallow?
I have been using the combination flask-restx + Marshmallow + flask-accepts for a couple of years now, and it does work pretty smoothly. I like flask-accepts' seamless handling of models and query parameters via decorators and its built-in integration with the swagger API docs. I'm obviously digging through the proposals of the contributors to this thread: just offering my 2 cents for consideration.
It seems to me also a very backward-compatibility friendly option, as it would still leave the possibility to use the old flask-restplus schema-handling and the hideous reqparse. Happy to contribute to the integration if the community decides on the Marshmallow + flask-accepts option :)
I experimented today with dropping in pydantic models to replace flask_restx native models, and I actually had some success. There are definitely problems, but it feels very close to "just working."
My approach was to leverage pydantic's ability to produce OpenAPI schema from models, eg.:
namespace.schema_model(my_model.__name__, my_model.schema())
Some models worked without any issue! I was pleasantly surprised. I also discovered I hadn't updated flask_restx in a while, so this was working on 0.2.0.
I did get an issue where some models would not get rendered to the "definition"
object in the OpenAPI schema, which would then cause problems for any models that depend on them. I think this is actually specific to 0.2.0, though, and should be fixed with the new RESTX_INCLUDE_ALL_MODELS
config option if I'm reading the code right. If I remember, I'll edit with the results on 0.5.0 later.
Fwiw, I would recommend against Marshmallow. Not anything against it--honestly, I haven't used it--but it uses essentially the same approach of forcing us to feed long arg lists to a field
function or class for every single field. It's tedious enough having to do that for sqlalchemy; it's irritating to have to do it again, particularly for models that exactly match an ORM class. pydantic
or even dataclasses
alleviate a lot of that by requiring only a type annotation for most fields.
Update: Decided I couldn't wait until Monday and tested with 0.5.0. The pydantic models I used earlier are so far working perfectly! No changes to my code at all.
So, Pydantic can serve as a drop-in replacement! 😄
I'll go through these a little bit, and offer thoughts (as a maintainer)
re APIFairy
-- Not 100% convinced of it, nor how to make it drop into the same workflow in restx
re: flask-accepts
-- it's a great addition on top of flask-restx
. We've had good conversations with apryor6, and they've helped out at various points on the project. Taking inspiration from (or finding more ways to meld it in) would be a win
re: pydantic
-- it's really exciting that it generally drops in as a replacement! I don't see pydantic
as an OR but as an AND. It'd be great to to have some examples of working pydantic
integration, what's working and not working (it looks like nested schemas aren't working, for instance, which may be tied to restx
using jsonschema
v4). If we go the route of officially documenting pydantic
as a possibility though, we'll want to ensure that at least one maintainer is well versed in pydantic
.
Some assertions (as a maintainer)
Models
should remain as close as possible to the same. We shouldn't break api.Model
for users whose codebase worked on X version until we hit a major revision. And, even then, we'll have to do a level of change management and announcements we'd never done before. Otherwise, we'll break someones production systemMarshmallow
, and then we can have api.Model be similar, but turned into Marshmallow, and also allow api
to take Marshmallow
directly. Then we can worry less about our types over time (just using the Marshmallow
type system, which is more robust and inclusive). api.schema
controller
is doing a lot of heavy lifting to massage the data. I am for having less repetition in code and brevity where possible, but models
must also be robust enough to be effective. something like [marshmallow-sqlaclhemy](https://pypi.org/project/marshmallow-sqlalchemy/)
is an approach where at least a single modeling style is used. I think that's probably more important -- consistencydataclasses
intrigue me greatly as builtin ways of handling this issue, and they deserve attention. however, as they're not supported across all the Python versions we support (and even though we don't list 3.4, somehow it's the most downloads, so something, somewhere, is pulling a lot of 3.4), we need to be careful about it.Based on your thoughts and assertions, I have the feeling that expanding the documentation and capability of schema_model()
would be the path of least resistance. With that, schema_model
can be used with Pydantic, or any library that can take a dataclass
or attrs
class and generate a schema, or just any library that can produce a schema from some input. Then, it's easy for users to "bring your own serializer library" without having to abandon or back-seat efforts to improve or replace the native API, nor to maintain some compatibility layer for whatever such library happens to be popular this month.
It'd be great to to have some examples of working pydantic integration
Here's a sampling based pretty closely on my actual code. Nested models work for me so far 😄 (I haven't stress-tested, though, ymmv. Update: I've been using this pattern in production for months with no issues* except those I create myself. *Update 2: Actually, one issue I just found: schema models seem to not play nicely with marshal_list_with
.) One thing that would make it even easier would be the option to return a JSON string rather than a dict
. You'll see why towards the bottom, and the link to the relevant issue on Pydantic's github.
This won't run as written, but it should at least be pretty close (should just need a real function instead of This should run as written, but I haven't tested it. The goal was just to show the pattern I found to work with get_stuff_from_backend
).flask_restx
v0.5.0 and pydantic
1.8.2.
import json
import random
from datetime import datetime
from typing import *
import flask
from flask_restx import Api, Model, Namespace, fields
from pydantic import BaseModel, Field
class Error(BaseModel):
error: str
code: int
class Config:
@staticmethod
def schema_extra(schema: dict, model):
schema["properties"].pop("code")
# This just hides the "code" key from the generated schema.
class Network(BaseModel):
id: str
state: str
created: str
title: str
owner: str
description: str
node_count: int
link_count: int
class AssignedHost(BaseModel):
uid: int
hostname: str
pop: str
class Reservation(BaseModel):
uid: int
status: str
owner: str
created_at: datetime
networks: Union[List[Union[Network, Error]], Error] = Field(default_factory=list)
host: AssignedHost
app = flask.Flask(__name__)
app.config["RESTX_INCLUDE_ALL_MODELS"] = True
api_blueprint = flask.Blueprint("api", __name__, url_prefix="/api")
api = Api(api_blueprint, title="with pydantic")
ns = Namespace("ns", path="/namespace")
ns.schema_model(Error.__name__, Error.schema())
ns.schema_model(Network.__name__, Network.schema())
ns.schema_model(AssignedHost.__name__, AssignedHost.schema())
ns.schema_model(Reservation.__name__, Reservation.schema())
def get_stuff_from_backend(id: int):
return random.choice((
Reservation(
uid=id,
status="golden",
owner=1,
created_at=datetime.now(),
networks=[],
host=AssignedHost(uid=1, hostname="example.com", pop="LHR"),
),
Error(error="oh noes!", code=500),
))
@ns.route("/")
class Sample(Resource):
@ns.response(200, "Success", ns.models["Reservation"])
@ns.response("40x,50x", "Error", ns.models["Error"])
def get(self, id):
result = get_stuff_from_backend(id)
if isinstance(result, Error):
json_ = json.loads(Error.json()) # This is a pydantic limitation with an open issue:
# https://github.com/samuelcolvin/pydantic/issues/1409
code = json_.pop("code")
return json_, code
else:
return json.loads(result.json())
api.add_namespace(ns)
if __name__ == "__main__":
app.run()
edit: Added the RESTX_INCLUDE_ALL_MODELS
setting. Thanks to Lukas' comment below for reminding me. Also made a definition for "get_stuff_from_backend" so one can (theoretically) run this snippet as-is.
Hi,
I did a marshmallow schema to restx model convertor, it's build on the existing restx-models as a helper function mostly.
It's not a model re-design indeed, but feel free to integrate it.
Basic Usage:
from flask import Flask, request
import flask_restx as restx
from flask_restx import Resource, Api
import marshmallow as ma
from flask_restx import marshmallow_to_restx_model # import the converter function
app = Flask(__name__)
api = Api(app)
class SimpleNestedSchema(ma.Schema):
simple_nested_field = ma.fields.String(required=False, metadata={'description': 'the description of simple_nested_field'})
class SimpleSchema(ma.Schema):
simple_field1 = ma.fields.String(required=True, metadata={'description': 'the description of simple_field1'})
simple_nest = ma.fields.Nested(SimpleNestedSchema)
# Give it as parameters the flask-restx `Api` or `Namespace` instance and the Marshmallow schema
simple_nest_from_schema = marshmallow_to_restx_model(api, SimpleSchema)
@api.route('/marshmallow-simple-nest')
class MaSimpleNest(Resource):
# Place it where you need a restx model
@api.expect(simple_nest_from_schema, validate=True)
def post(self):
return request.json
if __name__ == '__main__':
app.run(debug=True)
No more duplicate schemas! :)
The code:
import marshmallow as ma
import flask_restx as restx
from flask_restx import Api
from typing import Callable, Union
__all__ = [
"restx_fields",
"marshmallow_to_restx_model"
]
# Flask-RestX replacements for marshmallow fields
restx_fields_mapper = {
"Str": "String",
"Bool": "Boolean",
"Int": "Integer",
"Email": "String",
"Mapping": "Raw",
"Dict": "Raw",
"Tuple": "List",
"UUID": "String",
"Number": "Integer",
"Decimal": "Float",
"NaiveDateTime": "DateTime",
"AwareDateTime": "DateTime",
"Time": "DateTime",
"Date": "DateTime",
"TimeDelta": "DateTime",
"URL": "String",
"Url": "String",
"IP": "String",
"IPv4": "String",
"IPv6": "String",
"IPInterface": "String",
"IPv4Interface": "String",
"IPv6Interface": "String",
"Constant": "String"
}
def restx_fields(
description: str = None,
enum: str = None,
discriminator: str = None,
min_length: int = None,
max_length: int = None,
pattern: str = None,
attribute: str = None,
default: Union[int, float, str, bool, dict, list] = None,
title: str = None,
required: bool = True,
readonly: bool = False,
example: str = None,
mask: dict = None
):
"""
To be used in marshmallow field `metadata` if there are conflicting keys.
Let's say you need `description` field from metadata in other place than for restx field.
Ex:
```py
class MaSchema(ma.Schema):
name = ma.fields(
required=True,
metadata={
**restx_fields(description="The username"),
'description': 'needed for something else'
}
)
If you don't use `metadata` parameter for other operations you can just specify the fields in the dict
No need to use `restx_fields` function
Ex:
```py
class MaSchema(ma.Schema):
name = ma.fields(required=True, metadata={'description': 'The username'})
```
"""
return {'restx_params': {
'description': description,
'enum': enum,
'discriminator': discriminator,
'min_length': min_length,
'max_length': max_length,
'pattern': pattern,
'attribute': attribute,
'default': default,
'title': title,
'required': required,
'readonly': readonly,
'example': example,
'mask': mask
}}
def get_marshmallow_field_type(ma_field: Callable) -> Union[str, None]: """ Get string name for field type :param ma_field: marshmallow field :return: string name of the field """ attr_name = getattr(type(ma_field), "name") if attr_name in restx_fields_mapper: return restx_fields_mapper[attr_name] return attr_name
def get_restx_params(ma_params: dict):
"""
On metadata
field from marshmallow if restx_params
key is present
field will be used to add restx field kwargs
if not all keys from metadata
will be used as kwargs for flask restx fields
:param ma_params: vars from marshmallow field
:return: flask restx field kwargs
"""
restx_params = ma_params['metadata'].get('restx_params') or ma_params['metadata']
return {
'required': ma_params['required'],
**restx_params,
}
def get_field_data(ma_field): """ Get data required to create restx model :param ma_field: marshmallow field :return: dict with info needed to create restx model """ return { "params": get_restx_params(vars(ma_field)), "type": get_marshmallow_field_type(ma_field), "nested": None, "raw": ma_field }
def get_marshmallow_metadata(schema: Callable): """ Returns from marshmallow schema the following dict:
{
"schema1": {
"field_name1": {
"params": {},
"type": "String",
"nested": None,
'inner': field data
"raw": marshmallow_field,
},
"field_name2": {
"params": {},
"type": "String",
'inner': field data
"raw": marshmallow_field,
"nested": {
"schema2": {
"field_name1": {
"params": {},
"type": "String",
"nested": None,
'inner': field data
"raw": marshmallow_field
}
}
}
}
}
}
"""
marshmallow_metadata = {schema.__name__: {}}
# Simple fields
for field_name, ma_field in schema().declared_fields.items():
marshmallow_metadata[schema.__name__][field_name] = get_field_data(ma_field)
# Added recursion for nested fields
for field_name, field_data in marshmallow_metadata[schema.__name__].items():
if field_data['nested'] is None:
if isinstance(field_data['raw'], ma.fields.Nested):
marshmallow_metadata[schema.__name__][field_name]['nested'] = get_marshmallow_metadata(
field_data['raw'].nested)
if isinstance(field_data['raw'], ma.fields.List):
if hasattr(field_data['raw'].inner, 'nested'):
marshmallow_metadata[schema.__name__][field_name]['nested'] = get_marshmallow_metadata(
field_data['raw'].inner.nested)
else:
# ex: ma.fields.List(ma.fields.String)
marshmallow_metadata[schema.__name__][field_name]['inner'] = get_field_data(field_data['raw'].inner)
return marshmallow_metadata
def get_restx_field(api: Api, ma_field_meta: dict, *, nested: bool = False): if nested: return restx.fields.Nested( api.model, **ma_field_meta['params'] )
if ma_field_meta['type'] == "List" and "inner" in ma_field_meta:
return restx.fields.List(
getattr(restx.fields, ma_field_meta['inner']['type'])(**ma_field_meta['inner']['params']),
**ma_field_meta['params']
)
restx_field = getattr(restx.fields, ma_field_meta['type'])
restx_field_instance = restx_field(api.model, **ma_field_meta['params'])
restx_field_instance.default = None
return restx_field_instance
def ma_metadata_to_restx_model(api: Api, ma_metadata: dict): restx_model = {}
for schema_name, mameta in ma_metadata.items():
for field_name, ma_field_meta in mameta.items():
if ma_field_meta['nested'] is None:
restx_model[field_name] = get_restx_field(api, ma_field_meta)
else:
restx_model[field_name] = ma_metadata[schema_name][field_name]
# Added recursion for nested fields
for field_name, field_instance in restx_model.items():
if isinstance(field_instance, dict):
if 'inner' in field_instance:
restx_model[field_name] = get_restx_field(api, ma_field_meta)
if field_instance['type'] == 'Nested':
restx_model[field_name] = get_restx_field(api, field_instance, nested=True)
restx_model[field_name].model = ma_metadata_to_restx_model(api, field_instance['nested'])
if field_instance['type'] == 'List' and field_instance['nested'] is not None:
restx_model[field_name] = restx.fields.List(
restx.fields.Nested(ma_metadata_to_restx_model(api, field_instance['nested'])),
**ma_field_meta['params']
)
return api.model(schema_name, restx_model)
def marshmallow_to_restx_model(api: restx.Api, schema: Callable): """ Convert a marshmallow schema to a Flask-Restx model :param api: Restx Api instance or Namespace instance :param schema: Marshmallow schema :return: Restx model from marshmallow schema """ ma_metadata = get_marshmallow_metadata(schema) restx_model = ma_metadata_to_restx_model(api, ma_metadata) return restx_model
Hello there, I was playing around with pydantic
https://github.com/python-restx/flask-restx/issues/59#issuecomment-899790061 and even though you register your models they will get deleted somewhere, could not find the exact spot. However, in the docs is RESTX_INCLUDE_ALL_MODELS
which can be set to True
. Therefore all models registered will stay in the final result schema.
Please react if I should provide some adjusted working example.
No more duplicate schemas! :)
@ClimenteA Nice solution! How do I used it with query parameters respectively how do you solve this? It seems that the parser from reqparse is still necessary in this case?
The result from marshmallow_to_restx_model
function can be used anywhere a model
is needed. The cases of query parameters and file upload are not covered.
The function marshmallow_to_restx_model
it's just a mapper between marshmallow fields and the existing restx model fields.
I'd like to just declare the request model once, validate it, and convert it into a python object without having to create a custom transformer for it model => python object. It'd certainly improve DX, and it'd be an useful feature. Is this currently a feature of restx? I haven't seen it. I apologise if I don't have the complete picture here, but that shouldn't be so hard to pull off?
@abeiertz Flask_restx models don't currently provide that (unless I've missed some release notes?). I'll plug Pydantic once more. The models you create work very much like dataclasses or attrs
outside of the schema generation + validation context that flask_restx models serve. They perfectly do double-duty as your input/output models and as useful Python objects that you can pass around, add methods to, etc. See my earlier comment.
The only real drawback I've come across is that any kind of schema_model
needs to be used with the @response
decorator. I haven't tested a lot with @marshal_with
, but @marshal_list_with
definitely can't take a schema_model
which describes an object. (You get an AttributeError: 'SchemaModel' object has no attribute 'items'
from Six.) This hasn't been much of an issue for me since I found that @marshal_with
is actually really inflexible (https://github.com/python-restx/flask-restx/issues/347) and have been using the @response
decorator and calling marshal
directly, anyway, but it you already use @marshal_with
a lot it could be problematic.
I believe Marshmallow models are also plain-old python objects like Pydantic models and so can be used in similar ways. I honestly don't know the differences as I've never used Marshmallow.
Is there any progress in Flask-Restx for supporting webargs and marshmallow. From the discussion, it seems it's still in discussion?
I don't see an automated way to turn a SchemaModel or BaseModel into a fields
that is what an invocation of the @marshal_with
expects to see as its first argument.
Forgive my ignorance, and please correct me where I went wrong. Do I need to create an object with class flask_restx.fields.***()
values for every field of an object that I want to marshal?
It would be more a favorable result if I could I create a SchemaModel from the schema_model
function and pass that Model to the decorator @marshal_with
like so:
from flask import Flask, request
from flask_restx import Resource, Api
if not dir().count('app'):
app = Flask(__name__)
api = Api(app, doc='/swaggerui/', version="1.0.0", title="API",
description="API")
update_service_ns = api.namespace("Update Service",
description="Software update service",
path=f'{ROOT}/UpdateService')
def load_schema():
with open('./models/UpdateService_schema.json') as f:
return update_service_ns.schema_model('UpdateService', json.load(f))
@update_service_ns.route("/")
class UpdateServiceRoot(Resource):
""" Represents the UpdateServiceRoot
"""
@api.marshal_with(load_schema())
def get(self, **kwargs):
""" Return the UpdateServiceRoot json.
"""
If I run the code above, I get:
Traceback snip...
File "/usr/local/lib/python3.8/dist-packages/flask_restx-0.5.1-py3.8.egg/flask_restx/marshalling.py", line 181, in _marshal
for k, v in iteritems(fields)
File "/usr/lib/python3/dist-packages/six.py", line 589, in iteritems
return iter(d.items(**kw))
AttributeError: 'SchemaModel' object has no attribute 'items'
This is obviously because I haven't created a fields
that the @marshal_with
decorator expects to see. Is there an easy way to translate one into the other?
Edit: Well, I think I understand that I can't use marshal_with, but I have to use the @response
decorator instead. That will probably work just as well as the @api.doc
decorator, right? Thanks for the help. I hope there can be some kind of update to this thread with a minimal example of how to marshal a response using a result from schema_model
. Cheers
Any updates here?
Dataclasses
, Marshmallow
and/or Pydantic
would be great and a more generic way of declaring models, would be useful to decouple things.
I have to use the
@response
decorator instead. That will probably work just as well as the@api.doc
decorator, right?
Yep; @response
basically just takes its arguments and shapes them into a format consumable by @doc
and then calls @doc
. If you dislike having a half-dozen decorators on your methods, you can actually replace almost all of them with a single @doc
and a big dictionary :smiley:
I would like to be able to declare a model with python dataclass. Then use marshmallow-dataclass to generate a serializer/deserializer. Then have some utility to produce flask model from serializer/deserializer.
import marshmallow_dataclass
from datetime import datetime, date, time
from typing import List
from dataclasses import dataclass, field
@dataclass
class WeekTimeSlotDto:
DayOfWeek: int
StartTime: time
Duration: int
WeekTimeSlotId: str = None
@dataclass
class ScheduleGenerateDto:
TeacherId: int
ScheduleType: str
StartDate: date
Number: int
EndDate: date = None
WeekTimeSlots: List[WeekTimeSlotDto] = field(default_factory=list)
ScheduleGenerateSchema = marshmallow_dataclass.class_schema(ScheduleGenerateDto)()
def load_ScheduleGenerateDto(payload) -> ScheduleGenerateDto:
return ScheduleGenerateSchema.load(payload)
schedule_model = make_restx_model(ScheduleGenerateSchema)
@api.route('/generate_schedule_entry')
class ScheduleGenerate(Resource):
@api.expect(schedule_model, validate=True)
def post(self):
schedule = load_ScheduleGenerateDto(api.payload)
return {}
This seems to be more of integration between flask model and marshmallow. Let me know what you think!
For quite some time there have been significant issues around data models, request parsing and response marshalling in
flask-restx
(carried over fromflask-restplus
). The most obvious of which is the deprecation warning about thereqparse
module in the documentation that has been in place for far too long. These changes have been put off for various reasons which I won't discuss here, however now the new fork is steadily underway I (and no doubt others) would like to start addressing this.Since this digs quite deep into the architecture of
flask-restx
there will be significant (and likely breaking) changes required. As such, this issue is to serve as a discussion around the API we would like to provide and some initial ideas of how to best proceed. This is not intended to be the starting point of hacking something together which makes things worse!I will set out my current thoughts on the topic, please contribute by adding more points and expanding on mine with more discussion.
High Level Goals:
reqparse
andmodels
General Issues/Discussion Points
api.marshal
,api.doc
decorator style?reqparse
and existingmodels
interface?Resources/Notable Libraries