litestar-org / litestar

Production-ready, Light, Flexible and Extensible ASGI API framework | Effortlessly Build Performant APIs
https://litestar.dev/
MIT License
5.44k stars 372 forks source link

Enhancement: Harmonize example generation #3181

Open tuukkamustonen opened 7 months ago

tuukkamustonen commented 7 months ago

Summary

The current OpenAPI example generation is a bit of a mixed bag, and this issue collects the various separate issues and documents some extra needs.


  1. ~Support ResponseSpec(examples=...) for custom response examples~

    ~This is https://github.com/litestar-org/litestar/issues/3068 (merged)~


The example generation is turned off by default, but can be enabled by Litestar(openapi_config=OpenAPIConfig(create_examples=...)).

Yet, with example generation off:

  1. Examples are generated for error responses:

                      "examples": {
                        "NotAuthorized": {
                          "value": {
                            "status_code": 401,
                            "detail": "Missing or invalid credentials",
                            "extra": {}
                          }
                        }
                      }
                    }

    This is not tunable, they are always there. Would expect these to not exist by default (with example generation turned off).

  2. ~Examples are generated for response models:~

    ~Even with Litestar(openapi_config=OpenAPIConfig(create_examples=False)) (default), OpenAPI $.components.schemas models get examples.~

    ~Turning ResponseSpec(..., generate_examples=False) disables this, but you'd expect the global flag to do the same first.~

    This one was invalid report, it actually worked just fine. Ignore this bullet.


Once you actually set Litestar(openapi_config=OpenAPIConfig(create_examples=True)), you will get examples for:

  1. It's possible to locally turn example generation off per ResponseSpec but not per Parameter

    Allow Paremeter(generate_examples=False) to sync with ResponseSpec support.


  1. The generated examples are inserted as both OpenAPI and JSON schema examples, ie:

                        {
                            "name": "path_arg",
                            "in": "path",
                            "schema": {
                                "type": "string",
                                "examples": {  <------- HERE
                                    "path_arg-example-1": {
                                        "value": "EXAMPLE_VALUE"
                                    }
                                }
                            },
                            "required": true,
                            "deprecated": false,
                            "allowEmptyValue": false,
                            "allowReserved": false,
                            "examples": { <------- HERE
                                "path_arg-example-1": {
                                    "value": "EXAMPLE_VALUE"
                                }
                            }
                        }

    It maybe feels a bit unnecessary to do both, when OpenAPI ones should suffice.

    Example generation could be a toggle instead of a flag. It could support 4 modes:

    1. OpenAPI schema only
    2. JSON schema only
    3. Both JSON + OpenAPI
    4. None

    Litestar uses (3), generating examples in both. Even if this toggle doesn't get implemented, I believe it's good to identify the case here.

    This is https://github.com/litestar-org/litestar/issues/3057.


  1. ~The order of OpenAPI schema changes between executions~

    ~This is https://github.com/litestar-org/litestar/issues/3059 (merged).~


  1. Define examples in Pydantic (and Msgspec, dataclass, TypedDict) models

    This already works for Pydantic's Field(examples=[...]), but not for ConfigDict(json_schema_extra=...). There should also be a mechanism to define the examples for the model, within the model (not only per field). (It was deemed somewhere that Pydantic's json_schema_extra wouldn't be supported, but there needs to be some mechanism for this, like for Msgspec, dataclass, et al).

    Support for declaring query/path/header/cookie arguments as models is tracked in https://github.com/litestar-org/litestar/issues/2015. This is probably related, once there's full support for those, also defining examples in the models probably gets supported?


  1. ~dict[str, Any] and None trigger example generation always~

    ~This is https://github.com/litestar-org/litestar/issues/3069 (merged)~

Basic Example

No response

Drawbacks and Impact

No response

Unresolved questions

No response


[!NOTE]
While we are open for sponsoring on GitHub Sponsors and OpenCollective, we also utilize Polar.sh to engage in pledge-based sponsorship.

Check out all issues funded or available for funding on our Polar.sh dashboard

  • If you would like to see an issue prioritized, make a pledge towards it!
  • We receive the pledge once the issue is completed & verified
  • This, along with engagement in the community, helps us know which features are a priority to our users.

Fund with Polar

guacs commented 7 months ago

@tuukkamustonen thanks for the writeup!

Let me know what you think about all this and I think we could create different issues for each of them so we can track them separately.

JacobCoffee commented 7 months ago

@guacs

making that change currently may be considered a breaking change

could we use experimental features to backport?

guacs commented 7 months ago

@guacs

making that change currently may be considered a breaking change

could we use experimental features to backport?

We could, but I'm not sure if the feature brings enough of a benefit to have to use a feature flag (and the associated complexity) for it.

guacs commented 7 months ago

Examples are generated for response models

@tuukkamustonen I wasn't able to reproduce this with the following:

from __future__ import annotations
from dataclasses import dataclass

from litestar import get
from litestar.app import Litestar

@dataclass
class Foo:
    foo: int

@get("/")
async def get_foo() -> Foo:
    ...

@get("/favicon.ico")
async def you_cant_have_it() -> None:
    return

app = Litestar([get_foo, you_cant_have_it], debug=True)
tuukkamustonen commented 7 months ago
  • For the defining of examples in pydantic models themselves, I agree that the plugin should be checking the config dictionary. Though, I'm not sure about msgspec, dataclasses and TypedDict since, as far as I know, there's no official way to that. For those cases, I think users will have to rely on using Example instead.

That's fair. I'm using Pydantic so for me supporting only Pydantic would suffice 😄 Though, I think in general it's a good feature to be able to define the examples within in the models (less clutter for the handler declarations). Yeah, maybe not for TypedDicts or dataclasses, but looks like at least Msgspec supports generating JSON schema so maybe for that, too.

I assume there's a technical blocker for that, but couldn't you just use the JSON schema that those libraries generate, when such support is available? And retain custom support for just those that don't, ie. TypedDicts and dataclasses(?)

I wasn't able to reproduce this with the following:

Okay, I'll check (slowly, we just moved to a new address and apartment is full of boxes...)

provinzkraut commented 7 months ago

I assume there's a technical blocker for that, but couldn't you just use the JSON schema that those libraries generate, when such support is available? And retain custom support for just those that don't, ie. TypedDicts and dataclasses(?)

One reason to not use them is consistency. Being able to swap from e.g. a dataclass to a Pydantic model to a msgspec Struct without the OpenAPI schema changing is a valuable feature to have, especially if you're not only using the schema for documentation but e.g. to generate clients or downstream typing from.

Another reason would be richness of the generated schema. Because Litetar has more context information, e.g. examples, we can generate richer schemas.

And a last reason is that we often generate more specific schemas than the ones provided by the libraries themselves.

tuukkamustonen commented 7 months ago

Another reason would be richness of the generated schema. Because Litetar has more context information, e.g. examples, we can generate richer schemas.

Anything else than examples?

more specific schemas

And an example of this...?

tuukkamustonen commented 7 months ago

@guacs I think you're right. Examples are not generated for the responses as I argued. My bad. I'll write it off the list.