adriangb / xpresso

A composable Python ASGI web framework
https://xpresso-api.dev/
MIT License
177 stars 4 forks source link

feat: support combining multiple security schemes #22

Open adriangb opened 2 years ago

adriangb commented 2 years ago

See https://swagger.io/docs/specification/authentication/

Currently we only support OR for security models: if you specify multiple security models, if any of them passes auth passes. OpenAPI supports combing OR and AND. Technically you can achieve this by setting auto_error=False, but this isn't reflected in OpenAPI.

A better approach would be something like:

from dataclasses import data class

from xpresso import AnyOfSecurity

@dataclass
class SecurityModels:
    key1: Annotated[str, Security(...)]
    key2: Annotated[str, Security(...)]

async def endpoint(security: AnyOfSecurity[SecurityModels]) -> None:
    ...

AnyOfSecurity would just be Dependent subclass, with no effect on runtime, but that we can use when generating OpenAPI docs. This has a nice parallel with how we handle forms.

adriangb commented 2 years ago

I dug into this a bit, there's several tricky aspects.

One complicating factor is that users might want to declare their security models in different spots, for re-usability. For example, imagine a get_user(...)-> User dependency that requires a "me" scope and a get_user_items()->List[Item] that requires the "read" scope. If those are completely separate dependencies, how are these models joined? Are they ANDed or ORed? From a dependency injection standpoint, if they each raise an HTTPException, making this an AND (from an OpenAPI perspective) makes sense. So how would users declare an OR? Maybe some sort of composite security object that dispatches to it's children and handles the exceptions? But OpenAPI let's us OR ANDed schemes, but not the other way around (so you can't say key1 AND (key2 OR key3)

adriangb commented 2 years ago

One option is to decouple execution from OpenAPI here. We can have a security param to Operation and App: Operation(security=(APIKeyHeader(...) & APIKeyQuery(...)) | OAuth2Bearer(...))). I guess if the user doesn't pass in this parameter, we default to ANDing all of the models as far as OpenAPI is concerned.

Separately, users would control execution / validation by hand, using auto_error and such

adriangb commented 2 years ago

Here's an API I came up with for this:

apikey1 = APIKey("apikey1")
apikey2 = APIKey("apikey2")
oauth_model = OAuth("oauth", scopes=["read", "write"])

user_read = oauth_model.require_scopes("read")

async def get_user_id(
    key1: Annotated[Optional[str], Security(apikey1)],
    key2: Annotated[Optional[str], Security(apikey2)],
    token: Annotated[Optional[str], Security(user_read)],
]) -> int:
    if key1 and key2:
        validate_keys(key1, key2)
        return int(key1)
    val_token= validate_token(token)
    user_read.validate_scopes(val_token["scopes"])
    return val_token["user_id"]

async def get_items(user: Annotated[int, Depends(get_user_id)]) -> list[str]:
    return ["sandwich"]

app = App(
    routes=[
        Path(
            "/items/",
            get=Operation(get_items, security=(apikey1 & apikey2) | oauth)
        )
    ]
)

Pros

  1. I think that using apikey1 & apikey2 is a pretty nice way to express "both keys must be present".
  2. It type checks, so you can not do (apikey | apikey2) & oauth2 (mypy error and runtime failure)
  3. All of the use cases for OpenAPI are satisfied, and we can add a security parameter to only Operation and App to get a 1:! mapping with OpenAPI.
  4. We get rid of the scopes parameter to Security and more importantly the SecurityScopes dependency that FastAPI injects just so that you can check the security scopes that are expected. Instead,

Cons

  1. There's a disconnect between OpenAPI logic and execution flow. This is evidenced by (1) having to implement get_user when we already declared the logic for applying the security layers in Operation(... security=...) and (2) having to declare user_read = oauth_model.require_scopes("read") and yhen call user_read.validate_scopes(val_token["scopes"])
  2. The implementation is pretty clunky, needing to do data model stuff with __or__ and __and__.

Alternatives

For convenience, we could make the result of security: Annotated[??, Security((apikey1 & apikey2) | oauth))] be usable as a dependency that enforces the dependencies and their boolean logic. Question is, what does this thing return? I don't think returning a list of lists of unnamed values is sensible.

There's a bit of massaging that could be done around here, like OAuth2 could have a validate=... parameter that accepts a callback validate(str) -> T where T is the type of a validated token and is what ends up being returned. Maybe OAuth2 could then enforce the scopes itself.

Or we could even just make an OAuth client that gets the JWKs from the .well-known URL. The pro of this is that it's not trivial to implement w.r.t caching and such, and most providers either don't have or have terrible clients. But it would bring in a lot of dependencies and complexity.

adriangb commented 2 years ago

@yezz123 curious if you have any thoughts on this given your exprience with the FastAPI auth ecosystem?

ksamuel commented 2 years ago

Is is possible to use classes to define the security parameters ? If yes, a possible solution would be:

from typing import overload

class ApiKeySecurity(SecurityModel):
    key1: Annotated[str, Security(apikey1)]  
    key2: Annotated[str, Security(apikey2)]

class TokenSecurity(SecurityModel):
    token: Annotated[str, Security(user_read)]

@overload
async def get_user_id(security: ApiKeySecurity]) -> int:
    validate_keys(security.key1, security.key2)
    return int(security.key1)

@overload
async def get_user_id(security: TokenSecurity]) -> int:
    val_token = validate_token(security.token)
    user_read.validate_scopes(val_token["scopes"])
    return val_token["user_id"]

# Annotated[int, Depends(get_user_id)]) => shoudn't Depends just return
# whatever get_user_id returns to save an annotation ? Maybe I'm missing somethign 
async def get_items(user: Annotated[int, Depends(get_user_id)]) -> list[str]:
    return ["sandwich"]

DualSecurity = ApiKeySecurity | TokenSecurity
app = App(
    routes=[
        Path(
            "/items/",
            get=Operation(get_items, security=DualSecurity)
        )
    ]
)

But given that this boiler plate is going to be repeated a lot, providing helpers would be nice:


from typing import overload

apikey1 = APIKey("apikey1")
apikey2 = APIKey("apikey2")
oauth_model = OAuth("oauth", scopes="rw")

class ApiKeySecurity(ApiKeySecurityModel):
    key1: apikey1.security
    key2: apikey2.security

class TokenSecurity(TokenSecurityModel):
    token: oauth_model.scopes.read.security

# ApiKeySecurity is validated before hand for you like forms or api params
# You can overide validation ApiKeySecurity like on any pydantic model
# if you want to customize it.
# Do we need to provide custom response formats depending of the validation ?
# If yes, an additional mechanism must be put in place, but I don't
# know xpresso enough
@overload
async def get_user_id(security: ApiKeySecurity]) -> int:
    return int(security.key1)

# ApiKeySecurity is validated before hand for you like forms or api params
@overload
async def get_user_id(security: TokenSecurity]) -> int:
    return security.token["user_id"]

async def get_items(user: Annotated[int, Depends(get_user_id)]) -> list[str]:
    return ["sandwich"]

app = App(
    routes=[
        # Security has a default value
        # that automatically extracts all securities from deps,
        # and expect at least one of them. One  generic value can also
        # be passed to disable any security. E.G: for tests.
        Path( "/items/", get=Operation(get_items) )
    ]
)
adriangb commented 2 years ago

Is is possible to use classes to define the security parameters ?

Interesting thought, I think using classes in a declarative manner looks a lot nicer than my boolean operator overloading proposal.

Given that we currently do something very similar for Forms, that looks nice.

Security has a default value that automatically extracts all securities from deps, and expect at least one of them.

Yep I think this is what we'll have to do: if you have a single security dependency, we'll use that directly If you have multiple security dependencies (e.g. if you have a dependency that requires a token that gets re-used in multiple operations, and then your operations also requires some other security) then you must pass in an explicit security model (a class like the ones you described) that is used to build the relationship between the security models and display it in OpenAPI.

This makes the simple case easy and the complex case possible. It's also unambiguous: we can simple raise an exception and refuse to start the application if the user uses multiple security models and does not explicitly give us their relationship.

@overload

Is there any specific reason you used overload in your examples? My understanding is that at runtime only the last definition of the function is preserved and all of the @overload information is lost, so as far as Xpresso is concerned it would look like this:

async def get_user_id(security: Union[ApiKeySecurity, TokenSecurity]]) -> int:
    if isisntance(security, ApiKeySecurity):
        return int(security.key1)
    return security.token["user_id"] 

async def get_items(user: Annotated[int, Depends(get_user_id)]) -> list[str]:
    return ["sandwich"]

Which I think makes using @overload a bit redundant, unless users want to do it just for their own documentation purposes.

I think that if I summarize your ideas, it would look something like this:

  1. If you pass in a single security model (even with multiple security models as it's fields, or as a union of security models) we'll automatically document it for you in OpenAPI and do some basic validation (like that both API keys exist) without actually validating the API keys themselves.
  2. If you have multiple models that are not connected into one root model, you must create the root model and pass it into Operation(..., security=RootModel) so that we know how to build the OpenAPI.

A single token

from xpresso.security import OAuth, SecurityModel

oauth = OAuth("oauth2", scopes=["scope1", "scope2"])

class GetUserSecurity(SecurityModel):
    token: Annotated[str, oauth]

async def get_user(security: Security[GetUserSecurity]) -> None:
    ...

This would result in:

security:
  - oauth2: [scope1, scope2]

A token or an API key

from dataclasses import dataclass
from xpresso.security import OAuth, Security

oauth = OAuth("oauth2", scopes=["scope1", "scope2"])
apikey = APIKey("apikey")   # the security scheme name is apikey

class GetUserTokenSecurity(SecurityModel):
    token: Annotated[str, oauth]

class GetUserApiKeySecurity(SecurityModel):
    api_key: Annotated[str, apikey]

class GetUserSecurity(SecurityModel):
    token: GetUserTokenSecurity | None
    api_key: GetUserApiKeySecurity | None

async def get_user(security: Security[GetUserSecurity]) -> None:
    ...

OpenAPI:

security:
  - oauth2: [scope1, scope2]
  - apiKey: []

Two API keys OR a token

from dataclasses import dataclass
from xpresso.security import OAuth, Security

oauth = OAuth("oauth2", scopes=["scope1", "scope2"])
apikey1 = APIKey("apikey1")
apikey2 = APIKey("apikey2")

class GetUserTokenSecurity(SecurityModel):
    token: Annotated[str, oauth]

class GetUserApiKeySecurity(SecurityModel):
    api_key_1: Annotated[str, apikey1]
    api_key_1: Annotated[str, apikey2]

class GetUserSecurity(SecurityModel):
    token: GetUserTokenSecurity | None
    api_key: GetUserApiKeySecurity | None

async def get_user(security: Security[GetUserSecurity]) -> None:
    ...

OpenAPI:

security:
  - oauth2: [scope1, scope2]
  - apiKey1: []
     apiKey2: []

Annotated[int, Depends(get_user_id)]) => shoudn't Depends just return whatever get_user_id returns to save an annotation ?

That would be nice! But Annotated/PEP 593 does not work like that. There is no way to declare "look for the first annotation, which is a callable returning T and make the type T".

In this specific scenario though, we could do something a tad bit wonky but which should work. Instead of returning the token itself to the user as a string, we can return an instance of the class they declared as their security scheme, and the token can be an attribute on that object. That object can also include the scopes they declared as class variables, so everything they would need to verify the jwt and compare it's scopes to the required scopes is contained nicely in a single object.

Xpresso code
```python # in xpresso.security.py or something from typing import AbstractSet, ClassVar, Literal from pydantic import BaseModel from xpresso import FromHeader class XpressoSecurityScheme: type: ClassVar[Literal["oauth2", "http"]] scheme_name: ClassVar[str] class OAuth2Base(XpressoSecurityScheme): type: ClassVar[Literal["oauth2"]] = "oauth2" scopes: ClassVar[AbstractSet[str]] token: str def __init__(self, token: str) -> None: self.token = token class OAuth2(OAuth2Base): @classmethod def extract(cls, authorization: FromHeader[str]) -> "OAuth2": assert authorization.startswith("Bearer ") return cls(token=authorization[len("Berer "):]) class SecurityModel(BaseModel): def __init_subclass__(cls) -> None: super().__init_subclass__() # verify that all fields are either SecurityModel types or XpressoSecurityScheme types ```
# see above
from xpresso.security import OAuth2, SecurityModel

class MyOAuth2Model(OAuth2):
    scheme_name = "oauth2"
    scopes = {"scope1", "scope2"}

class GetUserSecurity(SecurityModel):
    oauth2: MyOAuth2Model

async def get_user(security: Security[GetUserSecurity]) -> None:
     assert security.oauth2.token == "ey.."  # user logic to validate JWT

Then we'd get a pretty nice feature set:

  1. Xpresso would know that MyOAuth2Model is a security dependency because it inherits from XpressoSecurityScheme
  2. The type system (mypy, Pylance) would know that security. oauth2.token is a string ✅
  3. This is compassable as per above, so models can be combined ✅
  4. SecurityModel can use __init_subclass__ to enforce at import time that all of it's fields' types inherit from XpressoSecurityScheme or SecurityModel
  5. Users don't have to add an extra Annotated and duplicate typing information ✅
  6. Users could use Pydantic's validation to validate the JWT (as long as they don't need to do async stuff) ✅
  7. It is easy to write custom security dependencies that do validation for you (example below) ✅

Combining models as above would work:

from xpresso.security import HeaderAPIKey, OAuth2, SecurityModel

class MyOAuth2Model(OAuth2):
    scheme_name = "oauth2"
    scopes = {"scope1", "scope2"}

class MyAPIKey(HeaderAPIKey):
    name = "X-Some-Header"  # if unset it gets inferred from the field name wherever it is used

class GetUserAPIKeySecurity(SecurityModel):
    apikey: MyAPIKey

class GetUserTokenSecurity(SecurityModel):
    token: MyOAuth2Model

class GetUserSecurity(SecurityModel):
    token: GetUserTokenSecurity | None
    api_key:  GetUserTokenSecurity | None

async def get_user(security: Security[GetUserSecurity]) -> None:
     # assuming that `get_scopes_from_token` is some user function that knows how to validate the JWT
    assert  security.token.scopes.is_subset(get_scopes_from_token(security.token))

This is still kinda kludgy, maybe there's a better way.

If a 3rd party package wanted to provide an OAuth2 model that actually verifies the JWT for you:

3rd party extension code
```python import json from typing import TypeVar, ClassVar, Type from pydantic import BaseModel from xpresso import FromHeader UserModelType = TypeVar("UserModelType", bound=BaseModel) class EnforcedOAuth2Model(OAuth2Base, Generic[UserModelType]): user_model: ClassVar[Type[UserModelType]] # type: ignore[misc] user: UserModelType def __init__(self, user: UserModelType, token: str) -> None: super().__init__(token) self.user = user @classmethod async def extract(cls, authorization: FromHeader[str]) -> "EnforcedOAuth2Model[UserModelType]": assert authorization.startswith("Bearer ") token = authorization[len("Bearer "):] payload = json.loads(token) # or validate JWT, etc. scopes = set(payload["scopes"]) assert scopes.issuperset(cls.scopes) return cls(user=cls.user_model(**payload), token=token) # catch Pydantic ValidationError and raise an HTTPException ```
class MyUserModel(BaseModel):
    name: str

class UserEnforcedOAuth2Model(EnforcedOAuth2Model[MyUserModel]):
    user_model = MyUserModel
    scopes = {"scope1", "scope2"}

@yezz123 I'm especially curious about this last part, would this provide the right extensibility points to create a 3rd party auth library that does the whole song and dance?

yezz123 commented 2 years ago

@yezz123 I'm especially curious about this last part, would this provide the right extensibility points to create a 3rd party auth library that does the whole song and dance?

Yes, the last part could be just a way to use the Oauth2 and give the developer the choice of building his own 3rd party directly or building an authentication package on it, same I did with Authx while I create my own functionality on the top of what fastAPI and HTTPX provide and I create my own customized serializer that the user of authx while used as a parameter to create his own 3rd party instance, you could check that's even I provide google and Facebook but there is a way could be used for others part ex. Github

from authx.services import SocialService
from authx.api import UsersRepo
from authx.core.jwt import JWTBackend

SocialService.setup(
        repo = UserRepo,
        auth_backend = JWTBackend,
        base_url = 'http://localhost:8000',
        option = 'social',
    )

You could check this package to https://github.com/frankie567/httpx-oauth built on the top of HTTPX and fastapi and used by defining the instance and environment variables, you could see how this is built https://github.com/tiangolo/fastapi/blob/master/fastapi/security/oauth2.py, and its could help to build a native one while the Oauth2 is similar in different libraries and Frameworks, we instance the scope and we give the developers the choice of building the top of scope.

I used this before it help me to understand how the flow of scopes and oauth2 could be built on top of an architecture florentcpt/poc_oauth2_native_app

adriangb commented 2 years ago

I managed to cook up a tracer bullet implementation in case any of you are interested in taking a look: https://github.com/adriangb/xpresso/pull/68/files#diff-098decea0c8483a402cf0f191aa5d61c04a692f9b844491876437870c0e8aff0

There's some rough edges to smooth out. One thing that is a shame is that you don't get any auto-complete for adding class vars like this (you do get type checking insofar as that you can't do something like override name: str with name = 123).

ksamuel commented 2 years ago

My understanding is that at runtime only the last definition of the function is preserved and all of the @overload information is lost,

Indeed, I haven't thought about that. Mypy understands it because it parses it, but we only get the residual metadata.

If you have multiple models that are not connected into one root model, you must create the root model and pass it into Operation(..., security=RootModel) so that we know how to build the OpenAPI.

Should it raise an exception if it detect multiple models without one root, and no explicit security parameters? This would prevent the user from assuming security, and if the error message is nicely written, can provide the steps to fix the code.

Security[GetUserSecurity]

Is the Security wrapper needed ? We don't need it for forms after all. SecurityModel could provide the interface we need. Or if we want to decouple it, create a SecurityPolicy that most users would use instead of SecurityModel which automatically validate, and have the interface from both Model and Security?

This is still kinda kludgy, maybe there's a better way.

You are defining a complex authentication logic, with 2 possible factors, using 10 lines of code. I think it's not bad. Especially since most users will not even see it, and probably want to just define a DI with get_user() and be done with it.

In fact, the framework should probably provide some default Security policies ready to be used for get_user.

For the User model, it's not as clear. Having a default user model linked to db, with username and email has been a problem with django on the long run. Indeed, a user may not have a username or an email. A user could even not be human.

On the other hand, asking xpresso dev users to define themself one User model every time is probably going to be annoying very fast. But maybe it's more the role of something build on top of the framework. Something that provide common User models for the most desired use cases. It's not an easy decision.

Also, to simplify custom auth, some simple generic templates to override it would be nce. EnforcedOAuth2Model is probably a rare case where you want full control over attribute names. Most of the time, people with a custom auth scheme will be ok with the default choice.

# I'm haven't check the edge cases, but since dataclasses does a lot, it might be possible to use
# class decorators to avoid the explicit Generic[UserModelType] ihneritance,  and 
# to generate generate the user_model attributes and __init__ methods
@security_class 
class EnforcedOAuth2Model(OAuth2Base):

    user: UserModelType

    @classmethod
    async def extract(cls, authorization: FromHeader[str]) -> "EnforcedOAuth2Model[UserModelType]":
        assert authorization.startswith("Bearer ")  # don't put this in the doc, or people will use it in prod, and it will break with -O
        token = authorization[len("Bearer "):]
        payload = json.loads(token)  # or validate JWT, etc.
        scopes = set(payload["scopes"])
        assert scopes.issuperset(cls.scopes)
        return cls(user=cls.user_model(**payload), token=token)  # catch Pydantic ValidationError and raise an HTTPException

token: GetUserTokenSecurity | None

Unrelated, but I love this syntax so much more than Optional[]. I hope Annotated[] will be written with a shortcut one day. & would be nice, it's close enough, but it's not exactly the same so I don't know...

adriangb commented 2 years ago

Should it raise an exception if it detect multiple models without one root, and no explicit security parameters? This would prevent the user from assuming security, and if the error message is nicely written, can provide the steps to fix the code.

Yep exactly there will be only 2 cases supported:

  1. You provide a single SecurityScheme as a dependency
  2. You provide an explicit Operation(security=...) argument

So in the case you mention an exception would be raised and the app wouldn't start. I think the error message can be pretty straightforward, we have all of the info so we can build a nice template including some of the user's class names, etc.

Is the Security wrapper needed ? We don't need it for forms after all. SecurityModel could provide the interface we need.

So for forms we do need a marker (FromFormData or FromMultipart). If there is no explicit marker, di falls back to inspect.signature to figure out how to construct a class. This returns the parameters for __init__, but here we want the classes to be constructed using async def __call__() so that they can do stuff like make an HTTP request to get a JWK needed to validate a JWT. What the Security marker does is tell di to call SecurityScheme.__call__ in order to construct it. Here's a clearer example of what's going on:

import asyncio
from dataclasses import dataclass
from typing import Any

from di import Container, Dependant, AsyncExecutor

@dataclass
class SecurityScheme:
    foo: str = "foo"
    @classmethod
    async def __call__(cls) -> "SecurityScheme":
        return SecurityScheme("bar")

class Security(Dependant[Any]):
    def __init__(self, scheme: SecurityScheme) -> None:
        super().__init__(scheme.__call__)

async def main() -> None:
    container = Container(scopes=(None,))
    executor = AsyncExecutor()
    without_wrapper = container.solve(Dependant(SecurityScheme))
    with_wrapper = container.solve(Dependant(SecurityScheme))
    async with container.enter_scope(None):
        res1 = await container.execute_async(without_wrapper, executor=executor)
        res2 = await container.execute_async(with_wrapper, executor=executor)
    assert res1.foo == "foo"  # di called __new__/__init__
    assert res2.foo == "bar"  # this is what we want

asyncio.run(main())

Maybe we could make a special case for a class implementing an @classmethod __call__ (this would have to go in di)? I could see that being useful beyond this single use case for users to be able to asynchronously construct classes. If we did that, we could also move around the Binder interfaces a bit, for example by creating a Form class and MultiPartForm class that would be used like:

class MyForm(Form):
    field: str
    other: int

async def endpoint(form: MyForm) -> None:
    ...

You are defining a complex authentication logic, with 2 possible factors, using 10 lines of code. I think it's not bad. Especially since most users will not even see it, and probably want to just define a DI with get_user() and be done with it.

👍 yeah I realize this is an advanced use case, I just want to make sure we don't throw a bunch of boilerplate at this just for the sake of a clean design / api

For the User model, it's not as clear. Having a default user model linked to db, with username and email has been a problem with django on the long run. Indeed, a user may not have a username or an email. A user could even not be human.

On the other hand, asking xpresso dev users to define themself one User model every time is probably going to be annoying very fast. But maybe it's more the role of something build on top of the framework. Something that provide common User models for the most desired use cases. It's not an easy decision.

I feel strongly that we should not have a default user model of any kind. User models are very dependent on the final implementer's systems and domain. I think this is best left for 3rd party packages.

That said, we should make sure that these 3rd party packages are possible and even easy to write 😃 . I wouldn't be opposed to having some toy implementation in our tests / docs, just like we have a half-baked implementation of using MsgPack.

it might be possible to use class decorators to avoid the explicit Generic[UserModelType] inheritance

Indeed, that's a good idea! Luckily with this design we can punt that into the future / 3rd party packages so it's not something we'd have to worry about in Xpresso (for now).

Unrelated, but I love this syntax so much more than Optional[]. I hope Annotated[] will be written with a shortcut one day. & would be nice, it's close enough, but it's not exactly the same so I don't know...

foo: int & Depends(scope="app") would be interesting indeed! Or maybe /? `foo: int / Depends(...) / Field(...)

The one big issue with Annotated is that (1) it requires an import and (2) it's very verbose. This is actually called out in PEP 593, the reasoning was "let's put this out there, see how it's used and we can always come back latter to make it less verbose".

adriangb commented 2 years ago

Maybe we could make a special case for a class implementing an @classmethod call (this would have to go in di)?

With a minor change in di (PR) I was able to get rid of a decent chunk of code and complexity here! Thanks for the pushback @ksamuel

ksamuel commented 2 years ago

Thanks for all the work you put in this.

adriangb commented 2 years ago

Ran into a couple hiccups after I tried implementing this. I'm going to think about it a bit, but I wanted to share here:

Lack of autocomplete

When you subclass a SecurityScheme and start typing, you don't know what parameters are required and which are optional, or what parameters are even available:

class MyAPIKey(APIKey):
    # how does a user know what they need to fill in here
    # without digging into the docs?

This may not be a big deal if there's only 1-2 parameters, but I remember getting pretty frustrated by this when using Pydantic's Config, which is similar:

class MyModel(BaseModel):
    class Config(BasseModel.Config):
        # is the parameter allow_extra or extra_allow?

What about just using instances?

The issue with instances is that they are not types, which would make SecurityModel very verbose. Contrast these two options:

from xpresso.security import SecurityModel, APIKeyHeader

# classes

class APIKey(APIKeyHeader):
    name = "key"

class UsersSecurity(SecurityModel):
    key: APIKey

async def endpoint(security: UsersSecurity) -> None:
    ...

# instances
from typing import Annotated
from xpresso import Security

apikey = APIKey(name="key")

class UsersSecurity(SecurityModel):
    key: Annotated[str, Security(apikey)]  # verbose

async def endpoint(security: UsersSecurity) -> None:
    ...

No way to pass in non-field options for Forms

This is tangential to this topic but somewhat relevant because I'd like for there to be a single logical way to do things across forms, security models, etc.

If we were to change forms to:

from xpresso.bodies import Form

class MyForm(Form):
    age: int
    name: str

async def endpoint(form: MyForm) -> None:
    ...

It is true that users save the boilerplate of a FromFormData marker, but what if they want to add a description to their request body? Now they'd have to do:

from xpresso.bodies import Form

class MyForm(Form):
    age: int
    name: str

async def endpoint(form: Annotated[MyForm, Form(description=...)]) -> None:
    ...

Or do we go the Pydantic route and add a Config class?

Another option would be using __init_subclass__:

from xpresso.bodies import Form

class MyForm(Form, description=...):
    age: int
    name: str

async def endpoint(form: Annotated[MyForm, Form(description=...)]) -> None:
    ...
ksamuel commented 2 years ago

It is true that users save the boilerplate of a FromFormData marker, but what if they want to add a description to their request body? Now they'd have to do:

I can see the value to have various degrees of granularity for passing config: one on the class, one on the endpoint signature, etc.

So yes, for putting things on the class, __init_subclass__ makes sense: it's better than a config class which people always find weird, and less verbose than a decorator.

But we still need to find a way to pass things at the signature level. Annotated[MyForm, Form(description=...)] is indeed very verbose, so let's explore alternatives.

One possible approach would be to provide the equivalent of the django CBS as_wiew(), with something:

async def endpoint(form: MyForm.override(description=...)) -> None:
    ...

I think it's preferable to Annotated[MyForm, Form(description=...)] for several reason:

adriangb commented 2 years ago

Unfortunately the param: Foo.bar(...) syntax is a non-starter: it's technically not a valid type annotation and MyPy/Pyright will yell at you. We could do Annotated[MyForm, MyForm.override(...)] but I fear that's not much better than the import.

ksamuel commented 2 years ago

Indeed, I should have tested it before writing such a comment.

I tried various versions of it, but the dynamism mypy allows is way more limited that what annotations accept.

I think the local override, with Annotated[MyForm, Form(description=...)] will then be the best route, knowing most people will use the class parameters in most cases, so it should be rare enough.

adriangb commented 2 years ago

How about:

from xpresso import Form

class MyForm(Form):
    __config__ = Form.Config(description=...)

Or something along those lines? __config__ (or whatever we call it) would be typed in the base class so that assigning something other than Form.Config would give an IDE error, and Form.Config would be typed as to accept the right parameters, so you'd get auto-complete and type checking.

ksamuel commented 2 years ago

I do prefer the option of doing class MyForm(Form, description=...) if you can pull it off with __init_subclass__ or a metaclass. If not, a dunder attribute is still a better choice than an inner class.

adriangb commented 2 years ago

It's doable to do it that way. The main issues I worry about are readability and autocomplete, it's not common to see this:

class MyForm(Form, description="Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent facilisis vel sapien eget lacinia. Nulla malesuada rutrum sodales. Curabitur at lobortis lacus. Cras metus ipsum, laoreet ut maximus nec, euismod ullamcorper felis. Morbi erat tellus, scelerisque in accumsan in, rutrum vitae massa. Pellentesque in viverra ex. Cras consequat massa neque, et dignissim risus commodo in. Proin iaculis justo mollis feugiat varius."):
    pass

And I'm not sure what formatters would even do with this.

Of course you could do:

description = "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent facilisis vel sapien eget lacinia. Nulla malesuada rutrum sodales. Curabitur at lobortis lacus. Cras metus ipsum, laoreet ut maximus nec, euismod ullamcorper felis. Morbi erat tellus, scelerisque in accumsan in, rutrum vitae massa. Pellentesque in viverra ex. Cras consequat massa neque, et dignissim risus commodo in. Proin iaculis justo mollis feugiat varius."

class MyForm(Form, description=description):
    pass

But if you have a lot of parameters I think it'd still either be a really long line or spill over the class bases into multiple lines which would be awkward:

class MyForm(
    Form,
    description=...,
    ...,
    param100=...,
):
    pass

This is not realistic for Form at least (we only have 4 parameters or so) but I'm just trying to push the ideas to see where they break down.

As for autocomplete, an instance (assigned to a dunder attribute) gives you autocomplete so you don't have to look at the docs to remember the parameter names. At least on VSCode with Pylance you don't get autocomplete for __init_subclass__ parameters (you get errors, but not suggestions).

General note

I think with some of these things (another example being the instance vs. class thing for defining security models) it may be worth supporting multiple options at first, just to get user feedback and explore edge cases (like the above). Then once the landscape is clearer, we can choose 1-2 winners and remove the other ways of doing things. I think as long as we're clear about what we're doing that's okay for an early stage project like this. All of this is as long as actually implementing multiple ways to do things would be easyish (in this case I think it would be).

ksamuel commented 2 years ago

It's version 0.21.0 so for now, there is no stability guaranty, that's the point.

adriangb commented 2 years ago

So I take it you like the plan of, in some cases, offering multiple ways to do something with the idea of keeping only some of them long term?

ksamuel commented 2 years ago

Depends of your definition of like.

When I do this, I know I have to implement migration facilities later, which is more work: either some adapter, some scripts giving warnings when running the dev server, etc. It's one thing to break the API, it's another to let your users dry.

So on my projects, I try to avoid it because I know it will add a maintenance burden to my shoulder. I'd rather postpone the feature, code a few stuff on my own with a private alpha offering both solutions, see which one I use the most myself, and settle on one.

But I think it's an acceptable solution to the problem, provided you know what it implies.

adriangb commented 2 years ago

Gotcha. I'll mull over it a bit. Thank you for all of the great input.