IdentityModel / IdentityModel.AspNetCore.OAuth2Introspection

ASP.NET Core authentication handler for OAuth 2.0 token introspection
Apache License 2.0
147 stars 66 forks source link

Invalidate cache after grant revoked #45

Closed cleftheris closed 5 years ago

cleftheris commented 5 years ago

Using reference tokens. Is there a way to keep caching the results of the introspection endpoint (not disabling the cache) and at the same time have an incoming callback fire immediately when hitting the revocation endpoint?

leastprivilege commented 5 years ago

Please elaborate.

cleftheris commented 5 years ago

A scenario. Protecting an API resource using reference tokens using IdentityServer4. The API is using the AccessTokenValidation middleware which under the hood uses this library and is configured to leverage the distributed cache feature.

  1. A grant is given to a client by IdSrv after a user consents.
  2. The user goes to the IdSrv Grants page and revokes the access.
  3. The client continues to access the API resource on behalf of the user until the cache entry for the reference token is invalidated.

What I am suggesting is to expose a webhook endpoint inside the AccessTokenMiddleware that can be called in the backchanel by the IdentityServer towards the API so the cache entry can be purged immediately.

Is the above far fetched? Right now I have dropped the caching to one minute which is quite ok & I can always disable cache but I wanted to share my thoughts.

leastprivilege commented 5 years ago

I see.

This would be useful, but is out of scope for this library. The cache is meant to be really short-lived, something between 5sec to 1 minute.

I heard that there might be a spec upcoming that details the token service to API revocation push mechanism. Once this is available, we can re-visit.

cleftheris commented 5 years ago

Thanks @leastprivilege

github-actions[bot] commented 3 years ago

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue.