Closed xcollantes closed 1 year ago
Use https://next-auth.js.org for basic functionality which is not maintained by NextJS/Vercel but mentioned in their official docs.
Using NextAuth.js which organizes and implements OAuth 2.0 for NextJS.
To save my privacy and digital footprint, I need to limit open access to my personal portfolio which can be used for OSINT against my personal information.
This raises two questions:
How do we prove users who are they say they are?
Usage is easy since the dependency relies on middleware used in API directory in the NextJS app.
How do we let specific type of users access?
Since NextJS will handle authentication i.e. prove they are the person they say they are, a simple solution for now will suffice for the authorization i.e. certain access to a list of allowed people.
There is code in the NextAuth options which will check a Google Sheet which is a single column of emails which is the allow list.
Blog section containing the work experience and articles is not protected by the privacy scheme here. For example: https://xaviercollantes.dev/blogs/**.
Issue: There is no check for the session for NextAuth in the blogs rendering since the blogs are rendered at build time, not run time when the user is requesting.
Blog section containing the work experience and articles is not protected by the privacy scheme here. For example: https://xaviercollantes.dev/blogs/**.
Issue: There is no check for the session for NextAuth in the blogs rendering since the blogs are rendered at build time, not run time when the user is requesting.
Content can be secured behind "login wall" using the middleware.js file which can restrict access. Another option is to use the getServerSide generation to build each page on user request. This will slow down the access of the web page for blogs but using the faster static generation will not secure pages.
Pitfall: Using regex for path will work.
export { default } from "next-auth/middleware"
export const config = { matcher: ["/blogs/(.*)"] }
Add some filtering of users to minimize open source intelligence scraping.