yakupbeyoglu / ForgeAPI

ForgeAPI is a powerful and automated C++ tool(compiler) designed to streamline the process of building RESTful APIs from structured JSON inputs.
MIT License
2 stars 0 forks source link

Lexical Analyzer & JSON Structure #2

Open yakupbeyoglu opened 6 hours ago

yakupbeyoglu commented 6 hours ago

Basically we should define a json structure based on the database relations and authorization groups. You can find example json below.

{
  "table": "posts",
  "fields": [
    { "name": "id", "type": "bigIncrements" },
    { "name": "title", "type": "string", "length": 255 },
    { "name": "content", "type": "text" },
    { "name": "user_id", "type": "unsignedBigInteger", "foreign": { "table": "users", "references": "id", "onDelete": "cascade" } }
  ],
  "relations": [
    { "type": "belongsTo", "model": "User", "foreignKey": "user_id" }
  ],
  "routes": [
    {
      "method": "GET",
      "uri": "/posts",
      "action": "index",
      "custom_sql": "SELECT * FROM posts WHERE user_id = :user_id",
      "authorization": "public"
    },
    {
      "method": "GET",
      "uri": "/posts/{id}",
      "action": "show",
      "authorization": "public"
    },
    {
      "method": "POST",
      "uri": "/posts",
      "action": "store",
      "authorization_groups": ["admin", "editor", "manager"]
    },
    {
      "method": "PUT",
      "uri": "/posts/{id}",
      "action": "update",
      "authorization_groups": ["admin", "editor", "manager"]
    },
    {
      "method": "PATCH",
      "uri": "/posts/{id}",
      "action": "updatePartial",
      "authorization_groups": ["admin", "editor", "manager"]
    },
    {
      "method": "DELETE",
      "uri": "/posts/{id}",
      "action": "destroyPartial",
      "authorization_groups": ["admin"]
    },
        {
      "method": "DELETE",
      "uri": "/posts/{id}",
      "action": "destroyAll",
      "authorization_groups": ["admin"]
    }
  ],
  "middleware": [
    { "name": "auth", "routes": ["/posts", "/posts/{id}"] }
# It can also     { "name": "auth", "routes":"/posts/*" }  // This means all routes need auth.
  ],
  "authorization_groups": [
    { "role": "admin", "description": "Administrator with full access" },
    { "role": "editor", "description": "User with editing privileges" },
    { "role": "manager", "description": "User with managerial access" },
    { "role": "client", "description": "Regular client user" }
  ]
}

Token Definitions:

Tasks

1. Read Input:

Read the input JSON string that you want to analyze.

2. Initialize Tokenizer:

Set up a tokenizer with a state machine if necessary, to manage different parsing contexts (e.g., inside an object vs. inside an array). Define Token Patterns:

Write regex patterns or other matching logic to recognize different token types.

3.Scan Input:

Use the tokenizer to scan the input and generate tokens according to the defined patterns.

4.Handle Special Cases:

Handle cases such as nested objects/arrays and escape sequences in strings.

5. Output Tokens:

Store or output tokens in a way that can be consumed by a parser or other components of a compiler. We need to store token and output values on a graph.

Don't forget that the second step of the lexical analysis process is Syntax Analysis. For each analyzed object, we can also check the syntax. In this case, we can speed up the process and reduce complexity by avoiding repeated iterations.

yakupbeyoglu commented 5 hours ago

https://github.com/nlohmann/json library should be a submodule on third_party/ directory.