singnet / das-poc

Distributed Atomspace.
11 stars 11 forks source link

Define a couple of possible architectures to provide DAS services #194

Open andre-senna opened 1 year ago

levisingularity commented 1 year ago
levisingularity commented 1 year ago

Aug 9, 2023 Yesterday, Terraform code was implemented for provisioning resources on Vultr, just testing because I still don't have the access credentials, I followed the provisioning that is temporarily being done manually throughout the day, I wrote a pyhton script to connect to the banks of redis and Mongodb data for connection tests and data capture, architected some adaptations in the application design and estimated AWS costs with servless databases for Redis and MongoDB.

levisingularity commented 1 year ago

Image mongodb - AWS Pricing Calculator.pdf My Estimate - AWS Pricing Calculator.pdf Image

levisingularity commented 1 year ago

Image

Image

levisingularity commented 1 year ago

Aug 9, 2023 Vultr provider documentation for terraform: https://registry.terraform.io/providers/vultr/vultr/latest/docs

levisingularity commented 1 year ago

Aug 9, 2023 MongoDB and Redis response time for query with 100k records:

Image

levisingularity commented 1 year ago

Aug 9, 2023 Repositories created for testing: https://github.com/levisingularity/DAS-infra-stack-servless https://github.com/levisingularity/DAS-function https://github.com/levisingularity/DAS-infra-stack-aws

levisingularity commented 1 year ago

Aug 10, 2023 Between yesterday and today I worked on top of the Vultr servers, creating them dynamically with terrafor (see project https://github.com/levisingularity/DAS-infra-stack-vultr), I created a structure to also create the lambda function resources in the aws (see project: https://github.com/levisingularity/DAS-infra-stack-aws).

levisingularity commented 1 year ago

Aug 10, 2023 I was able to create a lambda function, use s3 to store the source code zip (https://s3.console.aws.amazon.com/s3/buckets/das.singularitynet.io?region=us-east-1&prefix=production /&showversions=false) and run the test code in the aws lambda:

Image

levisingularity commented 1 year ago

Aug 10, 2023 Terraform code running on my machine:

Image

levisingularity commented 1 year ago

Aug 10, 2023 test script response time on Vultr's barel metal server:

Image

levisingularity commented 1 year ago

Aug 11, 2023 Today I managed to set up a server with OpenFAAS simulating a python function running servless, we can already create as many functions as we want, and invoke these functions:

Image

Image

levisingularity commented 1 year ago

aug 12, 2023:

Detailed architecture drawing:

Image

levisingularity commented 1 year ago

Aug 15, 2023: Yesterday I was able to integrate openFaas with aws ECR for the docker images registry, now we can build and push the function images in OpenFaas using the AWS ECR registry

Image

levisingularity commented 1 year ago

aug 15, 2023: Yesterday I was able to integrate openFaas with aws ECR for the docker images registry, now we can build and push the function images in OpenFaas using the AWS ECR registry

levisingularity commented 1 year ago

aug 16, 2023: Here is the last evolution made yesterday of the architecture:

Image

levisingularity commented 1 year ago

https://github.com/singnet/das-infra-stack-vultr/pull/1 https://github.com/singnet/das-pre-infra-aws/pull/1 https://github.com/singnet/das-pre-infra-vultr/pull/1 https://github.com/singnet/das-infra-stack-aws/pull/1

levisingularity commented 1 year ago

aug 16, 2023: Yesterday the terraform script for API Gateway was already created and today I already configured and uploaded the prs to the new repositories

levisingularity commented 1 year ago

aug 17, 2023: Today I managed to make available in aws, an instance with openfaas, already with the test functions deployed. If you want to test it, follow the link and accesses: http://3.92.235.170:8080/ui/ user: admin pass: 66U6dPbkQgBtDTIfIhZSK03Ox9BVQLV6BOlj5sSCdCAaMXTBN0aMk5IXlnaikfq

levisingularity commented 1 year ago

Image

levisingularity commented 1 year ago

Builted the pipeline script to validate the terraform script on github actions:

Image https://github.com/singnet/das-infra-stack-aws/actions/runs/5897267840

Today, i will apply this pipeline to all Das infra projects and add the step to release and tag create following the semantic versioning

levisingularity commented 1 year ago

https://github.com/singnet/das-scripts-pipeline/pull/1

levisingularity commented 1 year ago

I managed to make the lambda connect with redis and mongodb and insert the data in mongo and redis, but when I make the call to redis it gives an error related to the slot, I tried to use rediscluster but it does not want to work, I will try to use now the redis-py-cluster to see if it will work

levisingularity commented 1 year ago

Image

levisingularity commented 1 year ago

I managed to complete the integration, I just couldn't find the reason why redis was taking longer than mongodb to respond... I'm looking at how to pass the pipeline code from gitlab to github about the semantic versioning

levisingularity commented 1 year ago

List of natively supported technologies for creating functions in openfaas: https://github.com/openfaas/templates

in AWS Lambda Functions: https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html?icmpid=docs_lambda_help

levisingularity commented 1 year ago

Created docker images on DockerHub registry:

Image

levisingularity commented 1 year ago

Created documentation of DAS system infrastructure architecture: https://docs.google.com/document/d/1kQhM62T3TIb3ECoqBxmqPulMciJq7Vev3-f_Gnhes_s/edit

levisingularity commented 1 year ago

Created repository on Github to pipelines scripts and created pipeline to generate the Tag of release for all projects: https://github.com/singnet/das-scripts-pipeline

levisingularity commented 1 year ago

configuring openfaas to upload the functions and test the connection with redis and mongodb:

levisingularity commented 1 year ago

I made 3 functions in Vultr's Openfaas, connected to the Redis and MongoDB databases:

You can access and see the functions within Openfaas by accessing the link below: http://149.28.222.79:8080/ui/ user: admin Ask the password

To see the code for each one, just look at the repos:

You can see that the adaptations I made to DAS were few. Basically I built a file to deploy the function in Openfaas (das-function.yml), changed the name of the main function, from main to handler (service/server.py → handler.py), changed the reference of this main file in service /client.py and removed the call via grpc to leave it in the Openfaas HTTP standard (provisionally, because it was simpler to leave HTTP)

levisingularity commented 1 year ago

Working list redis and mongoDB scripts on Vultr infra:

Image

Image

levisingularity commented 1 year ago

Instructions for testing DAS already working in OpenFaas: 1- http://149.28.222.79:8080/ui/ 2- Log in with your credentials: user: admin password: Ask the Password 3- Use the DAS function (image attached)

body example:

{ "action":"get_node", "database_name": "das", "node_type": "node-example", "node_name": "node-name" }

Image

levisingularity commented 1 year ago

The endpoint query is already completed but I'm still trying to resolve the timeout problem

The following json is an example of a query:

{
  "action": "query",
  "database_name": "das",
  "query": {
    "And": [
      {
        "Link": {
          "link_type": "Evaluation",
          "ordered": true,
          "targets": [
            {
              "Variable": {
                "variable_name": "000000214999369af91fb563b4e0eadb"
              }
            },
            {
              "Variable": {
                "variable_name": "1a0738cb1a6b6b8ce7bae84c4296c0ce"
              }
            }
          ]
        }
      }
    ]
  }
}
levisingularity commented 1 year ago

I've resolved the timeout problem, and now the query function is working properly along side with the other ones image

levisingularity commented 1 year ago

I'm creating the pipelines to deploy the functions to AWS

levisingularity commented 12 months ago

I've configured the DAS client to call AWS functions, and the database is still being populated on AWS. Here's the log as of now:

2023-10-17 11:34:14 INFO Parsed 134205645/141269154 (95%) 2023-10-17 11:35:36 INFO Parsed 135618336/141269154 (96%) 2023-10-17 11:37:03 INFO Parsed 137031027/141269154 (97%) 2023-10-17 11:38:24 INFO Parsed 138443718/141269154 (98%) 2023-10-17 11:40:01 INFO Parsed 139856409/141269154 (99%) 2023-10-17 11:42:13 INFO Parsed 141269100/141269154 (100%) 2023-10-17 11:42:13 INFO Finished parsing file. 2023-10-17 11:42:13 INFO Populating MongoDB link tables 2023-10-17 13:46:59 INFO Populating Redis 2023-10-17 13:46:59 INFO Building key-value file

I'm setting up the versioning pipeline in DAS.

DasClient being called from the terminal

https://github.com/singnet/das/assets/141682181/71ae1375-60b0-4c3a-a69c-9200a4c951c6

levisingularity commented 12 months ago

I fixed the function call problem in AWS, configuring a new machine to run the script to populate the database.

levisingularity commented 12 months ago

About my update, i doing:

tool to performance tests: https://github.com/wg/wrk documentation in progress: https://docs.google.com/document/d/1njmP_oXw_0FLwoXY5ttGBMFGV2n60-ugAltWIuoQO10/edit?usp=sharing