Closed audrey-roe closed 9 months ago
NB if you need prompt help, i have taken the liberty to engineer some prompts that you can reference, just look for the preprompt folder in the repo.
All this should be done in typescript or python
Typescript please.
Ooops, I have never used typescript before
- [x] Read the documentation Build 3 functions:
- [x] i) Function that takes express route file, and sends to the assistant or agent and gets back a list of controllers (view functions). it should be in list format like so [loginController, signUp, dashboardInfo, ...] The format is important because the function that it will be handled off to, is a function that finds the controller modules in the codebase, and expects it in that format.
- [ ] ii) Function that takes the Django view and cleans it up. The view may be incomplete, it may also end up being the entire file where the code resides and finally it may be a complete view function without any issues. write a logic to anticipate all 3 scenarios and handle them accordingly
- [ ] iii) Lastly the function that can take either the Django view or the Express controllers and generate documentation
NB if you need prompt help, i have taken the liberty to engineer some prompts that you can reference, just look for the preprompt folder in the repo.
I think we should use gpt-3.5-turbo instead of anthropic since anthropic has a wait time before access can be given and with good prompt engineering we can get a good result that would enable us deliver an MVP on time.
documentation is in python but im alredy working on a typscript version cant do the langchain thing currently working on using gpt3.5 to run the stuff ill give feed back as soon as i have something ps i also dont knw typescript buh ill figure it out
I made a pull request some hours ago, solving this issue the best way I know using prompt engineering and chat gpt 3.5, I am familiar with langchain for prompt chaining but not proficient at using it yet. We can adopt it later with more time on our hands because it is a useful framework to use alongside your LLM models
ive been able to extract routes and functions from any js file and return then in a list please @audrey-roe could you give a sample file i coud use to futher test my typescript function also i was able to do this without using ai i analysed the AST of the file and exreacted the functions and routes automatically
hi! @Gigatronhertz here is a sample
route.ts example 1 import { Express, Request, Response } from "express"; import { loginUserHandler, createUserHandler, deleteUserHandler, revokeSession } from './controller/user.controller'; import { getFileHandler, streamFileController, uploadFileHandler, handleCreateFolder, markAndDeleteUnsafeFileController, getFileHistoryController, reviewFile} from "./controller/files.controller"; import {verifyAccessToken, createOrUpdateSession} from "./middleware/requrieUser"; import isAdmin from "./middleware/isAdmin"; import multer from 'multer';
const storage = multer.memoryStorage(); const upload = multer({ storage }); function routes(app: Express){ app.get("/", (req:Request, res: Response)=> res.status(200).send('Welcome To Cloudguardian! Check out the Postman collection for the available endpoints' )); app.get("/healthcheck", (req:Request, res: Response)=> res.sendStatus(200)); app.post('/api/login', createOrUpdateSession, loginUserHandler); app.post('/api/user', createOrUpdateSession, createUserHandler); app.delete('/api/user', verifyAccessToken, deleteUserHandler); app.post('/api/revokeSession', verifyAccessToken, revokeSession); app.put('/api/file/upload', verifyAccessToken, upload.single('file'), uploadFileHandler); app.get('/api/file/download/:fileId', verifyAccessToken, getFileHandler); //done app.get('/api/file/stream', verifyAccessToken, streamFileController); app.post('/api/create-folder', verifyAccessToken, handleCreateFolder); app.post('/api/file/mark-unsafe', verifyAccessToken, isAdmin, markAndDeleteUnsafeFileController ) app.get('/api/file-history/:fileId',verifyAccessToken, getFileHistoryController); app.put('/api/file/review/:fileId', verifyAccessToken, isAdmin, reviewFile); } export default routes
routes.ts example 2 import { FileController } from "./controller/file.controller" import { UserController, AdminController } from "./controller/user.controller" import requireAdmin from "./middleware/requireAdmin" import requireUser from "./middleware/requireUser"
export const Routes = [
//User routes
{
method: "post",
route: "/api/users",
middleware: [],
controller: UserController,
action: "create"
}, {
method: "post",
route: "/api/login",
middleware: [],
controller: UserController,
action: "login"
},
//Admin Routes
{
method: "put",
route: "/admin/mark-image-unsafe/:fileId",
middleware: [requireUser, requireAdmin],
controller: AdminController,
action: "markImageUnsafe"
}, {
method: "put",
route: "/admin/mark-video-unsafe/:fileId",
middleware: [requireUser, requireAdmin],
controller: AdminController,
action: "markVideoUnsafe"
},
//Files and folder Routes
{
method: "post",
route: "/file/upload",
middleware: [requireUser],
controller: FileController,
action: "uploadFile"
}, {
method: "post",
route: "/file/download",
middleware: [requireUser],
controller: FileController,
action: "downloadFile"
}, {
method: "post",
route: "/file/create-folder",
middleware: [requireUser],
controller: FileController,
action: "createFolder"
}
]
- [x] Read the documentation Build 3 functions:
- [ ] i) Function that takes express route file, and sends to the assistant or agent and gets back a list of controllers (view functions). it should be in list format like so [loginController, signUp, dashboardInfo, ...] The format is important because the function that it will be handled off to, is a function that finds the controller modules in the codebase, and expects it in that format.
- [ ] ii) Function that takes the Django view and cleans it up. The view may be incomplete, it may also end up being the entire file where the code resides and finally it may be a complete view function without any issues. write a logic to anticipate all 3 scenarios and handle them accordingly
- [ ] iii) Lastly the function that can take either the Django view or the Express controllers and generate documentation
NB if you need prompt help, i have taken the liberty to engineer some prompts that you can reference, just look for the preprompt folder in the repo.
I think we should use gpt-3.5-turbo instead of anthropic since anthropic has a wait time before access can be given and with good prompt engineering we can get a good result that would enable us deliver an MVP on time.
@Chinemelu4 Hi! we are using anthropic wrapped with langchain. That is why i shared to documentation in the first comment
documentation is in python but im alredy working on a typscript version cant do the langchain thing currently working on using gpt3.5 to run the stuff ill give feed back as soon as i have something ps i also dont knw typescript buh ill figure it out
@Chinemelu4 the second documentation link is for js and you can use it for ts.
Ok. So, I'm guessing you have access to an anthropic model api for testing?
documentation is in python but im alredy working on a typscript version cant do the langchain thing currently working on using gpt3.5 to run the stuff ill give feed back as soon as i have something ps i also dont knw typescript buh ill figure it out
@Chinemelu4 the second documentation link is for js and you can use it for ts.
This comment was from @Gigatronhertz
Ok. So, I'm guessing you have access to an anthropic model api for testing?
langchain.. we are using langchain please. @Chinemelu4 LangChain offers an experimental wrapper around Anthropic that gives it the same API as OpenAI Functions.
I dont know if you got to this point in the documentation https://js.langchain.com/docs/integrations/chat/anthropic_functions
@Joggyjagz7 any clarification you can provide at this point is very appreciated
been having trouble pulling off the typescript implementations for the antropic ai
@Gigatronhertz dont worry, i'll handle it
The OpenAI's chatgpt is not available to us at the moment, so we have to find other ways compeleting this project. so in subtitution of gpt4's assistant. we have to use langchain.
please check out the documentations below and see how you can get that to work.
https://api.python.langchain.com/en/latest/llms/langchain.llms.anthropic.Anthropic.html and https://js.langchain.com/docs/integrations/platforms/anthropic