decodingml / llm-twin-course

🤖 𝗟𝗲𝗮𝗿𝗻 for 𝗳𝗿𝗲𝗲 how to 𝗯𝘂𝗶𝗹𝗱 an end-to-end 𝗽𝗿𝗼𝗱𝘂𝗰𝘁𝗶𝗼𝗻-𝗿𝗲𝗮𝗱𝘆 𝗟𝗟𝗠 & 𝗥𝗔𝗚 𝘀𝘆𝘀𝘁𝗲𝗺 using 𝗟𝗟𝗠𝗢𝗽𝘀 best practices: ~ 𝘴𝘰𝘶𝘳𝘤𝘦 𝘤𝘰𝘥𝘦 + 12 𝘩𝘢𝘯𝘥𝘴-𝘰𝘯 𝘭𝘦𝘴𝘴𝘰𝘯𝘴
MIT License
2.6k stars 432 forks source link

Running Individual Components #36

Open RishiRavula opened 2 months ago

RishiRavula commented 2 months ago

Since we are employing a microservice architecture. Is it possible to run each docker fully contained in its own ecosystem so we can see it function on its own without dependencies on other components

(i.e. i JUST want to see the data/web crawler run on its own without anything else so i can analyze how it works)

Would be great as we progress through the medium articles!

iusztinpaul commented 1 month ago

Hello @RishiRavula ,

Yes, you can do that.

Let me know if you have any other questions.