etherlabsio / ai-engine

Core AI services and functions powering the ETHER Platform
MIT License
0 stars 0 forks source link

test #192

Closed vdpappu closed 4 years ago

vdpappu commented 4 years ago

Few clarifications:

Are we going to write to the S3 objects once per summary generation? Or would it be continuously written?

S3 artefacts (located in ././{context_id}/{mind_id}) are updated after every summary generation.

we'd have to create a new mind_id in artifacts/{env}/minds to avoid clashes.

@reaganrewop can you confirm this. Per our discussion, contextid/mind_id is the unique path and we need not create new mind_ids unless we have a new domain?

If a new entity is created, should it be linked with the parent domain mind?

New entities are specific to the context and shouldn't be populated back to the parent domain. For example, New entity DeLorean from Ether group shouldn't be populated back to Software Engineering Mind

I think this information might be needed to make useful connections in the graph in the future?

Regarding new entity, nothing changes in the current dgraph population approach. @shashankpr can confirm

What is the artifact_updater exactly doing here?

We can name it mind_updater to be specific- This service updates artefacts in the {context_id}/{mind_id} path for continuous learning after every Ether Call

We'll be dynamically creating lambda configurations with the new mind_id as the identifier and passing the new mind_id that was created. Is that acceptable?

Is this about the feature extractor lambda? If so, Per the current approach, we don't update the models - ie the model associated with each context is just a copy of the domain model and will not be updated. We are considering model updation for later stages. If we want to keep that ready, we can have lambda configuration one per each mind_id created.

@reaganrewop @shashankpr please add if I have missed any.