Closed bruffridge closed 3 years ago
Once the new model tarball is in S3. https://github.com/nasa-petal/PeTaL-labeller/issues/11
Specify a container to run inferences on the model, and the model tarball name and S3 location. https://github.com/nasa-petal/PeTaL-db/blob/main/dynamodb-cf-template.yaml#L352
How much compute power does the deployed endpoint need to generate inferences? https://github.com/nasa-petal/PeTaL-db/blob/main/dynamodb-cf-template.yaml#L375
What input format does the model expect? https://github.com/nasa-petal/PeTaL-db/blob/main/dynamodb-cf-template.yaml#L485
Once the new model tarball is in S3. https://github.com/nasa-petal/PeTaL-labeller/issues/11
Specify a container to run inferences on the model, and the model tarball name and S3 location. https://github.com/nasa-petal/PeTaL-db/blob/main/dynamodb-cf-template.yaml#L352
How much compute power does the deployed endpoint need to generate inferences? https://github.com/nasa-petal/PeTaL-db/blob/main/dynamodb-cf-template.yaml#L375
What input format does the model expect? https://github.com/nasa-petal/PeTaL-db/blob/main/dynamodb-cf-template.yaml#L485