Open ForceConstant opened 4 months ago
Is there a way to get the output image out of obico that shows the squares for issue spots?
Can you explain how the api to the ml works? I see that you send image url via the /img , and how/what data comes back? @nberktumer
I have disabled pause for now, because I get many warning notifications, depending on what the model looks like. For example see the screenshot below. Print looks fine, but level has passed 5 or six times into the warning zone. This is a stock P1S.
The threshold values are hardcoded to aggregation functions in automation. It is possible to create inputs for the thresholds by rewriting the aggregation functions: https://github.com/nberktumer/ha-bambu-lab-p1-spaghetti-detection/issues/2#issuecomment-1937766963
Is there a way to get the output image out of obico that shows the squares for issue spots?
API returns the squares but they are not shown in HA. However, it is possible to create a new image sensor with detected failures.
Can you explain how the api to the ml works? I see that you send image url via the /img , and how/what data comes back? @nberktumer
The response is in the following format:
{
"detections": [
["failure", <probability1>, [box1.x, box1.y, box1.width, box1.height]],
["failure", <probability2>, [box2.x, box2.y, box2.width, box2.height]],
...
]
}
Box values are the detected failures shown as squares in the original Obico service.
@nberktumer how about the values like ewm, etc. How do they get populated?
The HA automation calculates them: https://github.com/nberktumer/ha-bambu-lab-p1-spaghetti-detection/blob/2709ff644e9802b115699f09cbf68742485b1009/blueprints/spaghetti_detection.yaml#L268
These math formulas in the automation are taken from: https://github.com/TheSpaghettiDetective/obico-server/blob/496605a62fcb790097c510c151994e9b2bf020c1/backend/lib/prediction.py
I have disabled pause for now, because I get many warning notifications, depending on what the model looks like. For example see the screenshot below. Print looks fine, but level has passed 5 or six times into the warning zone. This is a stock P1S.
The threshold values are hardcoded to aggregation functions in automation. It is possible to create inputs for the thresholds by rewriting the aggregation functions: #2 (comment)
Is there a way to get the output image out of obico that shows the squares for issue spots?
API returns the squares but they are not shown in HA. However, it is possible to create a new image sensor with detected failures.
Can you explain how the api to the ml works? I see that you send image url via the /img , and how/what data comes back? @nberktumer
The response is in the following format:
{ "detections": [ ["failure", <probability1>, [box1.x, box1.y, box1.width, box1.height]], ["failure", <probability2>, [box2.x, box2.y, box2.width, box2.height]], ... ] }
Box values are the detected failures shown as squares in the original Obico service.
Ok I understand how things work a lot better now and think making a card that shows the boxes wouldn't be too hard. My only request is if in the next drop of this code, to save that actual json resuilt into a entity, so we can use it for the card.
I have disabled pause for now, because I get many warning notifications, depending on what the model looks like. For example see the screenshot below. Print looks fine, but level has passed 5 or six times into the warning zone. This is a stock P1S.