Closed idhamhalim closed 6 years ago
When you run, for example,
_flow --imgdir sampleimg/ --model cfg/tiny-yolo.cfg --load bin/tiny-yolo.weights --json
it return json files, however if instead you just run
_flow --imgdir sampleimg/ --model cfg/tiny-yolo.cfg --load bin/tiny-yolo.weights
it outputs images with bounding boxes.
@
@idhamhalim Hello friends, have you solved this problem?
It's already solved.
@idhamhalim So can you tell me how to do it? I can only do the output of the json file, but I want to get the predicted result graph by running the py file directly (that is, the same effect as the cmd run), your help is very important to me, thank you!
@lijiaze2018 check @ssusie's comment.
@idhamhalim were you able to do what @ssusie said from inside a py script or did you want it just from the terminal?
Hi guys,
You can try look at my repo, and see predictidham.py script. That script is configured to save images that are labelled with car in the video. You can try to change labels to another value, but you do need to check which labels are available.
Repo link below. https://github.com/idhamhalim/image-recognition-detection/blob/master/predict-idham.py
On Mon, 22 Apr 2019, 15:12 Shreyas Dixit, notifications@github.com wrote:
@idhamhalim https://github.com/idhamhalim were you able to do what @ssusie https://github.com/ssusie said from inside a py script or did you want it just from the terminal?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/thtrieu/darkflow/issues/628#issuecomment-485347412, or mute the thread https://github.com/notifications/unsubscribe-auth/AEKXT4ZGL3COAYKFU7BD4XLPRVQOPANCNFSM4EUYKSGA .
So. I tried the code for using darkflow from another python application, I modified it , because I wanted it to return multiple images with bounding box, however the output only returns predictions and is returned in a JSON-like format. How do i return it in form of images with the bounding boxes
some of the results :
[{'topleft': {'x': 469, 'y': 379}, 'label': 'bottle', 'confidence': 0.18183012, 'bottomright': {'x': 615, 'y': 687}}, {'topleft': {'x': 570, 'y': 439}, 'label': 'bottle', 'confidence': 0.35135984, 'bottomright': {'x': 705, 'y': 719}}, {'topleft': {'x': 664, 'y': 458}, 'label': 'bottle', 'confidence': 0.22756532, 'bottomright': {'x': 825, 'y': 715}}, {'topleft': {'x': 503, 'y': 568}, 'label': 'bottle', 'confidence': 0.18111153, 'bottomright': {'x': 581, 'y': 711}}, {'topleft': {'x': 588, 'y': 559}, 'label': 'bottle', 'confidence': 0.14755477, 'bottomright': {'x': 674, 'y': 719}}, {'topleft': {'x': 689, 'y': 563}, 'label': 'bottle', 'confidence': 0.13069168, 'bottomright': {'x': 788, 'y': 717}}, {'topleft': {'x': 213, 'y': 563}, 'label': 'diningtable', 'confidence': 0.13041878, 'bottomright': {'x': 467, 'y': 704}}]