Closed DearNebReddey closed 2 years ago
One way might be using autopilot, docs for dlc transformation are here
that would look something like
from autopilot import transform as t
from autopilot.hardware.gpio import Digital_Out
led = Digital_Out(pin=7)
dlc = t.image.DLC(model="my_model")
dlc += t.selection.DLCSlice(
select= ['tracked_point_1', 'tracked_point_2'],
min_probability = 0.5
)
dlc += t.logical.Compare(lambda x: "idk some comparison you want to make with the tracked points")
result = False
while True:
# get some frame from some camera, eg with autopilot one might do
timestamp, frame = cam.q.get()
_result = dlc.process(frame)
if _result != result:
led.turn(_result)
result = _result
also see: https://github.com/wehr-lab/autopilot/blob/main/examples/transforms/example_transformation_dlc.ipynb
happy to help out if you run into trouble, feel free to open an issue there or in the discussion board :)
haha oh hi yes you have been in the discussion board, sorry if you were looking for something different. i have been trying to work on some dissertation work and haven't been able to keep up with everything
closing this for now, feel free to reopen if this wasn't resolved
Hi there,
In the example of turning on an LED upon detecting a dog's rearing movement, you are using a processor objects that communicate with Teensy microcontrollers can it be replaced with a Raspberry PI 4 if yes can I have some details about this please.
Thanks.