Open akshaydhame2001 opened 7 months ago
Running on jetson and intel cpu should not affect the functionality of code or behaviour
Velocity Function: Ensure you're using the correct frame for your velocity commands. You might want to switch from MAV_FRAME_BODY_OFFSET_NED to MAV_FRAME_LOCAL_NED to see if that improves the response.
Condition for Movement: The transitions between moving forward and backward may be too abrupt. If the distance fluctuates around the thresholds (100-120), the drone could be rapidly switching between velocities. Adding a deadband around these checks could help stabilize the movement.
Yaw Control: If you're adjusting yaw at the same time as changing X velocity, it might lead to circular motion. Make sure yaw adjustments don’t conflict with velocity commands.
Error Calculation: Review how you're calculating errorx and errory. Large, unadjusted errors could result in significant changes in velocity commands.
Loop Consistency: Consider introducing a small delay (time.sleep()) in your loop to control command frequency, which might help stabilize behavior.
Tracking Initialization: Ensure that the tracker is initialized correctly. An unstable bounding box can lead to erratic movements.
I'm implementing follow me drone. Condition Yaw working fine, but negative X velocity from NED FRAME giving yaw added and therefore giving circular movement on jetson nano gazebo simulation. But on Intel CPU it's working fine. I check dronekit and pymavlink library version is same.
from future import with_statement from future import division from future import absolute_import from os import sys, path from io import open sys.path.append(path.dirname(path.dirname(path.abspath(file))))
-- Dependencies for video processing
import time import math import argparse import cv2 import numpy as np from imutils.video import FPS
-- Dependencies for commanding the drone
from dronekit import connect, VehicleMode from pymavlink import mavutil
-- Function to control velocity of the drone
def send_local_ned_velocity(x, y, z): msg = vehicle.message_factory.set_position_target_local_ned_encode( 0, 0, 0, mavutil.mavlink.MAV_FRAME_BODY_OFFSET_NED, 0b0000111111000111, 0, 0, 0, x, y, z, 0, 0, 0, 0, 0) vehicle.send_mavlink(msg) vehicle.flush()
def condition_yaw(heading): msg = vehicle.message_factory.command_long_encode( 0, 0, # target_system, target_component mavutil.mavlink.MAV_CMD_CONDITION_YAW, #command 0, #confirmation 1, # param 1, yaw in degrees 0, # param 2, yaw speed deg/s heading, # param 3, direction -1 ccw, 1 cw 1, # param 4, relative offset 1, absolute angle 0 0, 0, 0) # param 5 ~ 7 not used
send command to vehicle
-- Connect to the drone
vehicle = connect('udp:127.0.0.1:14550', wait_ready=True)
cap = cv2.VideoCapture(0) width=cap.get(cv2.CAP_PROP_FRAME_WIDTH) height=cap.get(cv2.CAP_PROP_FRAME_HEIGHT) tracker = cv2.legacy.TrackerMOSSE_create() Hide quoted text
def drawBox(img, box, distance): x, y, w, h = int(box[0]), int(box[1]), int(box[2]), int(box[3]) cv2.rectangle(img, (x, y), (x + w, y + h), (255, 0, 255), 2, cv2.LINE_AA) cv2.putText(img, "Estimated distance: {}cm".format(int(distance)), (75, 75), cv2.FONT_HERSHEY_SIMPLEX, 0.7, (255, 0, 255), 2) cv2.line(img, (int(width/2), int(height/2)), (int(x+w/2), int((y+h/2))), (255,255,255), 1)
while True: _, img = cap.read()
Convert BGR to HSV
cap.release() cv2.destroyAllWindows()