Closed danammeansbear closed 1 year ago
import numpy as np from scipy.spatial.transform import Rotation as R def calculate_real_world_position(drone_lat, drone_lon, drone_alt, drone_yaw, drone_pitch, drone_roll, object_x, object_y, object_size): enu = pyproj.Proj(proj='utm', zone=33, ellps='WGS84') x, y, _ = pyproj.transform(pyproj.Proj(proj='latlong', datum='WGS84'), enu, drone_lat, drone_lon, drone_alt, radians=False) drone_pos = [x, y, drone_alt] r = R.from_euler('zyx', [drone_yaw, drone_pitch, drone_roll], degrees=True) rotationMatrix = r.as_matrix() object_pos = np.dot(rotationMatrix, [object_x * object_size, object_y * object_size, object_size]) object_pos = object_pos + drone_pos lat, lon, alt = pyproj.transform(enu, pyproj.Proj(proj='latlong', datum='WGS84'), object_pos[0], object_pos[1], object_pos[2], radians=False) return lat, lon, alt drone_lat = 38.1472 drone_lon = -76.4268 drone_alt = 10 drone_yaw = 0 drone_pitch = 0 drone_roll = 0 object_x = 0 object_y = 0 object_size = 1 object_pos = calculate_real_world_position(drone_lat, drone_lon, drone_alt, drone_yaw, drone_pitch, drone_roll, object_x, object_y, object_size) print("Object position in Lat, Long, Alt:", object_pos)"
the above code I added can be incorporated into autonomous missions. I will be creating a fork of this for autonomous mission planning using an n-tier amount of drones.
Maybe you also need this: https://github.com/The1only/rosettadrone/issues/132 ?
I think I would be interested in implementing similar functionality.
How could I easily obtain data with required telemetry (location, gimbal angles etc.) in real time?
How could I easily obtain data with required telemetry (location, gimbal angles etc.) in real time?
See https://github.com/kripper/mavlink-camera-simulator/blob/main/sim.py
msg = master.recv_match(type='GLOBAL_POSITION_INT', blocking=True)
You could start implementing the gimbal movement so the drone points the gimbal to the POI defined in a mission using QGC (search the code for lookAt
).
Also search the issues for "gimbal" since there is a lot of useful information.
You could also look at this fork where this gimbal movement was implemented: https://github.com/m4xw/rosettadrone_mini2/
Any progress? Closing for now.
Any progress?
No. I had to switch my priorities. My company decided to not invest time into this app.
We just went with DJI Cloud API for now.
Have had to cease development all summer due to work & school schedule. will have from september 1st to january to work 4 hours a day MWF on project.
Although my attempt at implementing this was crappy, I did create a .net blazor webpage to control multiple simulated and physical drones. it connects to cell phone hosting a unity mobile app which hosts a voice bot and mapping software that can allow for Augmented Reality waypoints and mission assignments vocally and manually. Autonomous Agent to agent mission planning was tested in a unity simulated environment. Once I graduate, I imagine I will have more time to devote to the project. Till then, I am stuck working two jobs and taking a full course load.Among other limitations to my testing, I only own a DJI mavic air and it's the older model. Adam Dabdoub Software Developer c: 513.886.0301
On Tue, Aug 15, 2023 at 5:29 PM Paweł Kotiuk @.***> wrote:
Any progress?
No. I had to switch my priorities. My company decided to not invest time into this app. We just went with DJI Cloud API for now.
— Reply to this email directly, view it on GitHub https://github.com/RosettaDrone/rosettadrone/issues/93#issuecomment-1679641423, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJKTGPX34C5OEWSVOP37DLXVPS35ANCNFSM5FQNB6AQ . You are receiving this because you authored the thread.Message ID: @.***>
Augmented Reality waypoints
How do they work? Do you mean flying autonomously following "fiducial markers"?
Yeah, based on the type of mission thats assigned to them, they can create Geotagged waypoints that show up in the Unity Application as Augmented reality geotagged waypoints. This is all using rabbitmq as a message bus for communication. For the Testing the mission was for a drone to be to find a person(object) in a field and then display its geotagged location on the map and on screen as a waypoint. Once this waypoint is created, human operators using a ground control software or the mobile unity app, operators or agents can select other agents/operators to complete a task. I had the virtual drone (which is also an augmented reality object that is geotagged) go out and circle the person(object) after 30 seconds the agents returned home and landed. not accurately. Adam Dabdoub Software Developer c: 513.886.0301
On Tue, Aug 15, 2023 at 6:01 PM Christopher Pereira < @.***> wrote:
Augmented Reality waypoints
How do they work? Do you mean flying autonomously following "fiducial markers"?
— Reply to this email directly, view it on GitHub https://github.com/RosettaDrone/rosettadrone/issues/93#issuecomment-1679678893, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJKTGMXPW5EHB33M2VXDWTXVPWVDANCNFSM5FQNB6AQ . You are receiving this because you authored the thread.Message ID: @.***>
This isn't an issue with the main branch but an issue with the thing I'm trying to make to upload back to the community but I'm not that great at the coding part but I'm not gonna stop learning.
Problem instance: - An Rc airplane equiped with a GIMBAL camera is at V1 or vector one. with g1 or gimbal angle and A1 be azimuth/direction (N,S,E,W). A person in the field is v2 or vector 2. Assume the gimbal angle directly points at the person or v2. if given V1(x,y,z),r, A1, G1, please find the location of the person or v2(x1,y1,z1). V1=(39.375346,10, -84.208137) G1=45 degrees A1= 3.14/4 or 45 degrees R= 10 radius to target or distance to target 180 degrees = west 90 degrees = south 270 degrees = north0 0 degrees = east
python example.
-- coding: utf-8 --
code by adam dabdoub
import math import numpy
x = 39.375346 y = 10.0 z = -84.208137 V1 = {'X': x, 'Y': y, 'Z': z} G1 = math.degrees(45) #degrees A1 = math.degrees(45) # or 45 degrees R = 10 #radius to target or distance to target
180 degrees = west
90 degrees = south
270 degrees = north
0 degrees = east
a = y c = a//(math.cos(a)) b = math.sqrt(c 2 + a 2) x1 = (b math.cos(A1) + x) y1 = ((y - y) + 4) z1 = (b math.sin(A1) + z) V2 = (x1,y1,z1) print("UAV Coordinates:\n", V1) print("Person Coordinates:\n", V2)
old vector: {'X': 39.375346, 'Y': 10.0, 'Z': -84.208137} new vector: (30.130935697561846, 4.0, -71.61683531351122) I feel this is wildly inaccurate on my part because I'm dog$&!+ Coding. I'm looking to improve this. Do i need to do some unit conversion to get a better answer. Am I not taking into account it's euclidean and in spherical coordinates.