Closed The1only closed 1 year ago
I'd like to help where I can. For now I'm busy trying to cure a pandemic, As you can imagine what its like to have the answers but lack the resources to prove it. Annoying to say the least.
When all this is over I'd love to do the testing, I use python so mavlink I think is about as far as my programming skills will help us. If resources allow I'll collect more drones and we'll get to test them out.
Keep up the good work, I hold the forth until the pandemic has passed. The pandemic is on it's way back for a second round around here.
I can offer testing with a DJI Mavic Pro and QGC on Linux. I've been looking forward to be able to control the Mavic outside of the DJI ecosphere for a long time, especially mission planning etc. Will try to build latest master again and see how far I can get.
Perfect let me know. I belive I know what will fail, and can point you in the right direction. It’s the transcoding again.
Best regard Terje Nilsen 9Tek AS Norway
On Aug 27, 2020, at 17:44, chrono notifications@github.com wrote:
I can offer testing with a DJI Mavic Pro and QGC on Linux. I've been looking forward to be able to control the Mavic outside of the DJI ecosphere for a long time, especially mission planning etc. Will try to build latest master again and see how far I can get.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
hello i am on the project. I have a mavic air, how can i test/help?
Let me check in the latest code that will support Mavic Air.
MAVIC Air is working, both on the Android device and on the external unit like QGC.
Okay I am on standby
Sent from Mail for Windows 10
From: Terje Sent: Thursday, October 1, 2020 11:04 AM To: The1only/rosettadrone Cc: danammeansbear; Comment Subject: Re: [The1only/rosettadrone] We need more developers involved, theproject is growing out of a one man show... (#19)
Let me check in the latest code that will support Mavic Air.
Best regard Terje Nilsen 9Tek AS Norway
On Oct 1, 2020, at 07:07, danammeansbear notifications@github.com wrote:
hello i am on the project. I have a mavic air, how can i test/help?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
You can start by checking out the main code, and se if you get live video in the app and in QGC... if you do we are flying.
If you get issues just ask.
Best regard Terje Nilsen 9Tek AS Norway
On Oct 1, 2020, at 17:53, danammeansbear notifications@github.com wrote:
Okay I am on standby
Sent from Mail for Windows 10
From: Terje Sent: Thursday, October 1, 2020 11:04 AM To: The1only/rosettadrone Cc: danammeansbear; Comment Subject: Re: [The1only/rosettadrone] We need more developers involved, theproject is growing out of a one man show... (#19)
Let me check in the latest code that will support Mavic Air.
Best regard Terje Nilsen 9Tek AS Norway
On Oct 1, 2020, at 07:07, danammeansbear notifications@github.com wrote:
hello i am on the project. I have a mavic air, how can i test/help?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
Okay will do!
On Thu, Oct 1, 2020, 12:00 PM Terje notifications@github.com wrote:
You can start by checking out the main code, and se if you get live video in the app and in QGC... if you do we are flying.
If you get issues just ask.
Best regard Terje Nilsen 9Tek AS Norway
On Oct 1, 2020, at 17:53, danammeansbear notifications@github.com wrote:
Okay I am on standby
Sent from Mail for Windows 10
From: Terje Sent: Thursday, October 1, 2020 11:04 AM To: The1only/rosettadrone Cc: danammeansbear; Comment Subject: Re: [The1only/rosettadrone] We need more developers involved, theproject is growing out of a one man show... (#19)
Let me check in the latest code that will support Mavic Air.
Best regard Terje Nilsen 9Tek AS Norway
On Oct 1, 2020, at 07:07, danammeansbear notifications@github.com wrote:
hello i am on the project. I have a mavic air, how can i test/help?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/The1only/rosettadrone/issues/19#issuecomment-702234456, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJKTGI7USVGANIV2RFPXSDSISRRTANCNFSM4P2MLXAQ .
hi I can work on the Mavlink interface
@The1only The project is great, would help many users. Feel free to count me in. I am testing on mavic 2 zoom and mini with QGC on Ubuntu and Google Pixel 2/3/4; and will work on interfacing MAVROS and web applications.
That would be great to get an initial guide from you, so we could understand the code structure well. (even a simple flow diagram is helpful) For example, after main/java/sq.rogue.rosettadrone/MainActivity.java, which one is the next important .java file to look at? I would think sq.rogue.rosettadrone/video contains the core to handle video streaming, and com.MAVLink contains the core to do the translation between MAVLink and DJI language. But maybe I miss the global picture of the code.
Best, -- Luke
Let’s se:
Focus on: DroneModel, MainActivity and MavlinkReceiver, and the layout and GUI stuff.
Everything that is related to Drone communication and control, except for Video happens in “DroneModel”, so any new actions is suppose to be there. I have done quite now new things lately, will put soon.
"MainActivity" handles the main screen and the video related stuff, except for the low-level stuff. I have noticed that when I use Inspire 1 or Mavic Air ( and probably Phantom 3) drones, we need a quite power full Android device to avoid video errors. So we need to look at performance and threading. Another ting is that it seems that there are threads that gets hanging after an exit, as some times I can run video just fine right after rebooting the Android device, but after restarting the app the video gets crap...
In the "Video" catalog are very low-level H264, RTSP and RTP files, Only edit these files if you are familiar with these format on a bit level. DJI do not deliver a H.264 format we directly can transmit over UDP, we need to add NALs and other stuff.
"MavLinkReceiver" handles all mavlink related stuff, so when a new Mavlink command is received it is parsed here. Then the actions is executed in the DroneModel. Unfortunately there are some waypoint stuff in this file also, that is quite confusing, and not working at the moment. If someone could fix the waypoints that would be great. Remember DJI got some strange demand for minimum and maximum distance and minimum number og waypoints etc.
The rest of the files are helper files, doing small dedicated stuff as can be seen. I am about to add a generic AI screen that everyone can modify to their own likings. I use this mostly for AI research using Python on a Linux host computer to control the drone from live video.
The layout is at the moment tuned for DJI SmartControll screen layout. Some more dynamic layout would be fine. On the CrystalSky it looks a bit odd.
I have added some experimental DJI TimeLine usage lately, to test it out. At the moment I use this to do takeoff to x meters, as DJI do not support that. In the current code I have created that function using very low-level virtual stick movements to climb while holding position. It work well, but the TimeLine might make is simpler and hourly better in wind. I will add all this code later to day.
If anyone know how to make a plugin interface to the software please let me know. We need a plugin for the AI, such that every one can make their own AI stuff without getting involved in RosettaDrone2 it self...
Plugin is now added for the AI function block. This allow users to make totally independent Activities that can be lisensed outside Rosettadrone. It opens a new world of posibilities.
@The1only , thanks for your guide. We found that the module of validateTranscodingMethod in MainActivity.java doesn't include the case of MAVIC_MINI. Should we add it ?
Hope to hear your expertise.
Best, -- Luke
Hi @The1only and team,
I have good news.
1.- I just finished testing, reviewing and merging all forks, added support for native MAVLink waypoints and many other features, fixed hundreds of bugs, implemented feature requests, cleaned a lot of code, etc.
It's time to upload all my changes to the master branch and respond and close the issues.
Since this project has been dead for over a year and I know you have no time, please give me full access to this repo since I will be the active maintener for some time in the future. If not possible, I will create a "Rosetta 3" forked project, but I prefer to keep the development centralized.
In the future, please let's start commiting atomic changes immediately and apply good GIT practices.
For example, the merge of the CommandManager was a good move, but I noticed you had to revert it because we are not up to date. The codebase still needs a lot of refractoring and cleaning, but we first have to get up to date.
2 .- We are connecting Mavlink with DJI, the best flying drones considering features/price. After cleaning the mess, I will contact the other opensource communities (ArduPilot, DroneCode, etc) and ask them to include "Rosetta 3" to their list of "member" projects.
Please reach out to me as I want to help maintain this project. I would consider myself midlevel software engineer and would like to help out where possible. Id like to be a bay scrum master for this project if given some guidance!! Adam Dabdoub Software Developer c: 513.886.0301
On Wed, Feb 15, 2023 at 9:46 AM Christopher Pereira < @.***> wrote:
Hi @The1only https://github.com/The1only and team,
I have good news.
1.- I just finished testing, reviewing and merging all forks, added support for native MAVLink waypoints and many other features, fixed hundreds of bugs, implemented feature requests, cleaned a lot of code, etc.
It's time to upload all my changes to the master branch and respond and close the issues.
Since this is project has been dead for over a year and I know you have no time, please give me full access to this repo since I will be the active maintener for some time in the future. If not possible, I will create a "Rosetta 3" forked project, but I prefer to keep the development centralized.
In the future, please let's start commiting atomic changes immediately and apply good GIT practices https://about.gitlab.com/topics/version-control/version-control-best-practices/ .
For example, the merge of the CommandManager was a good move, but I noticed you had to revert it because we are not up to date. The codebase still needs a lot of refractoring and cleaning, but we first have to get up to date.
2 .- We are connecting Mavlink with DJI, the best flying drones considering features/price. After cleaning the mess, I will contact the other opensource communities (ArduPilot, DroneCode, etc) and ask them to include "Rosetta 3" to their list of "member" projects.
— Reply to this email directly, view it on GitHub https://github.com/The1only/rosettadrone/issues/19#issuecomment-1431484420, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJKTGMY2PCU7X7W5OYMA5LWXTT43ANCNFSM4P2MLXAQ . You are receiving this because you commented.Message ID: @.***>
Perfect, you are now the only maintainer except for me !!! I hope you have time and can do a good job.
On 15 Feb 2023, at 15:57, danammeansbear @.***> wrote:
Please reach out to me as I want to help maintain this project. I would consider myself midlevel software engineer and would like to help out where possible. Id like to be a bay scrum master for this project if given some guidance!! Adam Dabdoub Software Developer c: 513.886.0301
On Wed, Feb 15, 2023 at 9:46 AM Christopher Pereira < @.***> wrote:
Hi @The1only https://github.com/The1only and team,
I have good news.
1.- I just finished testing, reviewing and merging all forks, added support for native MAVLink waypoints and many other features, fixed hundreds of bugs, implemented feature requests, cleaned a lot of code, etc.
It's time to upload all my changes to the master branch and respond and close the issues.
Since this is project has been dead for over a year and I know you have no time, please give me full access to this repo since I will be the active maintener for some time in the future. If not possible, I will create a "Rosetta 3" forked project, but I prefer to keep the development centralized.
In the future, please let's start commiting atomic changes immediately and apply good GIT practices https://about.gitlab.com/topics/version-control/version-control-best-practices/ .
For example, the merge of the CommandManager was a good move, but I noticed you had to revert it because we are not up to date. The codebase still needs a lot of refractoring and cleaning, but we first have to get up to date.
2 .- We are connecting Mavlink with DJI, the best flying drones considering features/price. After cleaning the mess, I will contact the other opensource communities (ArduPilot, DroneCode, etc) and ask them to include "Rosetta 3" to their list of "member" projects.
— Reply to this email directly, view it on GitHub https://github.com/The1only/rosettadrone/issues/19#issuecomment-1431484420, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJKTGMY2PCU7X7W5OYMA5LWXTT43ANCNFSM4P2MLXAQ . You are receiving this because you commented.Message ID: @.***>
— Reply to this email directly, view it on GitHub https://github.com/The1only/rosettadrone/issues/19#issuecomment-1431501290, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABS7SLGYAV7APRWBUI7EOFTWXTVFRANCNFSM4P2MLXAQ. You are receiving this because you were mentioned.
@The1only I still have no access to close issues or merge pull requests. My user is "kripper"
@kripper once you have been added, please start assigning issues/bugs/tests to me and I will start working on it. Adam Dabdoub Software Developer c: 513.886.0301
On Thu, Feb 16, 2023 at 3:48 PM Christopher Pereira < @.***> wrote:
@The1only https://github.com/The1only I still have no access to close issues or merge pull requests. My user is "kripper"
— Reply to this email directly, view it on GitHub https://github.com/The1only/rosettadrone/issues/19#issuecomment-1433695103, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJKTGOQIVHGGFGK6M23PEDWX2HAZANCNFSM4P2MLXAQ . You are receiving this because you commented.Message ID: @.***>
@danammeansbear sure
@The1only you got confused and added Adam instead of me :-)
I can fix it once you give me access.
Oppps hmm. Send me you’re github name and I’ll fix it.Best regardTerje Nilsen9Tek ASNorwayOn 17 Feb 2023, at 00:28, Christopher Pereira @.***> wrote: @The1only you got confused and added Adam instead of me :-) I can fix it once you give me access.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
Fixed !!! I belive :-)
On 17 Feb 2023, at 00:28, Christopher Pereira @.***> wrote:
@The1only https://github.com/The1only you got confused and added Adam instead of me :-)
I can fix it once you give me access.
— Reply to this email directly, view it on GitHub https://github.com/The1only/rosettadrone/issues/19#issuecomment-1433878144, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABS7SLFMS7OFF7ZLHRPHZDTWX2ZYJANCNFSM4P2MLXAQ. You are receiving this because you were mentioned.
@The1only yes, thanks. Master branch is up to date now. Please everybody switch back to master and report issues.
++ thanks for keeping this project alive and moving it forward. being able to use qtgroundcontrol over mavlink with dji hw is <3
SUBJECT: Precision landing
Hi team,
WayPoint missions are working fine now with QGC using the new MissionManager class (using VirtualSticks). Next, I will implement precision landing using Python and OpenCV. There are a lot of python scripts around on Github but some of you probably have also been playing with precision landing in the past. Does anybody want to share their existing precision landing implementation to Rosetta?
I'm available for reviewing, cleaning and integrating the code into master so we can all maintain and improve it after. My recommendation is to always share your code and stay up to date on the same codebase. Otherwise personal efforts and implementations get obsolate and start to conflict with upstream implementations which sooner or later will be superior and well maintained by others.
I got object seek/detection and following/tracking using openCV and a trained «resdet» AI and Rosetta with som kalmanfilters etc.I am considering releasing it. However the trained AI I can not release, but it’s not that hard to make something for landing.Best regardTerje Nilsen9Tek ASNorwayOn 18 Feb 2023, at 00:52, Christopher Pereira @.***> wrote: SUBJECT: Precision landing Hi team, WayPoint missions are working fine now with QGC using the new MissionManager class (using VirtualSticks). Next, I will implement precision landing using Python and OpenCV. There are a lot of python scripts around on Github but some of you probably have also been playing with precision landing in the past. Does anybody want to share their existing precision landing implementation to Rosetta? I'm available for reviewing, cleaning and integrating the code into master so we can all maintain and improve it after. My recommendation is to always share your code and stay up to date on the same codebase. Otherwise personal efforts and implementations get obsolate and start to conflict with upstream implementations which sooner or later will be superior and well maintained by others.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
I'm thinking in integrating the precision landing Algorithm using Aruco from this repo: https://github.com/tizianofiorenzani/how_do_drones_work
BTW, have you tried MAVSDK? I'm running it locally on the smartphone listening on a UDP port. Rosetta connects to the port (MAVSDK acknowledges the connection), but somehow the mavlink commands sent by MAVSDK are not received by Rosetta.
Yes used it with python all the time. There was a trick, I’ll check my code.Best regardTerje Nilsen9Tek ASNorwayOn 18 Feb 2023, at 18:49, Christopher Pereira @.***> wrote: BTW, have you tried MAVSDK? I'm running it locally on the smartphone listening on a UDP port. Rosetta connects to the port (MAVSDK acknowledges the connection), but somehow the mavlink commands sent by MAVSDK are not received by Rosetta.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
No sorry I used mavutil:#!/usr/bin/env python3# -- coding: utf-8 --"""Created on Thu Jul 30 16:12:11 @.*: smartdrone"""# #########DEPENDENCIES#############from dronekit import Command, connect, VehicleMode,APIException,LocationGlobalRelative,LocationGlobalimport timefrom pymavlink import mavutilimport threadingimport mathimport numpy as npfrom AI_module import Modesfrom contextlib import contextmanagerfrom nose.tools import assert_equal# ########GLOBALES################# # ########FUNCTIONS#################class drone_controller: def init(self): self.g_elev = 1500 self.g_yaw = 1500 self.g_pich = 1500 self.g_roll = 1500 self.g_sw1 = 1000 self.g_sw2 = 1000 self.g_sw3 = 1000 self.g_AImode = Modes.NONE self.modeoverride = False self.takeoffheight = 10 self.t_stop = False self.airborn = False self.takeoffheading = -1 self.just_seen=[] # Place holder... self.sitl = None self.command_OK = False print('Controler Initialized...') # We go ahead and call our observer once at startup to get an initial value... def addObserverAndInit(self, name, cb): self.vehicle.add_attribute_listener(name, cb) #---------------------------------------------------- def message(self, name, message): self.flightstatus = message self.flightstatusname = name self.flightstatusready = True # Create a message listener for all messages. def receivecontrol(self, name, message): self.g_elev = message.chan1_raw self.g_yaw = message.chan2_raw self.g_pich = message.chan3_raw self.g_roll = message.chan4_raw self.g_sw1 = message.chan5_raw self.g_sw2 = message.chan6_raw self.g_sw3 = message.chan7_raw if self.modeoverride == False: if message.chan8_raw == 1000: self.g_AImode = Modes.NONE elif message.chan8_raw == 1100: self.g_AImode = Modes.LINETRACKING elif message.chan8_raw == 1200: self.g_AImode = Modes.POLETRACKING elif message.chan8_raw == 1300: self.g_AImode = Modes.LINELOOKING # Complete status... def completefunc(self, name, message): print("1) Reseived: %s -> %s" % (name,message)) if message.command == mavutil.mavlink.MAV_CMD_NAV_TAKEOFF and message.result == 0: self.command_OK = True elif message.command == 84 and message.result == 0: # Set position local NED... self.command_OK = True# print("1) Reseived: %s -> %s" % (name,message)) elif message.command == 86 and message.result == 0: # Position Conditional YAW... self.command_OK = True# print("2) Reseived: %s -> %s" % (name,message)) elif message.command == mavutil.mavlink.MAV_CMD_DO_SET_SERVO and message.result == 0: self.command_OK = True# print("3) Reseived: %s -> %s" % (name,message)) #---------------------------------------------------- # @contextmanager # def assert_command_ack(self, command_type, ack_result=mavutil.mavlink.MAV_RESULT_ACCEPTED, timeout=10): # """Context manager to assert that: # 1) exactly one COMMAND_ACK is received from a Vehicle; # 2) for a specific command type; # 3) with the given result; # 4) within a timeout (in seconds). # For example: # .. code-block:: python # with assert_command_ack(vehicle, mavutil.mavlink.MAV_CMD_PREFLIGHT_CALIBRATION, timeout=30): # vehicle.calibrate_gyro() # """ # acks = [] # def on_ack(self, name, message): # if message.command == command_type: # acks.append(message) # self.vehicle.add_message_listener('COMMAND_ACK', on_ack) # yield # start_time = time.tim # Define vehicle listner, and forward to controller... # while not acks and time.time() - start_time < timeout: # time.sleep(0.1)183 # self.vehicle.remove_message_listener('COMMAND_ACK', on_ack) # assert_equal(1, len(acks)) # one and only one ACK # assert_equal(command_type, acks[0].command) # for the correct command # assert_equal(ack_result, acks[0].result) # the result must be successful def connect(self, connection_string): print("Connecting...") if not connection_string: from dronekit_sitl import SITL, start_default self.sitl = SITL() # self.sitl.download('solo', '2.0.20', verbose=True) # sitl_args = ['-I0', '--model', 'quad', 'param load ./solo.param', '--home=60.4094,10.4911,0,45'] self.sitl.download('copter', '3.3', verbose=True) sitl_args = ['-I0', '--model', 'quad', '--home=60.4094,10.4911,0,0'] self.sitl.launch(sitl_args, await_ready=True, restart=True) # connection_string = 'tcp:127.0.0.1:5760' # sitl = start_default() connection_string = self.sitl.connection_string() print("Simulate mode!!!!") self.vehicle = connect(connection_string, wait_ready=True) self.vehicle.wait_ready('autopilot_version') # We need a position for the Pole tracker... self.just_seen = self.vehicle.location.global_relative_frame # self.vehicle.add_attribute_listener('GLOBAL_POSITION_INT', self.message) self.printstatus() self.setCameraYaw(0) return self.vehicle def printstatus(self): time.sleep(1) # Get Vehicle Home location - will be None
until first set by autopilot while not self.vehicle.home_location: cmds = self.vehicle.commands cmds.download() cmds.wait_ready() if not self.vehicle.home_location: print(" Waiting for home location ...") time.sleep(0.5) time.sleep(2) # Get all vehicle attributes (state) print("\nGet all vehicle attribute values:") print(" Autopilot Firmware version: %s" % self.vehicle.version) print(" Major version number: %s" % self.vehicle.version.major) print(" Minor version number: %s" % self.vehicle.version.minor) print(" Patch version number: %s" % self.vehicle.version.patch) print(" Release type: %s" % self.vehicle.version.release_type()) print(" Release version: %s" % self.vehicle.version.release_version()) print(" Stable release?: %s" % self.vehicle.version.is_stable()) # print(" Autopilot capabilities") # print(" Supports MISSION_FLOAT message type: %s" % self.vehicle.capabilities.mission_float) # print(" Supports PARAM_FLOAT message type: %s" % self.vehicle.capabilities.param_float) # print(" Supports MISSION_INT message type: %s" % self.vehicle.capabilities.mission_int) # print(" Supports COMMAND_INT message type: %s" % self.vehicle.capabilities.command_int) # print(" Supports PARAM_UNION message type: %s" % self.vehicle.capabilities.param_union) # print(" Supports ftp for file transfers: %s" % self.vehicle.capabilities.ftp) # print(" Supports commanding attitude offboard: %s" % self.vehicle.capabilities.set_attitude_target) # print(" Supports commanding position and velocity targets in local NED frame: %s" % self.vehicle.capabilities.set_attitude_target_local_ned) # print(" Supports set position + velocity targets in global scaled integers: %s" % self.vehicle.capabilities.set_altitude_target_global_int) # print(" Supports terrain protocol / data handling: %s" % self.vehicle.capabilities.terrain) # print(" Supports direct actuator control: %s" % self.vehicle.capabilities.set_actuator_target) # print(" Supports the flight termination command: %s" % self.vehicle.capabilities.flight_termination) # print(" Supports mission_float message type: %s" % self.vehicle.capabilities.mission_float) # print(" Supports onboard compass calibration: %s" % self.vehicle.capabilities.compass_calibration) print(" Global Location: %s" % self.vehicle.location.global_frame) print(" Global Location (relative altitude): %s" % self.vehicle.location.global_relative_frame) print(" Local Location: %s" % self.vehicle.location.local_frame) print(" Home Location: %sd" % self.vehicle.home_location) print(" Attitude: %s" % self.vehicle.attitude) print(" Velocity: %s" % self.vehicle.velocity) print(" GPS: %s" % self.vehicle.gps_0) print(" Gimbal status: %s" % self.vehicle.gimbal) print(" Battery: %s" % self.vehicle.battery) print(" EKF OK?: %s" % self.vehicle.ekf_ok) print(" Last Heartbeat: %s" % self.vehicle.last_heartbeat) print(" Rangefinder: %s" % self.vehicle.rangefinder) print(" Rangefinder distance: %s" % self.vehicle.rangefinder.distance) print(" Rangefinder voltage: %s" % self.vehicle.rangefinder.voltage) print(" Heading: %s" % self.vehicle.heading) print(" Is Armable?: %s" % self.vehicle.is_armable) print(" System status: %s" % self.vehicle.system_status.state) print(" Groundspeed: %s" % self.vehicle.groundspeed) # settable print(" Airspeed: %s" % self.vehicle.airspeed) # settable print(" Mode: %s" % self.vehicle.mode.name) # settable print(" Armed: %s" % self.vehicle.armed) # settable # print(self.vehicle.message_factory) self.last = self.vehicle.location.global_relative_frame def takeoff(self): self.threadTakeoff = threading.Thread(target=self.arm_and_takeoff, args=([self.takeoffheight])) self.threadTakeoff.start() def arm_and_takeoff(self, targetHeight): print("Takeoff Started to: %f"%targetHeight) if self.vehicle.location.global_relative_frame.alt < 1.5: print("Checking armable...") timeout = 0 while self.vehicle.is_armable is not True and timeout < 10: timeout = timeout+1 if timeout == 5: print("Timeout waiting for armable... ") # return None time.sleep(0.5) if self.t_stop is True: return False print("Waiting for drone to enter GUIDED flight mode") self.vehicle.mode = VehicleMode("GUIDED") while not self.vehicle.mode.name == 'GUIDED': time.sleep(0.5) if self.t_stop is True: return False print("Waiting for vehicle to become armed.") self.vehicle.armed = True timeout = 0 while self.vehicle.armed is False and timeout < 5: timeout = timeout+1 if timeout == 9: print("Timeout waiting for armed... ") # return False time.sleep(0.5) if self.t_stop is True: return False print("Look out! Virtual/Real props are spinning!!") time.sleep(4) self.Position = self.vehicle.location.global_relative_frame self.takeoffheading = self.vehicle.heading self.command_OK = False; # This is only used with rosetta drone, any other target will not respond. self.vehicle.simple_takeoff(targetHeight) # meters while self.command_OK == False or self.vehicle.location.global_relative_frame.alt < (targetHeight0.98): print("Current Altitude: %f"%self.vehicle.location.global_relative_frame.alt) time.sleep(0.5) if self.t_stop is True: return False time.sleep(1) self.condition_yaw(self.takeoffheading, relative=False) print("Altitude: %f" % self.vehicle.location.global_relative_frame.alt) self.just_seen = self.vehicle.location.global_relative_frame # Airborn, set camera angle... self.setCameraYaw(0) time.sleep(3) self.setCameraPitch(30) time.sleep(3) self.airborn = True print("Target altitude reached!!") return True def return_home_and_land(self): # Send Drone home... print("Returning home...") time.sleep(1.0) # Stop the AI ... self.airborn = False print("Waiting for drone to enter RTL flight mode") self.vehicle.mode = VehicleMode("RTL") time.sleep(5) while not self.vehicle.mode.name == 'RTL': time.sleep(0.5) if self.t_stop is True: return False print("Waiting for drone to return to landing")# while self.vehicle.mode.name == 'RTL' and self.vehicle.armed is True: while self.vehicle.armed is True: print("Current Altitude: %f" % self.vehicle.location.global_relative_frame.alt) time.sleep(1.0) if self.t_stop is True: return False print("Mode:",self.vehicle.mode.name) print("Landed...") def land(self): print("Now let's land") self.vehicle.mode = VehicleMode("LAND") time.sleep(5) print("Waiting for landing") while self.vehicle.mode == 'LAND' and self.vehicle.armed is True: print("Current Altitude: %f" % self.vehicle.location.global_relative_frame.alt) time.sleep(1) print("Landed...") # Send a velocity command with +x being the heading of the drone. # For the RosettaDrone/DJI we could use yaw_rate... but then we breake Adupilot compatibility... def send_local_velocity(self, vx, vy, vz,yaw): msg = self.vehicle.message_factory.set_position_target_local_ned_encode( 0, 0, 0, mavutil.mavlink.MAV_FRAME_BODY_OFFSET_NED, 0b0000011111000111, # Following ( 1 is mask OUT): yaw vel, yaw pos, force not accell, az,ay,ax, vz,vy,vx, pz,py,px 0, 0, 0, vx, vy, vz, # speed forward, right, down... 0, 0, 0, 0,float(yaw)) # yaw rate in radians/s... self.vehicle.send_mavlink(msg) self.vehicle.flush() # Send a position command with +x being ahead of the drone. def send_local_position(self, px, py, pz, yaw): msg = self.vehicle.message_factory.set_position_target_local_ned_encode( 0, 0, 0, mavutil.mavlink.MAV_FRAME_BODY_OFFSET_NED, 0b0000101111111000, px, py, pz, 0, 0, 0, 0, 0, 0, float(yaw), 0) self.vehicle.send_mavlink(msg) self.vehicle.flush() # Send a velocity command with +x being NOTRH of the drone. def send_global_ned_velocity(self, vx, vy, vz, yaw): msg = self.vehicle.message_factory.set_position_target_global_int_encode( 0, # time_boot_ms (not used) 0, 0, # target system, target component mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIVE_ALT_INT, # frame 0b0000011111000111, # type_mask (only speeds enabled) 0, 0, 0, # x, y, z positions (not used) vx, vy, vz, # N, E, D velocity in m/s 0, 0, 0, # x, y, z acceleration (not supported yet, ignored in GCS_Mavlink) 0,float(yaw)) # yaw, yaw_rate (not supported yet, ignored in GCS_Mavlink) self.vehicle.send_mavlink(msg) self.vehicle.flush() # Send a velocity command with yaw being the relative heading of the drone (yaw=0 is no change). def send_global_ned_position_relative_yaw(self, x, y, z, yaw, speed): abs_yaw = self.angular_difference(math.radians(self.vehicle.heading), yaw, False) self.send_global_ned_position(x, y, z, abs_yaw, speed) # Send a velocity command with +x being the heading of the drone. def send_global_ned_position(self, x, y, z, yaw, speed): # print("Goto Yaw: %f" % (yaw180/math.pi)) msg = self.vehicle.message_factory.set_position_target_global_int_encode( 0, # time_boot_ms (not used) 0, 0, # target system, target component mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIVE_ALT_INT, # frame 0b0000111111111000, # type_mask (only position enabled), ignore yaw for now...# 0b0000101111111000, # type_mask (only position enabled) int(x10000000), int(y10000000), float(z), # x, y, z positions (not used) speed, speed, 1.0, # x, y, z velocity in m/s 0, 0, 0, # x, y, z acceleration (not supported yet, ignored in GCS_Mavlink) float(yaw), 0) # yaw, yaw_rate (not supported yet, ignored in GCS_Mavlink) self.vehicle.send_mavlink(msg) self.vehicle.flush() def set_yaw(self, heading, relative=True): if relative: is_relative=1 #yaw relative to direction of travel else: is_relative=0 #yaw is an absolute angle if heading > 360: heading = heading - 360; if heading < 0: heading = heading + 360; if heading > 0: direction = 1 else: direction = -1 # create the CONDITION_YAW command using command_long_encode() msg = self.vehicle.message_factory.command_long_encode( 0, 0, # target system, target component mavutil.mavlink.MAV_CMD_CONDITION_YAW, #command 0, #confirmation abs(heading), # param 1, yaw in degrees 5, # param 2, yaw speed deg/s direction, # param 3, direction -1 ccw, 1 cw is_relative, # param 4, relative offset 1, absolute angle 0 0, 0, 0) # param 5 ~ 7 not used # send command to vehicle self.vehicle.send_mavlink(msg) self.vehicle.flush() def take_picture(self): msg = self.vehicle.message_factory.command_long_encode( 0, 0, mavutil.mavlink.MAV_CMD_DO_DIGICAM_CONTROL, 0, 1, 0, 0, 0, 1, 0, 0) self.vehicle.send_mavlink(msg) self.vehicle.flush() print('Picture taken!') time.sleep(2) # Set servo output, values are in degrees (+-90) def set_servo(self, servo, val):# print("servo: %d Val: %f",(servo,val)); self.command_OK = False; msg = self.vehicle.message_factory.command_long_encode( 0, # time_boot_ms (not used) 0, # target system, target component mavutil.mavlink.MAV_CMD_DO_SET_SERVO, 0, servo, # RC channel... 1500+(val5.5), # RC value 0, 0, 0, 0, 0) # with self.assert_command_ack(mavutil.mavlink.MAV_CMD_DO_SET_SERVO): self.vehicle.send_mavlink(msg) self.vehicle.flush() timeout=0 while self.command_OK == False and timeout < (55): time.sleep(0.2) timeout=timeout+1 # Camera controll Pitch and Yaw -20+90 deg..Position. def setCameraPitch(self,angle): self.set_servo(9,-angle) # Camera controll Pitch and Yaw +-90 deg... def setCameraYaw(self,angle): self.set_servo(8,angle) def condition_yaw(self, head, relative=True): print('Wait for position...') errorcounter = 100 self.command_OK = False; # Store location to be used after rotation (the drone may drift during rotation...) aLocation = self.vehicle.location.global_relative_frame if relative is True: heading = self.angular_difference(self.vehicle.heading, head, True) else: heading = head error = abs(self.angular_distance(self.vehicle.heading, heading)) # If offisial audupilot the mode will stay GUIDED all the time, this the 1.5 deg will be the limit. # If Rosettadrone, then the mode will be AUTO until the position is reatched... < 1.25deg error... while self.command_OK == False and error > 1.5: error = abs(self.angular_distance(self.vehicle.heading, heading)) print("Current error: %f" % error ) # Try / reTry... if errorcounter > 40: errorcounter = 0 self.set_yaw(heading, False) errorcounter=errorcounter+1 time.sleep(0.25) # Go back to the location before rotation, this is an issue on the DJI drones... self.goto_point(aLocation, heading, 0.4, True) print("At Position...") def print_state(self): print( "Global Location (relative altitude): %s" % self.vehicle.location.global_relative_frame) print("Velocity: %s" % self.vehicle.velocity) print("Heading: %s" % self.vehicle.heading) def goto_relative(self, forward, right, up, yaw, speed): current_yaw = self.vehicle.heading aLocation = self.vehicle.location.global_relative_frame point1 = self.get_location_metres(aLocation, forward, right, current_yaw) point1.alt = point1.alt + up abs_yaw = self.angular_difference(current_yaw, yaw, True) # print("abs %f %f %f "%(abs_yaw,yaw,current_yaw)) self.goto_point(point1, abs_yaw, speed, False) # Move around relative the origo... def goto_point(self, aLocation, yaw, speed, wait): if yaw < 0: yaw = yaw + 360 retrycount = 100 self.command_OK = False; # We got no way to know when it is done... so we have to do it manucommand_OKally... t= 0 while self.command_OK == False and t < 4: dist = self.get_distance_metres(aLocation,self.vehicle.location.global_relative_frame) distalt = abs(aLocation.alt-self.vehicle.location.global_relative_frame.alt) error = abs(self.angular_distance(self.vehicle.heading, yaw)) if wait == True: print('Distance to position %f-%f-%f' % (dist, distalt, error)) print('Head: %f' % self.vehicle.heading) # Some times we do not whant to wait... if wait == False: # Get global position from local fram parameters... self.send_global_ned_position(aLocation.lat, aLocation.lon, aLocation.alt, math.radians(yaw), speed) return if dist < 0.250 and distalt < 0.2: # and error < 1.5: t = t + 1 else: t = 0 # Try / Retry... every 10 sec. if retrycount > 40: retrycount = 0 # Get global position from local fram parameters... print("Set Alt: %f"%aLocation.alt) self.send_global_ned_position(aLocation.lat, aLocation.lon, aLocation.alt, math.radians(yaw), speed) retrycount=retrycount+1 time.sleep(0.25) print("At Position...") # Do a full crossbar inspection, this functio nwill be run as a async process... # aSize = cm to move... def adds_pole_mission(self,aSize): # Stop the drone movement if any... self.send_local_velocity(0,0,0,0) time.sleep(2) yaw = self.vehicle.heading aLocation = self.vehicle.location.global_relative_frame print(aLocation) print('Head: %f' % yaw) point1 = self.get_location_metres(aLocation, 0, aSize/100, yaw) # point1.alt = point1.alt - 0.5 print(point1) point2 = self.get_location_metres(aLocation, 0, -aSize/100, yaw) # point2.alt = point2.alt - 0.5 print(point2) # Get global position from local fram parameters... self.take_picture() time.sleep(1) # Move to right side, and take picture... self.goto_point(point1, yaw, 0.4, True) time.sleep(2) # self.goto_point(point1, self.angular_difference(yaw, -20, True),0.4) self.condition_yaw(self.angular_difference(yaw, -20, True), relative=False) time.sleep(2) self.take_picture() time.sleep(1) self.condition_yaw(yaw, relative=False) time.sleep(2) # Move back to center (and up 1 meeter)# self.goto_point(aLocation, yaw, 0.4, True) # Move to left side, and take picture... self.goto_point(point2, yaw, 0.4, True) time.sleep(2) self.condition_yaw(self.angular_difference(yaw, +20, True), relative=False) time.sleep(2) self.take_picture() time.sleep(1) self.condition_yaw(yaw, relative=False) time.sleep(2) # Move back to center and up... self.goto_point(aLocation, yaw, 0.4, True) # Now wait for the tracker to lock... time.sleep(7) def get_location_metres(self,original_location, dForward, dRight, deg): """ Returns a LocationGlobal object containing the latitude/longitude dNorth
and dEast
metres from the specified original_location
. The returned Location has the same alt
value as original_location
. The function is useful when you want to move the vehicle around specifying locations relative to the current vehicle position. The algorithm is relatively accurate over small distances (10m within 1km) except close to the poles. For more information see: http://gis.stackexchange.com/questions/2951/algorithm-for-offsetting-a-latitude-longitude-by-some-amount-of-meters """ ro = math.radians(deg) dNorth = dForward math.cos(ro) - dRightmath.sin(ro) dEast = dForward math.sin(ro) + dRightmath.cos(ro) earth_radius=6378137.0 #Radius of "spherical" earth #Coordinate offsets in radians dLat = dNorth/earth_radius dLon = dEast/(earth_radiusmath.cos(math.pioriginal_location.lat/180)) #New position in decimal degrees newlat = original_location.lat + (dLat 180/math.pi) newlon = original_location.lon + (dLon 180/math.pi) return LocationGlobal(newlat, newlon,original_location.alt) def get_distance_metres(self,aLocation1, aLocation2): """ Returns the ground distance in metres between two LocationGlobal objects. This method is an approximation, and will not be accurate over large distances and close to the earth's poles. It comes from the ArduPilot test code: https://github.com/diydrones/ardupilot/blob/master/Tools/autotest/common.py """ R=6371000 o1 = aLocation1.lat math.pi/180.0 o2 = aLocation2.lat math.pi/180.0 do = (aLocation1.lat-aLocation2.lat) math.pi/180 dp = (aLocation1.lon-aLocation2.lon) math.pi/180 a = math.sin(do/2) math.sin(do/2) + \ math.cos(o1) math.cos(o2) \ math.sin(dp/2) math.sin(dp/2) c = 2 math.atan2(math.sqrt(a),math.sqrt(1-a)) return Rc def distance_to_current_waypoint(self): """ Gets distance in metres to the current waypoint. It returns None for the first waypoint (Home location). """ nextwaypoint = self.vehicle.commands.next if nextwaypoint==0: return None missionitem=self.vehicle.commands[nextwaypoint-1] #commands are zero indexed lat = missionitem.x lon = missionitem.y alt = missionitem.z targetWaypointLocation = LocationGlobalRelative(lat,lon,alt) distancetopoint = self.get_distance_metres(self.vehicle.location.global_frame, targetWaypointLocation) return distancetopoint def angular_distance(self, alpha, beta): phi = abs(beta - alpha) % 360 # // This is either the distance or 360 - distance if phi > 180: distance = 360 - phi else: distance = phi if (alpha - beta >= 0 and alpha - beta <= 180) or (alpha - beta <=-180 and alpha- beta>= -360): return distance else: return distance -1 # Add an offset to a angle, ether in deg or rad. The input and output will be in 0-360 or 0-2pi. def angular_difference(self, inn, ang, deg): newang = inn+ang if deg == True: if newang > 360: newang = newang - 360 if newang < 0: newang = newang + 360 else: if newang > (math.pi2): newang = newang - (math.pi2) if newang < 0: newang = newang + (math.pi2) return newang def get_bearing(self,lat1,lon1,lat2,lon2): dLon = lon2 - lon1 y = math.sin(dLon) math.cos(lat2) x = math.cos(lat1)math.sin(lat2) - math.sin(lat1)math.cos(lat2)math.cos(dLon) brng = np.rad2deg(math.atan2(y, x)) if brng < 0: brng+= 360 return brngBest regardTerje Nilsen9Tek ASNorwayOn 18 Feb 2023, at 20:12, Terje Nilsen *@.> wrote:Yes used it with python all the time. There was a trick, I’ll check my code.Best regardTerje Nilsen9Tek ASNorwayOn 18 Feb 2023, at 18:49, Christopher Pereira @.***> wrote:
BTW, have you tried MAVSDK?
I'm running it locally on the smartphone listening on a UDP port.
Rosetta connects to the port (MAVSDK acknowledges the connection), but somehow the mavlink commands sent by MAVSDK are not received by Rosetta.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
got a new development machine, give me time to set everything up on Mac. will let you know.
@The1only can we rename the project to "Rosetta Drone" (without the version number) or just "Rosetta"?
I just found out that Rosetta was not supporting MAVLink 2. I'm going to implement it now.
We can, if you feel like it. Be aware the origional rosettadrone 1 (named rosettadrone) that gave up and handed it to me years ago. We do not whant to confuse people as that one is totally different.Ps: you do a greate job by the way, I support you 100%Best regardTerje Nilsen9Tek ASNorwayOn 21 Feb 2023, at 03:32, Christopher Pereira @.***> wrote: @The1only can we rename the project to "Rosetta Drone" (without the version number) or just "Rosetta"?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
I would advise against using just "rosetta" - that will conflict with the macos x86 emulation system thats also called rosetta and create unnecessary noise when searching for issues in the future. so, I'd suggest keeping rosettadrone or coming up with a new unique name.
I'm contacting the other Rosetta Drone repo owners on Github to link to this repo. It would be good to transfer the owner of this repo to a official "RosettaDrone" user.
I've added MAVLink 2 support on master branch and tested it with QGC. The version number is now hardcoded = 2.
The difference from the other Rosetta Drone repo is so huge, we would still need to call this Rosetta Drone V2.x, or better V3.x to avoid confusion as there now again is so many changes.I do not quite see the problem.
Best regard Terje
The other repos on GitHub are now official unmaintained and linking to us.
I need help cleaning the video code.
I wonder why we are not just sending the raw data received from the SDK to a destination UDP port to process it outside Rosetta using ffmpeg (instead of decoding it inside Rosetta with the native helper). Would it be possible? I tried it shortly with a mini SE, but wasn't able to decode the video on the end point with ffmpeg, even when ffprobe recognized it as h264 and it's size. Maybe ffmpeg needs special options?
When trying with the original RTP stream "packetized" by Rosetta, gstreamer works but the video looks as if it had encoding/decoding errors. I couldn't decode or detect the stream with ffmpeg or ffprobe. I could also be a performance problem, so I will try to just decode single frames. I want it for computer vision.
@The1only could you maybe confirm if your latest video code is in master branch?
I see. You got too long UDP packets when sending them out without decoding first. Here is your old post: https://github.com/The1only/rosettadrone/issues/8
I'm just worried about the decoding and encoding (on Rosetta) and then decoding again on the AI end point. Maybe Rosetta could send already decoded frames as raw to the AI script in a given resolution and frequency. Specially when the AI script is running locally on Android.
I created this issue to discuss the idea of doing the decoding outside Rosetta: https://github.com/The1only/rosettadrone/issues/122
I belive DJI adds some frames/data that needs to be stripped out, as I remember it.Best regardTerje Nilsen9Tek ASNorwayOn 24 Feb 2023, at 13:49, Christopher Pereira @.***> wrote: The other repos on GitHub are now official unmaintained and linking to us. I need help cleaning the video code. I wonder why we are not just sending the raw data received from the SDK to a destination UDP port to process it outside Rosetta using ffmpeg (instead of decoding it inside Rosetta with the native helper). Would it be possible? I tried it shortly with a mini SE, but wasn't able to decode the video on the end point with ffmpeg, even when ffprobe recognized it as h264 and it's size. Maybe ffmpeg needs special options? When trying with the original RTP stream "packetized" by Rosetta, gstreamer works but the video looks as if it had encoding/decoding errors. I couldn't decode or detect the stream with ffmpeg or ffprobe. I could also be a performance problem, so I will try to just decode single frames. I want it for computer vision. @The1only could you maybe confirm if your latest video code is in master branch?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
Let me check, there was no issues vith video on Mavic Pro 2…. However there was some performance issues with gstreamer. This is whay we used Nvidia GPUs when we compiled gstreamer. I belive, sometime there is a loot of I frames that makes the decoder strugle…Then again this was years ago now. Everything has changed.Best regardTerje Nilsen9Tek ASNorwayOn 24 Feb 2023, at 13:49, Christopher Pereira @.***> wrote: The other repos on GitHub are now official unmaintained and linking to us. I need help cleaning the video code. I wonder why we are not just sending the raw data received from the SDK to a destination UDP port to process it outside Rosetta using ffmpeg (instead of decoding it inside Rosetta with the native helper). Would it be possible? I tried it shortly with a mini SE, but wasn't able to decode the video on the end point with ffmpeg, even when ffprobe recognized it as h264 and it's size. Maybe ffmpeg needs special options? When trying with the original RTP stream "packetized" by Rosetta, gstreamer works but the video looks as if it had encoding/decoding errors. I couldn't decode or detect the stream with ffmpeg or ffprobe. I could also be a performance problem, so I will try to just decode single frames. I want it for computer vision. @The1only could you maybe confirm if your latest video code is in master branch?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
If you look at the video I made, the overall latency was supprisingly short (alt least to me). So no effort was made to improve it future. However flying fully autonomously with AI and video got issues when we passed approx. 5m/s when fine navigation/tracking. This is why I had to add a kalmanfiler in the python AI code.Best regardTerje Nilsen9Tek ASNorwayOn 24 Feb 2023, at 14:39, Christopher Pereira @.***> wrote: I see. You got too long UDP packets when sending them out without decoding first. Here is your old post: #8 I'm just worried about the decoding and encoding (on Rosetta) and then decoding again on the AI end point. Maybe Rosetta could send already decoded frames as raw to the AI script in a given resolution and frequency. Specially when the AI script is running locally on Android.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
If anyone know how to make a plugin interface to the software please let me know. We need a plugin for the AI, such that every one can make their own AI stuff without getting involved in RosettaDrone2 it self...
Done. https://github.com/The1only/rosettadrone/commit/93cf9efd57a7f31b14e01bcb7f92cb9441b80cef
You can now move your custom code to a class that extends the Plugin class. This way we can keep the MainActivity clean.
I have an event based architecture I am working on with a desktop and mobile app. Using rabbitMQ, Blazor and unity. Once complete I will be uploading it into my repo for testing as a demo project. Adam Dabdoub Software Developer c: 513.886.0301
On Sat, Mar 4, 2023 at 9:40 PM Christopher Pereira @.***> wrote:
If anyone know how to make a plugin interface to the software please let me know. We need a plugin for the AI, such that every one can make their own AI stuff without getting involved in RosettaDrone2 it self...
Done. 93cf9ef https://github.com/The1only/rosettadrone/commit/93cf9efd57a7f31b14e01bcb7f92cb9441b80cef
You can now move your custom code to a class that extends the Plugin class. This way we can keep the MainActivity clean.
— Reply to this email directly, view it on GitHub https://github.com/The1only/rosettadrone/issues/19#issuecomment-1454967301, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJKTGNIJ6LAQSPC7OUNAATW2P4LRANCNFSM4P2MLXAQ . You are receiving this because you were mentioned.Message ID: @.***>
with som kalmanfilters
I believe I need to implement EKF.
EKF to do what? I normaly try to stay with KF.
EKF to do what? I normaly try to stay with KF.
Moved here: https://github.com/The1only/rosettadrone/issues/132
We need people to work on the upgrade to the latest SDK, I ended up with no video at all.
We need people to work on the upgrading Gradel, it makes the code fail....
We need people to work on the Android lib uppdates.
We need more people on the Mavlink interface, there is loot of work to be done implementing one and one feature.
We need people on the GoogleMAP flight planner page, it does not work.
We need people to test on all different DJI platforms.
We need heads up when new things are coming our way, as DJI changes things without telling..,.