roboflow / dji-aerial-georeferencing

Detect objects in drone videos and plot them on a map
https://blog.roboflow.com/georeferencing-drone-videos/
Apache License 2.0
189 stars 32 forks source link

handle cases where the pitch value is different from -90 #18

Open ghost opened 8 months ago

ghost commented 8 months ago
        center = Point([self.observation["longitude"], self.observation["latitude"]])  # Convert longitude and latitude to GeoJSON Point
        altitude = float(self.observation["ascent(feet)"]) * 0.3048  # Convert ascent to meters
        fov_atan = math.tan(self.fov)
        diagonal_distance = altitude * fov_atan
        distance = diagonal_distance / 2
        bearing = (float(self.observation["compass_heading(degrees)"]) - 90) % 360 # Calculate bearing
        offset = math.atan(self.video_height / self.video_width) * 180 / math.pi  # Calculate offset based on video resolution

        # Calculate GPS coordinates of the video's four corners
        options = {'units': 'm'}
        self.top_left = rhumb_destination(center, distance, (bearing - offset + 180) % 360 - 180, options).geometry.coordinates
        self.top_left = self.top_left[:: -1]

        self.top_right = rhumb_destination(center, distance, (bearing + offset + 180) % 360 - 180,options).geometry.coordinates
        self.top_right = self.top_right[:: -1]

        self.bottom_right = rhumb_destination(center, distance, (bearing - offset) % 360 - 180, options).geometry.coordinates
        self.bottom_right = self.bottom_right[:: -1]

        self.bottom_left = rhumb_destination(center, distance, (bearing + offset) % 360 - 180, options).geometry.coordinates
        self.bottom_left = self.bottom_left[:: -1]

        # print the GPS coordinates of the video's four corners
        self.current_position = [self.observation["latitude"], self.observation["longitude"]]

        normalized = [self.dy - self.video_height / 2, self.dx- self.video_width / 2]
        distanceFromCenterInPixels = math.sqrt((self.video_width / 2 - self.dx) ** 2 + (self.video_height / 2 - self.dy) ** 2)
        diagonalDistanceInPixels = math.sqrt(self.video_width ** 2 + self.video_height ** 2)
        percentOfDiagonal = distanceFromCenterInPixels / diagonalDistanceInPixels
        distance = percentOfDiagonal * diagonal_distance  # in meters
        angle = math.atan(normalized[0] / (normalized[1] or 0.000001)) * 180 / math.pi
        # if the detection is in the right half of the frame, we need to rotate it 180 degrees

        if normalized[1] >= 0:
            angle += 180

        point = rhumb_destination(center, distance, (bearing + angle) % 360, options).geometry.coordinates
        point = point[:: -1]

Certainly! Isuccessfully rewritten the code in Python, and it's functioning well when the gimbal pitch is set to -90. If I want to extend the functionality to handle cases where the pitch value is different from -90, what should I do ?

yeldarby commented 8 months ago

Sounds hard to me because in addition to much harder math you’d have to also account for the topology of the terrain (imagine a mountain in front of you; the georeferencing would be much different than flat land or a valley).

ghost commented 8 months ago

I am considering a flat surface for my example, In the previously mention code, the assumption was made that the drone is positioned at the center of the image. However, when the pitch angle of the gimbal differs, an offset is introduced, influencing the drone's apparent location within the frame. I cannot figure out the mathematics behind it.

peakyquest commented 3 months ago

I am working on similar project where my gimbal isnt looking straight downward? is there any way?