Deep learning is currently a hot research topic, allowing automation of complex tasks, and UAVs can now be used to reliably access hard-to-reach places. We are researching how to join both: to have a UAV with a camera fly near an infrastructure and run a deep learning algorithm to inspect it: detect cracks in cement, rust in iron, etc.

As a preliminary test, we flew a Phantom 4 near one of the floodlight posts of the Okutama baseball field. This let us have an idea of what kind of footage we can get from a UAV, and how close we can fly to the infrastructure.

In the context of deep learning research of our lab, an interesting problem came up: given the geographic coordinates (“GPS” for convenience) and height of the drone, its heading, the orientation of its camera in 3D space, and the intrinsic parameters of the camera (resolution, field of view…), can we compute the GPS coordinates of a given pixel of the camera’s image? This is a photogrammetry problem, and solving it would allow us to pinpoint the location of objects that were detected via some deep learning algorithm in image space (i.e. a certain pixel in the image).

Read More

Today we flew our new DJI Matrice 100 for the first time. Matrice 100 is a developer-oriented drone compatible with the DJI Onboard SDK – this means we can have an onboard computer, instead of interacting with the drone via a smartphone connected to the radio controller as we did with the Phantom 4 + Mobile SDK in the past. In the future, this will allow us to perform heavy computations in the drone itself, for example, run deep learning in a superchip onboard, in real time.

Read More

Today, for the first time, we tested multiple (2) drones using our Supervisory Control Station (SCS), and it was successful. Further we tested dynamic re-allocation while executing a given allocation.

We logged GPS for accuracy and RTT for delays. We noticed similar values as last time, with the exception of one large delay (7 seconds).

The purpose of today’s flight was to test our Drone Automatic Flight Controller in a wider area. The GCS was set up at the baseball field and the drone started on the Hikawa elementary school. The drone flew at 5 m/s, at an altitude of 105 meter. The SCS of the GCS started the drone, which then automatically flew to the target area of the baseball field.

Read More

The purpose of today’s flights was to test our current UTM infrastructure consisting of a first version of the Supervisory Control System (SCS), which is hosted on a Ground Control Station (GCS) and the Drone Automated Controller (DAC). The SCS was used to manually create task allocations, i.e., GPS locations the drone should visit. Then DAC translates this allocation to flight instructions of the drone. As this was our first field test, the tests were performed on the baseball field in Okutama. We were happy to see that our UTM infrastructure is functional.

Read More

The purpose of today’s flights was to collect aerial footage of several places in Okutama, including the Hikawa elementary school, Nemoto Shrine, and Okutama baseball field. Later we will pixel-wise label selected pictures from the footage and thus improve the accuracy of our Deep Learning model that supports real-time semantic segmentation (pixel-wise labeling) of the aerial video stream. We used the 4K camera of our DJI Phantom 4 quadcopter. Today all flights were piloted manually.

Read More