News from SAME

SAME 2012 Conference Results

Click here to see the results

same-location-2012-titlesame-location-2012-sophia-ccsame-location-2012-cica

Latest update

  • Articles List
  • Future Same Dates
  • Conference and Exhibition – SAME 2012 Conference
  • Conference and Exhibition Same Forum 2011
  • SAME at DAC 48, San Diego

same-registration-confernece-2012

 

Print

Drone demonstration

Image processing for the autonomous navigation of micro drones

 

Jean-Luc Dugelay, Eurecom
Ludovic Apvrille, Telecom ParisTech

Micro drones are now affordable for public (~300 euros). Drones can be remotely controlled using tablets or phones. But our expectation is that they could also be used at low cost but efficiently in diverse applications related to civil protection surveillance as a mobile video surveillance sensor.

However, navigating in a complex and sometimes unknown environment is a task that is even difficult for human being. For example, to drive a car, we are assisted with visual signs: lines on road, warning signs, traffic lights, etc. In order to achieve an efficient autonomous navigation for drones based on image processing, we defined a collection of visual signs (or markers) that the drone can recognize to navigate indoor as well as outdoor.

For the purpose of the demonstration, we dispatched inside the SAME conference building a set of signs that indicates to the drone with path to follows to go from a starting point to an ending point. In particular, the drone can follow corridors, go through open doors, and make turns. The demonstration is based on a Parrot drone.

The drone uses one frontal and one bottom camera to identify the markers. Images output by the two cameras are first sent to a remote computer via a WIFI connection. Images are then analyzed, i.e. makers are extracted from images. Then, with regards to the recognized markers and according to current drone position and state, navigation instructions are sent back to the drone.

During the demonstration, techniques used for implementing the autonomous navigation will be presented in an amphitheater: overall system architecture, visual markers, image processing techniques. When the demonstration is running, images before and after computation are displayed all together in one mosaic.

 

Title 1: Introduction to Sensor-based Control for Aerial Robots

 

Tarek Hamel et Robert Mahony,

Abstract-- Aerial robotics has been an active area of research within the last decade. It has been steadily maturing throughout the years leading to sophisticated auto-pilot systems for fully autonomous flight and navigation systems for a range of military and civil applications. However many of the practical challenges associated with real time implementation of control and estimation algorithms for aerial robotic vehicles are yet to be satisfactorily resolved. Aerial robotic vehicles have complex and poorly known dynamic models. The sensor systems used can be noisy and poorly characterized. The applications considered may require them to be flown closer to the vehicle performance limitations than for manned vehicles. They are often flown in close proximity to an unknown or only partially known and dynamically changing physical environment. They might be designed to fly indoors or in an environment where GPS signals are not available. These practical requirements and constraints lead to a field that will benefit tremendously from the application of sophisticated control and estimation techniques. In this talk we will present an overview of several methods of feedback control (including image visual servo control) developed for the class of Vertical and Take-off and Landing (VTOL) vehicles, with a special attention to practical issues in the context of visual servo-control (stabilization, terrain following, landing, etc).

 

Title 2: Implementation of Nonlinear Attitude Estimators for Aerial Robotic Vehicles

 

Minh-Duc Hua, Guillaume Ducard, Member, IEEE, Tarek Hamel, Member, IEEE, Robert Mahony, Member, IEEE and Konrad Rudin

Abstract—Attitude estimation is a key component of the avionics suite of any aerial robotic vehicle. This paper details theoretical and practical solutions in order to obtain a robust nonlinear attitude estimator for flying vehicles equipped with low-cost sensors. The attitude estimator is based on a nonlinear explicit complementary filter, which has significantly been enhanced with effective gyro-bias compensation via the design of anti-windup nonlinear integrators. A measurement decoupling strategy is also proposed, which makes the overall attitude estimation more robust against non-accurate yaw angle estimation, due to magnetic disturbance, for example. In addition, the paper discusses the fixed-point numerical implementation of the algorithm. Finally, real experimental results confirm the superiority of the method described in this paper compared to other similar methods.

casa cg06 ue paca prides