Technical Blog

Category : AR.Drone

Control the AR.Drone LEDs

The AR.Drone has four LEDs, one per rotor.

Currently, we can only use predefined animations (around 20), but maybe we will be able to use user-defined sequences in a future version of the SDK.

Which function to use?

We use the ardrone_at_set_led_animation function to control the LEDs. The prototype is defined in ARDroneLib/Soft/Common/ardrone_api.h:

void ardrone_at_set_led_animation (
    LED_ANIMATION_IDS anim_id,
    float32_t freq,
    uint32_t duration_sec);
  • anim_id: set to one of the animations defined by the SDK. It is detailed further down in this article.
  • freq: the frequency (hertz) of the animation, or the inverse of the period (seconds). For example, if it is set to 0.25, the period is 4: the animation will be completed in 4 seconds. The higher the value is, the faster is the animation.
  • duration_sec: the duration in seconds. If its value is superior to the time of execution of the animation, it will loop. If the value is set to 0, the animation will loop infinitely.

The different animations

The anim_id is one of the values defined in ARDroneLib/Soft/Common/led_animation.h. They are several kinds of animations:

Animation’s Name Display
BLINK_GREEN_RED
BLINK_GREEN
BLINK_RED
BLINK_ORANGE
SNAKE_GREEN_RED
FIRE
STANDARD
RED
GREEN
RED_SNAKE
BLANK
Animation’s Name Display
RIGHT_MISSILE
LEFT_MISSILE
DOUBLE_MISSILE
FRONT_LEFT_GREEN_OTHERS_RED
FRONT_RIGHT_GREEN_OTHERS_RED
REAR_RIGHT_GREEN_OTHERS_RED
REAR_LEFT_GREEN_OTHERS_RED
LEFT_GREEN_RIGHT_RED
LEFT_RED_RIGHT_GREEN
BLINK_STANDARD

Understanding the led_animation.h file

The file ARDroneLib/Soft/Common/led_animation.h contains the declaration of the different animations. It works like a kind of enum: each identifier refers to an ID that will be transmitted to the drone. Then, it will play the animation settings associated to this ID. Therefore, we can’t define our own sequences.
For example:

LED_ANIMATION(BLINK_STANDARD, {0,2,{{0x00,500},{0xA5,500}}})

The LED_ANIMATION macro is explained in the file:

LED_ANIMATION(#name, {#nb_cycle,#nb_state,{{#led_pattern1,#delay1},{#led_pattern2,#delay2},{...,...}}})
#name = name, example : BLINK
#nb_cycle = number of times the animation is played (0 means infinite), example : 3
#nb_state = number of led patterns in the animation, example : 2
#led_pattern = led bitfield (G1 | R1 | G2 | R2 | G3 | R3 | G4 | R4), example : 0xAA all green led turned on
#delay = delay in ms for the associated led pattern, example : 500

In our example, the animation’s ID is BLINK_STANDARD. It loops infinitely and contains two states: 0×00 during 0.5 seconds, and 0xA5 during 0.5 seconds.
0xA5 is hexadecimal code. Its binary form is: 1010 0101. It is mapped this way:

1 0 1 0 0 1 0 1
G1 R1 G2 R2 G3 R3 G4 R4

Therefore, we can see that:

  • LEDs 1 and 2 are green
  • LEDs 3 and 4 are red

The results will be this:

The same way, in the following state, 0×00, all the lights are turned off.

To use this animation, we use the following code:

ardrone_at_set_led_animation(BLINK_STANDARD, 0.25, 6);

In this example, all the LEDs will switch between the two states every two seconds (period of 4 seconds) during 6 seconds.

Appendix: list of the animations

The list can be found in the following file: ARDroneLib/Soft/Common/led_animation.h

LED_ANIMATION(BLINK_GREEN_RED,               { 0,2, { {0x55,500},{0xAA,500} } } )
LED_ANIMATION(BLINK_GREEN,                   { 0,2, { {0x00,500},{0xAA,500} } } )
LED_ANIMATION(BLINK_RED,                     { 0,2, { {0x55,500},{0x00,500} } } )
LED_ANIMATION(BLINK_ORANGE,                  { 0,2, { {0xFF,500},{0x00,500} } } )
LED_ANIMATION(SNAKE_GREEN_RED,               { 0,8, { {0x90,200},{0x48,200},{0x24,200},{0x12,200},{0x9,200},{0x84,200},{0x42,200},{0x21,200}}})
LED_ANIMATION(FIRE,                          { 0,2, { {0x35,50},{0xC5,50} } } )
LED_ANIMATION(STANDARD,                      { 1,1, { {0xA5,100} } } )
LED_ANIMATION(RED,                           { 1,1, { {0x55,100} } } )
LED_ANIMATION(GREEN,                         { 1,1, { {0xAA,100} } } )
LED_ANIMATION(RED_SNAKE,                     { 0,4, { {0x40,500},{0x10,500},{0x04,500},{0x01,500}}})
LED_ANIMATION(BLANK,                         { 1,1, { {0x00,100} } } )
LED_ANIMATION(RIGHT_MISSILE,                 { 1,5, { {0x00,500},{0x04,300},{0x1C,100},{0x30,300},{0x00,500}}})
LED_ANIMATION(LEFT_MISSILE,                  { 1,5, { {0x00,500},{0x01,300},{0x43,100},{0xC0,300},{0x00,500}}})
LED_ANIMATION(DOUBLE_MISSILE,                { 1,5, { {0x00,500},{0x05,300},{0x5F,100},{0xF0,300},{0x00,500}}})
LED_ANIMATION(FRONT_LEFT_GREEN_OTHERS_RED,   { 1,1, { {0x95,100} } } )
LED_ANIMATION(FRONT_RIGHT_GREEN_OTHERS_RED,  { 1,1, { {0x65,100} } } )
LED_ANIMATION(REAR_RIGHT_GREEN_OTHERS_RED,   { 1,1, { {0x59,100} } } )
LED_ANIMATION(REAR_LEFT_GREEN_OTHERS_RED,    { 1,1, { {0x56,100} } } )
LED_ANIMATION(LEFT_GREEN_RIGHT_RED,          { 1,1, { {0x96,100} } } )
LED_ANIMATION(LEFT_RED_RIGHT_GREEN,          { 1,1, { {0x69,100} } } )
LED_ANIMATION(BLINK_STANDARD,                { 0,2, { {0x00,500},{0xA5,500} } } )

Create a video with the AR.Drone

The AR.Drone is an efficient source of images: it can fly, being remotely controlled, etc. We will see in this article how to create avi files from its cameras. We will use OpenCV to create it, so you’ll probably need first to take a look at Use OpenCV with the AR.Drone SDK.

OpenCV code

We will use the CvVideoWriter structure to build our avi file.
Firstly, we need a function to initialize it.

CvVideoWriter *init_video_writer(char *fname)
{
  int isColor = 1;
  int fps     = 30;
  int frameW = 320;
  int frameH = 240;
  return cvCreateVideoWriter(fname, // with avi extension
                             CV_FOURCC('D', 'I', 'V', 'X'), //MPEG4
                             fps,
                             cvSize(frameW,frameH),
                             isColor);
}

This feature is handled in my project with a button. I added to functions that are call by the button’s callback :

static CvVideoWriter *video_writer = 0;
void init_video(void)
{
  video_writer = init_video_writer();
}

void stop_video(void)
{
  // Necessary to have a valid avi file
  cvReleaseVideoWriter(&video_writer);
}

I added a function to add a frame to the video:

inline void add_frame(IplImage *img)
{
  if (video_writer)
    cvWriteFrame(video_writer, img);
}

Finally, we need to call the add_frame function every time we receive a new frame from the drone. I added it in the output_gtk_stage_transform function in Video/video_stage.c.
Underneath the code creating the OpenCV image, I added

if (/* video saving is enabled by the user */)
    add_frame(img);

Handling different frame rates

The AR.Drone has two cameras, using two different frame rates:

  • The frontal camera has 15 FPS
  • The vertical camera has 60 FPS

In the previous code, video was created with 30FPS. Therefore, one camera will look to slow, and the other one to fast. Therefore, we can update the function this way:

CvVideoWriter *init_video_writer(char *fname, int fps)
{
  int isColor = 1;
  int fps     = fps;
  int frameW = 320;
  int frameH = 240;
  return cvCreateVideoWriter(fname, // with avi extension
                             CV_FOURCC('D', 'I', 'V', 'X'), // //MPEG4
                             fps,
                             cvSize(frameW,frameH),
                             isColor);
}

Then, we can use two ways to call it:

video_writer = init_video_writer("out_horizontal.avi", 15);

or

video_writer = init_video_writer("out_vertical.avi", 60);

Refer to this page for more information about possible codecs.

Use OpenCV with the AR.Drone SDK

OpenCV (Open Source Computer Vision Library) is a powerful image processing library. I will detail in this post how to use it with the AR.Drone’s C SDK.

Compiling the AR.Drone SDK with OpenCV

The first step is to install OpenCV. If you’re using Ubuntu, you may refer to this page.

Once the library is installed, we need to modify the Makefile to add the correct flags. We will edit sdk_demo/Build/Makefile.

To add the correct cflags, find the line:

GENERIC_INCLUDES:=$(addprefix -I,$(GENERIC_INCLUDES))

and add underneath:

GENERIC_INCLUDES += `pkg-config --cflags opencv` 

To add the correct libraries, change the following line:

GENERIC_LIBS=-lpc_ardrone -lgtk-x11-2.0 -lrt

to

GENERIC_LIBS=-lpc_ardrone -lgtk-x11-2.0 -lrt `pkg-config --libs opencv`

Creating an OpenCV image from the drone’s image

We need to update the output_gtk_stage_transform in Video/video_stage.c to transform the data received from the drone to an IplImage, the OpenCV image structure. First, we need to add some includes:

#include "cv.h"
#include "highgui.h" // if you want to display images with OpenCV functions

We will use a method close to what we did to create a GdkPixbuf:

IplImage *ipl_image_from_data(uint8_t* data)
{
  IplImage *currframe;
  IplImage *dst;

  currframe = cvCreateImage(cvSize(320,240), IPL_DEPTH_8U, 3);
  dst = cvCreateImage(cvSize(320,240), IPL_DEPTH_8U, 3);

  currframe->imageData = data;
  cvCvtColor(currframe, dst, CV_BGR2RGB);
  cvReleaseImage(&currframe);
  return dst;
}

We call it from output_gtk_stage_transform in Video/video_stage.c:

IplImage *img = ipl_image_from_data((uint8_t*)in->buffers[0], 1);

Vertical camera handling

As detailed in a previous article, the images captured with the vertical camera has a lower size than the horizontal camera. The data transmitted has the same size in both cases, but with empty pixels. I updated the ipl_image_from_data:

IplImage *ipl_image_from_data(uint8_t* data, int reduced_image)
{
  IplImage *currframe;
  IplImage *dst;

  if (!reduced_image)
  {
    currframe = cvCreateImage(cvSize(320,240), IPL_DEPTH_8U, 3);
    dst = cvCreateImage(cvSize(320,240), IPL_DEPTH_8U, 3);
  }
  else
  {
    currframe = cvCreateImage(cvSize(176, 144), IPL_DEPTH_8U, 3);
    dst = cvCreateImage(cvSize(176,144), IPL_DEPTH_8U, 3);
    currframe->widthStep = 320*3;
  }

  currframe->imageData = data;
  cvCvtColor(currframe, dst, CV_BGR2RGB);
  cvReleaseImage(&currframe);
  return dst;
}

The trick is the same as detailed in the previous article. We set that each new line starts every 320*3 bytes, but we only use 176*3 byes per line.

Converting OpenCV images to GdkPixbuf

If you’re using a GTK interface as detailed in previous articles, you may want to display the OpenCV image inside your GTK Window. To do this, I use the following function to create a GdkPixbuf structure that can be displayed by GTK:

GdkPixbuf* pixbuf_from_opencv(IplImage *img, int resize)
{
  IplImage* converted = cvCreateImage(cvSize(img->width, img->height), IPL_DEPTH_8U, 3);
  cvCvtColor(img, converted, CV_BGR2RGB);

  GdkPixbuf* res = gdk_pixbuf_new_from_data(converted->imageData,
                                           GDK_COLORSPACE_RGB,
                                           FALSE,
                                           8,
                                           converted->width,
                                           converted->height,
                                           converted->widthStep,
                                           NULL,
                                           NULL);
  if (resize)
    res = gdk_pixbuf_scale_simple(res, 320, 240, GDK_INTERP_BILINEAR);

  return res;
}

Switch camera with the AR.Drone SDK

The AR.Drone has two cameras, but we can only use one at a time. Therefore, we can need to change the camera whose images are sent by the drone.

The different channels

The different channels are declared in ARDroneLib/Soft/Common/ardrone_api.h with the following enum:

enum ZAP_VIDEO_CHANNEL {
  ZAP_CHANNEL_FIRST = 0,
  ZAP_CHANNEL_HORI = ZAP_CHANNEL_FIRST,
  ZAP_CHANNEL_VERT,
  ZAP_CHANNEL_LARGE_HORI_SMALL_VERT,
  ZAP_CHANNEL_LARGE_VERT_SMALL_HORI,
  ZAP_CHANNEL_LAST = ZAP_CHANNEL_LARGE_VERT_SMALL_HORI,
  ZAP_CHANNEL_NEXT
}

They are 4 main values to select the video stream:
ZAP_CHANNEL_HORI

ZAP_CHANNEL_VERT

ZAP_CHANNEL_LARGE_HORI_SMALL_VERT

ZAP_CHANNEL_LARGE_VERT_SMALL_HORI

Note: images come from the Doc folder of the AR.Drone SDK.

Changing the camera

We use the following code to select the camera:

ZAP_VIDEO_CHANNEL channel = /* some channel, ex: ZAP_CHANNEL_LARGE_HORI_SMALL_VERT*/;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (video_channel, &channel, cb);

The cb parameter

cb is a pointer to a function called when the command is executed. It is defined in the Developer’s Guide:

The callback function type is void (*callBack)(unsigned int success). The configuration tool will call the callback function after any attempt to set the configuration, with zero as the parameter in case of failure, and one in case of success. In case of failure, the tool will automatically retry after an amount of time.

Handling the different resolutions

The vertical camera has a smaller resolution: width and height are only half of the horizontal (frontal) camera. Nevertheless, the data transmitted by the drone have the same size. Therefore, if you display the image, only a quarter of the image will be the camera’s image, and the rest will be green pixels. To create a Pixbuf structure containing only the good pixels, use the following code:

buf = gdk_pixbuf_new_from_data(pixbuf_data,
                               GDK_COLORSPACE_RGB,
                               FALSE,
                               8,
                               176,
                               144,
                               320 * 3,
                               NULL,
                               NULL);

This trick is to give as parameter the correct reduced width of 176 pixels, but to indicate that each new line starts after 320 * 3 pixels, so it will skip useless pixels. The height parameter is also reduced to 144, so the bottom of the image, filled with green pixels, is ignored.

Next, you can enlarge the image to have the same size as the horizontal camera:

pixbuf = gdk_pixbuf_scale_simple (pixbuf,
                                  320,
                                  240,
                                  GDK_INTERP_BILINEAR) ;

Create an AR.Drone graphical application

This article is a tutorial to learn the basics of the ARDrone SDK.

It shows the steps to create a minimalist graphical application, with the following features :

  • Display of the drone’s video
  • Buttons to take off / land

If you’re not familiar with the AR.Drone SDK, you should read first my previous Introduction to the AR.Drone SDK.

The AR.Drone Tool

The AR.Drone Tool is the easiest way to create an AR.Drone application on the Linux platform. Parrot provides a basic project skeleton, including all the code needed to initialize the system, gather video and other kind of information transmitted by the drone.

Those files are located in the directory Examples/Linux/sdk_demo/Sources. The files are the following:

  • ardrone_testing_tool.[ch]: the initialization code.
  • Navadata/navdata.[ch]: the navigation data (gyroscope and altimeter information, battery level etc.)
  • Video/video_stage.[ch]: handling of the video stream.
  • UI/gamepad.[ch]: handling of the joystick.
  • UI/ui.[ch]: unknown! (almost empty file).

To create an application, the main task is to customize several functions:

  • ardrone_testing_tool.c: ardrone_tool_init_custom() is called at the initialization of the application, and ardrone_tool_shutdown_custom() when leaving.
  • Navdata/navdata.c: demo_navdata_client_process() is called every time the drone transmits navigation data.
  • Video/video_stage.c: output_gtk_stage_transform() is called every time an image acquired by the drone is received and decoded.

Step 1: Creating the user interface

In this article, I will use GTK. The build system provided by Parrot is designed to use GTK, and the required flags are already present.

In this first step, we will simply create a basic interface with three main widgets:

  • An image widget to display the camera’s video
  • A button to take off
  • A button to land

I created two new files: UI/gui.c and UI/gui.h (click to expand).

#ifndef GUI_H_
# define GUI_H_

# include <gtk/gtk.h>
typedef struct gui
{
  GtkWidget *window;
  GtkWidget *start;
  GtkWidget *stop;
  GtkWidget *box;
  GtkWidget *cam;
} gui_t;

gui_t *get_gui();

void init_gui(int argc, char **argv);

#endif
#include <stdlib.h>
#include "gui.h"

gui_t *gui = NULL;

gui_t *get_gui()
{
  return gui;
}

/* If the drone is landed, only start is clickable,
   if the drone is in the air, only stop is clickable
*/
static void toggleButtonsState(void)
{
  gboolean start_state = gtk_widget_get_sensitive(gui->start);

  gtk_widget_set_sensitive(gui->start, !start_state);
  gtk_widget_set_sensitive(gui->stop, start_state);
}

static void buttons_callback( GtkWidget *widget,
			      gpointer   data )
{
    // FIXME: make the drone start
}

static void on_destroy(GtkWidget *widget, gpointer data)
{
  vp_os_free(gui);
  gtk_main_quit();
}

void init_gui(int argc, char **argv)
{
  gui = vp_os_malloc(sizeof (gui_t));

  g_thread_init(NULL);
  gdk_threads_init();
  gtk_init(&argc, &argv);

  gui->window = gtk_window_new(GTK_WINDOW_TOPLEVEL);
  g_signal_connect(G_OBJECT(gui->window),
		   "destroy",
		   G_CALLBACK(on_destroy),
		   NULL);
  gui->box = gtk_vbox_new(FALSE, 10);
  gtk_container_add(GTK_CONTAINER(gui->window),
		    gui->box);
  gui->cam = gtk_image_new();
  gtk_box_pack_start(GTK_BOX(gui->box), gui->cam, FALSE, TRUE, 0);

  gui->start = gtk_button_new_with_label("Start");
  g_signal_connect (gui->start, "clicked",
		      G_CALLBACK (buttons_callback), NULL);
  gui->stop = gtk_button_new_with_label("Stop");
  g_signal_connect (gui->stop, "clicked",
		      G_CALLBACK (buttons_callback), NULL);
	gtk_widget_set_sensitive(gui->start, TRUE);
  gtk_widget_set_sensitive(gui->stop, FALSE);

  gtk_box_pack_start(GTK_BOX(gui->box), gui->start, TRUE, TRUE, 0);
  gtk_box_pack_start(GTK_BOX(gui->box), gui->stop, TRUE, TRUE, 0);

  gtk_widget_show_all(gui->window);
}

You may have noticed that I am using vp_os_malloc() and vp_os_free() instead of the usual standard library functions. It is required by the SDK. If you don’t do so, you will get error messages like this one:

In function `init_gui’:
gui.c:(.text+0×21): undefined reference to `please_use_vp_os_malloc’

We need to update the Makefile to compile our file with the rest of the SDK. Edit sdk_demo/Build/Makefile and find these lines:

GENERIC_BINARIES_COMMON_SOURCE_FILES+=			\
   UI/ui.c  \
   UI/gamepad.c \
   Navdata/navdata.c    \
   Video/video_stage.c

Add a reference to our files, so the lines now looks like this:

GENERIC_BINARIES_COMMON_SOURCE_FILES+=			\
   UI/ui.c  \
   UI/gui.c \
   UI/gamepad.c \
   Navdata/navdata.c    \
   Video/video_stage.c

Step 2: creating the GUI thread

Now that we have the code to create the GUI, we need to call it. Parrot’s SDK includes a way to create threads with macros. We will use it to create a thread dedicated to our user interface. This work is done in ardrone_testing_tool.c.

The first modification is to include the GUI header. Add the following include:

#include "UI/gui.h"

Then we have to define our thread function, using specific macros. After the include, add this code:

DEFINE_THREAD_ROUTINE(gui, data) /* gui is the routine's name */
{
  gdk_threads_enter();
  gtk_main();
  gdk_threads_leave();
}

Then, we need to customize the ardrone_tool_init_custom function. We add the following lines:

  init_gui(argc, argv); /* Creating the GUI */
  START_THREAD(gui, NULL); /* Starting the GUI thread */

We also custom the ardrone_tool_shutdown_custom function. We need to add:

  JOIN_THREAD(gui);

Finally, we have to add our thread in the Thread Table.
We need to add

THREAD_TABLE_ENTRY(gui, 20)

in the following block:

BEGIN_THREAD_TABLE
  THREAD_TABLE_ENTRY( ardrone_control, 20 )
  THREAD_TABLE_ENTRY( navdata_update, 20 )
  THREAD_TABLE_ENTRY( video_stage, 20 )
END_THREAD_TABLE

More information about the thread management can be found in the Developer’s Guide.

Displaying the camera images

Every time the drone transmits to the computer an image, the output_gtk_stage_transform function in Video/video_stage.c is called.

The function I developed is the following one:

C_RESULT output_gtk_stage_transform( void *cfg, vp_api_io_data_t *in, vp_api_io_data_t *out)
{
  vp_os_mutex_lock(&video_update_lock);
  // Get a reference to the last decoded picture
  pixbuf_data      = (uint8_t*)in->buffers[0];
  vp_os_mutex_unlock(&video_update_lock);

  gdk_threads_enter();
  // GdkPixbuf structure to store the displayed picture
  static GdkPixbuf *pixbuf = NULL;

  if(pixbuf!=NULL)
    {
      g_object_unref(pixbuf);
      pixbuf=NULL;
    }

  // Creating the GdkPixbuf from the transmited data
  pixbuf = gdk_pixbuf_new_from_data(pixbuf_data,
				    GDK_COLORSPACE_RGB,
				    FALSE,   // No alpha channel
				    8,       // 8 bits per pixel
				    320,     // Image width
				    288,     // Image height
				    320 * 3, // New pixel every 3 bytes (3channel per pixel)
				    NULL,    // Function pointers
				    NULL);

  gui_t *gui = get_gui();
  if (gui && gui->cam) // Displaying the image
    gtk_image_set_from_pixbuf(GTK_IMAGE(gui->cam), pixbuf);
  gdk_threads_leave();

  return (SUCCESS);
}

Drone’s take off and landing

The two buttons in the graphical interface are currently binded to empty functions. We need to go back to UI/gui.c and edit the buttons_callback function.We use the following code:

static void buttons_callback(GtkWidget *widget,
                             gpointer   data )
{
    static int value = 1;
    ardrone_tool_set_ui_pad_start(value);
    if (value)
      g_print("Taking off");
    else
      g_print("Landing");
    value = (value + 1) % 2;
    toggleButtonsState(); // We want only one button to be clickable
}

We need to include the file containing ardrone_tool_set_ui_pad_start prototype:

#include <ardrone_tool/UI/ardrone_input.h>

The ardrone_tool_set_ui_pad_start(value) is the AR.Drone’s function used to take off or land the drone. If value is 1 the drone takes off, and lands is value is 0.

Now that all these modifications has been done, we have an application with buttons to take off and land, able to display the drone’s video.

Introduction to the AR.Drone SDK

What is the AR.Drone ?

The AR.Drone is a quadricopter created by Parrot. It can be be purchased in stores like Amazon.com at a price of around 300 dollars. It is mainly used as a flying video game, and several games has been released using augmented reality.

The drone has two cameras: one frontal and one vertical. The technical specifications can be found here.
The drone can be controlled using any Wifi device. When the drone is turned on, it automatically creates a ad-hoc wifi. The controlling device connects to the wifi, and communicates with the drone. Parrot developed an application for the iPhone and the Android Phones, named AR.FreeFlight. More applications created by other developers can be found on the AppStore, etc.

The SDK

Parrot released a SDK to help developers creating innovation applications using the drone. The SDK is available for iOS, Android, Linux and Windows. Linux and Windows SDK are using the C programming language. In this article, I will focus on the Linux SDK. The specifications of the communication’s protocol used by the drone is also available. Therefore, it is possible to create a new SDK instead of using Parrot’s one.

The SDK and the Developer’s Guide can be found in the AR.Drone Open API Platform. In this article, I am using the version 1.7. Several ways exist to create an application using the AR.Drone. In the developer guide, Parrot recommends to use the AR.Drone Tool, a framework designed to create easily applications.

SDK structure

The root of the archive contains several files and directory:

  • ARDroneAPI.dox: doxygen file, used to generate the documentation.
  • ARDroneLib: AR.Drone library (communication with the drone, video codecs etc.)
  • ControlEngine: files specific to the iPhone.
  • Docs: folder where the documentation is generated
  • Examples: folder containing demonstration code for each platform and the ARDrone Tool.

The archive contains more files than we need. We can remove the ControlEngine directory, and all the directories in Examples except the Linux one.

Building the examples

The examples are useful for several reasons:

  • To check that the system has all the needed libraries.
  • To check that the computer can communicate with the drone

The first step is to install the missing package. If you’re using Ubuntu, a script was developed to install them automatically. ARDroneLib/Soft/Build/check_dependencies.sh must be executed with the root privilege. OK is displayed when all the packages are installed.
When all the packages are installed, we are ready to compile the libraries and the examples:

cd Examples/Linux
make

If you get the following error message:

In file included from ..//VP_Com/linux/vp_com_serial.c:33:
..//VP_Com/vp_com_serial.h:22: error: expected ‘)’ before ‘*’ token

you need to edit the file ARDroneLib/Soft/Build/custom.makefile and change the line

USE_LINUX=no

to

USE_LINUX=yes

We need to connect to the wifi created by the drone when It is turned on. Its name is ardrone_XXXXX, with XXXXX the ID of the drone.
By default, only two IP addresses are used:

  • 192.168.1.1 is the drone.
  • 192.168.1.2 is the device that communicates with it.

Thefore, when we are connected to the wifi, we need to set our IP address. We can use the following command:

sudo ifconfig wlan0 192.168.1.2 netmask 255.255.255.0

If the computer address is not 192.168.1.2, the drone won’t be able to communicate with the computer.
Several examples have been compiled. The most useful one is Build/Release/ardrone_navigation. This graphical application is able to display all the information transmitted by the drone:

  • Camera’s images
  • Battery status
  • Gyroscopic data

The application is able to control the drone movement using a Joystick. Refer to the Developer’s Guide for more information.

Common troubleshooting

Error:

In file included from ..//VP_Com/linux/vp_com_serial.c:33:
..//VP_Com/vp_com_serial.h:22: error: expected ‘)’ before ‘*’ token

Solution:
Edit the file ARDroneLib/Soft/Build/custom.makefile and change the line

USE_LINUX=no

to

USE_LINUX=yes

Error:

undefined reference to symbol ‘some_gtk_function’

Solution:

Some flags are probably missing in the Makefile. A way to fix it is to edit the file Examples/Linux/sdk_demo/Build/Makefile. To add the correct flags, change the following line:

GENERIC_LIBS=-lpc_ardrone -lgtk-x11-2.0 -lrt

to

GENERIC_LIBS=-lpc_ardrone -lgtk-x11-2.0 -lrt `pkg-config --libs gtk+2.0`

Error:

Timeout when reading navdatas – resending a navdata request on port 5554

Solution:
One possibility is that your computer’s IP is not 192.168.1.2. You can set it with the following command line:

sudo ifconfig wlan0 192.168.1.2 netmask 255.255.255.0

Error:
The navigation data always contains zeros:
Solution
This is an error I had with VMWare. I changed my IP (in this case, the eth0 IP) to 192.168.1.3 and since then, I don’t have any more errors like this one. I use the bridged mode to connect to the drone via AirPort.
Error:
The check_dependencies.sh file is not installing the packages.
Solution:
Several problems can exist:

  • You are not using Ubuntu, and therefore the script cannot work. You need to manually install required packages.
  • You are not running the script with the root privilege. Prefix your command line by sudo.
  • The file temporary_file located in the same folder exists and indicates to the script that all the packages are installed. It can happen if you got an archive from an other computer, instead of the official SDK. You simply need to remove this file.