featured

Student project harnesses brain power to fly AR Drone

Before graduating from their Development Accelerator, students complete and present a final project. Robert, a JavaScript student, brought in an idea that involved an AR Drone 2.0, Mindflex Duel headsets, and an Internet connection. He teamed up with two other students and a TA to get his project—quite literally—off the ground.

This is a project that you’ve been working on for a while, correct?

Yes, that’s right. I started thinking about integrating brain control into a dance performance in 2010. The original idea was to have a group of dancers collectively control a robotic ball with their EEG signals. However, the problem with a ball was that it was tied to the ground, so much of the action would be hidden from the audience. If only the ball could fly!

In 2012, when I learned that the (relatively) affordable AR Drone 2.0 was programmable with Node.js, I got to work hacking Mindflex Duel headsets and bought a refurbished AR Drone 2.0, and a book about Node. With the occasional but essential help of a few hardware and software experts, by spring of 2013 I had a three-headset prototype flying, which I demoed at Robothon 2013 and Seattle Mini Maker Faire 2014.

The project was at this point at the beginning of Code Fellows, but it only worked on a local machine. The goal of the final project was to get it to work online so that instead of controlling the drone within wifi range you could control it from your machine anywhere in the world.

Above: JavaScript instructor Ivan Storck (far left) powers the drone via the Mindflex Duel headset.

How did you come up with the idea?

I really wanted to have telekinesis growing up. Like a lot of kids, I moved around a lot, and something about that superpower, in particular, appealed to me. It gives you direct control over your environment. This project definitely stems from the childlike desire to have superpowers.

Where did you get the parts for the drone?

The drone is an AR Drone 2.0, out of the box. The only modifications that we had to make were on the embedded software.

Can you explain how it works?

Sure! So, the final project works like this: Think hard here, launch a drone over there. Anywhere in the world. It directly links the user’s EEG signals to commands sent to a quadcopter over the Internet in real time. We pipe brain data from a hacked Mindflex headset through an Arduino Uno into the user’s USB serial port. We then use a Chrome App to initialize a Socket.io connection with the website and stream the data, which shows up in real time graphs. By default, the drone transmits a WiFi signal that the user connects to. We reconfigured it to connect to a 4G hotspot. The website uses the ar-drone node module to create a drone client at the IP address of that router, which allows you to send commands to it. Then, we have logic on the server that links the attention value of the user to various commands to the drone. Its current configuration is this: think hard to launch, think harder to rotate, think like crazy to flip.

AR Drone

What did your team work on during project week?

The team at Code Fellows (Zach Bryan, Anna Lousia Patino West, and Dale Corns, the TA) helped get this to work over the web. Prior to Code Fellows, the project only worked on a local machine. We had to flip the project inside-out to achieve this. Zach set up an EC2 server to coordinate the inputs and outputs. The whole team helped connect the drone and headsets to the web. Dale was especially helpful with the IP configuration necessary to connect the server to the drone, and Anna Louisa was essential in configuring Socket.io-client to make our chrome app stream brain data to the server. Zach also put a lot of time in the days before the demo to run field tests and get all of our controls, video stream, and real-time graphs on the same page. In the end, we had a web-based, real-time brain-controlled quadcopter.

What were your biggest challenges?

1) Configuring the drone.

This was our first and biggest challenge. If this didn’t work, we wouldn’t have a project. While there is no shortage of information about how to get started connecting your drone to a WiFi router, it still presented challenges, and we didn’t even know if or how we could connect it to the web. We ended up running down a few dead ends. For instance, I needlessly cross-compiled Node for ar-drone with Vagrant using Felix Geisendorfer’s instructions. It was a learning experience, but it took up time and ended up being totally unnecessary for what we were trying to do. Also, dealing with hardware is time-consuming, and we had to be persistent. Every time we wanted to test a drone configuration, we had to power it down, turn it back on, telnet into it, reconfigure it with shell, connect to the 4G hotspot, telnet BACK into the drone and run another shell script, then get into the hotspot and try out the new settings. Believe me, seeing that first ping response from our website was a huge moment for us.

2) Accessing a user’s serial port (Browserifying Socket.io for the Chrome App)

The Internet, thankfully, is designed to restrict a website’s access to a user’s peripherals. Who wants random websites controlling all of their home automation? However, when you DO want to access these things, you have to jump through some hoops. After significant research, we found that a Chrome App was the most robust way to handle this. Google has rich serial and USB APIs that make streaming realtime data to your website, well, possible.

What would you like to do with the project now that the course is over?

Our vision is to to allow users to control drones over the web using a checkout system. The user goes to the website, connects their EEG device, selects a drone, and away they go! We’d like to have drones around the globe that users can explore with.

JavaScript Graduating Class

Join the conversation on Reddit.

Next PostPrevious Post

About the Author