September 29, 2010

In junior year of high school, I remember watching a documentary on television about the 2005 DARPA Grand Challenge, an unmanned vehicle race across the desert carried out entirely by robots built by universities and private companies to earn a $1 million prize. The documentary was, of course, sensationalized. The whole thing had the tone of one of those “rise of the underdog” sports films, with Carnegie Mellon taking the role of the crowd favorites, and Stanford taking on the role of the underdog, winning out in the end against all odds. I was completely captivated by the race, and immediately decided that I wanted to make a career in robots (even though I knew nothing about them) – so I applied to both CMU and Stanford, and ended up attending the former. Since then, I’ve learned quite a lot about robots.

In the summer of 2010 before my Junior year at CMU, I took up an internship at a small robotics company in Pittsburgh called RE2, inc. The only thing I had heard about RE2 at the time was their interchangeable robotic end effectors for IED-disposing robots. At the time, I had no idea of exactly what sort of project I would be working on — only that I knew that it involved “a manipulation program for the government,” as was hinted at in the opening interview. I was showed a stock arm from Barret with a 3 fingered hand sitting on a table next to a BumbeBee2 stereoscopic camera in a small office in the back of RE2. Apparently, at the time of my initial interview in the previous winter, this was all that the project was envisioned to be.

When I came in for my first day of work, I was greeted by the 7-foot tall mechanical marvel that this initial prototype had become. It seemed, previously unbeknownst to me, that had accidentally stumbled into the final stages of development leading up to the DARPA ARM Project, a four-year long DARPA robotics program in which six teams from academia and industry are funded to solve some of the biggest challenges in robotics by utilizing a humanoid, two-armed robot built by RE2, Inc under DARPA funding.

Teams will be asked to autonomously, without any human assistance, pick up and manipulate a number of arbitrary objects from tools to toys. With the intention that by the end of the program, the teams will have developed algorithms robust enough to handle autonomous robotic manipulation on the battlefield.

The robot had two Barrett WAM arms with seven degrees of freedom, two three-fingered Barrett hands with pressure sensors and torque sensors, a BumbleBee2 stereo camera, a Prosilica high resolution camera, an SR4000 Swiss Ranger infrared camera, and a four degree of freedom neck. It was an amazingly capable, absurdly expensive robot. In the words of Dr. Robert Mandelbaum, the former program manager for the DARPA ARM program, the government is going to give teams “more than they need” in terms of sensors and manipulation capability.

My task, as an intern, was to develop a simple routine utilizing all of the robot’s capabilities that it could perform at the robot’s unveiling and at the AUVSI conference in Denver that year. In partnership with another intern named Michael Brooks, we programmed the robot to complete this task with only high level supervision.

I cannot stress how shocking this was to me at the time. I had literally never even seen, let alone programmed, a real humanoid robot in my life – and here I was being told that I had practically free reign over one of the most advanced robots available!

The robot’s routine would be absurdly simple for a human: pick up a block with the left hand that has geometric 3×3 patterns on it — black and white. Then, with the right hand, arrange the blocks that are there into the pattern you saw using a minimal number of steps. Give this task to a five year old child, and he will almost immediately learn and perform it — simple, right?

However, this task is incredibly, unbelievably hard for a robot to accomplish without very complex algorithms, accurate sensors, and quite a bit of heuristic assumptions about the environment. Locating blocks is a difficult procedure of statistically analyzing the differences between the images of the right and left eyes. Understanding the pattern seen on the block is a challenge in statistically fitting rectangles to noisy data sets. Picking up the blocks is a mathematically challenging ballet of inverting matrices and intelligent selection of possibilities — and all of it had to be done quickly accurately, and reliably enough not to damage the robot or crash its software.

Now, take that problem, and multiply it by a few orders of magnitude in difficulty, and then you will have some understanding of the problem the DARPA ARM-S software program is attempting to solve. These teams are not being asked to pick up simple, yellow wooden blocks on blue tables, but hammers, drills, duffel bags, or anything else that the government decides to throw at them!

Nevertheless, developing the routine took up the five most exciting weeks of my career thus far. It utilized all of my math, robotics, and programming skills to the extreme. Computer vision, robotic manipulation, computational statistics, and AI all came together for this project in a way that was at times quite difficult to keep track of. In every class covering these algorithms, I had only fully understood the special, controlled, and easy cases. In Computer Vision, for example, all of my stereo processing experience had been on laboratory-controlled data sets with no distortion or specularity. Here, however, I was dealing with imperfect cameras, noisy data sets, and imperfect lighting. In manipulation and controls, I had only studied simple, two to four degree of freedom arms with simple grippers. Here, I was dealing with an enormously complex, seven degree of freedom arm with conformal hands and sub-milliradian encoder accuracy.

Since our software was running on the ROS framework, I also had to learn quite a bit about distributed systems. and also about software development in a real company. I learned what it means to do code reviews, to adhere to documentation standards, to test my software in a methodical way, and to communicate with a team of mixed software, electrical, and mechanical engineers. All in all, I learned more in this internship about robotics than I have in years of classes.I’m incredibly grateful for being involved in this project, — let alone being challenged to program the robot myself with complete decision making authority. It was a privilege to work on this project, nothing less. I never thought that I would end up being so closely involved in one of these challenges, which had driven me initially to study robotics in the first place.

Matthew Klingensmith
RE2 Software Engineering Intern, 2010
Carnegie Mellon, School of Computer Science, Class of 2012