I studied hard for a year and now I have an AI job building robots

Written by TomPJacobs | Published 2017/12/14
Tech Story Tags: self-driving-cars | ai | artificial-intelligence | ai-jobs | career-development

TLDRvia the TL;DR App

Jobless, aimless, eating noodles for dinner, how did I get here?

I’ve always been a developer: writing code for computers to eat up and spit out as useful things for people to use.

But over the last few years, my passion for software development, sitting down for 8 hours a day and writing code — has diminished a little. I’ve always greatly enjoyed it, because ever since I had written a game on my Commodore 64 when I was a kid, I could see that if you could write code, you could make anything happen in the world of the screen.

So while it’s fun setting up websites and writing iPhone apps, I found that because these things are now well understood, and they are reaching the limits of what effect they can have on the world on screens on desks and in our pockets, they’re not really as fun as they used to be, because they’re not expanding what is possible to make happen in the world with code.

So in 2015 while working at my job, I started to think again about the potential that code has to transform what exists in the world. Starting out as just pages on screens, the web, apps, and phones have now reached out into the real world in small ways and transformed how a lot of people spend their time and make their living these days. This was good progress from the pages of the early web, but it seems that exciting new development in this area is starting to hit the tail end of the S curve.

In other words, code has reached out as far as it can into the real world from its screen and browser world.

That’s what I thought, until I read about a small group of dudes in Melbourne that were racing drones around that they had built themselves from parts. I remember reading the article on a Friday night late at the office, where I was just sitting and updating some website files, and thought: I want to go to there.

Code flies free.

This is where code has the chance to break free of the screen, and into the real world.

So over the next two years, 2016 and 2017, I became obsessed building and flying drones; with the idea that you could buy a few $10 motors, put them onto a small frame, and again with the magic of some code to balance and control the drone in the air, you could create something that allowed you to become a bird, opening up a new perspective on the world that humans have never seen before. With the help of local Maker spaces, and seeing the never ending mischief reported, along with actual somewhat promising and useful reports, it seemed clear that something was happening here: A chance for code to interact with the physical world in a way that it hadn’t been able to before, for it to expand the environment in which it runs, from the screens to the skies.

At the same time, I kept hearing about this crazy fringe group of developers talking about how great some things were with funny names like Torch, Caffe, CNNs, and AlexNet. They seemed to be quite keen on these technologies as seemingly very effective solutions to problems that I had studied way back in the AI course I took in university where we solved Wompus World, which didn’t seem like the most pressing issue in the world at the time, and honestly, doesn’t even now. This was of course well before TensorFlow, Keras, ResNet, and YOLO (both the cultural idea to live your life by, and the better-than-human performance object recognition neural network). Okay, computers are now better than humans (remember, that’s us) at finding where Wally is in an image — that’s interesting, somewhat scary, but still, how does that change things?

Dining room. Understood.

With these two pieces coming together at the same time, it’s time for a new S curve to begin. Physical actuation meets physical scene perception equals a whole new, bigger screen for code to run on: the human world (and beyond).

So I decided to spend the entire year of 2017 not working, but studying self driving cars, robotics, computer vision, and machine learning.

(During this time I also heard an awful lot about Bitcoin and VR, but these things haven’t really helped people that much, other than a few fun games, and reallocating money dollars to and from a few lucky people.)

At the end of 2016, I happened to be hanging out at the Starbucks near the Google campus, and saw on Twitter that George Hotz was speaking at a nearby event about self driving cars. So I walked over, and found it was for the Udacity Self Driving Car Nanodegree program, welcoming the first term of students enrolled. Lots of questions were asked to me about how I was enjoying the program so far, and I had seen it before and it looked interesting, so I decided to enrol.

Completing the three-term, year long, Udacity Self Driving Car Nanodegree was pretty eye opening, and a great way to start enjoying coding again. Three terms and many projects later, including learning how to control steering and speed, use simulators, process images, detect lane lines, and make sense of sensor data, I’m onto the last group project where we run our code on a real car in California, hope it obeys traffic lights, and doesn’t hit anything.

The Udacity projects were the only clearly defined checklist item to complete for the year, but of course there was a lot more than that to do. So for 2017, I sat in an apartment in Melbourne city for most days of the year, starting in January with an empty apartment and a laptop, and by December my robot lab ended up looking something like this:

At the same time, watching the DIYRobocars movement grow in the U.S., I started writing self driving car code and helped organise Melbourne’s own self driving RC car events, getting people together to write the software to have rovers and RC cars navigate tracks by themselves, and definitely not just crash into each other all the time:

The slow and steady rover on the left that gets wiped out at the end is the only autonomous vehicle in this video, the rest are old-fashioned human driven in this first event, clearly expert drivers all round

By the end of 2017, I had explored a lot of different ways of getting code to explore its new environment, with cameras, lidars, depth cameras, motors, arms, hands, wheels, legs, and even microphones.

But it did start to feel like I was just playing with some toys for a year. In October I was jobless, aimless, and eating noodles for dinner. How did I get here?

Most good things usually have to come to an end, especially when the money runs out. So come October, I started looking for areas and companies that I could put my interest in autonomy and real world interacting code to the test.

While looking at robots online, the ever more startlingly large and capable Boston Dynamics robots of course keep making headlines. Cassie was interesting in its ability, but also still quite large, expensive, and delicate looking. There were a lot of hobbyists, a lot of interesting wheeled robot manufacturers, a lot of neat uses for robotic arms including serving coffee at CafeX, and building of electronics with Carbon Robotics, ModBot, and even some people taking the idea of a drone and upsizing it to a vehicle with KittyHawk. All very interesting uses of code interacting in the real world to be useful in people’s lives.

But the robot that took my interest the most was Minitaur, by Ghost Robotics.

The silent direct drive and low geared motors and simplified leg design of their upcoming generation robot looked like they were onto something special, and they have an approach that can be flexible enough to eventually be affordable to consumers. It looked great so I developed my own mini version of it back in July:

Reaching out to the CEO and showing what I had built and my interest in the company, I was invited to interview, and the small team there put together some questions about electrical and mechanical engineering, which was interesting because my background is computer science, not electrical engineering. I spent a few days putting together answers as best I could, while researching the areas I hadn’t delved into that deeply before.

Along with this was a challenge project, which actually involved directly working from two Udacity Self Driving Car Nanodegree projects that I had completed earlier in the year, so that really came in handy. One project was taking an camera image stream and steering the car from that image (pretty important for self driving cars) using a neural network, and changing that to detect obstacles in the road and driving around them as well (you always have to be on the lookout for giant cubes on the highway blocking your path). Combining that with using a neural network to update waypoints of the planned path for a robot, using TensorFlow, Keras, and a Unity simulator, the learnings of the past year were all coming together.

I flew out to the Philadelphia to meet them and see their office. After hanging out for a few days and seeing how they work, it seemed like a great fit, and they were able to give me a job offer. Pretty damn great.

So now I’ve flown back to Melbourne, and it’s time to apply for my work visa, which usually takes about a month to get, and then fly back out in January to start putting together robots that walk, and use code to move about in the real world.

I can’t wait to see where the legged robots will be deployed first, how people will interact with them, and how eventually they’ll start being used in places to improve people’s everyday lives, by co-existing with them in the real physical world, rather than just being software bots trapped behind a screen.

Welcome to the real world.

They say that software is eating the world; I think it hasn’t even started yet. I think you’ll see more cases of code intuiting human intentions and assisting in areas that we never thought it could reach. We’re starting to see it with the Echo and Google Now; just imagine if your Google Home could move around your house like a Roomba.

Now that’s an amazing thing that code can do.


Published by HackerNoon on 2017/12/14