Interview: StreetDrone announces open-source autonomous car platform
Self-driving cars are a fascinating problem, a colossal future market and a bit of a glamor topic right now, so there's plenty of people very keen to get into the game, from young minds in universities to established companies that see autonomy as an innovate-or-die lynchpin of their future.
Unless you're a major automotive manufacturer, though, a ton of your startup time and cost is going to be burned buying and modifying a car to test your software on. It can't just be any car. It needs to be fully drivable by wire, including gas, brakes and steering. And it needs to have a bunch of onboard cameras and sensors, as well as the processing power to deal with all that input.
Not to mention, not everyone wants to start teaching a car to self-drive from scratch. Autonomy is such a gigantic and complex problem that there's plenty of hay to be made chewing away at small, digestible chunks of it.
So one UK startup is making it as easy as possible. StreetDrone is selling a fully integrated autonomous car platform, ready to roll, built on the tiny Renault Twizy and pre-loaded with open-source self-driving software, so developers are free to take or leave or modify whichever bits they wish.
"Basically, our car is like the Raspberry Pi of autonomous cars," StreetDrone CEO Mark Preston told us over the phone from his Oxford office. A 20-year veteran of the Formula 1 paddock, having been chief of R&D for the Arrow team and also working for McLaren and Super Aguri, Preston spotted an opportunity in the R&D space as self-driving vehicles start to take their first wobbly steps.
"Our basic idea is that there'll be an R&D market for the next 5-7 years maybe," he told us. "Nobody's sure right now how long it'll be until everything's fully commercialized. It's unclear how the first vehicles will start to go onto the road in a commercial sense. So at the moment we're supplying to that R&D market."
Why the Renault Twizy?
The Renault Twizy is a tiny, single-seat electric oddity, of which Renault has sold somewhere between 20-30,000 units since its launch in 2012. It drives the rear wheels with 17 horsepower's worth of electric motors, has a maximum range around 100km (62 miles) and its footprint on the road is so small you'd barely fit two motorcycles side by side in the same space.
"Funnily enough," says Preston, "the Twizy is actually designed by the guys at Renault Sport. So we knew those guys from racing. And it's designed a bit like a little race car. Four wheels, independent suspension, it's a spaceframe chassis and the bodywork basically clips on … It's not a unibody. So you can modify it quite easily.
"We take the rear bodywork off it and put a different set of bodywork on, to mount LiDARs and cameras and put the big computer brain in the back. It's just a very modifiable vehicle. And it's a very simple car. You don't have to worry about all the CAN bus stuff you'd have to in a bigger car, with all the air conditioning and windscreen wipers, and everything that's controllable by the CAN bus in a modern car. We don't have to worry about all that while we're developing autonomous software. So we've simplified a lot of that with Renault, so that it's way easier to control than let's say, hacking into a Prius or one of the other big SUVs that a lot of the folks you might have seen online are starting with.
"Basically, we take the base vehicle. We add a power steering system and a power braking system, and then our XCU which controls those elements. We tap into the drivetrain and can give the car signals for acceleration, and regeneration through the electric motors. Our drive by wire elements can control the steering and brakes.
"And one of the big things in autonomous driving is to have a failsafe system. If our system doesn't hear from the AI system in a very short timeframe, it assumes the AI's had a problem, and it basically brakes the car to a stop. We're calling it a fail-stop system rather than failsafe, which is really another level where you have double redundancy on all systems like an aircraft. We might get to that point in the future, but at the moment we assume there's a safety driver on board, so the car will stop and the safety driver can take over and decide what's the problem, reboot the system and start again. It's a true R&D vehicle with a number of elements of safety that we're adding."
StreetDrone's basic platform starts at about UK£70,000 (~US$93,000), kitted out with everything a research team needs to get started, as well as custom livery to make sure it's very presentable.
How many sensors are enough for a self-driving car?
One tends to imagine the autonomous cars of the future will be stuffed to the gills with sensor technology, bristling with all the LiDARs and radars and cameras and ultrasound units of the rainbow. Preston says that won't necessarily be the case. "The first customer we supplied to is using just cameras. Some customers are using LiDAR as well. I think for the moment, LiDAR is reasonably important.
"But our chief scientist Adrian Bedford, who's an ex-missile designer, his view is that a human driver gets by with two cameras, two eyes, mounted on a swivel, and your ears are stereo audio inputs as well. So in the endgame, theoretically, we should be able to mimic a person. A person doesn't need rangefinding systems, they use previous knowledge, judgement, those kind of things to make decisions about people and cars, and bad weather and good. Theoretically, we should be able to do the same as a human.
"On our vehicle we're developing the open source self driving software on, we have seven cameras for a full 360-degree view, and one LiDAR on top. That LiDAR can see pretty much the whole world, just because of the size of the vehicle. So it's one LiDAR and 7 cameras. You can use GPS to an extent, but you have to be careful. GPS can be spoofed, and sometimes it doesn't work accurately in some areas. But GPS is a check as well."
Creating an open-source autonomous driving software platform
Not everyone who wants to work on autonomous cars wants to build the entire self-driving capability from scratch. Indeed, for some, the fact that the car self-drives might even be almost irrelevant. One StreetDrone customer, for example, is working on vehicle-to-infrastructure and vehicle-to-vehicle communication. Others might want to tackle a specific problem. To give these folks the best leg-up possible, StreetDrone has started a division focused on building open-source self driving software, which it calls OpenSD.
"The open source software we're developing is based on something called ROS, the Robot Operating System," says Preston. "It's been around for a reasonable amount of time. We're upgrading to the second version of ROS, which is real time. Initially ROS was used for R&D. Now that it's going into cars, it needs to have a real time element that can be constantly checking everything in parallel.
"ROS is quite common around the world in Robotics departments. If you've done a Robotics degree, you will most likely have programmed something using that language and toolset. That's what we're basing it on.
"So let's say there's a university in Australia, or anywhere around the world, and they want to develop just one element of the software. Let's say it's the kangaroo problem that Volvo was talking about recently. That university can just work on the kangaroo question. And they put that piece of software back into the pool, because it's open source, and now the software can deal with kangaroos.
"Then somebody in another country, say Canada, maybe they do something just to deal with moose, maybe they behave differently to anything else. You're breaking down the problem and spreading it around the world, and over time the software becomes more and more capable, and more diverse in its capabilities."
Who's using StreetDrone?
The company's first customers have already hit the headlines. Wayve produced a cute video last week, in which a kitted-out Twizy taught itself how to drive down a lane, from scratch, in 20 minutes, using nothing but machine learning, trial and error. Check it out:
Another group is working in the mining space, using the StreetDrone as a quick R&D test bed that may lead to large-scale autonomous mining machinery. A third is working on vehicle-to-infrastructure and vehicle-to-vehicle communications. A fourth is putting together an autonomy test center, which will run a number of cars that researchers can rent time with to validate their software and test ideas in situations where buying a whole car wouldn't make sense.
"Universities is one of the main target audiences for us," says Preston. "Startup companies is another. After that, we think it'll be more fleets. Let's say somebody's developing a self driving bus and they want to do some testing and validation before they build the prototype. Hopefully they'll come to us, we'd sell them a small test vehicle, maybe help them with the open source software. And the software can be tricked, you can pretend our little Twizy is the size of a bus, and operates with bus-like dynamics. That's fairly simple in the software.
"As time goes on, maybe we help them build the prototype, we put the XCU on a bus, and we connect into the open source software for that bus and they can start developing a full size bus.
"If you're somebody doing special vision systems or AI, you don't really want to have to worry about brakes and control systems and that sort of stuff. You just want to say "turn left. Turn right. Slow down. Speed up." You want to spend your time doing your work on the other problems. That's what we're looking for. People who have some R&D or research they want to get going really quickly. They can use any elements of the puzzle, or they can use their own software. But they don't generally want to do the car bits."
On the one hand, some people might find the idea of an open-source self driving car disquieting, fraught with potential for security vulnerabilities and misuse. On the other, StreetDrone represents an incredible leg-up for a vast range of companies wishing to take their first steps in autonomy. We look forward to seeing where people take this tech.