Automotive

Autopilot for your Honda: comma.ai goes open source with self driving car kit

Autopilot for your Honda: comma.ai goes open source with self driving car kit
comma.ai's aftermarket self-driving kit in action. The company says it performs as well as the current Tesla Autopilot and better than any other manufacturer's effort
comma.ai's aftermarket self-driving kit in action. The company says it performs as well as the current Tesla Autopilot and better than any other manufacturer's effort
View 2 Images
The Neo hardware box for comma.ai's open source OpenPilot autonomous driving software,. coming son to a Honda or Acura near you
1/2
The Neo hardware box for comma.ai's open source OpenPilot autonomous driving software,. coming son to a Honda or Acura near you
comma.ai's aftermarket self-driving kit in action. The company says it performs as well as the current Tesla Autopilot and better than any other manufacturer's effort
2/2
comma.ai's aftermarket self-driving kit in action. The company says it performs as well as the current Tesla Autopilot and better than any other manufacturer's effort

GeoHot couldn't sell his US$1000 self-driving car kit, so he's giving it away. Faced with the threat of huge fines from the US Department of Transport, self-driving car startup comma.ai has open sourced its autonomous driving kit, which anyone can now install using about $700 worth of hardware.

Comma.ai was a small company with a big promise: the team was working on a US$1000 kit that would give your Honda or Acura car a set of self driving capabilities roughly equal to Tesla's Autopilot.

Led by a wild-eyed, impish software wizard called George Hotz (also known as GeoHot, the guy that made the first iPhone jailbreak and hacked the Sony Playstation 3), comma.ai sought to simplify the hardware and programming side of autonomous driving by taking advantage of huge leaps in recent machine learning technology.

Early tests looked like it was working way ahead of schedule, too. After watching Hotz and his team drive for a few months, the system seemed to be working things out. It was far from perfect when it took the wheel, needing plenty of human supervision, but it was improving rapidly and showing signs that it could handle irregular road markings (such as the lack of painted lines around Las Vegas freeways) with some intelligence. The learning machine was picking up things that programmers might not, like the shiny, smooth tracks car tires can wear into an old piece of road.

The main thing it needed was data. Thousands upon thousands of hours of driving data from a broad base of different drivers and road conditions, that could feed its learning engine, slowly ironing out problems. And that was where consumers came in – the comma one system was due to launch by the end of 2016 as a sub-thousand dollar kit.

And then it all came screeching to a halt, when the US Department of Transportation sent comma.ai a special order demanding that the company prove the device was safe, and threatening fines to the tune of US$21,000 per day. Hotz and the team had assumed that they didn't need to meet Motor Vehicle Safety Standards, as the comma one system required a driver to be supervising at all times.

In response, Hotz sent out a flurry of tweets saying he'd "much rather spend [his] life building amazing tech than dealing with regulators and lawyers," that the comma one was cancelled, and hinting that the team might be taking their talents to China.

But in a surprise move today, comma.ai made the self-driving system available in an unexpected way by open sourcing the self-driving software on GitHub, and encouraging tinkerers to build their own hardware units according to a set of instructions.

The Neo hardware box for comma.ai's open source OpenPilot autonomous driving software,. coming son to a Honda or Acura near you
The Neo hardware box for comma.ai's open source OpenPilot autonomous driving software,. coming son to a Honda or Acura near you

Physical parts cost around US$700, and include a OnePlus3 mobile phone as the heart and display of the unit, as well as a custom-ordered, 3D-printed case. Hotz describes the assembly process as no harder than assembling a piece of IKEA furniture – but there's a decent amount of soldering involved, so some electronics knowledge would certainly be handy.

The system will operate basically like an adaptive cruise control with lane keeping assist. It doesn't have the Tesla system's ability to change lanes, but other than that, the team says it's "about on par with Autopilot at launch, and better than all other manufacturers." Mind you, Tesla is just about to push out a major Autopilot update that may push the game and the goalposts further ahead.

The OpenPilot system will work on the 2016 Acura ILX and Honda Civic Touring Edition out of the box. Some minor tweaking will add support for the CR-V Touring, and other cars will be "more of an undertaking." You can see the system in action in the video below (warning - there's NSFW language playing on the car stereo).

Driving up the 280

By open sourcing the system, Hotz and the comma.ai team are able to get their work out onto the road without the liability issues of selling a fully fledged product, and hopefully they'll be able to assemble enough users, and get enough driving data into the learning system that before long they may be able to prove the system's robustness for commercial use.

The question now is ... Who's game to give it a go?

The OpenPilot software can be downloaded here. Hardware instructions are here.

Source: comma.ai

3 comments
3 comments
Bob Flint
Great...now anybody can create non-functional hardware that will end up killing or maiming more people than what the original goal was to create safer vehicles...
Idiots teaching morons how not to drive, while thinking life is just a game, and hit the reset button......
MichaelDeKort
Lockheed Engineer/Whistleblower - BE VERY WARY OF THIS PRODUCT. I do not believe it is remotely possible that engineering due diligence has been done here.
https://www.linkedin.com/pulse/nhtsa-should-shut-down-all-auto-piloted-self-driving-cars-dekort?trk=mp-author-card
NHTSA should shut down all auto piloted and self-driving cars until proper exception handling testing is conducted.
While I support this technology it has to be done right. I believe we are nowhere close to that at this time and putting people's lives at risk. NHTSA should shut down all auto-piloted and self-driving cars until proper exception handling testing is done. Especially when the companies who make them come from Commercial IT. This is because those companies, the Google’s, the Tesla’s, the Uber’s have little actual best practice experience in designing large and complex systems. Especially when massive exception handling is needed. The perturbations of environmental and automobile system error conditions are immense. Commercial IT is not even remotely capable of doing this relying on themselves and their incredibly poor practices. Thinking that one can make a cool app or website is NOT like what is being done here. To work through the requirements and scenario perturbations, design integrated systems to deal with it and test this takes folks with experience in doing that. And they need the proper tools. Most of this is non-existent in Commercial IT. Far more engineering, code and testing should be going in to these systems than that of the “happy” or normal path. The places to find these folks, methods and tools would be NASA, DoD and the airlines industry. Couple them with people with automobile system and traffic engineering and you would be on the way to something that will work. Using real people, cars and the public to gather most of the exception or accident data is reckless, will result in needless deaths and take decades for information to be gathered. There are far safer and faster ways to do this. The public is being used as Guinea Pigs based on a massive sense of false confidence. They are being used to help create exception handling or accident scenarios becasue these companies are to ignorant, inexperienced or cheap to do it right. The first company who gets this and has the patience to do it right will win out. The others will eventually face so much litigation and potentially criminal charges and will have wasted so much time in ignoring this path they will no longer be players in the space.
1) Using text based scope docs that do not build into a full system view. Use Cases and Stories are extremely poor ways to illicit and explain scope. What is needed is Diagrams. These facilitate visual flow where exception handling points would exist. This step is the most important. If you cannot see all of the combinations you cannot design or test for them.
2) Using BAs for scope and QA for testing. DoD uses a systems engineer for both. That way there is continuity. To make sure they don't have a fox in the hen house QC is also performed. (BTW testing is QC. Auditing and improving process is QA. Commercial IT can't even get the titles right). This will result in missing and incomplete scope and testing.
3) There is very little system design going on. Too much serial discovery Agile going on. Little object oriented or UML design going on. Most of it is web based. Much of this comes Agilists who ignore what they can know up front and use of Use Cases and Stories and not diagrams. Most of Commercial IT's design process is not based on a full systems design approach. They build one step a ta time purposefully ignoring systems. For complex systems especially with massive exception handling this alone would keep the product from either every working correctly or the project ever finishing.
4) They lack proper tools that facilitate scope decomposition through design, code and testing. Something like DOORs. Commercial IT rarely has separate tools let alone an integrated one. Most won't even use a proper RTVM in Excel. This will result in missing and incomplete scope and testing.
5) They rarely have chief architects that look across the whole system. They have the same stove piped little kingdoms I just mentioned above for software. This will result is missing and incomplete scope and testing.
6) Full system testing is rarely done. Especially when there are third party interfaces. Simulators are rarely built to replace those parties if they are not connected in the test environment. Exception handling testing is rarely done.
7) There are rarely any coding standards. Especially built from in depth testing and exception handling. Examples - http://caxapa.ru/thumbs/468328/misra-c-2004.pdf, http://lars-lab.jpl.nasa.gov/JPL_Coding_Standard_C.pdf, http://www.stroustrup.com/JSF-AV-rules.pdf
8) Software configuration management - Commercial IT rarely creates a product wide integrated SWCM system. They have dozens or even hundreds of little teams who have their own CM. And they use tools that relay on best practice use. Something that doesn't exist. Having Jira and Git isn't nearly enough. There is a reason IBMs Clearcase is not free. This will result in the wrong software versions being used. Which will lead to defects. It will also lead to laying patches on top of patches which will result in defects.
9) There is no Earned Value Management (EVM) or proper estimation tools or productivity data. (Like rework, defect density and proper root cause data). This means they will have major schedule and budget issues. (Given the deep pockets of Google, Uber and Tesla this one might not matter)
(A clear example. The gentleman in charge of Uber autopilot here in Pittsburgh, the hub city, has 8 years Twitter experience. I am telling you these guys are in way over their heads. Raffi Krikorian look him up on LinkedIn. Based on what I see he is not remotely qualified. When Elon Musk took the first set of code for Space X to NASA it was rejected because they didn't come close to handling exceptions. Additionally Elon's stubbornness about using the term "autopilot" is shortsighted and reckless. Elon has gone from amazing to his own worse enemy.)
In addition to being a systems engineer, engineering manager and program manager at Lockheed Martin on NORAD, the Aegis Weapon System and Aircraft Simulation programs I was a whistle-blower who stopped Lockheed and Northrop from doing things they should not have post 9/11 - http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=4468728
Update 11/28/2016
Tesla has a software update coming out soon. From this article - http://www.thecountrycaller.com/63406-tesla-motors-inc-tsla-enhanced-autopilot-is-coming-by-mid-december/
"Here’s how Tesla describe the improvement in Autopilot features:
“Enhanced Autopilot adds these new capabilities to the Tesla Autopilot driving experience. Your Tesla will match speed to traffic conditions, keep within a lane, automatically change lanes without requiring driver input, transition from one freeway to another, exit the freeway when your destination is near, self-park when near a parking spot and be summoned to and from your garage.”
This software should NOT be on the road. It should not be called "Autopilot". Drivers should not be Guinea pigs and misled into a false sense of confidence. It is just now getting to be able to maintain lanes, keep up with traffic and transition from one freeway to another? If it cannot do this now it is nowhere near ready to be on the road. This means that massive amounts of primary or happy path engineering is not done let alone the even larger work needed for exception handling or accident scenarios that augment the associated happy path. And they say they will have full autonomy in a year??? There is NO WAY that is possible - let alone 5 more years. NHTSA etc have GOT to make a list of detailed scenarios these cars have to prove they can handle and the variation of them. Possibly thousands of scenarios.
Gabriel Jones
I like the idea. Its not like anyone is going to go to sleep while driving with this installed.