25

Hackers are smart. Could they hack a self-driving car through its CD drive? From what I understand, malicious code could be uploaded to the driverless car via CD which could give them access to brakes, windscreen wipers, sensors, etc. (all of which could be used to potentially commit murder or hold the car ransom).

TRiG
  • 609
  • 5
  • 14
Rob Dawson
  • 1,186
  • 1
  • 12
  • 16
  • 15
    Any system could theoretically be hacked from any means of input. But that does not mean that the system could not be implemented in a reasonably secure way. – Anders Oct 11 '16 at 13:23
  • 59
    Or, y'know, just build the system in a way where the CD player is completely separate from anything critical? – James Hyde Oct 11 '16 at 13:31
  • 11
    Most systems that I have seen have direct access via the SD card slot usually to update maps. I would be more concerned with that then a CD. In most systems in today's environment, the CD player will skip anything that isn't either .WAV(Raw) or a coded format that it understands. Doesn't mean it cannot be used as a DOS, I had a CD player in an older Mazda that we had created a disk with a couple hundred thousand filenames and it would get so hung up that it wouldn't even let you eject the disk. I had to take the drive out and use the manual eject pin method. – Kevin B Burns Oct 11 '16 at 14:36
  • 12
    An attack via CD player requires physical access. I suppose the saying about physical access to computers can be extended to cars (driverless or not) – Hagen von Eitzen Oct 11 '16 at 15:20
  • 3
    What is your threat model? You cannot gauge security risks without a threat model. To your last sentence about not being convinced they are safe, consider that it's entirely possible that you will walk out the door tomorrow morning and get hit by a rogue garbage truck. Does that means you never leave your house? Also, an unscrupulous individual could cut your brake lines, with or without a CD player in the car. – Cort Ammon Oct 11 '16 at 16:41
  • 13
    @CortAmmon But, with a CD player in the car, he could listen to his favourite music while cutting the brake lines. That would be more enjoyable, so he would be more likely to do it. Therefore, CD players are a security risk. – David Richerby Oct 11 '16 at 17:19
  • 3
    Why would you put a CD player in a driverless car? There would be no one to listen to it. – emory Oct 12 '16 at 10:13
  • 1
    @emory The passengers may listen to it. – vascowhite Oct 12 '16 at 10:29
  • 1
    Isn't this question about 5 years late? CD, really? – pipe Oct 12 '16 at 11:20
  • I would be more worried about access-less threats, such as the wifi hotspot in the car, or the bluetooth system. – njzk2 Oct 12 '16 at 15:17
  • Is "Hackers are smart" the entire basis for your concern? Wouldn't a verified attack route be a better reason for a question? –  Oct 12 '16 at 16:58
  • See [Executable space protection - Wikipedia](https://en.wikipedia.org/wiki/Executable_space_protection). That description is not totally accurate but the general concept applies. Modern processors and operating systems don't allow user-level applications to execute data that is not designated as executable. Cars are not special; the concepts are the same. Things like Google Play are critical since they serve data that is later executed. – Sam Hobbs Oct 12 '16 at 22:24
  • How do you know the cd drive isn't a completely separate computer with it's own power? If you can hack with that particular system... then you can either jump across devices (which would be amazing in its own right) or you can somehow hack through power (also incredible) – user64742 Oct 12 '16 at 22:44
  • @VirtualAnomaly Back when CD players were still standard, they *had* to be integrated into everything as system or navigation updates would have to be done via CD. Decoupling the system wasn't an option then and it probably won't be now since people will presumably want to be able to control it from their car's on-board system. – Lilienthal Oct 13 '16 at 09:19

6 Answers6

85

Not on a well-designed car

The CD player is part of the media system. It's likely that the media system has a number of security vulnerabilities, and a malicious CD can probably take control of the media system. It would be difficult to fix this without either greatly increasing the cost, or restricting the functionality of this.

The car control systems - the CAN bus - should be strongly separated from the media systems. In previous attacks, like Jeep hacking, attackers have been able to break across from the media system to the CAN bus. However, this represents poor design and implementation. The two systems should be kept separate - or at least, have a highly restricted interface - and it is possible to do that at reasonable cost.

Whether any future driverless cars will be well designed remains to be seen.

paj28
  • 32,736
  • 8
  • 92
  • 130
  • 65
    *"Whether any future driverless cars will be well designed remains to be seen."* – 700 Software Oct 11 '16 at 13:53
  • 8
    You should amend your answer to: "Yes, but it shouldn't on a well designed car". Time and time again we see this kind of hack take place. I'm not familiar with these systems, but given the current trends I doubt that they're using an "air gap" between the systems. – David Oct 11 '16 at 14:57
  • 1
    @David - That was kind of my point in the final paragraph. By the way, you probably don't want a complete air gap, there are a few reasons to interconnect, like parking sensors sounding through the stereo speakers. But the interface should be heavily restricted. – paj28 Oct 11 '16 at 15:05
  • 17
    While I think this is a true answer, it's a bit of a tautology. If I may paraphrase, "the CD player is safe as long as the car is designed such that the CD player is safe." Any cars which are connected in a way which permits a hacker to take over are automatically marked as "not well designed," so it's a bit cheating. – Cort Ammon Oct 11 '16 at 16:43
  • 1
    @CortAmmon - Touché! In fairness, I do explain how to make it safe, so I hope my answer is useful to someone. – paj28 Oct 11 '16 at 16:52
  • 5
    In the Chrysler hack, the two systems _did_ have a highly restricted interface. But not highly restricted enough. See [this presentation from DEF CON 23](https://media.defcon.org/DEF%20CON%2023/DEF%20CON%2023%20video%20and%20slides/DEF%20CON%2023%20Conference%20-%20Charlie%20Miller%20-%20Remote%20exploitation%20of%20an%20unaltered%20passenger%20vehicle%20-%20Video%20and%20Slides.mp4). – Michael Hampton Oct 11 '16 at 21:09
  • 2
    @MichaelHampton: The appropriate kind of restriction on the interface would be dataflow in one direction only (at physical not only logical level, i.e. no sending read requests or flow control). Although there's a great deal of spying that might be enabled by sending car information to the media center, it won't ever allow taking control of the car. – Ben Voigt Oct 11 '16 at 21:30
  • 2
    @BenVoigt - Over-the-air updates are one reason for dataflow from media to car control. I know this has been targeted in previous hacks. But it is possible to do signed updates well, and the benefits of over-the-air updates are significant. Also, a driverless car would need route information to go from the media centre to the autopilot. – paj28 Oct 11 '16 at 22:37
  • 1
    Even if it's completely isolated in software, you still need to get the hardware right as well, or it will be exploited as well. E.g. by pulsing nearby media-related wires to induce a rogue signal in your CAN bus or something – Thomas Oct 12 '16 at 06:23
  • My car has two CD drives - one for audio CDs and the other for navigation maps. – el.pescado - нет войне Oct 12 '16 at 08:24
  • @el.pescado Is that the factory configuration? I wonder why they don't just add physical switch to toggle the connector between audio & nav system. – Martheen Oct 13 '16 at 03:48
  • Yes, that's factory configuration (it's a 2007 Opel Vectra). I think the reason is to allow to use navigation while listening to music - so usability rather than security. – el.pescado - нет войне Oct 13 '16 at 09:40
26

Yes, it would.

Researchers from UC San Diego actually implemented an attack through this vector:

“We found a flaw in a CD player in our car,” he said. “You could pick a song and code it in a way that if you played on your PC it’ll play fine, but if you play it in your car, it’ll take it over.”

http://www.sandiegouniontribune.com/news/education/sdut-ucsd-professor-cyber-hacking-2015aug28-story.html

Most probably this is through a memory corruption vulnerability in the meta information tags in the audio file. Through this they were probably able to direct commands to the CAN system that regulates the car.

But you don't even need a CD; in the worst case it can happen remotely through mobile networks

J.A.K.
  • 4,793
  • 13
  • 30
  • 1
    Suppose this would be an inconvenience to the attacker to say the least... Guess we're safe *claps* – Rob Dawson Oct 11 '16 at 12:27
  • Except for the part about hacking remotely through mobile networks ;( – Rob Dawson Oct 11 '16 at 12:29
  • "Would" is a too strong. "Could" would be more correct. There's no guarantee that a CD player either would have vulnerabilities, or would be connected to other subsystems, so it's incorrect to say that it would definitely pose a security risk. – Xander Oct 11 '16 at 14:00
  • Distributing the media on an online file sharing service would also accomplish this in an untargeted attack. – J.A.K. Oct 11 '16 at 14:02
  • 2
    It bugs me that C compiler writers don't recognize as a commonplace requirement the notion that programs which are given invalid input data be allowed to produce arbitrary output data, but the output format and other behaviors must remain defined. If programmers don't care what pixel or audio samples are generated from an invalid file, subject to the above constraints, requiring that programmers rigidly handle all corner cases will not only create security holes whenever programmers fail to do so, but it will hurt performance when they do (vs. letting compilers have more freedom). – supercat Oct 11 '16 at 15:33
  • @supercat - Check out [Rust](https://www.rust-lang.org/en-US/) – paj28 Oct 11 '16 at 15:40
  • @paj28: Is there yet any ARM cross-compiler support? Last I checked, both D and Rust seemed like interesting languages, but neither is at all useful to me without support for ARM cross-compilation. – supercat Oct 11 '16 at 15:42
  • @supercat - I don't know about cross-compiling, but ARM [is supported](https://github.com/warricksothr/RustBuild). I guess you can stretch to at least a Raspberry Pi to do your compiling? – paj28 Oct 11 '16 at 16:00
  • @paj28: Cool. The page you linked earlier at first glance just seemed to have PC-related downloads available; I'll have to see if Rust looks feasible for ARM. – supercat Oct 11 '16 at 16:04
  • @paj28 "threads without data races"? So Rust disallows shared variables? – JAB Oct 11 '16 at 16:52
  • @JAB - I don't know the details, but [this blog](http://manishearth.github.io/blog/2015/05/30/how-rust-achieves-thread-safety/) looks interesting – paj28 Oct 11 '16 at 16:57
  • @paj28 Okay, so Rust doesn't prevent you from introducing race conditions if you really try, it just provides better tools to avoid them. – JAB Oct 11 '16 at 17:00
  • 4
    @supercat: A C compiler couldn't possibly take on the burden of maintaining "output format and other behaviors" for buggy programs. Even figuring out what the correct output format is would require the compiler to read the programmer's mind. Even far safer languages like Rust or Haskell can't make that kind of guarantee. – user2357112 Oct 11 '16 at 23:59
  • 2
    @supercat The compiler might well compile a function so that it produces invalid output data given invalid input data, with no other ill side effects. But then the output from that function is used as an index into a table of function pointers, which causes the `send_command_to_engine` function to be called instead of the `play_music` function, for example. – user253751 Oct 12 '16 at 04:05
  • @immibis: In many kinds of programming, it's necessary to keep track of what data has been sanitized and what data has not. Compilers had traditionally allowed many operations to be safely performed upon unsanitized data without sanitizing it first, provided the results of those operations were treated as unsanitized. What's killer is having compilers observe that since one part of code has computed x< – supercat Oct 12 '16 at 14:57
  • @user2357112: The requirements for many real-world program include some parts that need to be met for all inputs, and some that only need to be met for valid inputs. If a function is supposed to evaluate `int32a*int32b – supercat Oct 12 '16 at 15:11
  • ...the computation either by computing a 32-bit product and sign-extending it, computing a 64-bit product and using it directly, or doing anything else which will work in cases where the result fits within a 32-bit "int" and will yield a 0 or 1 in all other cases? My point is that programmers should be allowed to use bounds checks in cases where they should affect program's actions and omit them in cases where they shouldn't need to, without compilers treating the omission of bounds checks in some cases as an invitation to omit them from all cases. – supercat Oct 12 '16 at 15:15
10

Never mind the CD player, your tires are conspiring against you

"Security and Privacy Vulnerabilities of In-Car Wireless Networks: A Tire Pressure Monitoring System Case Study"

We also found out that current implementations do not appear to follow basic security practices. Messages are not authenticated and the vehicle ECU also does not appear to use input validation. We were able to inject spoofed messages and illuminate the low tire pressure warning lights on a car traveling at highway speeds from another nearby car, and managed to disable the TPMS ECU by leveraging packet spoofing to repeatedly turn on and off warning lights.

pjc50
  • 2,986
  • 12
  • 17
4

Speaking from personal experience here, not a snowballs chance in hell.

I was part of a team that wrote a fully new device stack for an automotive infotainment system back in 2008. Quite a while ago, but even then we understood the critical need to protect our software stack.

Our problem was made worse because the system ran (and runs) on Linux. And we fully complied with the GPL 2 terms, which means that you could put in a self-developed code and the car would accept that.

However, this was specifically not a security risk because the car used a digital signature system. Your own code would run, but the car simply refused to talk to your software. And it didn't listen anyway - the infotainment system at best had read-only access to a small set of enumerated data items such as the car speed.

I know that our system was at the cutting edge of automotive engineering at the time, and the already mentioned Jeep hack happened later. That's not really surprising. There's quite a bit of legacy going on, clean sheet redesigns aren't that common. Jeep is of course a minor brand of a struggling company, so it doesn't come as a big surprise that they are lagging. But that wouldn't be a brand which you'd expect would first produce a driverless car - the chief suspects would more healthy companies (could be Mercedes, could be Toyota, and of course Tesla)

MSalters
  • 2,699
  • 1
  • 15
  • 16
  • 5
    Re: "not a snowballs chance in hell". The OP didn't ask about "Your CD system", I think they meant any system. You then seem to disprove your comment by listing cases that did have issues and dismiss them as though there won't be those kinds of companies in the driverless car industry. Additionally, while a system may be secure, it is only secure against the things the developers were able to think of protecting against. I hope your processor wasn't built in China, otherwise who knows what back doors are waiting to be utilized. – Dunk Oct 11 '16 at 22:03
  • 2
    Was your security model to link the car control and infotainment systems, and harden the infotainment system? Sounds like you took security seriously, but I still think that's a risky design. The perimeter attack surface is massive, and presumably includes things like MP3 decoders that need to be high performance. – paj28 Oct 11 '16 at 22:40
  • 1
    I'd agree with @paj28 - "not a snowballs chance in hell" is pretty hard claim. Digital signatures depend on cryptography implementations, and crypto algorithms themselves are found weak and exploitable with time, not to mention that their implementations often have bugs too. Then there are all side channel (like timing) attacks etc. Read only access also perhaps can be exploited for writing via bugs (for example, in access controls themselves - like kernels, hypervisors) or in hardware itself (remember rowhammer?). – Matija Nalis Oct 12 '16 at 01:13
  • 1
    @Dunk: Considering that we needed the processor to boot only signed kernels, and it's an automotive product, you can assume it's not some random Chinese bit. Yes, there's a "hidden from the OS" security module in there - that's the whole reason we can enforce the digital signing. – MSalters Oct 12 '16 at 07:10
  • 2
    @paj28: The model wasn't to "harden" the infotainment system. The model was to consider it compromised by default - who knows what sort of unsigned code it might be running? Down to the the drivers, the whole kernel source was available. This greatly narrows down the attack surface. – MSalters Oct 12 '16 at 07:13
  • @MatijaNalis: Actually, the code isn't even running when the signatures are checked, and the whole signature checking process is unobservable (deeply embedded). That is not security by itself, but anyone with the required level of physical access could simply replace or add extra hardware. – MSalters Oct 12 '16 at 07:15
  • @MSalters-The US Government doesn't even allow certain highly critical kinds of products to be built for them that have parts which have been manufactured in China because of this concern. Digital signing can't help with detecting that the manufacturer added extra transistors in the silicon which could be used to trigger mechanisms that can be used to exploit the system. – Dunk Oct 13 '16 at 13:58
  • @Dunk: The good thing is, when you buy a million chips annually for a fairly low-risk application, the vendor has a good reason not to screw you over. The US government problem is that they buy 1000 chips for high-risk applications. – MSalters Oct 14 '16 at 15:30
0

Security on self driving cars are becoming a trending topic, as cars get more and more software.

The more code and hardware there is the more exposed is the system because the surface of attack is bigger. That said I wouldn't worry too much about the CD drive. Most recent self driving car will be connected to the internet to get various data (weather, traffic, stream music, sync calendar, etc etc). If a car were to be targeted cd wouldn't be a wise choice, and like you said, hacker are smart so they would probably target more modern and open doors to the outside world.

That said, let's pretend there is a flow in your cd drive: the hacker would have to make you download a song, make you burn it to a CD and then hope you'll play it on your self driving car - So if you don't download dodgy files it's basically impossible for them and definitely not worth the effort...

One last thing to add is that the song itself could give somme voice commands to the car if it is compatible (like what they have done for phones). Again you would have to get the song from a dodgy source and this doesn't allow to do something that is not designed to work with the voice interface. So it's pretty unlikely that a song will tell your car to break...

From a developper point of view, I think that self driving cars won't be 100% bulletproof, but they will (and already are) be much much much safer that human operated cars. This is just because a computer has a shorter response time, it is never drunk, sleepy or distracted, it has much more senses. you rely on a 200-220° optical field of view, the computer can rely on a 360 camera system coupled with long range radars, proximity sensor etc...

Let's be honest, when we launch a rocket, it's operated by a computer, not a human, there is a reason for that.

I hope it helped you better understand the risks and be less scared of self driving car.

0x1gene
  • 783
  • 1
  • 6
  • 10
  • 1
    I'd be pretty afraid of a driverless car controlled by a computer with a **slower** response time. You either mean faster or shorter. – Anthony Grist Oct 12 '16 at 15:48
  • @AnthonyGrist ahah true that ! I meant shorter thanks :) – 0x1gene Oct 12 '16 at 16:08
  • When we fly a plane, it's run by a computer, for a reason. When we land a plane, it isn't. –  Oct 12 '16 at 19:47
  • @WilliamKappler both Boeing and Airbus have been piloting (pun intended) computers landing planes; as well as complete computerized flight. Flight or boating might be simpler than ground transportation for a computer. – MikeP Oct 12 '16 at 19:58
0

If it is connected to the systems that run the car, then anything is possible.
If it is not connected, e.g. it is a Discman, then no.

MikeP
  • 1,159
  • 7
  • 12