Are driverless cars illegal?

by on March 25, 2013 at 3:15 am in Law | Permalink

Here is a long, 99-page article (pdf) by Bryant Walker Smith suggesting the answer might be “yes.”

My argument is less subtle than those in the footnotes of the paper.  Try running a driverless car in Fairfax City, or Alabama for that matter, while sleeping in the back seat with your feet up.  See what happens when you drive by an alert policeman.  (By the way, if you are asleep will your driverless car respond to the police siren and pull over?)

Let’s say you sit at the wheel while the software drives, you still are pulled over, and given a ticket for “reckless driving.”  You show up in court and the judge asks you what regulatory inspection or safety process your equipment has been through.  I am not saying you will always lose the case or indeed always will be pulled over, but your vehicle is no longer a reliable source of hassle-free transportation, no matter what statutory arguments you may make on your behalf.

There are different notions of the word “legal,” but from a practical point of view what the police will let you get away with is surely relevant.  It seems to me that your protected sphere here is quite small.

For the pointer I thank Jerry Brito.

Alex Weiner March 25, 2013 at 3:30 am

Presumably, no large company capable of doing a large-scale retail rollout of this technology would do so without the regulatory/legal green light beforehand.

jk March 25, 2013 at 4:13 am

What about auto-pilot on planes? Of course the pilot is paying some attention to the console just not actively managing it (I didn’t read the article yet, so I don’t if he addresses this).

Vernunft March 25, 2013 at 6:18 am

What about it?

jk March 25, 2013 at 9:49 am

There should be case law or guidances to show where the culpability of pilot begins vice errors of the code for autopilot. Policy makers can use this analogy when drafting laws regarding driverless cars. We are entrusting a lot to pilotless jets you know…

milprof March 25, 2013 at 10:13 am

Very different situation. In an airplane, at least one of the pilots is expected to be 100% attentive at all times — monitoring the autopilot, visually watching for traffic (such as the weather allows), listening to air traffic control, etc, and ready to instantly take over from the autopilot. There is much more programming of the autopilot than the driverless car example (i.e., clear specification of route, altitude, etc., not just “fly to Peoria, please”). And planes are under the active guidance of ground controllers.

For it to be blamed primarily on the a/p rather than the pilots, the fault pretty much has to take place such that pilots could not have noticed it (vs “did not” notice) and/or could not have manually taken control back from the a/p once the problem started. It’s been very very rare for an accident to have primary fault placed on the a/p vs on the pilots.

milprof March 25, 2013 at 10:23 am

Just to follow up, in a quick search I’m not finding any cases where a major airline crash was pinned on the autopilot. Autopilot as a contributing factor, yes, but ultimately more due to pilot error as crews could have and should have seen trouble coming and/or did the wrong thing in trying to recover from it.

Two things worth noting: airplane autopilots are *not* designed to handle all situations. When something gets weird enough — unusual attitude, equipment failure, whatever — the autopilot will pop off and dump control back with the pilot. Not necessarily totally “off”, but with a reversion to a much more basic level of computer assistance where the pilot is expected to help figure things out.

Autopilots also do not handle collision avoidance to the same degree that automobile ones would have to. The skies are much less crowded and primary responsbility for not bashing into another plane lies with ATC on the ground.

Bottom line is the airplane autopilot liability looks a lot more like cruise control in today’s autos than like the “asleep in the back seat” hypothetical from Tyler.

Doug M March 25, 2013 at 4:46 pm

The Air France flight out of Brazil was partially a fault of the autopilot. The pitot tube cloged. The Autopilot recognised that that Airspeed indicator was not provididing consistant results. The autopilot throttled back the engines before passing control back to the pilots. From then to the water it was pilot error.

Captain Sullenbergers landing on the Hudson River was, in part, an autopilot error. The engine ingested a bird. The autopilot shut down the engine, and would not alow the pilots to restart the engine. Had the auto pilot not been in the way, they could have likely turned that plane around.

Both of those were Airbus planes. Airbus has a more agressive autopilot than Boeing.

John Schilling March 25, 2013 at 10:44 am

The culpability of the pilot, roughly speaking, begins when he starts the engine and ends when he turns it off. The pilot is absolutely responsible for the safe operation of the airplane. The pilot is always expected be fully devoted to the safe operation of the airplane, unless he temporarily delegates that responsibility to a competent, human, copilot. If, on rare occasion, a pilot is deemed not culpable for a mishap on the grounds of faulty hardware, this will be on account of the fault being not reasonably knowable to the pilot AND of such a nature as to make it impossible for him to safely operate the airplane. Since the pilot can pretty much always turn off the autopilot and go back to flying the airplane the old-fashioned way, it is almost inconcievable that any “error of the code” would render it impossible for the pilot to safely operate the airplane.

Autopilots are extremely useful devices, but they do have their limits. We have better than half a century of experience in working within those limits – and in assessing liability when someone tries to go beyond them. This experience is not favorable to people who imagine they can simply tell their next car to drive them to work while they read the morning paper. As, for that matter, is our experience with “driverless” cars, almost all of which in fact require full-time professional drivers.

Rahul March 26, 2013 at 1:11 am

Key difference is that autopilots were mainly intended to make planes safer and not to make pilots redundant.

With driverless cars the design assumes a driver will be absent.

John Schilling March 26, 2013 at 12:49 pm

Every actual “driverless car”, and I believe every design that has actual financing attached, either includes a driver’s seat and a requirement that it always be properly filled while the vehicle is in motion, or a restriction that the vehicle only be used in what are essentially sandboxes.

The common wish assumes that the driver will be absent or at least optional. The wish is unlikely to be granted any time in this generation. Because what we can actually design, are “auto-drivers” that might make cars safer but aren’t likely to make drivers redundant.

8 March 25, 2013 at 4:23 am

What if the car is clearly marked as being driverless with gaudy colors and logos? I could see a rental company employing the technology first.

Steve March 25, 2013 at 6:03 am

Um, if it’s illegal, change the law? This would seem to be a problem that yields to a solution.

Andrew' March 25, 2013 at 6:24 am

So, were you really ‘reckless driving’? It used to be that not doing a crime was a defense.

Andrew' March 25, 2013 at 6:26 am

(I guess that was before it took 100 page legal papers to show something is illegal)

londenio March 25, 2013 at 6:42 am

It is like 35 pages long if you don’t mind skipping the footnotes. Tyler, of course, refers to the footnotes. Tyler, in fact, is cited in the footnotes.
I agree with the commenters, the law will have to change.

Statspotting March 25, 2013 at 6:57 am

Most of Google’s ‘moonshots’ have to be carefully analyzed from a legal as well as privacy standpoint. By definitions, they would push the edge on these issues – for instance, Google Glass – why is nobody talking about the social implications of this very intrusive product?
http://statspotting.com/why-is-nobody-talking-about-the-social-implications-of-google-glass/

JWatts March 25, 2013 at 11:44 am

You must not have been reading the news much. All kinds of people are talking about the social implications of Google Glass.

Link: http://lmgtfy.com/?q=Google+Glass+ban

Master of None March 25, 2013 at 7:25 am

This is a thin argument. The same could be said for any number of innovations throughout modern time.

You can argue the same for Google Glass, btw (walking into banks and other places where photo/video isn’t allowed)

This strikes me as the same kind of posts from Paul Krugman about what is “politically realistic.”

Society decides what is legal, what is realistic. It is the burden of the journalist to assume that everything that is human is possible.

Claudia March 25, 2013 at 7:33 am

Would think with all the special interest capture kicked around on this blog, you’d be more optimistic. Who is going love driverless cars? Not just gadget-crazed hipsters…but people who can’t drive now or won’t soon, and a good chunk of them hold AARP cards. Making driverless cars an accepted reality is not trivial but it will happen because the demand is so well concentrated. In many parts of the US once they take your license away, you’re life is a small sphere. I am certain my grandmother’s husband would hop in a driverless car to go attend all the social stuff he has had to cut way back without a license. These cars will also address a growing safety problem on the roads too: with no alternatives people often drive longer than they should. Of course laws and norms will need to change … and some people making a show in their driverless car will get pulled over, maybe not just for reckless driving…but there is a powerful lobby and important block of voters that ought to eventually get behind this technology. It may just take some time.

Bill Harshaw March 25, 2013 at 8:35 am

Absolutely, says the 72-year old whose reflexes are fading, and maybe his mind. While I may be failing, I can still do a good act as a dummy behind the steering wheel.

Steven Kopits March 25, 2013 at 9:03 am

It’s clear that driverless cars need an appropriate legal framework, one which does not exist today. But like Claudia, I see a huge potential market–about $25 bn per year.

Driverless cars will also be necessary to provide sufficient mobility for society as the oil supply peaks out, and as the natural gas supply comes under greater stress after the mid-2020s.

Here are some quick oil facts for you:

- All of the increase in non-OECD oil consumption came from OECD consumers–not oil producers–in three months ending Fed. 2013, compared to a year earlier (EIA STEO data). That’s the first time we’ve ever seen that. What does that say about the oil supply outlook?

- Upstream spend (oil and gas exploration and production expenditure) has doubled in the last five years, to $644 bn in 2012. Global crude oil production (excluding NGLs, condensate and biofuels) is essentially flat since 2005. What does that say about the global oil supply?

- Exxon, Shell, BP, Total and Hess are all reducing upstream spend. What does that suggest about the oil supply?

- Upstream costs have been rising at 11-13% (Petrobras and Exxon, respectively), even as oil prices have declined modestly. What does that suggest about the oil supply?

Personally, I would be fast-tracking self-driving technology and regulation.

Cyrus March 25, 2013 at 9:20 am

Up through the first decade of the 20th century, nearly all vehicles had autonomous control systems that sometimes responded to the wishes of a human operator. The legal framework that supported that (the human operator remains liable for whatever the vehicle’s control systems might do) is still be on the books in some places.

Steven Kopits March 25, 2013 at 10:47 am

Is the passenger liable for a bus crash?

Andrew' March 25, 2013 at 12:21 pm

Drivers are systemically flawed, and yet driving persists. We accept that just as we accept (implicitly for the most part due to the average voters fundamental attribution fallacy bias) that government creates roads and enforcement systems that are sub-optimal. Driverless has the obvious advantage that an individual flaw can be patched whereas if we identify a systematic flaw in human drivers you could never hope to change it with an awareness campaign. It would be a government fail if driverless were on net safer and yet the provider companies were bankrupted.

JWatts March 25, 2013 at 11:47 am

Driverless cars will also be necessary to provide sufficient mobility for society as the oil supply peaks out, Driver-less cars are, at best, tangentially related to scarce oil.

Finch March 25, 2013 at 1:28 pm

Driverless cars will likely increase the amount of driving going on. That’s kind of the point of them, isn’t it?

JWatts March 25, 2013 at 1:44 pm

Yes, probably so and a good point.

Steven Kopits March 25, 2013 at 2:03 pm

JWatts -

I am making the case that the oil supply looks to be in pretty bad shape, and I wouldn’t have to work too hard to give you a scenario where it peaks out in the short to medium term.

What then? Well, we can migrate to natural gas, a step that is now at least three years overdue. (We ran our workshop on this at Columbia University’s SIPA back in 2010.)

But keep in mind that China is perfectly capable of adding a US equivalent of oil consumption in only seven years, most notably in the 2018-2025 period. Of course, the oil supply will most likely be in decline; therefore, nat gas will have to do the heavy lifting. Nat gas could run for a long time, but I would guess that we’ll be seeing global pressures on the supply by sometime in the mid to late 2020s.

Then what? Well, with self-driving electric cars, you can augment the oil and nat gas supply with electric vehicles that are cost-effective. So the move away from oil leads to a diversity in the transportation fuel supply that makes it look more like the power sector.

That’s energy policy for the US to 2030 in a nutshell, at least as far as transportation goes.

JWatts March 25, 2013 at 5:06 pm

Well, with self-driving electric cars, you can augment the oil and nat gas supply with electric vehicles that are cost-effective.

I fail to see how self-driving cars effects the equation. At least to any very significant degree.

I firmly agree that electric cars (or more likely plug-in hybrids) are a useful idea that will see increased usage as battery technology gets better. Battery technology improves on average about 3% per year, so even without major breakthroughs, you can expect a battery powered vehicle to be 34% ‘better’ a decade from now. It won’t take to many decades for the cost curve to favor electric cars.

JonF March 25, 2013 at 8:12 pm

While well-programmed cars may save some gas by efficient driving it still will take a certain amount of energy to move a car plus cargo plus occupants from point A to Point B. I doubt the efficiency savings will be more than a fairly minor fraction of that.

Steven Kopits March 26, 2013 at 8:05 am

You solve the problem with vehicle utilization rates, not better technology. The self-driving component is what allows utilization rates to rise.

I’ve commented on this any number of times.

dead serious March 25, 2013 at 9:00 am

I don’t have time to read a 99-page dissertation so my comment might in fact be addressed therein.

I would think that one obstacle would be convincing an insurer to cover a driverless car. Certainly not an insurmountable problem, but in many (all?) states it’s illegal to drive without insurance so this condition would have to be met.

PD Shaw March 25, 2013 at 9:46 am

A driverless car suggests to me that insurance coverage would need to be acquired by the manufacturer, or maintenance provider.

The sphere of negligence that a driver would have might be quite small, i.e. the failure to take the car in for regular maintenance, or operating the vehicle with obvious defects.

Where a driverless car fails to react properly to a given situation, liability is more likely to rest on the manufacturer under a form of “products liability,” which makes me think such a car will not be sold on the market unless the manufacturer can get lobby to get its liability adjusted.

dead serious March 25, 2013 at 11:09 am

I don’t think that a state would bless that construct.

If your tire suffers a blowout and you crash into other cars, the other drivers don’t submit claims against the tire manufacturer.

PD Shaw March 25, 2013 at 12:05 pm

Sure, drivers are supposed to monitor the condition of their vehicle, so a blown tire could be the driver’s fault; the insurer might pay and seek reimbursement from the tire manufacturer if a defect is revealed.

I guess the point I was trying to make though is that with a driverless cars, potential liabilities shift from the driver to the vehicle. The driver has less opportunity to make mistakes, and the mistakes that will be made are more likely an omission in design. I would think it would make personal coverage less expensive, but create new issues for the manufacturer.

JWatts March 25, 2013 at 11:53 am

I would think that one obstacle would be convincing an insurer to cover a driverless car.

I expect that to be a problem for maybe the first 10 years. And then the accident rates will have time to sink in. At that point, you probably won’t be afford to get insurance unless you are using an automated system.

I would reasonably expect Driver-less cars to be 50% less likely to be in an accident than a human driven car. And 40% of the remaining accidents will involve them being hit by a human driven car. There will be a huge incentive to switch over to automated cars. $150+ billion per year in incentives, just in accident avoidance alone.

dead serious March 25, 2013 at 12:42 pm

I agree with you – once it’s a proven technology, insurers will jump on board and this won’t be an issue.

The issue is convincing the first insurance company to make the leap and so on until the technology is proven and there is an established market.

gwern March 25, 2013 at 10:21 am

Smith is hardly the only one to have qualms about how legal autonomous cars are under current regimes (and no, the Nevada and California style laws do not resolve the major issues): http://lesswrong.com/lw/gfv/notes_on_autonomous_cars/

Willitts March 25, 2013 at 10:35 am

Don’t conflate sense or logic with law.

A person who is intoxicated above the legal limit who operates any of the controls of a car, including turning on the engine in a snowstorm to keep warm is violating the law. I usually accepted plea bargains when it was clear to me that the perp had no intent to drive, but the law is the law and case law is the law.

The driverless system might be flawless. It might exceed the ability of humans to avoid accidents. But engaging that driverless system is a willful act, and it does not absolve you from your responsibilities as a driver. It is reckless driving for you to abdicate control of the vehicle by taking your hands off the steering wheel, falling asleep, or sitting in the back seat. Putting your car on cruise control is not an excuse for accidents, and neither is it an excuse for falling asleep or otherwise relinquishing control.

Pilots and airlines are responsible for planes that are on autopilot.

Most of the time, automatic controls (and reckless driving and drunk driving) areoly a problem when they become a problem. But they are ALWAYS illegal.

The bottom line is that you can proffer all of the pencil-necked arguments that you want, and personally I would agree with you, but as a prosecutor I would treat you as you are – a lawbreaker. On a first offense I would probably let it go as a minor moving violation, but if it involved death, injury or damage to property I would examine the facts and apply liability in accordance with established law of the state. Most likely this would shift the burden to you to prove that your system played no part in causing the accident and did everything possible to avoid it. This is something you could not likely prove as an affirmative defense.

Therapsid March 25, 2013 at 11:03 am

“A person who is intoxicated above the legal limit who operates any of the controls of a car, including turning on the engine in a snowstorm to keep warm is violating the law… It is reckless driving for you to abdicate control of the vehicle by taking your hands off the steering wheel, falling asleep, or sitting in the back seat.”

This will kind of thinking will be utterly antiquated by the middle of this century at the very latest. If the system is “flawless” or appreciably akin to flawless in comparison to human-operated cars, the laws you prosecutors enforce will change.

JWatts March 25, 2013 at 11:55 am

the laws you prosecutors enforce will change.

I expect it to be a 180 degree change. A human driving a large ground vehicle on a major restricted access road will probably be considered reckless driving.

Andrew' March 25, 2013 at 12:49 pm

I suspect it will be a many-year slog fighting tooth and nail to get it legal. The next day it will be mandated.

John Schilling March 25, 2013 at 1:14 pm

Almost certainly so. But the important question is, when do you expect this 180 degree change?

If we accept the idea that aeronautical precedent may be useful, here’s where we stand. Between roughly 29,000 feet and 41,000 feet – the “restricted access road” used by e.g. airliners – all aircraft are required to be flown by specially approved autopilots. This has been the law in the United States since 2005, as it allows higher traffic density while maintaining existing safety levels. As the higher traffic density is extablished fact, hand-flying a typical aircraft at those altitudes would in mostc ases be reckless. The current plan is I believe for 2015 to see the first autopilot-only aircraft operating in this airspace; present requirement is for both an autopilot and a human pilot. There is some doubt as to whether the 2015 schedule will be met.

Autopilots have been around since the 1930s, though the more relevant benchmark may be the first UAVs capable of more or less autonomous operation, ca. 1980. So, twenty-five years to implement the requirement that the automatic control system be used in high-density airspace, thirty-five to forty years before the human pilot can be even optionally dispensed with.

Even assuming “driverless” car technology develops at twice the pace of autopilot/UAV technology, which is plausible on purely technological grounds but unlikely at the social and political level, don’t expect a requirement for e.g. smart cruise control on the highways any time before 2025, or dispensation to take a nap in the back seat until 2030. Probably closer to 2040 on account of the social and political factors, and longer still in the slower but more complex environment of city streets.

Finch March 25, 2013 at 1:38 pm

I more or less agree that people are downplaying some of the technical challenges (or are perhaps unaware of how difficult it is to get such complex devices working near-flawlessly – the difference between 99.99% and 99.9999% matters here). But much of the 35-40 years you cite for UAVs was spent waiting around for things like sufficiently reliable and otherwise suitable electronic hardware. We don’t need to start that clock from zero.

JWatts March 25, 2013 at 1:50 pm

smart cruise control on the highways any time before 2025

Maybe my time horizons are different than most posters, but yes I agree and I’d be surprised to see it much before then. But on the other hand that doesn’t strike me as a particularly long time.

Dan Weber March 25, 2013 at 2:55 pm

99.99% sounds good, if measured in terms of trips. That’s one trip in 10,000 having an accident, or once every ~10 years per car. Pretty good if we are there already.

John Schilling March 25, 2013 at 4:24 pm

Actually, the improvements in avionics hardware reliability were almost all done in the 1950s through 1970s; hardware failure rates have been almost flat since 1980. UAV reliability has been driven almost entirely by software reliability and capability, and mostly by mission-specific software reliability. By which I mean, not so much “the software crashed/didn’t do what the specification required”, but “we never though to program it for that scenario; reality is too wierd”.

And in that respects, UAVs had a major advantage in that they spent their first twenty years in a real-world military sandbox where crashing and burning once every hundred sorties was acceptable. Which made for a huge database of bug reports to clue the developers into what they had been missing, bringing us to the present level of one crash in a thousand sorties and maybe within reach of the 1:10,000 level we’d need to have to consider letting them play in the same airspace with airliners (though not actually carrying passengers).

I find myself wondering where self-driving cars are going to build up that level of operational experience. Roughly speaking, it’s going to take 10,000 crashes to get us enough data for the self-driving car that only crashes once in 10,000 trips, and the first few hundred crashes at least will be high-profile media events if they happen on public streets. I’m thinking that after a hundred Dead Pretty White Girls, “trust us, once we’ve killed ten thousand more people these things will be safer than you idiot drivers and the Dead Pretty White Girl count will start to decline…”, will not be the winning argument even if it happens to be true.

Finch March 25, 2013 at 5:28 pm

Thank you for your reply. I hedged with “otherwise suitable” on purpose. Performance was surely a constraint in 1980. I’m surprised to hear microcontrollers and boards were that reliable back then, at least without money-is-no-object engineering.

Willitts March 26, 2013 at 12:45 am

The question wasn’t “will they someday” be legal. The question was “are they (today)” legal. The answer is no.

Now you could probably drive thousands of miles on autopilot without incident, and if a cop pulled you over for sleeping, you could give him your best David Hasselhof impression of a kink in your neck while KITT is driving. If and when you have an accident, the cops aren’t going to ask if you were driverless. They’ll just make a judgment based on the situation.

But if you advertise that you are not controlling the car, you can and will get a Terry Stop. Each state has different statutes, and the only vehicle
laws that reached into my arena were reckless or drunk driving resulting in death or grievous bodily harm. One would think that if there is no law prohibiting driverless cars that it is legal. Think again. States have very broad police powers, and there is a catch-all statute for every perceived threat of harm. On the flip side, the traffic court judge would likely dismiss your ticket if there was no harm and if a statute isn’t clearly violated. But after you show up to court to schedule arraignment, go to arraignment, and then face the judge (if the cop shows up), you will realize it wasn’t worth your time. It would be better to get state permission first.

Will it someday be legal? Yeah, after all of the liability issues are settled.

Rahul March 26, 2013 at 1:19 am

Isn’t primary liability always of the owner of the equipment? Say a roller coaster at Disney kills a kid or people die in an air crash. Disney or Delta get sued.

Whether Boeing / the pilot / etc. also get sued is another story.

But is there any case where the owner was absolved legally just because he was the mere owner of “automatic equipment”? So, I fail to see why people don’t want to be responsible for driverless cars they own.

Memnon March 25, 2013 at 11:45 am

A. Why would you let the car drive itself and sleep in the back seat if you did not perceive that to be reasonably safe behavior?
B. Why would you think sleeping in your self-driving car was safe if you had no evidence thereof?
C. Who would arrest or prosecute you for such behavior if there was firm evidence of its safety?

I grant that C could be answered “jerks who refuse to accept clear evidence” but we are still at B and should not get ahead of ourselves.

Prove that the technology works, and all legal issues will with [hopefully not too much] time be considered quaint. Assume it works without testing it, and it is the right duty of law-enforcement to keep your reckless ass off the public roads!

lemmy caution March 25, 2013 at 1:18 pm

There are no driverless cars so it is premature to develop a regulatory framework for driverless cars. The government will be perfectly willing to regulate driverless cars in a reasonable manner once they actually exist.

Andrew' March 25, 2013 at 1:30 pm

Wait a second. Deja vu all over again. What if we had a driverless car developer guy say he was concerned about regulations. Would you call him a liar?

If we have no driverless cars, then why are they illegal? Sorry if you are being funny.

lemmy caution March 25, 2013 at 2:12 pm

“If we have no driverless cars, then why are they illegal?”

Why are regular cars legal? They are legal because, although they kill people in large numbers, society has determined that they are useful and has developed regulations to control their use.

The fact that driverless cars don’t exist makes it pretty hard to regulate them. Are they going to be safer than regular cars? less safe? Will equipment malfunctions be an issue? who knows because they don’t exist.

“What if we had a driverless car developer guy say he was concerned about regulations.”

I would be concerned too. So what?

MikeW March 25, 2013 at 1:40 pm

For aircraft, the “pilot in command” is always responsible for the aircraft. It doesn’t matter that the autopilot can control flight from takeoff roll to stopped on the arrival runway, it doesn’t matter that air traffic controllers have provided instructions … the PIC is the guy on the hook.

Aircraft manufacturers have always built systems such that the pilot initiates any actions and the pilot can override any actions. To maintain the PIC’s control over the aircraft & minimize their liability?

Finch March 25, 2013 at 2:09 pm

This is a funny use of the PIC acronym.

In certain circles, “PIC” is a particular type of embedded computer. http://en.wikipedia.org/wiki/PIC_microcontroller

Dan Weber March 25, 2013 at 2:27 pm

The law and the cars will evolve with each other.

The law right now should not allow completely autonomous vehicles — where you can sit in the back and go to sleep — and that’s fine.

We need laws that let the first generation of automated cars pass. Say someone has to be in the driver’s seat, and conscious. Not necessarily paying full attention to everything going on, but aware enough that he would notice flashing sirens.

After we adapt to that, we start getting more aggressive.

Larry March 25, 2013 at 3:50 pm

I suspect that until it is made clear that a driverless vehicle on the roads is lawful we will see lots of reckless driving tickets written by the police. They love offenses like reckless driving because these offenses are rather vague and can therefore be applied to many driving actions.

Willitts March 26, 2013 at 12:52 am

I see a dozen reckless drivers a day. Cops have better things to do than look for driverless cars. They don’t even like enforcing cell phone restrictions. It was tough to get them to enforce seatbelt laws and motorcycle helmet laws. It is very much a no harm, no foul system. Unfortunately cops will abuse their authority and make illegal stops based on the perception of wrongdoing.

I often drive with my hands in the lower quadrant of the steering wheel, and I’ve never been pulled over for that. But I’ve learned to avoid left turns as much as possible.

John March 25, 2013 at 6:45 pm

Legality if local for a number of things: http://www.technewsworld.com/story/76254.html

Of course in this type of situation I’m not sure the law will always apply anyhow. I was recently at a talk by Radley Blacko, who pointed out that the exemption for police to present the search warrant to someone at the location being search was rescinded by Congress some time ago (forget exactly when he said). This means that all these DEA and SWAT busting into peoples’ homes has not underlying legal basis any longer — unless we accept that this has become a common law behavior — or that we accept Judges, who have failed to reject such behavior when presented in court, have the authority to make law outside of common behavior of the local community. So, even if the smart car were legal under some general right of transportation (we don’t license bike, wheelchairs or automated wheelchairs or a number of other mechanisms that are allowed — or that people customarily use — on the streets), and that is a question, it still might not keep you from getting the ticket and having the court support that fine.

Willitts March 26, 2013 at 1:08 am

http://www.ilga.gov/legislation/ilcs/fulltext.asp?DocName=062500050K11-503

(a) A person commits reckless driving if he or she:

willful or wanton disregard for the safety of persons or property;

Operating a driverless car is clearly willful. The law is very subjective, and you can be ticketed for any perceived risk to life or property. Not maintaining control of the vehicle would likely be reckless per se. That shifts the burden onto you to prove an affirmative defense – that the risk to life and property was no higher than that of a regular driver. It’s not a law you want to test. If someone dies, even of it wasn’t the driverless car passenger’s fault, there could be enough evidence for a vehicular homicide conviction.

An appeals court could set precedent by declaring that there is nothing illegal per se about driverless cars. But then the next day the state legislature would vote in a law to regulate it.

I have no dog in this fight. I’m just telling you the legal realities.

JonF March 25, 2013 at 8:15 pm

As I have said on these threads before, we will certainly have (some) cars with fairly sophisticated auto-drive feature, they will still require an attentive and competent (and licensed and sober) “attendant” in the drivers seat, able to take over if something goes wrong– and, yes things will go wrong. That’s technology after all. Auto-drive will be the roadway equivalent of auto-pilot in airplanes.

Jonathan Pak March 25, 2013 at 8:40 pm

I think it’s important also to gather as many relevant statistics as possible. Many large companies have begun hoarding staticians to help inform themselves of whether a particular decision by the company will be well received by the public. The nominative statement that, “software driven cars should be illegal” could be an extremely vacuous statement if the public had a dominantly negative disposition about such cars.

If the public were to receive such a good, then the first question that proably gets imposed, if there is an accident, who is at fault? Are car producers willing to take on such costs and responsibilities? Perhaps private costs may be raised to help offset such things.

I wonder what the public thought about airplanes when they first became available commercially.

ChrisA March 25, 2013 at 10:13 pm

The US isn’t the only country in the world you know. I am sure autonomous vehicle manufacturers could get pretty decent terms in other countries if they agreed to build their cars there.

Willitts March 26, 2013 at 1:11 am

I might add that we already have cars that park themselves. If there is a death during one of these automated events, I wouldn’t want to be the driver of that car. I frequently get numbskulls who walk between me and another car while parallel parking. A driver who is automatically parking has to stay alert.

BiModal Glideway March 27, 2013 at 3:28 am

The autonomous car won’t be mainstream because of the cost to produce these cars. So take a look at our BiModal Glideway Dual Mode Transportation System on Facebook, Twitter and by visiting our website. Email us with any comments or questions you may have.

Comments on this entry are closed.

Previous post:

Next post: