Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I kept honking at a sleeping Tesla driver on the freeway, but she didn’t wake up (mercurynews.com)
103 points by yumraj on Aug 24, 2021 | hide | past | favorite | 116 comments


Literally as written, this sounds like a medical emergency.

From the article: "It had its right turn indicator on, but it drifted slowly to the left, into my lane, before slowly going back into its own lane."

For this to have happened, the vehicle could not have had autosteer engaged. At most, it would have had regular traffic aware cruise control on, which would not have been monitoring for wheel input at all anyway. This strongly implies to me that the operator was at least operating the vehicle intentionally to a significant degree.

In any event, calling 911 is an appropriate response to an unresponsive driver.


What happens with Teslas on autosteer when an emergency vehicle approaches from behind? Will it safely get out of the way for flashing blue lights (US police), red lights (fire, ambulance). Amber lights - tow trucks? If a police car with lights on pulls in behind and waits, will autosteer pull you over? I am wondering how to safely stop a car with autosteer ON and driver OFF.


Apparently, it'll pull over and turn on the emergency blinking lights.

https://www.tesmanian.com/blogs/tesmanian-blog/tesla-autopil...


With just regular autosteer, absolutely nothing. Generally speaking, I'd expect the same response out of Nav on Autopilot with the "Enhanced AutoPilot" or "Full Self Driving" features.


Is it not possible for AP to lose the lane lines temporarily and then self correct?


When AP looses the lane lines, and then re calibrates (usually due to very poor markings), it starts beeping rather insistently that you take over immediately.

If you have lane detection on, Teslas will steer you back into the lane, if you start drifting off, even if auto-pilot is off. (This alone is a huge safety feature).


Dead people often can’t hear beeping


Deaf people neither. (I am Deaf.)


Thank you. Always glad to see people on HN properly siting their sources. Who knows what to believe on the internet these days


Sure, but there's two issues here. The turn signal is on in the other direction. Activation of the turn signal in base autosteer will disable autosteer. Activation of the turn signal in more advanced modes (Nav on Autopilot) will result in a lane change in the direction of the signal.

There might be a way to trigger a scenario like this in an FSD equipped car with all FSD related features turned off, and a really careful sequence of events with the turn signal, I guess? I'd have to try it to be sure, but I think it'd actually take some work to do.

But that last part does have me thinking it might be possible. The weird slowdown that he reported at the end could be consistent with AP finally giving up on alerting the driver and trying to "make safe" by slowing aggressively and probably going crazy with audible alerts. Maybe then the driver perked up and took off the ramp?

But tbh, it all reminds me of a driver that I ran into (thankfully not literally) a while back. They were in an older car, so obviously not on ADAS... just driving that badly and probably under the influence. It happens a lot.


Sure, if there's a blizzard or heavy rainstorm that blocks the cameras. Happens in Florida all the time. Rain so heavy the cameras can't see anything but a gray wall of water. In that case AP will emit an annoying audible alarm and disable itself, at which point you are steering the car manually.

In these conditions, human drivers will "follow the taillights" but I don't think Tesla allows for that yet. If AP can't see outlines of objects due to heavy rain/snow, it cuts out.


Right. It's situations like this that I point to when people say "AP is safer than human drivers" - because AP is disabled entirely when it is unable to work effectively. Humans don't have that luxury.


I suppose you count those as wins for yourself, but it seems smart and safe to insist on turning over control in that situation. Humans don't have a safe failure mode like that where they will stop driving or surrender control to an automated system when intoxicated, falling asleep, and so on.


I'm not arguing the merit of that.

I'm arguing that you can't point to numbers and say "Look, AP is safer than humans, because in the optimal subset of conditions where AP is 'able' to drive, it does better than humans in all conditions, even ones where AP would refuse to engage".


Sometimes it can make sense even for humans to stop driving and wait for better conditions. Rarely even humans are forced to give up driving (for example being stuck in snow).

Viewed in this light the statement that humans don't have the luxury to delegate is not always applicable.


Sure. There absolutely are.

But there is a large disparity between "conditions where _ALL_ drivers should stop driving and wait", and "conditions where AP will forcibly disengage, but an average human driver should have no issue carrying on with relative safety".


I've used AP in some situations where I was having a hard time clearly seeing the lines due to the rain. It held up pretty well, with me only very rarely intervening (from memory, 2 times in 100 miles). It probably helped that I slowed down pretty significantly due to the weather. I've also heard that the newer radarless vehicles are a little more picky.

In general, I'm surprised at how many things it can handle well, but it does still need a hand on the wheel for the few times that it will fail.


Exceedingly unlikely unless the lane demarcation is extremely bad. Lane-holding tech is really standard and reliable now (and even if you don't think AP is great, it's not stupider than what ships in mass-market cars).


>>unlikely unless the lane demarcation is extremely bad

Which is the case.....a lot of the time actually? At least here in UK the lane marking is frequenly so bad that my Volvo Pilot Assist disengages because it can't see the lanes anymore - and I'm like yeah, duh, I can't see them either.


The lane demarcation on the freeway south from San Francisco is so bad that it frequently confuses me, and I've been driving on it for 20 years.


Which one? I always found both the 101 and 280 pretty clear!


Most recently, northbound 101 under the 380 overpass. But, also just before the 92, and through Redwood city.


Agreed, though I'll add that on the very rare occasion that my Model 3 does lose the lane lines a bit, it tends to very quickly jerk back into the lane once it picks up the lines again. "it drifted slowly to the left, into my lane, before slowly going back into its own lane" does not sound like plausible autopilot behavior based on my experience; even if it was on autopilot and accidentally left the lane, it wouldn't slowly go back to it's lane, it would sharply correct itself as soon as it found out what happened.


Exactly... and turning on a signal completely turns off autosteer if in base autopilot.

Not sure if it could be coerced into automatically coming back on if the car had FSD and the signal was turned back off, though?


With EAP or FSD, turning on the signal will trigger a lane change


Well, there is one thing that's stupider. There's no cruise control. It's AP or nothing. If I just want the car to do nothing but hold the current speed, like my 1986 Chevrolet Camaro did, it doesn't have that feature. If the cameras don't work for some reason such as a heavy rainstorm, there's no "dumb mode" cruise control as a backup. I find that annoying.


I wouldn't be using cruise control during a heavy rainstorm. Standing water and hydroplaning plus cruise control = not a fun time.


Yeah this sounds crazy unsafe. Cruise control is for long boring drives down straight highways in high visibility.


If you pull down once on the stalk, it should activate cruise control, right? It's pulling down twice that activates autosteer.


That still does TACC. He wants cruise control without TACC. That is not possible in any modern Tesla, AFAIK. I believe the most recent to offer it was the Model 3 SR, which is no longer available. The SR+ and the Y SR both included AP with mandatory TACC.

Its weird, as other cars at least offer an option to disable TACC.


I have lane assist in my 2019 Subaru Outback, and it works well maybe 60% of the time. Most definitely not something I would call “extremely reliable”.


It’s possible that she wasn’t sleeping and instead may have been experiencing a medical emergency like heart attack or stroke.


I remember a story in Germany the other way around were a Tesla driver detected a car on the highway with an unconcious driver. He set himself in front of the other car, braked and stopped slowly until both cars came to a halt. Both insurances did not want to pay for the damage (voluntarily induced damage and such crap) and Tesla paid the bill for the Tesla driver's repair. [1]

[1 German]: https://www.waz-online.de/Nachrichten/Panorama/Tesla-ueberni...


Happened on the Golden Gate Bridge in 2007. No Teslas involved, naturally, but people sometimes do honorable things to stop an otherwise runaway vehicle.

> Beatty took bold and immediate action. He drove his Ford F-350 Super Duty utility truck in front of the Jeep and allowed it to essentially crash into the back of his vehicle so it would latch on, according to bridge officials. He then "slowly and safely" guided the Jeep across the bridge's southbound lanes and brought it to rest in a safe area, away from the flow of traffic.

edit: This was pre-divider, so a runaway vehicle on the GGB could have made quite a mess by crossing into oncoming traffic.


This sounds nice but you should NOT stop on a highway. This could've killed both + whoever slams into them.

People die all the time even when stopped on the emergency lane (you should really step over the barrier to be safe).


So you suggest doing nothing? Letting the car continue until it leaves the road at the first corner? Potentially causing a head on if there is no dividing barrier?

Tesla driver did the right thing, and is a hero in my books.


Call the police, stay nearby with warning lights perhaps? You could even take the middle option and slow them down to a safe speed.

It's an extraordinary situation, there are no well-proven answers here.

It is however well proven that stopping on a highway kills people, if they had caused a pile-up this would be a very different story.

Its baffling to me that you and others (downvoting) are so convinced that this was a safe action because the outcome was good this time.


It's not a safe action. But it's also not safe to do nothing. I think the driver made the right call (and with the benefit of hindsight it was in this case.)

Doing nothing would have resulted in a high speed accident with certainty of 1. This could have involved a head on or left wreckage on the highway. The driver took a chance on a risky intervention and it paid off.


Sometimes it can still be the safest thing to do. It sounds like that was one of those times.


That was mentioned in the article.


In 2007, a Piper Seneca plane crashed into a condo building in Richmond, BC, Canada. Right over fairly busy intersection, too. The elderly pilot had a heart attack or something.

https://www.cbc.ca/news/canada/british-columbia/1-dead-2-inj...


Literally my first thought


> It had its right turn indicator on, but it drifted slowly to the left, into my lane, before slowly going back into its own lane. Then it started to slow down considerably.

Autopilot won't let you signal in one direction while changing lanes in the opposite direction. The only way to do that is by driving manually.

My guess is the Tesla was under manual control and the driver was intoxicated, extremely sleep deprived, or having a medical issue.


Something sounds fishy here. The car changing speeds without anything prompting it and swerving in and out of lanes is not behavior that would be seen if it was driving on Autopilot.


Asleep on the wheel could cause some issues.


The torque required to cause it to swerve out of its lane would also be enough to disable Autopilot. This story isn't newsworthy if Autopilot isn't on.


This exact same thing happened to be in LA. I was driving behind a really slow Tesla (like 35mph on the freeway). I switched lanes to pass him and noticed he was just asleep at the wheel. Maybe he also had a medical emergency, but it seemed like he was just sleeping. Maybe I should have called 911.


"Just" sleeping behind the wheel? I can't imagine not calling emergency services if I ever saw something liked that.


Maybe?


It's pretty crazy how often people fall asleep behind the wheel.


When I was younger I would get road hypnosis, particularly at night. Don't even have to be tired, just start "zoning out" completely despite doing things to combat that (roll down windows for fresh air in my face and blast music).

I feel like this lightened up a bit in my late 20s. In any case, I love that I now live in a city where I don't have to drive and do not own a car.


A few years ago I nodded off on a 50mph thoroughfare. I was jet lagged from a transpacific flight a day earlier; I knew I was falling asleep, but there was nothing I could do to stay awake, and there was no exit I could take. Eventually I drifted into the lane on the left, hit a cargo truck, bounced off and hit another car on the right. Thankfully no one was hurt, but it could have turned out really ugly.

Don’t drive when you’re jet lagged — or fatigued, generally speaking.


> but there was nothing I could do to stay awake, and there was no exit I could take.

Surely there was a shoulder for use during emergencies. Why on earth didn't you pull over and turn on your hazards?


Probably extreme sleep deprivation? It can be hard to evaluate the width/safety of a shoulder in that state, or even ascertain whether you're looking at a shoulder or another lane. Scary stuff.


I wouldn't even try to use AutoPilot for this reason. I'd fall deep asleep if I wasn't actually steering the car.

I get active cruise control. I don't get lane assist. Doesn't everyone else's mind just perform steering pretty much without thought anyway.


I thought I am badass and would not start falling asleep while drivning ever but as I got older I noticed it gets worse for me so I get breaks after 3hrs of driving.

I thought you could not get sleepy on a motorbike, because it is loud and you get constant airflow. Got enough road at night and I also had to stop every 2hrs and do sit-ups get some water and only then hit the road back.

It is important to take breaks.


One of the biggest fear that stops me from buying a four wheeler. I have fucking slept on my two wheeler :(

(Yes, it’s a concern and I’ll be a danger to others as well. I feel so comfortable and sleepy on moving vehicles)


He could be dead, actually. I imagine a modern adaptation of "The headless horseman": a murdered driver rides in his electric vehicle, self-charging on charging stations and automatically riding between the locations frequented by the deceased owner. ...


See the short story "Road Stop" by David Mason, published 1963.

    The car called the Traveler, rolling at the stately thirty miles an hour it
    always held, was coming down the road now, and the two men stood, watching. The
    woman, a little behind them, watched too, her face growing whiter. No one said
    anything as the old fashioned car rolled by, straight and steady down the
    highway, holding the center of the lane as sharply as it always did.

    There was a film of dust inside the windows, though the Traveler was clean and
    shining outside. But the film did hide the white bone faces, the despairing
    hands that had long ago stopped trying to break through those closed windows.
You can read the whole thing here: https://archive.org/details/1963-01_IF/page/n91/mode/2up?vie...


And on the next page, there's that classic Berserker story where the pilot trains his dog to operate a MENACE:

https://en.wikipedia.org/wiki/Matchbox_Educable_Noughts_and_...


The dead driver, his slumped body having fallen toward an "Accept" button on the car's touch screen just as an offer to drive for Uber appeared, now makes $200,000 / year going non-stop without bathroom or sleep breaks.


Oh what a beautiful idea for a short story. Would have been a great writing prompt over in the redditverse probably.

[Edit:]

Could be a crime novel. A classical who-dun-it, but with the need to trackback the route and stuff like this and things happening in between and the data of the car being tampered with or stuff like this.


There's a powerful sci fi short story I read in an anthology quite a while ago (I believe called The Flying Dutchman) describing a crewless bomber being repaired, fueled, taking off, bombing its target, landing, and then doing the same thing again. As it describes this repetition, you slowly realize that humanity has long since been wiped out and the bombing runs are automated systems repeating themselves. (That's my memory of the story, anyhow, I read it in the 80's :) ).

(Edit: found it, "Flying Dutchman" by Ward Moore, it's from the 50's and you can find its text online with those terms)


I like that the name refers to a legendary ghost ship itself [1].

[1] https://en.wikipedia.org/wiki/Flying_Dutchman


There’s a short film that has the same plot - I assume based on that story. I don’t remember the name.


Do you mean these two short films by Dima Fedotov? https://youtu.be/pyMNIFZTQkg https://youtu.be/IjJmTeBSEzU


Thanks for linking these! They indeed capture that same concept and are quite beautiful, in a haunting sense. In this case automated technology seems to inhabit the same space as ghosts in a non-sci-fi setting. At least the type of ghost that re-enacts some event over and over. (The title of the short story I mentioned is an overt reference to one of those stories). I wonder what added meaning the technological element conveys, e.g. the difference between a story about ghostly sailors re-entering naval combat ad infinitum vs. a story about robotic pilots continuing a physical war patterned by their long dead designers. Both are a form of haunting.


For the story to work well, there needs to be no logs, or misleading ones. I assume Teslas records travel in at least one log?


It could work if the car had gone on enough trips, it wasn't clear which one was the important one. Maybe someone is sending the car out every day to hide their tracks.


This actually happens more often than, I think, people think about. It makes sense when you stop and realize what percentage of their lives Americans spend in their cars (like the old rule "most accidents happen in the home," with an average daily one-way commute of 27.6 minutes, most Americans spend 2.9% of their lives in their cars, so all else being equal 2.9% of medical emergencies can be expected to happen in cars).

I don't have high-quality numbers, but a low-quality Google search tosses out that 20% of accidents are caused by a participant having a medical emergency and losing control of their vehicle (grain-of-salt: the number is from an injury law firm, so obvious incentive to bias the number).


I would expect fewer medical emergencies in a car than in some other settings. But people who've had a medical emergency are likely to get into a car to head to the hospital, and can then run into serious trouble.

In particular it is common for people suspect that they may be having a heart attack, try to drive to the hospital, and then lose control on the way.

This happens enough that standard advice from medical professionals for potential heart attack victims includes tips like, "Survive. Don't drive!" If you've got worrying symptoms, call 9-1-1 and NOT attempt to drive. See https://www.valleyhealthlink.com/blog/2017/february/survive-... for a random example.

That said, the average ambulance ride in California is close to $600. So people have a strong incentive to try to avoid the ambulance, no matter how bad an idea that may be.


It would be interesting to know what portion of medical events happen in a car. So many things are less likely (eg whole driving you aren’t using a knife to cut up dinner, or cleaning leaves out the gutter), but so many are worse (eg if you are going 100kmh and you faint).


Only if "has medical emergencies" is independent from "drives to work." My guess is more medical emergencies happen in hospitals than on the road, even though the average person spends a lot less time in the hospital than on the road.


Reminds me of this short video[1] about an automated house that outlives its owners.

[1]https://www.youtube.com/watch?v=5LNHYz89sNc


See also "There Will Come Soft Rains" by Ray Bradbury.

All you need is a solar powered self driving car, and perhaps a programmed routine. (If occupied, to work in the morning and home at night).


https://lightyear.one/ has you covered.


I'm not up to date on solar power, but my bullshit detector is going off quite a bit. My understanding is that solar power charges very slowly per area unit. Intuition says you can maybe drive (not super fast) for 30 min after charging for a sunny. Would like to be wrong.


I've actually read it, about 30 years ago. Good memories.


Steven Spielberg's first movie 'Duel' is visually dated but transfers surprisingly easily to a world of automated driving.


That could be an interesting premise for a Neo-Noir Sci-Fi novel.


In Bolo universe ( https://en.wikipedia.org/wiki/Bolo_universe ), there were multiple cases where commander was dead but AI kept its plan.


Or a Twilight Zone episode.


So the sound dampening works well too?


I refuse to drive because I'm terrified of the responsibilty and risk anyway, notwithstanding people trusting software sufficiently to sleep while responsible.

Please take care driving, said with love and maybe understanding.

Cycling in Dublin is both gorgeous and scary, what with fellow drivers, cyclists, and pedestrians. Walking is great.

The odds on killing somebody while walking are small.


Maybe AutoPilot/self-driving status should be obviously indicated by some light, sort of like an on-duty taxicab.

This could alert drivers to be wary; there maybe/is no human driver.


I've always thought this about even cruise control.

I've quite often been sat at 75mph with CC engaged, start an overtake and have the car I'm passing speed up so just tuck in behind... then they slow down and I go to overtake again...

I think a purple (??) CC light, or autopilot light would help people avoid getting aggro about an algorithm.


It should be possible to overtake a self driving car and carefully slow down in front of it until a full stop. Normally they detect that and stop too.


That in itself can also be dangerous. There are in fact traffic laws in various places against driving too slow or otherwise impeding traffic flow because doing so can cause accidents.


You are always allowed to stop in an emergency. Or if your engine stops working for example ;)


You're supposed to turn on your blinkers and pull over ASAP. Stopping cold on a highway lane is a huge hazard.


And you can't use your warnings from the front (well you can but they'll be invisible).

So unless the self-driving vehicle has a routine to activate those on weird slowdown it may be very hard for the traffic coming behind to notice, further increasing the danger.


Tesla autopilot can overtake slow cars. You might have to swerve and block it. Certainly possible, as it is not aggressive, but it wouldn't be as straightforward to stop it as with regular traffic aware cruise control.


That's possible, but the article said that it changed lanes to the left while signaling to the right. That rules out having NoA enabled with "confirmationless" lane changes.


That would rule out autopilot entirely, wouldn't it? If true then this is just a case of someone falling asleep behind the wheel that could happen in any car. But with modern safety features like lane departure avoidance and automatic emergency braking. Tesla's active safety features can even brake for stop signs and traffic lights in addition to cars and pedestrians. And regenerative braking will bring the car to a stop much more quickly than a gas car when the accelerator is not pressed.


I'm trying to think of weird edge cases, but yeah it does seem like the odds are against AP being on in this case. Maybe TACC was on, but even that seems a little inconsistent with the report.


I, for one, am not getting into a car fight with our robot overlords.


On a highway that could cause an accident as traffic behind the self driving car would have to screech to a stop or swerve around. There needs to be some way to get it to pull over onto the shoulder.


This highlights a meta-problem of highways in general: they're very high-speed conduits, which are not intended for traffic to whiplash to 0 MPH in a short distance, where accidents and emergencies can definitely create roadblocks that force traffic to whiplash to 0 MPH over a short distance.

It'd be nice to have a solution to that. There are no inexpensive ones (rails handle this with corridor control... Every N miles, a signal indicates whether it's safe to proceed to the next segment. Some sections of highway have signage allowing for this; most don't).


I've always wanted to run a traffic simulation to see if traffic is eliminated if the highway was partitioned (5-10 rows of car per partition) into several chunks going at a precise speed and requiring cars to maintain some distance between the next partitions behind them. This could be accomplished with some tracking laser following the cars against the wall. Of course this would be ruined as soon as drivers wanted to hop into the partition in front of them, change lanes, etc.

Another slightly relevant idea is to mark intersections with a large red no-go area placed at the calculated point, where, if you were going the speed limit, if the traffic signal changed to yellow and you were in this box you could not safely pass the stop line. I hate having to guess how long a yellow light is (3 to 6 seconds is a large gap).


If you can't slow down in time to avoid hitting the car in front of you when it slows down, you're tailgating, and are a danger to yourself and the cars around you.


The problem isn't when you're tailgating, the problem is when you're comfortably cruising at 80 mph and all of a sudden you realize there's a stopped car in your lane. It can be surprisingly hard to notice the one car that isn't going with the flow of traffic. There are videos of cars getting totaled this way. Not sure I want to know what happened to the passengers.

On busy highways, if your car breaks down and you need to slow, one of your biggest concerns should be to get out of the way asap before someone rear-ends you.

Whether you consider it a result of tailgating or not, pile-ups are a common result of unexpected action on a highway.


What do you do if there is a traffic jam in front of you? You always hit the last car?


I haven't hit anyone so far, but I've gotten rear-ended myself. Twice. Thankfully never been stranded on on a highway like this.. https://www.youtube.com/watch?v=2lnYOlUnsWI


What do you do if a tree or something falls on the road?


That’s why you should stop slowly in such a case. Create a traffic jam behind you.

A traffic jam is a far less dangerous situation than an intoxicated driver.


You shouldn't have to screech to a halt if the car in front of you slowly decelerates.



Thanks, I'm getting a 'please disable your ad blocker' when I don't even use one


> My questions are these: At what point does the self-driving car stop driving since the person’s hands were still on the wheel? Was there anything else I could have or should have done?

"Steering wheel torque-based driver monitoring" is not good enough for monitoring the attention of the driver.

It is a '"poor surrogate" for driver attention'. Not my words, but the words from the NTSB [0].

[0] https://arstechnica.com/cars/2019/09/feds-scold-tesla-for-sl...


Well, I know you're not supposed to fall asleep while "driving" a Tesla right now but... isn't that the ultimate end goal? Cars that self-drive so well that you can take a nap while you're alone in them?


It likely was not using any driver monitoring in the provided scenario.


If given the time, location, and description of the vehicle, could Tesla track them down?


[flagged]


Saved is a generous way to put it.


Seizures and strokes behind the wheel is far more prevalent then people think. I can't speak for the entire USA, but having had a family member who suffered for that for years, the second someone has seizures, doctors usually work to revoke their licenses immediately.

So yes, Autopilot does saves lives. We know how many people have died with autopilot on, and I suspect it's a fraction of thoose who die with it off. Autopilot does some things poorly, somethings far better then a human can do. It's a technology with it's strengths and weaknesses. In my case, I was on autopilot driving through Denver, when the car suddenly swerved out of the lane into a adjacent lane. I freaked out thinking the system was going wonky.

Turns out that some drunk "street driving enthusiasts" decided to have a Race on I-70, and one of them missed that the lane he was trying to merge into from my rear right was occupied by a truck. It was in my blind spot, so while I was paying attention to the front of the car, autopilot was monitoring where my eyes could not.

In californium, I was driving a Tesla in Cueprtino on a fairly busy highway but late at night. The car started doing it's "pay attention now" beeps, a second before I realized that someone was driving the wrong way on my side of the freeway.

Would either of these been a fatality? Probably not, but possibly so.

People fall asleep all the time behind wheels. They plot into parked cars all the time. They drive in the wrong side of the road. They plow into emergency vehicles.

Anything we can that makes things safer is nothing but good.

I bought a early Model 3 - and i will tell you that Autopilot is night and day better now then it was. The evolution wasn't in hardware, but in software. As more people use it, the software and hardware get bigger.


Odds are it's about 50/50 regarding lives saved vs. lives cut short.


Could you walk us through the math here?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: