Skip to content

Distractions

Tauvic Ritter edited this page Dec 2, 2020 · 9 revisions

We can define distractions as anything that does not support safe driving and in fact even makes it worse.

Question: Should we monitor for specific distractions or just for hands on the wheel?

Three approaches:

  • Classification
    • Monitor for a list of known negative situations (eating, drinking phone usage)
    • Monitor for a list of known positive situations (hands on the wheel)
  • Anomaly approach: monitor driver behaviour as a whole and assume most is positive

There are so many thing people can do to get distracted will we ever be able to identify them all? What method provides the best information for feedback and further analysis?

Pose detection

Pose estimation refers to computer vision techniques that detect human figures in images and videos, so that one could determine, for example, where someone’s elbow shows up in an image. It is important to be aware of the fact that pose estimation merely estimates where key body joints are and does not recognize who is in an image or video.

The PoseNet model takes a processed camera image as the input and outputs information about keypoints. The keypoints detected are indexed by a part ID, with a confidence score between 0.0 and 1.0. The confidence score indicates the probability that a keypoint exists in that position.

We could use body pose estimation to figure out if the drivers body is in a active driving position with hands positioned on the steering wheel.

Links:

List of positive/negative situations (StateFarm Distracted Driver dataset)

Four years ago StateFarm insurance setup a $65000 dollar competition on Kaggle for detecting distracted driving.

We've all been there: a light turns green and the car in front of you doesn't budge. Or, a previously unremarkable vehicle suddenly slows and starts swerving from side-to-side. When you pass the offending driver, what do you expect to see? You certainly aren't surprised when you spot a driver who is texting, seemingly enraptured by social media, or in a lively hand-held conversation on their phone. According to the CDC motor vehicle safety division, one in five car accidents is caused by a distracted driver. Sadly, this translates to 425,000 people injured and 3,000 people killed by distracted driving every year. State Farm hopes to improve these alarming statistics, and better insure their customers, by testing whether dashboard cameras can automatically detect drivers engaging in distracted behaviors. Given a dataset of 2D dashboard camera images, State Farm is challenging Kagglers to classify each driver's behavior. Are they driving attentively, wearing their seatbelt, or taking a selfie with their friends in the backseat?

classes The 10 classes to predict are:

  • c0: safe driving
  • c1: texting - right
  • c2: talking on the phone - right
  • c3: texting - left
  • c4: talking on the phone - left
  • c5: operating the radio
  • c6: drinking
  • c7: reaching behind
  • c8: hair and makeup
  • c9: talking to passenge