According to a 2008 study by the Swedish National Road and Transport Research Institute, about 20 percent of all road traffic accidents are caused by driver fatigue. Tired motorists are also eight times more likely than rested motorists to get in an accident, displaying driving abilities similar to someone who is intoxicated.
Fatigue and microsleep at the wheel are often the cause of serious accidents, especially with large vehicles. However, the initial signs of fatigue can be detected before a critical situation arises.
So, in 2015, US carmakers pledged to have many types of Advanced Driver Assistance Systems (ADAS) equipment as standard by 2022.
One of the first drowsy-driving monitoring systems to appear in the truckers’ cab was a driver-facing camera that alerted the driver when it registered eyelid and head droops.
Naturally, privacy concerns kept this technology from going far. So, where are we today?
Driver cameras: love it or hate it?
When it comes to trucking insurance, litigation, and driver safety it is no question that in-cab driver facing cameras are essential for an effective safety program for carriers.
Face- and gaze-tracking algorithms monitor the driver and send audio alarms, vibrate the driver’s seat and notify the monitoring station if safety parameters are not met.
In fact, some devices try to predict a driver’s drowsiness.
While carriers like Shipley Energy in Central Pennsylvania have a rewards program with an extra bonus if their camera performance is above a determined threshold, not all carriers successfully balance a soft touch but teachable moments culture.
For example, on Reddit last month, @Shesnotintothistrack posted this on r/Trucker about In Cab Cameras at Waste Connections:
**** these things. I can’t change music, take a drink, eat, talk on the phone, scratch my nose, or anything without getting written up.
They say it’s for “accident mitigation” and for “safety”
You can see what happened by monitoring outward facing video and data recording controls at the time of the incident. If someone is on their phone or whatever you’d see that they aren’t responding in an appropriate time frame to avoid an accident.
**** is about control. I’ve been driving for **** near ten years. I don’t need to be babysat to do my job. Seriously feels like an invasion of privacy. I don’t need big brother freaking the **** out every time I take a drink of water.
We need to ban these things. Or at least limit use. Putting them in a truck with a driver who is a repeat offender? Sure. That makes sense. But someone who has no incidents has to be watched? Yeah **** that.
Looking at leaving my spot currently because I’m just fed up. I work too many hours for too little pay to be watched like a hawk.
With the rise of AI software, what can we learn from Tesla’s driver facing camera system and large amounts of data and apply it to the future of cameras in trucking?
Benefits of in-cab forward facing cameras
Before we talk about Tesla’s Full Self-Driving system, let’s quickly understand why these cameras are not going away for medium and large fleets.
First, they can help lower insurance premiums.
No matter the size of your fleet, the most important factor affecting your insurance premiums is your claim history.
According to Andrew Walker, Safety and Environmental Manager at Shipley Energy, “If claims dollars go down, your premiums will go down, eventually. Not having accidents is what gets your insurance premiums reduced.”
Accidents will inevitably happen, but proactive safety is vital in proving you are taking safety precautions to prevent accidents from happening.
The goal for carriers is figuring out what tools to implement with the goal to reduce claims.
These tools include:
- Inward and outward facing cameras
- Establish a data-driven driver training program
- Utilize ELD data for driver safety reports
- Invest in a third-party proactive DOT compliance program
Automated driver coaching, and customizable dashboards, alerts, and analytics allow managers to implement “high-touch” safety programs without inflating overhead or taking up all their time.
Second, driver-facing cameras can reduce litigation risk.
An estimated 70 percent of accidents involving a large truck are not the fault of the truck drivers but by the non-professional driver.
Driver-facing cameras are touted to be beneficial in proving a driver’s innocence in case of an accident.
- Driver-facing cameras reveal what the driver was doing behind the wheel
- Video footage can pinpoint roadside and in-cab conditions at the scene of the accident
- Should a case go to court, footage can establish responsibility or help exonerate the driver and trucking company, which can reduce the amount paid in a settlement and insurance premiums
So, now that we understand a bit more about the benefits to driver-faced cameras, what can we learn from Tesla?
AI based camera systems can learn risky behaviors but rely on large amounts of data
Okay, hear me out.
Tesla is the data king when it comes to AI driving tech.
Today, there are approximately 400,000 vehicles in North America with Tesla’s FSD system and over 150 million miles driven since the FSD Beta program was launched in October 2020. This is a huge jump from the 160,000 users announced in September 2021. The previous version of Tesla Autopilot saw over 3 billion Autopilot miles.
The way artificial intelligence learns is through data, but also there are certain calculations that allow us to expedite learning, and that expedited-learning function is similar to what Tesla builds into its self-driving systems, which is what allows them to learn so quickly.
To do this safely, Tesla installed driver facing cameras (first in the Model 3 in 2017 and Model Y in 2021) to make sure drivers are paying attention to the road when they use Autopilot, the company’s driver assistance system.
In April 2021, Tesla computer wiz @greentheonly on Twitter posted several videos of what the in-cabin camera records along with the data showing what the system “sees” when analyzing the driver’s face.
The goal is simple, to make sure that the driver is not distracted behind the steering wheel. But how does it prove this?
Remember those annoying nagging notifications the trucker was complaining about? Well, it’s the same in Tesla vehicles.
To limit this nagging system, Tesla is trying to understand what is or leads to drowsy driving.
Tesla’s operating system tracks data on a driver attentiveness subroutine that uses the forward-facing camera. The data being tracked included:
- Phone use
- Driver eyes nominal, closed, up or down
- Driver looking left or right
- Driver head down
- And more
Many FSD beta testers detail the 1:7 Rule, where if your eyes are looking away for longer than seven seconds, you will eventually receive a warning to pay attention to the road.
It is important to note that the Tesla owner manual does state that the “cabin camera does not perform facial recognition or any other method of identity verification. To protect your privacy, cabin camera data is not associated with your vehicle identification number.”
This data provides not only accountability but security. In bringing a camera-based driver attention system online, they can improve their driver attentive system over time, which we actually just learned is happening.
More recently, in a @greentheonly tweet thread about a recent software update, Green said: “They are now tracking additional things like how many yawns the driver had recently, how many blinks and how long they were, leaning. All this is to calculate how drowsy the driver is.”
Today, these cameras will start track additional data, such as:
- How many yawns the driver had recently
- How many blinks and how long they were
Then, the software will compare it to how centered the driving is, how many lane assist warnings and corrections happened lately, and add support to recognize when drivers are looking at mirrors, car controls, etc.
However, it is easy to drown in a sea of data, so driver-monitoring systems must pick out the important events to report.
Tesla’s goal is to use their AI software to calculate how drowsy the driver is and help reduce those driver warnings.
Elon Musk has also commented that the cabin-facing camera is used to identify the driver and front passenger’s body position during an accident and optimize the firing of the airbag in real time.
Whichever AI camera tracking system that cracks the drowsy driving code can pass this information to NHTSA or apply it to other systems in the trucking industry, such as Netradyne’s camera AI system, Motive, Omnitracs, and more.
Ultimately, this can lead back to major improvements in fleet safety, automakers can eventually improve their ADAS technology so all vehicles on the road can be notified of drowsy driving, and commercial insurance claims can decrease, improving insurance premiums.
The transportation industry is on the cusp of dramatic safety improvements due to the large amounts of data being tracked. It is just a matter of time.
Proactive driver training lowers distracted driving and insurance risk
The second step is regular driver training.
Underwriters look at many different risk factors and data provided to establish the potential risk levels and determine your insurance score, one of the primary factors in determining how much monthly insurance premium the consumer will incur.
After this analysis and determination of an insurance score, underwriters can provide a monthly premium estimate.
Teaching a driver through regular online training, such as “the hazards of distracted driving” or “how to do a pre-trip and post-trip inspection”, can help you keep your safety risk low.
Some of the easiest things to catch during a driver inspection are also the most common violations written up on a roadside inspection.
The more this information is top-of-mind for drivers, the better the chance they may drive safely.
Our DOT trainers offer a variety of in-person or online training courses tailored to the specific needs or weaknesses of your company and drivers.