A new system tracks the elbows and wrists of drivers and sees how fast they can control autonomous vehicles in emergency situations.
Recently, researchers have developed a new technology that can track the hand movements of a driver who is not paying attention, and calculate how long it takes the driver to control the car in an emergency.
If the manufacturer can overcome the final legal obstacles, then one day, L3 self-driving cars can be safely escorted to their destination while passengers are sleeping and texting. However, these cars need a way to understand how fast the driver can respond when controlling the vehicle in an emergency.
To meet this need, Kevan Yuen and Mohan Trivedi of the University of California, San Diego developed their new manual tracking system, which was described in a study published in the IEEE Journal of Intelligent Vehicles on November 22.
Although it may sound simple to track someone's hand, it is difficult to do it in a small car, because there are few golden places where cameras can be placed. The driver's hand may also be blocked by other objects such as strong sunlight on the driver's arm, and the camera may be blocked.
In their new method, Yuen and Trivedi have adopted an existing program that tracks the whole body movement of the human body and adjusted it to track the wrist and elbow movements of drivers and passengers. It distinguishes the left and right joints of the two passengers in the front row. The researchers then developed and applied machine learning algorithms to train the system to support L3 autonomous driving technology. They trained the system with 8,500 annotated pictures.
Trivedi: "This method can be used in a very wide range of real-world driving environments. No matter how many passengers or vehicles, it can detect hands, locate and analyze activities with high accuracy and efficiency.
Their analysis showed that the system was able to identify the positions of eight joints (the right / left elbow / wrist of the passenger and driver) with an accuracy rate of 95%. However, when estimating the average length of a person's arms, the system has a 10% positioning error.
But in some cases, the tracking system is not working. For example, when the driver is wearing a bizarre costume with a heavy artistic tattoo, this is not reflected in training, and when the driver's arm blocks the camera. The sight of the other arm.
The researchers said that they encountered some problems in the test, but can change the camera position to avoid it, and can use multiple camera views to increase the training data set so that it can identify more clothing.
"This project is part of our research efforts in developing safe autonomous driving," Trivedi said, adding that the team is discussing the possibility of commercializing the technology with at least one potential customer.