The vehicles of the future must not only be able to 'see' but also 'feel'

  

That’s the view of Boaz Mizrachi, founder and CTO of Tactile Mobility, a sensing and data company for AVs, who believes that driverless vehicles must be able to not only “see” the road via intelligent vision systems, but must also “feel” the grip, bumps, curvatures, hazards, inclement weather etc. under their tyres.

“By 2021, many are predicting that we’ll see the launch of level 3/4 AVs that are capable of highway speeds. I don’t see it happening so quickly, and particularly not just with vision systems. Although these capabilities are required to realise an AV, these alone will not be sufficient,” he explained.

Road conditions and speed present two major challenges for AVs, according to Mizrachi. He believes that right now, AVs are not advanced enough to react to certain hazards – for example an aquaplane. To manage this, he points to a type of predictive risk assessment in the form of “crowdmapped tactile (road conditions) maps”.

Tactile Mobility believes it has a solution in the form of its Surface-DNA technology, which it currently implements in a fleet of ‘ordinary’ (that is, not autonomous) vehicles. This software gathers and analyses data about the road surface and uses this data to generate these ‘tactile maps’.

“The software sits inside the electronic control unit (ECU) and connects to the existing sensors found inside a vehicle. We apply physical modelling and machine learning in real-time to create meaningful information,” explains Mizrachi. “The system receives approximately 3000 messages per second, which are processed in the car, and the results are transmitted to the cloud.”

At this stage, the company provides ‘after-market’ smart devices that are installed in fleets to provide the drivers and the fleet managers with meaningful insights about the vehicle and the surface it drives on.

The company is now looking to the AV market, proposing a constantly updating worldwide map that transmits the roads’ surface conditions to the vehicle. The idea is that the system will be able to intelligently assess whether there is a risk and, if so, how much of a risk it is, so the vehicles can react appropriately e.g. slow down, without human interaction.

So far, the project has been a 7-year venture, but Mizrachi hopes for it to become a commercialised reality in 2 more years. The company is therefore seeking to collaborate with OEMs in order to realise this dream.

Over the last few years, the company has also overcome some tough challenges. For example, developing a ‘language’ that can represent the tactile sensing of the road. “This language can be used to describe what one car experienced on a specific road segment and share it with the other cars. In turn, the other cars can drive on the same road and know what to expect,” he explains.

But there are challenges still to be faced. “In order to share tactile information, we need to normalise the sensed data with the parameters of the sensing car. This way we have a nominal formalised way to describe the world’s road surfaces,” he explained.

“This information is being taken from a variety of vehicles and will not always be consistent with one another. For example, one vehicle’s sensors may indicate a slippery road because its tires are worn, rather than the road actually being wet.”

He adds that along with “surface DNA” i.e. road condition, the Tactile Mobility system will also provide “vehicle DNA”. In other words, data that stipulates car condition and is tailored to that specific brand. He foresees this being a useful maintenance tool in the future of automotive, as well as for smart vehicles of today.

As for fully autonomous vehicles, he says it’s hard to predict when these will materialise. He envisions strict regulations that relate to the AVs’ capabilities emerging, for example, AVs restricted to a certain speed, certain roads, or certain times of day or length of times, which he says will gradually become freer as the technology’s safety and reliability improves.