A new behavioural artificial intelligence (AI) system claims to prevent more than 90% of crashes by predicting the actions of pedestrians and other road users and alerting the driver to a potential incident.

Behaviour AI is available as a dashcam after-fit solution but the company behind the system, Humanising Autonomy, hopes it will be offered as standard fit within the next two to three years, linked to advanced driver-assistance systems (ADAS). It is already in talks with several vehicle manufacturers.

Co-founder and chief product officer Leslie Nooteboom said the system helps autonomous systems “to understand people”. It predicts what other road users will do and alerts the driver. It also provides the fleet with the data about any near miss (as well as any incident), which helps with driver training.

The system has been installed on Arriva buses in London for more than a year while trials have just started with a van-based fleet operator through Humanising Autonomy’s partnership with FreightLab, an innovation challenge between Transport for London (TfL) and 10 industry partners including Royal Mail, DPD, UPS and John Lewis Partnership.

Behaviour AI scans the road for vulnerable road users and predicts their next movement. Nooteboom claims it can detect objects up to 80 metres ahead at 40mph, twice the distance of the industry standard, with 99% accuracy.

“It’s not just recognition; it predicts if they are going to cross the road or where they are moving to on the road,” he said. “Our data shows that the system can predict two seconds before the driver realises what is happening and send them an alert.

“It is also continuously learning and not just from one driver, from the entire fleet because all the data sits within our management system. We believe it can prevent around 92% of accidents.”

The system is frequently updated to take account of new forms of mobility, such as e-scooters.

“They were a problem because they (riders) weren’t moving their limbs, but they were moving fast. So, we had to adapt the model to detect e-scooters,” Nooteboom explained.

“This is where we add intelligence to the system so it machine learns and applies that to the real world. It doesn’t just learn from the data we give it; it’s an interpretable AI approach. We can see how it’s making decisions, so we know how sure it is about making those decisions. Then we know what we need to do to make it even better, which we do with over-the-air updates.”

The venture capitalist-based company has its own data to support its claims about accident reductions; within the next six months it will have analysed copious amounts of real fleet data to provide robust proof.

This includes using near-miss data for driver training purposes.

“We take the video of the near miss and extract information on the risk level for driver education, even where there was no accident,” said Nooteboom.

“We also provide the information for the insurance company so they can understand the lead up to the accident.”

The pricing structure has yet to be set, but simply preventing one crash would justify the investment, according to Nooteboom.

He added: “We believe we are first to market with this solution. It uses small chips so it can be used now. The competition is the predictive AI in the full autonomous vehicles.”

SOCIAL DISTANCING SOLUTIONS

Humanising Autonomy is playing a part in improving the safety of passengers and staff at public transport interchanges through a partnership with Transport for Greater Manchester (TfGM).

The company’s behavioural video analytics software was deployed last year to help TfGM understand social distancing issues, assess how passenger behaviour is affected by the Covid-19 pandemic and make temporary infrastructure changes.

By analysing CCTV footage, the system enabled TfGM to re-route public transport to hot spots where lots of people were waiting for a bus.

The data will also be used to better understand the capacity for walking and cycling, and how space is used at transport interchanges.