Auctions and inspection companies are coupling traditional hands-on vehicle inspections with imaging technology to create condition reports that give customers better visibility and confidence about a vehicle’s condition. 

Those companies are also using and/or testing artificial intelligence and monitoring advances in machine learning.

Having more transparency in condition reports is important because buyers and sellers are increasingly making financial decisions about vehicles by opting to click tires remotely rather than kick them at a physical location.

Consider this: Just before the first wave of COVID hit in 2020 which forced Manheim to go to digital sales only, more than half of its sales were to digital buyers, said Brad Burns, Manheim associate vice president of vehicle information/digital.

But more telling is that though Manheim’s sites are now fully open to in-lane buyers, digital sales still hover around 80%, Burns said.

“It’s a big change and the fact that the percentage of digital sales settled there tells us that clients have realized there are some great advantages to transacting digitally,” he said.

“We are trying to provide the same amount or more confidence for clients to transact digitally, as it would be if they were standing next to the car.”

Trust and transparency

Doug Hadden, vice president of field initiatives at ACV Auctions, a digital wholesale marketplace, agrees.

“Condition reports today have to have true trust and transparency; they have to be a representation of what that car truly is,” Hadden said. “So when I make that buying decision based on what I’ve read in that condition report — the pictures, the data, now more than ever before, we do OBDII scans. I need to know as a dealer what I’m getting myself into.

“What I don’t want as a dealer? I don’t want a surprise when that car shows up.”

One effort to keep such surprises to a minimum and adopted by many auction companies is imaging technology that takes multiple, rapid fire photos of a vehicle’s undercarriage.

Technology creates a single image of the undercarriage revealing a visual of its true condition, Hadden said.

“You can see underneath that car and you can see it in digital clarity,” he said. “You can see that there is a hole in the muffler or those bushings don’t look too good, or wow, that thing looks brand new. You can actually see that sitting in your chair and that car was never on a lift.

“That’s the kind of stuff that excites me about the future.”

Imaging gantries

Manheim also has technology that provide images of vehicle undercarriages and is testing the use of imaging gantries at a “handful of locations,” Burns said.

To conceptualize an imaging gantry, think of a vehicle driving through a tunnel equipped with consistent lighting and 58 cameras of exacting elevations and angles no matter where the vehicle is in the tunnel, Burns said.

As a vehicle drives though, the cameras capture high-resolution images and then “puts together an image-set like no other” Burns said.

“You’ll be able to see everything from a half-inch scratch to reading manufacturer’s data on the side wall of the tire. These cameras are shooting five frames per second. Think about how much data that is,” he said

“It really helps clients make confident decisions about what they think the condition of the vehicle is.”

The pilot started at Manheim Minneapolis at the end of 2021 and has expanded to Manheim Atlanta, Manheim Tampa, Manheim Phoenix and Manheim Ohio. A larger scale roll-out to other Manheim locations is scheduled for 2023, Burns said.

Human touch

Though technology is making it easier for buyers to see vehicles up close and personal remotely on their computers or smart phones, images tell just part of a vehicle’s story, said Eric Widmer, senior vice president, sales and marketing at Alliance Inspection Management, an independent vehicle inspection provider.

“You still need a human in the loop to clarify what’s being picked up by the AI and the algorithms,” he said, referencing his company’s 300 to 350 inspectors who physically examine and inspect vehicles for clients such as rental car companies, dealers and financial institutions.

“Our guys have been trained to understand if a car has been hit and what’s been repaired. We have ongoing training. We’re looking at all the solutions that have been created in terms of AI and damage detection to make the inspector’s job easier, but we haven’t found one yet that you can use instead of an inspector.”

The inspection itself takes about 50% more time now than it did 20 years ago because vehicles  are more complex and have more equipment that has checked and recorded, Widmer said. It takes about 45 minutes depending the vehicle’s condition and age, he said.

How electric vehicles will impact inspections and condition reports is unknown, but change is a certainly, Widmer said.

“You won’t look at oil sludge, you’ll look at battery life,” he said.

“We’re trying to figure out what kind of information is going to be available and what’s important to the buyer and seller in terms of the transaction and what kind of value are you going to put on it.”

AI: consistent and objective

Jim O’Brien is general manager of the North American region and global remarketing head at Ravin AI, which uses artificial intelligence to inspect vehicles by using mobile phones and CCTV cameras.

He is watching how AI and machine learning will impact inspections and condition reports in the not too distant future.

As AI continues to evolve, it will eliminate human subjectivity, add consistency and may eventually allow people who are not professional inspectors, including consumers, to conduct vehicle inspections by using a device to scan each part of the vehicle, he said.

“I think sellers are very interested in a more consistent and objective process,” he said.

He said it may take a while for users to fully adopt the technology because it will need to be trained, similar to the way robot vacuum cleaners “learn” to clean a home’s floors.

“The first few times that robot is going to get stuck on the rug, it’s going to want to run over a cord and not get around a chair,” O’Brien said. “You’ve got to tell the robot, ‘here’s how I want you to clean my rug’. You’ve got to train the robot to work in your environment and that could take six to 12 months.

“The (companies) that don’t want to go through the training process are the ones that are going to be behind.”