Quick Transport Solutions Inc.

What Technologies Underpin Autonomous Trucks? Part I

Autonomous trucking technology has come a long way. A lot of work goes into making sure a large class 8 vehicle can safely dive itself. All it takes for a human to learn is about two-weeks. But turning a big rig over to a computer takes millions of miles and billions of dollars before we put a machine behind the wheel. Many struggle with the question: Is driving a big rig just super complex or computers simply not up to the task? The answer is a combination of both questions.

Humans are special because we can quickly adapt to situations on the fly. When something unexpected happens, the computers in our heads are able to make split-second decisions based on millions of outside factors. Getting a machine to this point has proven a much harder task for machine learning and artificial intelligence professionals. A computer must be instructed on how to proceed through written lines of code. But can written lines of code account for every contingency?

How Do You Teach Computers Autonomous Driving?

Take one example of a car passing a big rig before the big rig initiates a lane change maneuver. Managing these scenarios does not take a lot of thought for a human, but for a computer, it needs specific written instructions to safely complete the maneuver.

Initially writing down the code that drives the instructions on how the truck should respond is initially quite difficult. You’ve got to cover specifics like how many miles an hour or meters per second each vehicle is traveling, then each change in speed and position must be accounted for before any big moves are made. The computer must be able to determine overall interaction and intent without any human input.

Some companies are gaining insight from open-sourced information already on the web. For instance, Google provides information regarding how their machine learning hardware interacts with the environment. Google Nest can control many different aspects of the home, but how does it do it. Studying how Google does it allows trucking and autonomous OEMs to extrapolate the principles for use in their own autonomous programs.

Comparing Human and Machine Thinking in the Moment

Human truck drivers hardly think twice when it comes to making a maneuver. Still, even though we consider ourselves high-functioning thinkers, our reasoning is still based on flawed data. In the best circumstances, a human driver can estimate the closing speed of a vehicle coming up from behind and gauge whether there is enough space for a maneuver. These guesses are nothing more than that: conjecture based on past experience.

A computer, on the other hand, is built-in with some automatic advantages. A computer driver will know precisely how far the car is traveling and how many seconds it will take for that car to close the gap. As a result, the computer can make split-second decisions based on precise calculations. And it isn’t that the computer is overthinking the situation. Consider that programmers are programming the computer with data-based intuition. This differs from pure logic. This way they try to get the computer to behave like a human, but without the possibility of human error.

One of the companies at the forefront of this evolution is Waymo, which offers an advanced generation lidar and camera system. Their system is designed to provide high resolution. to enable fully autonomous driving and make split-second educated guesses about pedestrian intent based on which way the pedestrian happens to be leaning or which direction they look. Their system can also see inside someone else’s car and quickly understand what the other driver is going to do based on the movement of the other driver’s limbs or the direction their face is pointing.

Using Key Points to Ascertain Human Intent

Waymo’s systems, including many other companies working in this space, involve what they call “key point” studies. Their autonomous system, based on a series of sensors, lidar beams, and cameras, can see which direction a pedestrian is facing and whether or not they are leaning towards the path of the vehicle. Are they leaning forward or are they leaned back and relaxed?

Key points are defined by body joints or specific limb movements. They even have created a specific pose estimation algorithm. The autonomous driving industry has been working on these pose estimation algorithms in an open-source method for many years. They use animation cartoons and create realistic video game characters, then program them to make real-life movements.

As they program the computer driver, the autonomous machine draws upon the movements and actions of the virtual human to decide how it should react. Sufficiently advanced autonomous systems will eventually be able to recognize partially occluded objects, whether it be a person hidden between two vehicles or behind a sign. The goal is for the computerized driver to properly plan a person’s moves based on what they’ve been taught in the virtual world.

What is Visual Perception Technology?

No matter how impressive all this sounds, nothing created through machine learning has yet been able to come close to a human’s visual perception. The situational analytics capabilities of the human mind, aided by the visual spectrum, is a wonder of nature.

Our eyes can quickly focus on a point whether mere feet away or miles away. And thanks to life-long learning and evolutionary adaptation, humans can interpret visual inputs and make accurate determinations fairly quickly and accurately. We make decisions based upon visual input and don’t give that decision a second’s thought.

But how will automated systems make the same determinations? Without eyes and a supercomputing brain, autonomous technologies collect pertinent information using other systems. These includ cameras, sensors, radar systems, and lidar, which includes a series of light detection and ranging systems. They round out these systems with thermal imaging and infrared sensors. With so many systems combined to create an all-around picture, the hope is that these systems will come close to human perception.

Advances in Visual Perception Technology

Companies have been making huge strides in advancing the capabilities of visual perception technology. And due to these advances, in some ways, visual perception technology is even better than our eyes. Why? Imagine a system that combines infrared and thermal imaging technology. Human eyes cannot see through fog or smoke. It is very difficult for truck drivers to properly spot a body, animal, or other obstruction when visibility is very low. Visual perception technology equipped with sufficient technologies can do this without a problem.

In fact, thanks to recent advances in the field, visual perception technologies can provide highly accurate visual information from just a few feet to up to 500 yards away. And this type of visual acuity can be accomplished in bright daylight or in the dark of night. They do this through a clever combination of visual systems set up and controlled by an A.I.-driven central processing unit.

Lidar specifically has been a popular choice among autonomous vehicle and technology OEMs. By sending out constant pulses of light, Lidar systems can work up to 500 feet and beyond. Lidar systems give an accurate window into what’s happening down the road. They leaving adequate time to adjust, whether to slow down or change direction.

Cameras and Radar Are Still in Use

Camera resolution has been improving and costs have been dropping dramatically. While lidar is the primary forward-facing vision source on commercial motor vehicles, cameras and radar have made big strides. Smaller and more sensitive cameras provide capabilities trucks could not rely on even a decade ago.

By combining systems like lidar, cameras, sensors, and machine learning, autonomous commercial motor vehicle technologies might almost be ready for prime time. Still, you can bring all the technology you want and install it into big rig CMVs, but they be ready?

Truck drivers shouldn’t worry. Interactions with human truck drivers and other road obstructions happen almost daily. Human truck drivers are still better at predictable analysis. We still haven’t reached a point where computers can completely displace human operators. No matter how close we get to an autonomous driving world, don’t worry yet. It’s going to be a long time before truck drivers should worry bout being put out of a job.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

0
Would love your thoughts, please comment.x
()
x
About QuickTSI

QuickTSI is your one-stop-shop for everything you need to run your transportation and freight logistics business. Our website allows you to post loads or find trucks, post trucks or find loads, look up carrier profiles, view trucking companies, find truck driving jobs, and DOT medical examiners.

Mailing Address

Quick Transport Solutions, Inc.
11501 Dublin Blvd. Suite 200
Dublin, CA 94568

Contact Us

510-887-9300
510-284-7280

Terms & Conditions    Privacy Policy

Cookie Policy    Content and Data Usage

© 2011-2024 Quick Transport Solutions Inc.