AI Expert Explains: How Does Artificial Intelligence Work To Improve Logistics and Delivery?

AI Expert Explains: How Does Artificial Intelligence Work To Improve Logistics and Delivery?

What exactly has AI’s impact on logistics been, and where will AI continue to improve the last mile? And what does this innovation mean for a company like yours, involved in delivery?

Luckily, we have our resident AI expert, Jim Barnebee, who is also our VP of Infrastructure. He’s been digging into components of AI for over 15 years so that we all don’t have to.

It couldn’t be a more important topic. 98 percent of logistics companies have said that data-driven decision-making is essential to their future success, and Big Data and Artificial Intelligence go hand in hand. AI in logistics is also where companies can get a lot of bang for their investment—over half the cost of all delivery logistics is contained within the first and last mile of the logistics supply chain.

Recently Jim appeared on two podcasts, where he answered lots of big questions on our mind related to AI’s impact on and logistics. Here’s a summary of the questions he was asked:

First, what does every company, especially smaller companies, wanting to up their logistics game need to know?

First, the fundamentals. Perhaps the biggest leap forward that’s already taken place for logistics companies is the cloud. It takes an enormous amount of computing power to constantly calculate and recalculate the best route from Point A to Point B depending on changing conditions. Fortunately, small companies do not have to invest in that cloud infrastructure or AI expertise because they’ve been able to hire a Platform as a Service (PaaS) or Software as a Service (SaaS) such as GetSwift to handle that power for them. That’s been a major step forward, already, for logistics and delivery companies.

Logistics companies have invested big in robotics automation, especially in the crucial and costly first mile of delivery. Think of warehouses, autonomous drones, and self-driving vehicles, all of which have grown in importance during the Covid-19 pandemic. Investing in these areas can push margins higher and so there’s been big strides here.

What are the areas where AI can give you the most bang for your buck in last mile delivery?

Over half the cost of all delivery logistics is contained within the first and last mile of the logistics supply chain. Each company will want to look at ROI. They should ask, “What are the areas or components that would improve their last mile delivery journey and yield the biggest ROI for customers?” It turns out the answer is usually in improving Dispatch, Tracking, Route Optimization, and Customer Notifications.

Can you go through each of these areas, one by one, and talk about AI’s impact? 

Dispatching is being optimized using AI in determining the best way to make sure the best person or vehicle or even company gets an order for delivery. Many factors may play into these types of decisions, other than the obvious closest person working to the pickup and drop-off.

Such things as who has capacity to move the item, who has a more fuel efficient vehicle, who has to pay tolls, even pay or benefits differentials or partial load optimization (like sharing of space) can play into these decisions.

In Tracking, optimizations such as how often to request a location, status, what kind of information to track, etc., can all play into decision making. Some companies will improve fuel efficiency or driver alertness by tracking how many miles at what speed drivers log between rest breaks. Or perhaps if GPS directs the shortest route but that route has tolls, it might be cheaper once the tracking is optimized to take a little longer but pay less in tolls using more efficient vehicles.

A lot of work has gone into Route Optimization. In last mile delivery, a company has to move people or goods not just from Point A to Point B, but from Point A to Point F, with stops and changing conditions along the way. Because of AI, there is a way to pipe in real-time data–traffic, weather conditions, construction, and more—along with dynamic changes such as added stops—into an algorithm that can continuously update the route to make it the most efficient. There may actually be 300 or 400 changing conditions along the way, and there is a race in the logistics and delivery industry to integrate those changing conditions with your systems to come up with the most efficient routes.

Now, you can also think about how business analytics can improve your route. Let’s think of a big trucking company that delivers 10,000 packages a day in New York City. How many packages get delivered quickly, by how many trucks, to how many places, crossing how many bridges, and what’s the toll cost? Something as simple as: if all of your trucks take a different bridge, which actually increases your routing time, but all of the tolls are 50 cents less. You’ve taken an extra 30 minutes to deliver the package, which may not cost you anything in customer satisfaction but it’s saved you $5,000 a day in toll fees. That’s the kind of thing you can do when you have access to big data, giving you the business impact of making these kinds of decisions.

Customers expect to know what’s happening every step of the way with their orders now. Being able to provide tracking of packages has become standard for large companies in most circumstances, and cloud platform providers have made this capability available to smaller businesses who could never supply it on their own.

Wrapping Up

That’s just a snapshot of what Jim’s team works on. To hear more about AI’s impact on logistics and delivery industry, and how Jim sees AI impacting the present and future of logistics, check out Jim’s appearances on these two podcasts:

Photo: cottonbro via Pexels

Making AI Smarter by Eliminating Flawed Data  Jim Barnebee, GetSwift’s VP for AI and Infrastructure, was granted a US Patent for his pioneering work with knowledge graphs.

Making AI Smarter by Eliminating Flawed Data Jim Barnebee, GetSwift’s VP for AI and Infrastructure, was granted a US Patent for his pioneering work with knowledge graphs.

Artificial intelligence algorithms require gobs of data. With each introduction of new and varied data sets, machine learning applications mature. The better and more comprehensive the data, the better the outcome.

But what happens when the training set that informs AI is corrupted with flawed data?

Not great things, as it turns out. That’s why Jim Barnebee, GetSwift’s Vice President of Artificial Intelligence, and his former colleagues from IBM’s Watson Group invented a system to allow machine-learning applications to interpret the quality of new data through a process called veracity. The US Patent and Trademark Office granted Jim and his three colleagues from IBM a patent this past September.

The patent solved several of the most persistent problems in the world of knowledge graphs, massive data and Ai, like the need to continually retrain machine-learning systems to distinguish between unique stores of data.

Not all data is equal because of its location or time of creation, and for systems that consume data like rocket fuel, that’s problematic. That’s why Jim and the team created the concept of veracity to allow a machine learning system to vet the truthfulness of datasets and, in turn, determine when knowledge graphs should use those datasets to get updated with more reliable data.

For businesses, the applications of veracity are enormously important.

Consider: if your company sells shoes around the globe, you have massive amounts of data about those sales and the supply chain that feeds them in every location.

Aside from the language and cultural barriers, you have strong and weak markets and varying levels of data quality. Yet you desperately want your data to tell you which models of shoes are flying off the shelves, where are data anomalies, and where there’s possible theft or fraud. AI can be trained to search for these and other anomalies once the data is made available and truth versus non-truth can be defined and replicated.

And to make this pattern completely auditable, the patent also includes the use of blockchain to create a transparent and accountable record. Blockchain is a better audit path than human memory.

This is much like the way humans learn, absorbing information as we grow and experience the world, replacing earlier constructs with better ones that form our understanding going forward.  This process—that is, constantly improving knowledge graphs—has been eluding AI researchers for quite a long time.

You can view the full patent here.

Jim’s training in both ontology (which, in computer science, is essentially the representation of the knowledge from a set of concepts) and blockchain helped the team technically solve the twin problems of veracity and memory. A former DARPA developer with groundbreaking work on Java as a language and encryption tool for the US Navy, Jim worked on ontologies and knowledge graphs for years before joining IBM Watson. Previously he served as the ontology evangelist at Unisys, led custom ontology work at Orbis, and also founded and moderates one of LinkedIn’s largest ontology special interest groups.

Jim continues to use his expertise in AI as a VP at GetSwift, alongside a uniquely talented (and growing) team assembled by CTO Dennis Noto. Jim and the team are using AI to transform cloud-based, real-time delivery management software into the next generation of computing. To stay up to date with the latest developments at GetSwift and apply for future opportunities, follow GetSwift on LinkedIn.