Goodyear’s AI tire concept can read the road and adapt on the fly

Goodyear is thinking ahead to how tires – yes, tires – might change as autonomous driving technology alters vehicle design, and as available technologies like in-vehicle and embedded machine learning and AI make it possible to do more with parts of the car that were previously pretty static, like its wheels.

Its new Eagle 360 Urban tire concept design builds on the work it revealed last year with its Edge 360 concept tire, which re-imagined the tire as a sphere, capable of providing autonomous cars with the ability to move in any direction at a moment’s notice, without having to worry about turning arcs and altering the angle of wheels.

This new Urban version, unveiled at the Geneva Motor Show going on this week, adds a “bionic skin” to the mix, which includes embedded sensors throughout to detect things like a change in surface type for the road itself, say from asphalt to packed dirt, or to take note of things like snow and rain.

[youtube https://www.youtube.com/watch?v=KAdw09M-F-g&w=680&h=383]

Using data collected by these sensors, the tire could then activate built-in actuators to change the shape of the surface and tread of the tire. It’s a bit like how electrical signals tell your muscles to change shape when you flex or grip, but all done at the behest of onboard virtual intelligence telling the tire what shape will best help it maintain traction and control given the current state of the road beneath it.

Imagine a future where you don’t need winter tires even if you live in Minnesota, or where your Tesla adapts to have racing slicks when you’re taking it out for a track day automatically. Intelligence and adaptability built into a tire is also another way that autonomous vehicle makers can take risk out of the equation for the most unpredictable parts of driving, including how weather impacts the road.

Techcrunch event

Disrupt 2026: The tech ecosystem, all in one room

Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400.

Save up to $300 or 30% to TechCrunch Founder Summit

1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately

Offer ends March 13.

San Francisco, CA | October 13-15, 2026

Localized AI applications like this are made possible by a growing category of embedded processor and GPU options from companies including Nvidia that make it possible to enact machine learning “at the edge,” or where sensing actually happens, instead of having to shuttle that information back to big data centres and then out again.

Topics

, , , , , , ,
Loading the next article
Error loading the next article