Article

Fluid Dynamics Powers Faster, Smarter AI in Traffic, Climate and Drone Systems

DATE: 6/2/2025 · STATUS: LIVE

From early gamer to Biden-honored AI innovator, Rose Yu uses fluid ideas to tame urban traffic, climate chaos, and drones…

Fluid Dynamics Powers Faster, Smarter AI in Traffic, Climate and Drone Systems
Article content

Rose Yu has applied fluid‐dynamics ideas to advance neural‐network methods that forecast traffic patterns, improve climate simulations and keep drones steady in flight.

When Yu was ten, she received a computer from her uncle—a rare gift in China at that time. She spent her early years playing games before turning to creative projects. By middle school, she had earned an award for web design, marking the start of a series of honors in computing.

Yu went on to study computer science at Zhejiang University, where she captured a prize for inventive research. She then moved to the University of Southern California (USC) for graduate work, encouraged by the fact that her uncle was at the Jet Propulsion Laboratory in nearby Pasadena. In 2017 she earned her PhD with top‐dissertation recognition. In January, in one of his final acts as president, Joe Biden honored her with a Presidential Early Career Award.

Now an associate professor at the University of California, San Diego (UCSD), Yu heads efforts in “physics‐guided deep learning.” She embeds known physical laws into artificial neural networks, yielding new methods for constructing and training these models. Her team has tapped fluid‐flow concepts to sharpen traffic forecasts, accelerated turbulence simulations to deepen hurricane insights and built tools for tracking Covid-19 spread.

Yu aims to roll out a suite of AI research assistants she calls AI Scientist. She sees a collaboration in which human investigators and physics‐driven AI agents join forces to uncover fresh scientific results. She believes gathering ideas from a group of such digital assistants could raise the pace of discovery.

Quanta spoke with Yu about turbulence in its various forms, ways to boost AI’s power and how smart models might ease congestion. The conversation has been condensed and edited for clarity.

When did you first explore combining physics with deep learning?
It began with traffic. Back in grad school at USC, I lived near the junction of I-10 and I-110 in Los Angeles. Getting around meant crawling through jams, which got me thinking in 2016 about whether I could make a difference. Deep learning was all the rage then—everyone was applying multilayer neural networks to static images, for example. I asked myself if the same approach could tackle phenomena that evolve over time. My colleagues and I devised a fresh way to set up the problem.

What did you try first?
We pictured traffic flow as diffusion, analogizing cars on roads to fluid moving across a surface, governed by fluid‐dynamics laws. Our key twist was to map this flow onto a graph—a structure from graph theory. Each sensor that records highway speed became a node, and the roads (with their distances) formed the edges. A graph at a single moment shows average vehicle speed at each location. Stitching together snapshots taken every five minutes spells out how traffic changes. That setup let us forecast future conditions.

How did you train your network?
Big data is vital. Fortunately, one of my advisers, Cyrus Shahabi, had collected an extensive repository of Los Angeles traffic records over many years. I tapped into that rich dataset for training.

How accurate were your forecasts?
Previous models gave reliable traffic predictions for about 15 minutes. Ours extended that window to an hour, a major leap. Google Maps incorporated our code in 2018 and soon invited me as a visiting researcher.

Is that also when you started work on climate models?
Yes. In late 2018 I spoke at Lawrence Berkeley National Laboratory. Some scientists there and I decided to test physics‐guided deep learning on turbulence forecasting—a crucial yet uncertain aspect of climate models.

Turbulence shows up in swirling patterns you see when stirring milk into coffee. In the ocean, such eddies can span thousands of miles. Conventional approaches solve the Navier-Stokes equation, which describes fluid flow. Those calculations yield high‐quality predictions but at a pace too slow for forecasting hurricanes or tropical cyclones in real time.

Could neural nets speed things up?
That was our hope. We trained deep networks on top‐tier numerical simulations so they could learn to “emulate” those calculations. Without grinding through long computations, the networks pick up hidden patterns in the data. In two‐dimensional tests our method was 20 times faster. In three dimensions it ran about 1,000 times faster. A module like this could plug into larger climate frameworks to improve severe‐weather forecasts.

Where else does turbulence matter?
Almost everywhere. In blood vessels, turbulent flow can trigger strokes or heart attacks. During my Caltech postdoc, I coauthored a study on stabilizing drones. Propellers stirring air near the ground create unpredictable swirls. We built a neural net that models those air currents and feeds into the drone’s control system, cutting wobble on takeoff and landing.

Now I’m collaborating with UCSD and General Atomics researchers on fusion reactors. Keeping plasma stable at hundreds of millions of degrees relies on controlling small‐scale turbulence. Traditional physics‐based simulators run slowly. We’re crafting a deep‐learning tool to predict plasma behavior almost instantly, though that work is ongoing.

What inspired your AI Scientist concept?
Recently my group built algorithms that discover symmetries straight from data. Our tools rediscovered Lorentz symmetry—the rule behind light’s constant speed—and rotational symmetry, which explains why a perfect sphere looks the same when spun. We trained nothing specifically on those laws, yet our system pulled them out of raw numbers. That success made me think: if AI can uncover known physical principles, why not push it further to generate new research ideas or hypotheses? That notion grew into AI Scientist.

Is AI Scientist just one big neural network?
No. Think of it as a collection of specialized programs that guide researchers through various stages of inquiry. Our team has built models for tasks such as short‐term weather forecasting, assessing drivers of global temperature rise and teasing out causal links—like how vaccination rates influence disease spread.

We’re now creating a more general “foundation” model capable of handling diverse inputs—numerical records, text, images, video—and tackling multiple research challenges. A prototype exists, though we’re still training it and broadening its knowledge. We hope to have a robust release in a couple of years.

What roles could it play?
I view AI Scientist as an assistant across the research process. Gathering and organizing thousands of papers for a literature review, for instance, can take weeks. A large language model can read and summarize that volume overnight. AI is excellent at hypothesis generation and data analysis, too. Where it falls short is judging experimental validity; it can’t replace an experienced researcher’s intuition.

How far do you think AI Scientist can go?
My vision is to strip away routine tasks so experts can focus on creative work—the spark of insight that machines don’t bring. I don’t see AI replacing humans or stifling our inventiveness. On the contrary, I expect close partnerships between scientists and AI tools to open up research paths that neither could explore on their own.

Keep building
END OF PAGE

Vibe Coding MicroApps (Skool community) — by Scale By Tech

Vibe Coding MicroApps is the Skool community by Scale By Tech. Build ROI microapps fast — templates, prompts, and deploy on MicroApp.live included.

Get started

BUILD MICROAPPS, NOT SPREADSHEETS.

© 2025 Vibe Coding MicroApps by Scale By Tech — Ship a microapp in 48 hours.