By Katrina Dix
It's only been two years since the vibrant scientific community at žž and in Hampton Roads lured , away from the tenured position he already held in North Carolina. But his interest in the intersection of physics, urban planning and engineering goes back much further — to the (mostly) friendly arguments he and his father had over what routes to take while the elder Park taught the younger to drive. Dr. Park liked using navigation apps; his dad, not so much.
On a recent trip to visit his hometown of Seoul, South Korea, now armed with degrees in all three fields as well as four patents, he talked his father into using his method — only to hit traffic anyway when many other drivers made the same decision.
“He got really upset and told me my patent doesn’t work,” Dr. Park said, smiling. “But I could have picked the best way if I had better data!”
Here, Dr. Park, who teaches in the Batten College of Engineering and Technology, answers a few questions about his research.
What is your current research focus?
For the last several years, I’ve been developing artificial intelligence that has very sophisticated navigational decision-making ability but that is light enough — both in weight and in power resource needs — to be carried in smaller, less expensive unmanned aerial systems (UAS) and other mobile platforms. I’ve worked with hurricane-tracking drones, nanorobots that measure billionths of a meter and can navigate the human body, the Mars rovers, first responder drones in urban areas and, most recently, a project with the Air Force Research Laboratory to improve military drones’ ability to defend against or capture enemy drones.
They sound different, but it’s all the same idea — that the robot is doing something a human cannot do, and it makes the decisions itself. It becomes adaptive, so when its environment changes, it can learn by itself and make safe decisions. That’s called edge learning. It’s a subset of AI that processes new information on a device at the “edge” of what the AI already knows and the new data it’s taking in.
Could you share some examples of how you apply edge learning?
One of the ways the National Oceanic and Atmospheric Administration tracks hurricanes is by using expensive hunter aircraft to drop drones in the eye of the storm. The drones gather information about wind speed and other conditions from the inside, but they can’t adjust their flight based on the conditions they encounter, which limits the accuracy of their data. By designing AI that both gives the drones more navigational autonomy and requires fewer onboard resources, we can build cheaper, expendable drones that can react to the changing wind conditions and keep up with the hurricane longer and from more angles. This will give us much more accuracy in predictions about the critical path of the hurricane and better public trust in predictions, which makes a big difference in protecting public safety.
In another case, on Mars, you need high autonomy because you can only communicate once a day. NASA gives the Mars rovers a day’s worth of navigation direction based on images recorded with aerial systems, but there’s no 100% accuracy. Maybe a rover thinks it sees a rock blocking the way when there really isn’t one. Until the next contact, it’s stuck when it didn’t have to be. So, to design an innovation to find similarities between whole areas, giving the rovers a better idea of what an area they can transit looks like versus one that they can’t. I recently received approval for a patent for that.
This summer, I’ve been working with Dr. Von Moll (Alexander Von Moll, Ph.D.) at the Air Force Research Laboratory in Dayton, Ohio, . Up until now, the drones have had the ability to move through weapon engagement zones and attempt to avoid, neutralize or capture other threats, which already requires very complex calculations, but they couldn’t account for wind or precipitation. They were developed for perfect weather conditions.
How have your research interests developed over your career?
As an undergraduate, I started in physics, and I was interested in the way everything is in a state of motion. I studied the transport of particles in engineering systems, such as how fluid moves through pipes. Then I was introduced to the traffic theory that when cars are moving, the road is like a pipe. It has a certain capacity, and the vehicles on that road are just like particles — but even more interesting, because the way they move is guided by human behavior. That makes things crazy unpredictable, and there are even more uncertainties than a lot of other physics problems. I got more interested in the engineering side of that, so I double-majored in physics and urban planning and then started working as a transportation engineer for a consulting firm.
Pretty quickly, I realized that I was doing a lot of the same tasks the same way for each project, from road measurements to feasibility studies, and I became interested in automating some of them with robots. But I wasn’t interested in typical robots. I was more interested in how robots make navigational decisions, and how they can be made safer, to protect robots and the people, objects and spaces around them. Some of my mother’s colleagues at the university where she teaches English talked to me about the research I would need to do to pursue something like that, and that’s when I decided to go back to academia for a Ph.D. in civil engineering.
Overall, my research has evolved from modeling physical transportation processes to designing AI-driven systems that can dynamically gather information, learn and adapt for better predictions and better control in uncertain, real-world settings.
How would you describe your work’s impact on the world?
My four-year-old son would say “My dad flies the drone inside the hurricane, gets the information and destroys the hurricane.” That’s not exactly what’s happening, but my framework helps guide autonomous sensing platforms, like small, unmanned aircraft systems (sUAS), to collect the most informative data in real time and navigate through the most vicious weather conditions.
A few decades ago, our storm prediction accuracy was about 10%, and now, it’s around 35%. We’re still very reliant on passive storm-induced flood forecasting, which in turn relies heavily on storm forecast accuracy. So, when it comes to rapidly intensifying storms, we can predict about one out of three accurately, and the other two we cannot. With the recent horrific events in Texas, and the need for improved prediction and public trust are on everyone’s mind, especially here on the coast.
The city of Norfolk already provides real-time flood alerts, and improved storm forecasts will further enhance this service. I’m also working with the U.S. Army Corps of Engineers on the Hampton Roads storm surge vulnerability study, which relies on accurate storm predictions. Ultimately, we plan to demonstrate this system through a tool like the Waze mobile app, where users will be able to see simulated flood depths and impacted roads in real time, improving situational awareness and safety for millions living in coastal areas.
There are a lot of civic applications for the work I’ve recently done with the Air Force Research Laboratory, too. We need to solve many of the same problems for urban air mobility and advanced air mobility, allowing for things such as drone first response — which could help with everything from finding the exact location of someone in distress to video assessment that helps deploy the right resources . It’s also the necessary foundation for drone delivery of everything from pizza to medication. We imagine numerous manned and unmanned systems sharing the same airspace, under various wind conditions, navigating efficiently to their destinations while remaining free from collision. The sensors and AI that help with this could help prevent manned aircraft collision, as well.
Ultimately, all of my current National Science Foundation, Office of Naval Research, Air Force Research Laboratory and Virginia Innovation Partnership Corporation-funded projects aim to apply these results to real-world applications. Through these projects, I use physics and engineering to keep people safer by making AI navigation smarter.
This interview was edited for length and clarity.