Early last year, students at George Mason University were joined by 25 somewhat unusual, new residents. Measuring just under 2 feet tall, Starship Technologies’ fleet of boxy wheeled robots were on campus to deliver anything from coffee to sushi.
English major Kendal Denny immediately placed an order through Starship’s app, which is paired with the university’s meal plan.
“They were this new technology that no one on campus had ever experienced before," she said.
George Mason’s executive director of campus retail operations, Mark Kraner, had been struggling with competition from other food delivery services – but managing the university’s own human delivery force didn’t seem viable.
“It’s difficult to make sure you have the right number of people in the right places at the right times,” he said.
Rolling around at 4 mph, typically delivering orders within 15-30 minutes, Starship’s robots have quickly adapted to the campus, and the students have adapted to them, too.
“I used it a lot during exam periods when you don’t have time to go to the dining hall and stand in the line," said recent graduate Sofya Vetrova. “It’s much easier to just order from your phone. You get notified when the delivery is downstairs, so it’s very convenient and less time-consuming."
Tens of thousands of food orders have been delivered so far across campuses nationwide, including at The University of Texas at Dallas. At George Mason, Mark is looking into expanding the service to deliver mail, groceries and bookstore orders.
"Cars are really difficult on campus because parking spaces are rare," he said. “But the robots don’t need them, and they can weave easily around students, so they’re just like anyone else walking along a sidewalk."
The challenge of the last mile
For George Mason students, the robots simply represent convenient food delivery, but automated delivery could mean much more on a global scale.
According to the Logistics Research Centre of Heriot‐Watt University, the last mile -- the final stage of delivery from a transportation hub to the customer’s home -- contributes an average of 181 grams of CO2 into the air per delivery.1 And, with the majority of deliveries taking place in highly populated urban areas, congestion is a major concern. A combination of increasing urbanization alongside the growth of e-commerce is only increasing the problem, as urban freight looks set to increase by 40% by 2050.2
“The last mile of delivery is responsible for many of the problems we see with trucks polluting the air and blocking traffic lanes," said Matt Chevrier, a robotics expert with our company. "If we could replace these with smaller robots, which contribute significantly less pollution to the streets and can insert themselves into ordinary traffic, it could have a significant impact on urban air quality and on urban quality of life in general."
Robots in the wild
Unlocking the potential of automated delivery is not without its challenges. The first wave of robotics unfolded in factories and laboratories, taking the form of fixed robotic arms that precisely repeat pre-programmed routines, safely inside fenced-off zones.
As robots are being released into the real world, and expected to successfully navigate both the diverse obstacles presented by the urban environment and the unpredictable behaviors of their human co-inhabitants, then they need to independently perceive, understand and learn from their surroundings.
Fundamentally, that requires a few things: precise, accurate sensors, fast connection systems analogous to the human nervous system, rapid data processing -- often enabled by artificial intelligence – and quick reactions. Starship seems to be achieving all of these feats.
Multiple sensing technologies are used by robots depending on their size and speed. Some use LIDAR, ultrasonic, cameras, radar or a combination of these technologies. LIDAR is often used for autonomous vehicles with high speeds requiring long breaking distances. Starship doesn’t use LIDAR and relies on other sensor fusion for navigation and obstacle detection.
Our company’s TI mmWave sensors operate at a wavelength smaller than typical radio waves, but greater than lasers. This allows the sensor to see in challenging environmental conditions – such as darkness, extreme bright light, dust, rain, snow and extreme temperatures.
TI mmWave sensors also enable accurate detection of transparent objects, such as glass. “TI mmWave brings a lot of advantages new opportunities by ensuring that if something needs to be detected, it can reliably be detected," Matt said.
Detection is only half the story, however. Wheeled robots also need to identify what’s in front of them and then judge how best to respond. In dynamic environments, it isn’t feasible to await a decision while the raw data is sent to the cloud for processing, which means the machine learning algorithms need to run on the robot itself.
Our company’s Sitara™ processors are specifically optimized for the low-power operation of machine learning in the robot itself, enabling mmWave sensor data to be utilized for accurate categorization in real time. For the longer term, this data can also be uploaded to stationary computer systems, where time-constraints and power demands aren’t an issue, and used to further train identification algorithms, while also building up a detailed map of the robot’s typical routes.
“We’ve all had situations where the GPS fails on us, or doesn’t give us an accurate enough location," Matt said. “Supplementing this with the robot’s own map can make navigation much more reliable."
Back at George Mason, the robots have been quickly accepted as part of the student community. “People like to take pictures with them and just watch them, because they are cute," Kendal Denny said. “They’re kind of our new mascot."