Prof. Julie Shah in MIT Spectrum: How will we make things?

MIT Spectrum asked three MIT researchers how they’re defining the next generation of manufacturing. Read the full article.

Julie Shah ’04, SM ’06, PhD ’11, the H. N. Slater Professor in Aeronautics and Astronautics, adopts a human-centric philosophy in her work in robotics.

The traditional difference between people and robots on a factory floor is their flexibility, according to Shah, who leads the Interactive Robotics Group in the Computer Science and Artificial Intelligence Lab. Coordination of human tasks happens organically, with the reassigning of tasks as needed. But introduce a robot, and that portion of the manufacturing process tends to become more rigid. Shah also serves as faculty director of the MIT Industrial Performance Center and co-leader of its Work of the Future Initiative, which carries out an applied research and educational program to understand how organizations make new technologies work in practice.

“If you want to know if a robot can do a task,” she says, “you have to ask yourself if you can do it with oven mitts on.” But so much of the work we do in our lives, including in manufacturing, requires dexterity and adaptability, and getting a robot to help with nimbler tasks is costly. According to Shah, it’s one of the reasons that only 1 in 10 manufacturers in the United States has a robot, and why those who have them don’t tend to use them much.

Shah is trying to change that by designing robots and automation systems that are safe, smart, and flexible. Done properly, she says, human-robot integration in some cases has led to better-paying jobs, workers learning new skills, and higher profits and productivity.

Part of her work involves programming robots to model how people move and operate so they can better integrate into an industrial environment alongside their human companions. Crucially, she’s also designing robots that workers on the factory floor can program and test without needing special expertise. That means affordable, more agile machines (think no oven mitts) that are just as easy to teach and train as a human being—“a system that you can program with low-code or no-code interfaces,” she says.

Shah conjures an example of someone wearing a mixed-reality headset while they walk and talk through the steps of an industrial process. That visual and verbal information would be automatically compiled into a control program “and then the robot just does it,” she says. The goal is to put the front line workers into the driver’s seat for testing, deploying, and re-programming robots as the work changes. “That’s the dream of the future.”