The panelists assembled for TDK Ventures’ Digital Transformation Week (DX Week) session on Industry 5.0 envision a manufacturing paradigm in which robots, artificial intelligence, and extended reality are indispensable. Far beyond simple robotic process automation in which machines take over complicated, repetitive, or dangerous activities so humans can concentrate on strategic, creative, value-adding tasks, Industry 5.0 will synergize the contributions of machines and people, creating a workspace in which robots work shoulder-to-shoulder with their human masters.
Geetha Dholakia, TDK Ventures portfolio manager introduced the Day 4 discussion by describing Industry 5.0 as the application of sustainability and society-centric goals to Industry 4.0’s transformation to smart manufacturing using both physical and digital tools to create an efficient manufacturing ecosystem. It emphasizes interconnectivity, automation, machine learning, and real-time monitoring and data analysis.
Many of Industry 4.0’s center on the tools — robots, sensors, gauges, computers, machine learning — that expedite prototyping, continuous improvement, and other manufacturing processes. Industry 5.0, on the other hand, leverages environmental stewardship, dignity, respect, and other human sensibilities. It makes sense, then, that the discussion opened with an exploration of the human-machine integration.
Anthony Jules is co-founder and CEO of Robust.AI, a company stressing collaborative mobility in robotics. He said industry now has access to plentiful data thanks to the industrial internet of things, and production process monitoring.
“The next step is being able to put all that data in context. The key to leveraging that is being able to have people’s decisions executed by robots or automation as quickly as possible,” he said. “To pull that off we need a different level of integration between humans and robotics.”
Jules explained that humans must gain transparent understanding of their context and their role in the process.
“They need to be able to synthesize that information and make decisions,” he said. “Those decisions then need to be executed by whatever automation systems exist as part of the process. That tie-in will release a huge amount of value over the next five years.”
Dholakia noted that “if you want to have a collaborative environment where you have humans and machines working together, you have to start from the design stage.
“Christian Ramsauer, professor in the department of Innovation and Industrial Management at Austria’s Graz University of Technology mentioned that numerous industrial accidents occur when extensive use of robotics in manufacturing does not take into consideration the presence of humans.”
Over the next five years, he wants to see more “human-centric production to make sure that we are not short on workers in the factory and that we benefit from the technology where robots can help and at the same time seamlessly work with humans,” he said.
He said that design challenges are plentiful. Robots with these abilities have been developed on a small scale, but full-function, industrial-scale models will only become viable over the next few years.
Yasser Alsaied, Amazon Web Services’ vice president of IoT and former Qualcomm vice president, agreed that safe, efficient robot-human interaction presents a giant leap forward. He said that currently, robotics in manufacturing is limited to automation.
“You set up the programming and the robots do precise tasks some degree better than a human, repetitively and with limited errors, so your yield becomes higher, and your production quality is consistent,” he explained.
The next step, he said, is greater use of sensing cameras, computer vision, and “decision-making on the robotic side itself — on the edge, pushed out to the robotic machine so it can do a better job of what it’s doing but also to avoid accidents.”
The technology is being used in autopiloted cars, satellites, industrial robots, and even home vacuum cleaners to avoid collisions, impediments, and improper human interactions.
“The more you can push decision-making and smartness into the robotic, the more advanced use cases will emerge,” Alsaied said.
Xiaolin Lu is a fellow and director of Texas Instruments’ Kilby Labs Sensing & Processing Group. She said her company’s work in bringing wireless connectivity has furthered the functionality and availability of sensing technologies in the factory setting.
“Robust wireless connectivity allows us to put sensors anywhere we want,” she said. “The challenge now, and one we didn’t put enough thought into at the very beginning, is energy efficiency and energy budgeting for the robot? A robot is making decisions based on a lot of distributed sensors. These sensors do not necessarily have the luxury of a DC source for charging.”
She said there may be 10 to 20 tiny sensors — less than the size of a penny — installed on a single machine. They may exhaust their power within a year or even a few months.
“So, how do we get all these things synchronized? We need to think through the power efficiency and budget that we’re living with.”
She also echoed the need to analyze and mobilize all the data the sensors are capable of collecting and what to do about physical data security and cybersecurity.
“Security sometimes contradicts the embedded work. In order to do security, I have to add a certain amount of overhead,” she explained. “My communication bandwidth is already very limited, but I have to include a lot of other security-related traffic. So, how we feed all this into a holistic decision-making process at the very early stage is very critical.
Sonita Lontoh, who sits on the boards of Sunrun and TrueBlue, said incorporating edge decision-making and security protocols will require more than evolutionary technology.
“Business model innovation and change management is also very important because every company is at a different stage of transformation maturity,” she said.
Alsaied said change management and adaptable business models will become more critical as more vendors provide different robotic capabilities and innovations.
“Suddenly, you have different types of operating modes, robotic operations,” he said. “Then, you need to centralize again. That’s where the strength of the cloud (comes into play). You have to have an operating system for the robotics so solutions from different vendors operate under a common denominator of the robotics will operate in a factory or enterprise or on the street.”
The panel also discussed the impact of augmented reality, artificial intelligence, simulation, and applications for quantum computing and the metaverse. We will explore these topics in subsequent articles. The first part of TDK’s Industry 5.0 panel conversation spotlighted ways humans can leverage robotics to impact the world for good. It fulfilled the conference’s promise to use provocative discussion and expert insights inspire inventors, entrepreneurs, researchers, and investors to incorporate sustainability and social justice into their digital transformation pursuits. DX Week reinforces TDK’s commitment to supporting initiatives that scale global impact through advancement of technology and marketable products.