Data Democratization to Usher in Industry 5.0

NEWS

Data Key to Sensing, Simulation, Security, Say DX Week Experts

In the coming decade, innovative business models will disrupt the way the world’s economy runs, the supply chain operates, and consumers obtain the manufactured products they need. The key is the application of artificial intelligence, 3D printing, and other technologies powered by data. That is the view put forward by Sunrun and TrueBlue Board Member Sonita Lontoh, during TDK Ventures’ Industry 5.0 panel discussion as part of Digital Transformation Week (DX Week). She said this paradigm shift will be governed by how equitably and conscientiously data can be shared with impact scalers who can put it to use for the good of humanity.

Lontoh’s theory kicked off the third and final segment of the Industry 5.0 discussion and was quickly adopted by her colleagues on the panel.

Yasser Alsaied, Amazon Web Services’ vice president of IoT and former Qualcomm vice president, said the democratization of data cannot come to fruition until data scientists effectively categorize the type and nature of the information they are collecting.

“As technologists and people who want to see technology advance quickly, we believe the use of technology is beneficial. We go across the board to try to share. That’s why we have open-source software,” Alsaied noted. But as artificial intelligence becomes more sophisticated, “data is becoming the new oil and the new currency. You have an ambitious crowd that wants to fuse everything globally. But the people invested in collecting the data, storing it, and maintaining it feel the right to regulate how it is used.”

He said governments may cite national security and businesses may claim the data represents trade secrets in order to refuse access to other organizations. Data that can be used to improve the human or environmental condition should be shared, Alsaied argued. To ensure that happens, it is important that researchers categorize their data, he explained. Personal, geopolitical, and strategic data that must is rightfully proprietary, he acknowledged, but environmental, health, and sustainability data can be shared at no risk to the controlling organization.

“Data scientists need to start categorizing data, not just creating it as a metafile containing a picture of everything, everywhere. Categorizing the collection of data to help regulators and enterprises understand what they should allow [to be shared and what they can legitimately keep confidential].”

Geetha Dholakia, TDK Ventures Portfolio program manager, summarized the point.

“There is a need to disassociate the data we collect from sensors for an industrial or application mode from that we collect from people,” she said. “There has to be a different governance for the former, which can enable all of society by propelling it toward Industry 5.0, as opposed to the latter, which needs more safeguarding.”

“There is a part of democratizing AI that bears a strong resemblance to open-source software,” observed Anthony Jules, co-founder and CEO of Robust.AI, a company stressing collaborative mobility in robotics. “While there is going to be a lot of government regulation and protection by corporations because data is a valuable asset, I think that in parallel there will also be a strong movement to build data sets that are generally available. We see this in academia already.”

He said what’s exciting about AI is not that deep learning is new and fantastic.

“It has been a fantastic tool for almost a decade,” he said. “What’s different is that a small team or an individual using publicly available data and the resources and computation they can access to build world-class classifiers, recognizers, or arbitration systems. AI has gotten to the ‘garage stage,’ where computers and chips were in the early ’80s. Small teams can now have really big impact on the frontier of what’s possible. That is kind of unstoppable.”

The conversation shifted to other technologies and capabilities that will further manufacturing and distribution over the coming decade.

Xiaolin Lu, fellow and director of Texas Instruments’ Kilby Labs Sensing & Processing Group, said quantum computing may comprise only a part of the next generation of digital brainpower. She said conventional computing is reaching its limit in terms of energy efficiency. With the continued growth of AI and extended reality making greater demands for cloud capacity computing power. By the year 2032, industry will require computing power measured in billion, not millions, of instructions per second.

“We have to change the way we think about the serial nature of computing: memory — input — output, vs. in the brain where we process a tremendous number of neurons in parallel,” Lu said. “It is estimated that when you perceive something, the bit rate is very slow — about 200,000 acts of reduction from the signal you receive to the actual signal used to process it in order to make a decision.”

She suggested it is time to change from a viable, commercial new computer perspective to a fundamentally different computation model incorporating not only quantum technology but also energy efficiency, parallelism, and brain-inspired neuromorphic and in-memory computing.

“All these bring a huge dimension of compression and how we generate some conclusion even as we continue to process what we’re seeing from the outside world. Quantum will never replace the classical computer; it will be a complement to that model. It is limited to certain types of computations. You still have technical challenges like temperature, the vacuum environment. Likewise, neuromorphic will start as a complement to classical computing models. Incrementally, people will see its advantages, and it has the potential to replace classical computing.”

Jules advocated for quantum computing. While agreeing that it is just one computational paradigm that will lay the groundwork for Industry 5.0, he noted that it allows users to solve complicated problems, including arriving at “multi-dimensional decisions based on a lot of complex inputs into every process delivered by multiple sensor types, whether it’s in manufacturing or even at a policy level. It’s really easy to create optimization problems like routing lots of vehicles on streets or lots of robots in warehouses. It becomes computationally intractable. As humans, we look at it and think, ‘it should be obvious where the [robot] wants to go,’ But anyone who’s actually worked on the software realizes these problems grow in complexity geometrically. Paradigms like quantum computing give us solutions in a finite amount of time.”

The new computational paradigms will also help Industry 5.0 achieve its potential by facilitating the use of simulations, Alsaied said. The more industry brings simulation technology to work with the data and AI, 3D sensing, quantum computing, the more realistic the models will come to pass for project planning, change management, and even equipment maintenance.

“There will be much more advancement in digital twins that will allow more efficiency in how we deal with the environment, for example, harvesting wind and wave energy from oceans. You have to have digital twins and simulation to see the impact on sea life. This is an area where the organizations investing right now will have a huge advantage.”

Christian Ramsauer, professor in the department of Innovation and Industrial Management at Austria’s Graz University of Technology, noted that half the projects industry presents to his students focus on smart sensing — machine vibration, pipeline maintenance, and other solutions that minimize the need for human intervention.

He said he is amazed at how much Siemens goes into digital, in an effort to increase the value chain and how the company tries to engage students to bring them new ideas.

Lontoh is not surprised by industry’s interest in smart sensing, given the customer and business benefits that can accrue from successful deployments.

“What’s important for a lot of legacy businesses that are mostly hardware-focused — companies like Siemens, GE, Schneider — is the potential to move up the value chain. They want to offer new business models like technology-enabled services and predictive-maintenance subscriptions.”

Each company, she said, depending on who they are and where they are in their journey, can leverage technology to either shift their missions or vertically integrate to capture more market share.

TDK’s Industry 5.0 panel discussion provided insightful and provocative information on how data sensing, collection, and sharing can unleash the power of computing and simulation to make manufacturing and commerce more efficient and connected. Data availability will influence the rate and completeness of our transformation to full automation of every aspect of business, from production and maintenance, to labor, logistics, and distribution. Like all DX Week activities, the panelists’ enthusiasm and vision will inspire inventors, entrepreneurs, researchers, and investors to incorporate sustainability and equity into the development and commercialization of their breakthroughs, true to TDK’s commitment to supporting initiatives that scale global.


"*" indicates required fields