Photorealism, Platform Competition, More AI on the Horizon


DX Week 2023 Panelists Predict the Future of Computing and Connectivity


The hottest technology on the planet, artificial intelligence, promises to dominate innovation through the end of the decade and beyond. That’s the conclusion reached by the three gurus participating in TDK Ventures’ computing and technology panel during day one of Digital Transformation Week 2023. AI long has attracted the fancy of science-fiction fans and dedicated researchers for at least a generation. Only recently have breakthroughs moved the needle precipitously close to singularity and even cognition, opening a range of potential use cases and exposing a raft of challenges.

“I remember the first time I got up on stage to talk about artificial intelligence,” reminisced Diane Bryant, NovaSignal CEO and board chair. “You couldn’t even call it ‘AI’; no one would know what you were talking about. That was 2013. Here we are a decade later, and the adoption is real.”

She said the most impactful innovation in the computing and connectivity space over the next five years will be the response to the computing demands AI presents and the platforms it occupies.

“Right now,” she added, “Nvidia is the platform, and the price of the GPU (graphics processing unit) platform has gone up and up. Nvidia’s stock in 2019 was $30. It’s now $300, so I think the innovation in the next five years has got to be a competitive platform. It’s hard because (Nvidia) is a very sticky platform. But you can see the investments that Google has been making in TPUs (tensor processing units). AWS (Amazon Web Services) has come out with its own AI processing matrix multiplication solution. But they’re not at the competitive performance level of Nvidia.”

Playground Global General Partner Laurie Yoler agreed that new computing architectures will carry the day.

“We first saw AI happening with regard to actuation, the digital world meeting the physical world,” she explained. “In autonomous vehicles, we saw a lot of happening in computer vision. And we saw that helping, through actuation, the whole robotics industry, and mobility coming out of that.”

Now, as neural network models continue to improve, entrepreneurs are finding ways to employ it in combining large language models with advances in natural language processing, shifting business models toward the tech space, she said. ChatGPT’s conversational interface makes it possible, but “they compute, the architecture, and the energy required for all that data is tremendous,” she said.

There is massive demand over the next five years for a more robust architecture that can support faster inferencing. Companies that want to build their own models and train on their own data are going to need a training platform. Investors are excited about companies that are developing in-memory compute technology, combining both functions in SRAM to generate speed and cost savings while building customized, low-cost, and low-energy-consumption training tools.

“For training, these huge language models are going to be useful, and large corporations could build these very expensive models – I’m sure people have seen how much OpenAI is spending these days,” Yoler said. “But many (smaller) companies want domain-specific models requiring not trillions of parameters, but where the sweet spot might be 3 to 50-billion parameters that any enterprise might already have and want to train on.”

The insatiable demand for digital brainpower is motivating companies to devote significant resources to joint processing, noted Paul Jacobs, CEO and chair of XCOM Labs.

He said that in the next five years, his and other wireless telecom companies will use joint processing and moving processing to the edge.

“We’ve gone through a set of different innovations to try to get the efficiency of the networks up and be able to support all of these cool things that now we’re actually moving the processing out of the end device and into the edge of the network,” Jacobs said. “That lets you do all sorts of cool things. With XR (extended reality), we’re doing split rendering, so you can get photorealistic images on the devices on your headset, which you can’t do because the headset itself is too constrained. If you do it at the edge, you can get photorealism.”

Bryant said 10 years is a magic number for the best ideas to mature.

“It seems to take a decade for everything to go from introduction to mainstream. Cloud computing is a great example. It was 2007 when AWS knocked on Intel’s door and asked us if we wanted to partner in this renting out of hardware infrastructure. We said, ‘No, that sounds silly.’ (But in) 2016 or 2017 real enterprises were deploying real apps into the cloud. So, the obvious answer is that anything that is nascent today will in 10 years become prolific. So, what’s nascent today? Generative AI. That’s what everyone is talking about. There are real applications. If you can make something more efficient, you’re going to be successful.”

Jacobs agreed and projected that augmented and virtual reality and a truly interactive and environmentally inclusive metaverse will coalesce around photorealism. “Ten years is about the timeframe to put all those pieces together to get to a…realistic representation of reality that you’re interacting with in both real life and virtually, then it’s going to be important and life changing. There are going to be kids in the future that are going to live primarily in a very digitally mediated world.”

He also sees wearable computing devices becoming more prevalent over the next decade, perhaps altering the competitive landscape for handheld units.

“The most valuable company in the world is based on selling smartphones,” he reminded the audience. “In 10 years, maybe we will get to the point where the smartphone will be challenged by other devices that are more body-worn. Right now (wearables) are big and dorky looking. There are not a lot of applications, they don’t let you work with each other, and they’re not very realistic.”

Tailoring technology to specific tasks and uses will be the hallmark of innovation by 2030 and beyond, Bryant said.

“You’re going to see a lot more custom silicon coming out of other large companies like Facebook and cloud-service providers,” she explained. It’s inevitable because (currently) it’s incredibly inefficient due to the price you’re having to pay for the compute power. Investing in silicon is expensive and investors have to have a lot of fortitude, but there’s going to be a real appetite for acquiring companies that have innovations around the hardware space in AI. The big guys are going to grab them, and they’re going to scale the platform and create some real competition.

Some of that customization will take aim at quantum computing, Yoler predicted. Startups are seeking to leverage silicon wafers, ion-trap technology, superconducting, and photonics to give it the inside track toward building a one-million-qubit quantum computer it believes tomorrow’s businesses need to achieve its desired level of reliability, scalability, and error-correction functionality.

“We are a huge believer in the different math-intensive workloads that quantum computing will solve that are quite different from the data-intensive AI workloads,” she said.

Panelists also discussed the security and ethical issues surrounding AI, current disruptive technologies in computing, and ways startups can compete with incumbent technology companies. We will report on that conversation in future reports from TDK Ventures’ Digital Transformation Week 2023.

The second annual TDK Ventures DX Week brought together some of the world’s brightest talent in the digital space. Held over three days in three global technology hubs – Silicon Valley, Tokyo, and Bengaluru – the panels, interviews, and lectures centered on the nexus between the analog and digital environments. The most prestigious thought-leadership event of the year, DX Week highlighted the insights, best practices, and visions that will guide digital technologies toward creating a more productive, inclusive, and sustainable planet.

"*" indicates required fields