During the opening keynote of the Game Developers Conference, Nvidia founder and CEO, Jensen Huang, announced new systems, software, and partnerships to ensure the company continues to innovate the industry with AI. One particular focus is on the Omniverse platform, a development environment where developers can work collaboratively and in real time to create new creative and entertainment projects that take advantage of virtual reality and beyond. The reference is, of course, also to the metaverse trend, which in one way or another is driving the projects of major companies active in the global digital scenario.
“Omniverse provides a powerful development pipeline that addresses the challenges of today’s world,” said Huang. “Its ability to unify creators, art, tools and applications under a single platform can inspire collaboration among even the most disparate game development organisations”. With Omniverse’s real-time collaboration and simulation platform, companies will be able to use AI and RTX hardware and build custom environments to simplify, accelerate and improve their development workflows. New features include updates to Omniverse Audio2Face, Omniverse Nucleus Cloud and Omniverse DeepSearch, as well as the introduction of the Unreal Engine 5 Omniverse Connector.
According to Nvidia, global teams work to build huge libraries of 3D content, a time-consuming process made even more difficult when incorporating more detailed and realistic elements into designs. It’s a painstaking task, where the slightest change can negate hours of effort in front of the screen. All it takes is the wrong lighting, a change in the physics of an object and the result is not what was expected. For the CEO, Omniverse addresses these challenges and helps developers build photorealistic and physically accurate games seamlessly by connecting artists, their resources and software tools into one powerful platform. “The collaborative aspect of Omniverse can dramatically reduce iteration time on critical design decisions, accelerating project completion”.
What is Omniverse
Omniverse is an open, multi-GPU platform that runs on any device with an RTX-supported card, from a laptop to a server, so you can transform complex 3D production workflows into something less labor-intensive but higher-level. Omniverse is based on Pixar’s Universal Scene Description (USD), an easily extensible, open-source 3D scene description and file format for content creation and interchange between popular game development tools. Nvidia has expanded USD by developing new tools, integrating technologies, and providing examples and tutorials.
Game creators, designers, artists, and developers can pool their resources, libraries, software applications and game engines within Omniverse, freely iterate on real-time design concepts, build productivity-enhancing tools and instantly share high-fidelity models. “Game developers can use the ready-made Omniverse Apps, and many others have been built by third parties,” continues Jensen Huang. Components in the Omniverse platform include: Omniverse Audio2Face, an “AI-powered” Nvidia application that allows artists to generate high-quality facial animations from a simple audio file.
Then Omniverse Nucleus Cloud, in early access, for easy one-click sharing of Omniverse scenes, eliminating the need to deploy Nucleus locally or in a private cloud. With Nucleus Cloud, game developers can collaborate in real time on 3D assets between internal and external development teams. Omniverse DeepSearch, an AI-enabled service available to Omniverse Enterprise subscribers, allows them to use natural language and image input to instantly search through the entire catalogue of 3D assets for objects and characters. Then Omniverse Connectors, plug-ins for collaborative “live sync” workflows between third-party design tools and Omniverse. With the launch of the Unreal Engine 5 Omniverse Connector, creators have the opportunity to exchange USD and material definition language data between the game engine and Omniverse.
Scalability and digital twin
At the bottom of the keynote, Nvidia also launched other innovations to complement the Omniverse. Nvidia announced OVX, a computing system designed to power large-scale digital twins. The platform is designed to handle complex digital twin simulations to run within Omniverse.
The OVX system combines high-performance GPU-accelerated computation, graphics and artificial intelligence with high-speed storage access, low-latency networking, and precision timing to provide the performance needed to create digital twins with real-world precision. OVX will be used to simulate complex digital twins to model buildings, factories, cities and even the entire world. “Physically accurate digital twins are the future of how we design and build,” said Bob Pette, vice president of Professional Visualization at Nvidia.
“Digital twins will change so many industries, in the long run all of society. The OVX portfolio of systems will be able to power real, real-time, always-on digital twins on an industrial scale”. OVX will enable designers, engineers, and architects to build physically accurate digital twins of buildings and create massive, true-to-life simulated environments with precise time synchronisation between the physical and virtual worlds.
An example of where Nvidia is going with OVX is DB Netze, part of German national railway holding company Deutsche Bahn. Netze is building a digital twin of the railway network in Omniverse to train systems for automatic train operation and enable AI-powered predictive analytics for unforeseen situations in railway operations. “Using a photorealistic digital twin to train and test AI-enabled trains will help us develop more accurate perception systems to optimally detect and react to incidents. In our current project, Nvidia OVX will provide the scale, performance, and computing capabilities we need to generate data for intensive machine learning development and to handle these highly complex simulations and scenarios,” said Annika Hundertmark, Head of Railway Digitization at DB Netze.