5

Universal Scene Description: The foundation of the entire metaverse (not to ment...

 1 year ago
source link: https://venturebeat.com/metaverse/universal-scene-description-the-foundation-of-the-entire-metaverse-not-to-mention-a-whole-lot-of-other-3d-applications/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client
Sponsored

Universal Scene Description: The foundation of the entire metaverse (not to mention a whole lot of other 3D applications)

GettyImages-1360794070.jpg?fit=750%2C469&strip=all
Image Credit: Getty Images

Presented by Nvidia


The metaverse is coming and it’s bringing many technical questions with it. How will we build these virtual worlds? How will we share the plans? How will we cooperate and compete to design the locations, scenes and environments?

The answer to all of these questions involves, at one level or another, Universal Scene Description or “USD” for short. The technology was first invented at Pixar to help them assemble and interchange the scenes to their movies and it follows in a long history of dozens of file formats used to represent three-dimensional objects by architects, product designers and scientists.

Now it’s one of the leading interchange formats that’s competing to define the metaverse by bringing some consistency and interoperability to the virtual world. Pixar open-sourced the project in 2016 in the hopes of nurturing a larger community that can collaborate on tools and rendering infrastructures. They built it to make movies but it’s found traction in other industries.

Today, a number of companies use the format to support their three-dimensional virtual worlds. Autodesk Inventor, for instance, stores mechanical designs for manufacturing using the format. Blender and Cinema4D support it as part of creating and rendering movies.  Nvidia is making it the foundation of Omniverse, their suite of tools for building and inhabiting the metaverse.

“It really started with Pixar, when they put it out there in the open. We recognized it early and we imagined a future where by default all 3D tools and engines and simulation stacks will end up using it,” explained Rev Lebaredian, vice president of simulation at Nvidia. “We’d like to take USD all the way to that point where it could be used everywhere for anything that’s 3D-related.”

Under the USD hood

Each USD layer stack can include a number of digital assets that assemble the scene. Some describe objects, their shapes and appearance. Others describe their locations and how they interact. Together, they build out what the user will see and interact within the metaverse.

The composed result of each USD layer stack is called the Stage. The objects are defined and placed within the Stage using a hierarchical collection of data structures called Prims. Each comes with a collection of Properties which spell out all of the details of the objects like their radius, opacity or their orientation. Much of the work of building each scene involves populating the big hierarchical tree of Prims with the right properties.

Here’s an example of a very simple Stage with a ball in the middle of it:

 def Sphere “ball”

        float3[] extent = [(-2, -2, -2), (2, 2, 2)]

        color3f[] primvars:displayColor = [(0, 0, 1)]

        double radius = 2

USD schemas support a wide variety of properties that define the object and how it’s rendered for the screen. The simplest definitions may just be color, but some are adding more elaborate models that capture some of the physical properties of natural items. Nvidia, for instance, added the Material Definition Language (MDL) that defines how light behaves when it bounces off a fabric or solid. This lets the hardware produce the most realistic view in a variety of differently lit environments.

USD’s composition engine is designed to make editing more efficient via sparse, non-destructive authoring. For example, a content creator can leverage hierarchical structures commonly used in other areas like object-oriented programming. The objects on the stage can both have their own individual aspects or they can derive details from a more general model or template. USD provides a rich set of composition features, of which this object-oriented approach is only one.

A class prim may define values for an entire collection of objects like, say, a flock of birds. The individual items may derive their value from the original or they may override them by changing particular values for a specific scene. This produces an efficient file structure that minimizes repetition while also simplifying editing or changing the prims as they evolve over time.

“It’s more than a file format, it’s a whole interchange paradigm,” said Aaron Luk, Sr. Engineering Manager for the USD Ecosystem in Nvidia Omniverse. “Other interchange formats can participate in USD as first-class citizens.”

For instance, the composition process can aggregate scene elements that are stored in some other common formats like Alembic. USD can reference Alembic assets natively and compose sparse overrides on top. It’s often said to be non-destructive because the referenced content can be used in different ways with entirely different overrides without modifying the source data directly.

“Industrial Light & Magic was a big user and developer of Alembic, and they still are,” recounted Luk. “We loved the idea that they could work on Star Wars and basically generate their assets in Alembic, but then stitch them together in USD for shots and sequences.”

The interchange format also offers a number of other options for building elaborate scenes. The objects can be grouped together into sets and layers that can be manipulated as one. A person, for instance, may have two arms and two legs that can move independently along set paths but the entire person can also be moved as one unit.

A powerful part of USD is the language for modifying the values. There is a collection of operations that are designed to be applied to the entire stage in a predictable process. Operations to add, change or reorder properties of the prims are designed in a powerful way so that only a few basic operations can morph an entire scene. The artists designing an effect don’t need to spend hours applying many small changes to each individual prim if they can write a simple set of operations that describe the effect at the highest levels of the scene.

Below you can catch up with Aaron Luk’s session from Nvidia’s latest GTC event.

Evolving to greater heights

USD is also designed to evolve with time. Basic subsytems like the schemas for data modeling or asset resolvers for resouce location are extensible. A plugin system opens up great possibilities for users who need especially elaborate changes.

This flexibility also extends to the software that implements USD-compliant renderers. The data structures are designed to simplify parallel execution by multi-threaded code running on multiple cores and processors, especially the high-powered GPUs that are becoming increasingly standard. 

USD is poised to expand to fill the needs of everyone who wants to build out the metaverse. The current version is rooted in the original Pixar release but now some others are starting to imagine expanding it to tackle more complex challenges. Nvidia, alongside Pixar and Apple, for example, have contributed schemas that define many properties of materials so that software can do more than just render objects. Simulators can build out scenes with working, moving objects by using properties like the bounciness or the strength of the materials.

All of these changes are growing more formal as the developers recognize the advantages of joining together and pulling in the same direction. They’re starting to set up committees, mailing lists and governing processes so that the standard will evolve to handle all of the tasks from the metaverse.

“Every single 3D tools maker, every engine maker, virtually all of them have USD on their radar,” said Lebaredian. “They’ve either integrated it into their tools or they’re planning on integrating it into their tools, one way or another. We’ve hit a milestone in this journey. We’ve hit a big one where we have critical mass of acceptance and now we have to go to the next stage.”


Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact [email protected].


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK