March 29, 2024

Ferrum College : Iron Blade Online

Complete Canadian News World

Nvidia shows an AI model that turns a few dozen shots into a 3D scene

Nvidia shows an AI model that turns a few dozen shots into a 3D scene

Nvidia’s latest AI rendering is impressive: a tool that quickly converts “a few dozen” 2D snapshots into a 3D rendered scene. In the video below, you can see the method in action, with a model dressed as Andy Warhol and holding an old-fashioned Polaroid camera. (Don’t overthink Warhol’s relationship: he’s just a small part of the PR landscape.)

The tool is called Instant NeRF, referring to “neural radiation fields– A technology developed by researchers from UC Berkeley, Google Research and UCSD in 2020. If you want a detailed explanation of neuroradiation fields, you can read one here, but in a nutshell, the method assigns the color and intensity of light to different 2D shots, and then generates data to correlate these images from different observation points and present a final 3D scene. In addition to photos, the system requires data about the position of the camera.

Researchers have improved this type of 2D model into 3D For a few years now-Adding more details to the final presentations and increasing the speed of the presentation. Nvidia says its NeRF instant model is one of the fastest developed to date and reduces rendering time from a few minutes to a process that ends “almost instantly.”

As the technology becomes faster and easier to implement, it can be used for all kinds of tasks, says Nvidia in a blog post. Job description.

“Instant NeRF can be used to create avatars or scenes of virtual worlds, to capture video conference participants and their 3D environments, or to reconstruct scenes for digital 3D maps,” writes Nvidia’s Isha Salian. “This technology can be used to train robots and self-driving cars to understand the size and shape of objects in the real world by taking 2D images or video footage of them. It can also be used in architecture and entertainment to quickly create digital representations of real environments that creators can modify and build upon.” (The metaverse appears to be invoking.)

See also  Dell's new-look XPS 13 Plus is now available starting at $1,299

Unfortunately, Nvidia hasn’t shared details about its method, so we don’t know exactly how many 2D images are needed or how long it will take to render the final 3D scene (which will also depend on the power of the computer doing the rendering). However, the technology appears to be advancing rapidly and could start to make a real impact in the coming years.