arzh-CNenfrdejakoplptesuk
Search find 4120  disqus socia  tg2 f2 lin2 in2 X icon 3 y2  p2 tik steam2

NVIDIA announcements at Siggraph 2017

We present to your attention NVIDIA announcements at Siggraph 2017

  • New SDKs
    • OptiX 5.0 with AI noise reduction
    • 360 video in stereo with integration into the ZCAM solution
  • New products
    •  DGX Station, a personal supercomputer for developing AI for graphics applications
    • external graphics cards Quadro and Titan Xp (eGPU)
  • New simulators for AI in VR (playing dominoes with the robot Isaac, trained by neural networks, in the virtual environment for Holodeck collaborations)
  • New developments related to AI, augmented reality and VR

Speed ​​up rendering with AI

NVIDIA announced the use of artificial intelligence (AI) to speed up rendering, introducing SDKNVIDIAOptiX™ 5.0 with new ray tracing capabilities.

SDKOptiX 5.0, launched on NVIDIADGXStation™, the newly introduced AI-powered workstation, will give designers, artists and other digital content professionals rendering capabilities comparable to 150 standard CPU-servers. This GPU-acceleration will allow you to iterate and create new versions of high-fidelity content at much lower cost.

image003

New ray tracing capabilities OptiX 5.0 will speed up the process of visualizing projects and significantly improve the interaction of developers with their content. They will also speed up removing image grain using AI-based methods and they will add GPU-acceleration in creating the effect of motion blur in animation.

OptiX 5.0 will be available free of charge to registered developers in November.

Running NVIDIAOptiX Xnumx on DGXStation, content creators can noticeably speed up training, inference, and rendering. Almost silent system NVIDIADGXStation, which fits easily under the desktop, is based on GPU new generation NVIDIAVolta, and is the most powerful rendering system using artificial intelligence. To achieve equivalent performance to DGX Station, developers will need to create a farm with more than 150 servers with an energy consumption of about 200 kW - compare with DGX Station, which consumes 1.5 kW. The cost of purchasing and maintaining such a farm reaches $4 million over three years, compared to $75 for the DGX Station.

More

Streaming 360 video in stereo becomes a reality

 

NVIDIA introduced SDK VRWorks 360 Video, which allows you to broadcast high-quality 360 video in stereo live.

Stitching 360 video into stereo is a step forward for the live broadcast industry. NVIDIA VRWorks speeds up the process of stitching video into stereo while maintaining a high level of image quality.

The VRWorks SDK allows production studios, camera manufacturers and application developers to integrate the 360-degree stereo fusion SDK into their existing workflow for live production and post-production. New model VPro from Z CAM is the first professional 360-degree VR camera with a fully integrated VRWorks SDK.

image005

Z CAM, one of the first companies to release a mainstream professional VR camera for live broadcast, is also integrating the NVIDIA VRWorks SDK into its image stitching applications WonderStitch and WonderLive . 

SDKVRWorks 360 will be available for download on August 7 and will allow developers to capture, stitch and stream 360 degree video content. VRWorks supports real-time streaming of 360-degree mono and stereo video and its post-processing, namely:

        merging video streams from 4K cameras in real time and offline;

        decoding, calibration, stitching and encoding of video with GPU-acceleration;

        360 projection onto a cube map or panorama;

        support GPUDirect for Video for low video latency;

        support for up to 32 video streams.

More

External GPUNVIDIA for laptops

Interactive rendering, creation VR and AI development are now available on laptops. Professional designers with underpowered graphics can now get the power NVIDIATITANXp or NVIDIAQuadro® with chassis for external GPU (eGPU) to significantly improve the performance of your applications.

image009

For everyone who works with content creation applications for animation, color grading and rendering, as well as CAD and modeling applications, external graphics Quadro will be available to those certified under the program eGPU company partners (BisonOneStopSystems/Magma и Sonnet, and later for the rest), starting in September.

To ensure high performance for professional users working with Autodesk® Maya® and Adobe® Prime ministerPro on systems with external TITANXpNVIDIA releases new driver.

More

Shall we play dominoes? Robot Isaac в Projectholodeck

Our new demo will introduce you to two technologies that we announced in May at the GTC conference.

NVIDIA Project Holodeck is a multi-user, physically based, near-real-life environment that allows people to interact with robots in virtual reality just as they would in real life.

NVIDIA Isaac is an artificial intelligence (AI) robot that was trained using Isaac Lab's powerful simulation environment.

Visitors to Siggraph will see how these two technologies work together, interacting with the Isaac robot in two ways.

image012

 

First, you will be able to walk through the exhibition with the robot in the physical world. Secondly, you can put on a VR headset and enter the simulation environment through Project Holodeck.

The integrated use of deep learning and computer vision makes it possible to teach a robot to sense and react to the presence of people, understand the course of a game, understand the correct moves in a game, and determine which domino to choose and how to position it. Learning through two neural networks helps the Isaac robot not only understand the game, but also understand how to apply this understanding when interacting with people.

Based on the received images of dominoes, the first neural network is analyzed during the game and determines the possible allowed moves. The data is then passed to another neural network, which uses reinforcement learning to determine which domino to select and how to position it.

After training in the Isaac environment, the acquired knowledge can be transferred between the physical and virtual worlds.

By building and training robots in a simulation environment and then working with them in a VR environment like Project Holodeck, developers can release robots into the real world faster and at a lower cost.

Early access to the Holodeck beta will open in September 2017. Sign up on Project Holodeck updates. 

More