MS-Mixed-Reality-Partner

Removing access constraints for large datasets and high-quality rendering

ARR-Theorem-XR-Factory-Data

Combine the power of Microsoft ARR, and the advanced data preparation and optimization capabilities of the Theorem Visualization Pipeline, part of the Theorem-XR product suite, and large datasets and high-quality rendering constraints are removed.

As a member of the Microsoft Mixed Reality Partner Program (MRPP) we have worked closely with Microsoft to ensure the Azure platform seamlessly integrates with our pipeline and experiences, enabling us to deliver innovative, class leading, collaborative, mixed reality experiences for Visualization, Design Review and Factory Layout for the HoloLens 2.

Quality

Demonstration Request > Lets Talk

Driving Innovation Forwards

With enhanced data sharing and supplier collaboration, users are taking advantage of the new spatial computing and visualization capabilities of HoloLens. Our Engineering and Manufacturing customers are driving our ARR focus forwards, helping them to improve their design quality, delivered by world class innovative engineering and manufacturing products and services using HoloLens and ARR. 

ARR is the obvious 'Next Step' for existing XR users and anyone that has the requirement to deploy a Mixed Reality (MR) enabled use case.

HoloLens-ARR

Exceeding Design Data Expectations

Theorem’s ARR solution delivers remotely rendered data from the Azure Remote Rendering servers, enabling users to add locally rendered content, by combining streamed data and existing managed assets.

This is all facilitated by the seamless integration between Theorem’s Visualization Pipeline and the ARR Server providing an easy-to-use mechanism to deliver content to the Theorem-XR Experiences.

ARR-OpenModel-1

Demonstration Request > Lets Talk

What is Remote Rendering?

Every day we take for granted that on our phones, laptops and PC’s 2D and 3D data is rendered so we can visualize and interact with it. The same requirement is there for Extended Reality (XR) devices too. However, the capacity of XR devices can mean that the device cannot render the volumes of data needed to deliver a good experience for the user. XR devices have the added need to achieve high frame rates to ensure a good experience.

Remote rendering offloads the rendering process to a separate server which might be in the Cloud or an internal network. Those servers normally include high end CPU’s and GPU’s with large amounts of RAM. The rendered data is then streamed to the device- whether that’s a phone or XR device- and displayed. This significantly reduces the processing load on the device.

Without good rendering performance, the quality of the graphics and the richness of effects such as textures and lighting are lost. Remote rendering offloads the need for high performance CPU’s and GPU’s on the XR device, as rendering is done by the rendering server, solving the quality and performance problem.

theorem_diagram_2

Why use Remote Rendering?

In order to deliver good quality graphics and maintain performance (frame rate) on XR devices, the number of polygons, the file size, and the number of individual parts determine whether a 3D object can be displayed with high quality and performance in XR.

One significant issue many XR users face is the limited computing and graphics power of smartphones, tablets and XR headsets. Whilst new devices appear every few months offering better performance, the need to display very large data sets at a high quality is an issue that will remain, as there are physical barriers, as well as the need to reduce the form factor, to bring immersive technologies to the mainstream.

The result is a trade-off between the graphics quality and performance. Like most things the devil is in the detail; the device, the data size, the quality needed, the best frame rate and the use case will all effect the outcome.

Quality

Approaches to the problem

Theorem’s Visualization Pipeline provides a fully automated process to simplify and remove details of 3D CAD content so that the XR object is delivered with optimum size and quality to achieve good performance on the users XR device of choice. However, there are use cases where very large data sets cannot be optimised to the volume of polygons needed to maintain quality.

The trade-off between optimising and quality can end up with a model which is unusable. Remote rendering solves that problem, enabling models with tens and hundreds of millions of polygons to be used on XR headsets and mobile devices. 

By offloading the high computation needs for rendering large data sets at high quality from the device, to a remote rendering server, solves this problem, enabling lower cost headsets to still deliver great graphics quality and performance to the user.

VP-Inputs-Outputs-2

Demonstration Request > Lets Talk

Theorem-XR Experiences

Visualization

The Visualization Experience enables a user to visualize 3D CAD data at full scale and in context and interrogate the data in Augmented, Mixed and Virtual Reality devices.

Design Review

With the Design Review Experience a group of designers can collaboratively review a design milestone in full scale and in context, make comments and annotate the design for later review.

Factory Layout

The Factory Layout Experience enables an individual or group of engineers to layout a work cell, production line or full factory using Mixed and or Virtual Reality

Guides

Guides provides a solution to create Training or Work instructions to teach a front line worker on a build process, or guide an experienced worker during a complex task. It does this using Mixed or Virtual Reality headsets and Desktop devices.

Visual Digital Twin

The Visual Digital Twin delivers a rich 3D digital model allowing it to be overlaid on top of a physical object and tracked. For use in inspection, training, packaging and in support of a full Digital twin.

Visualization Pipeline

The Visualization Pipeline is a fully automated solution that prepares your CAD data for use in XR. It can be run in batch mode, processing data triggered by PLM processes, sat inside of CAD using the save as command, or drag and drop from the file system.