Rendering: Where Are We Now?

Rendering: Where Are We Now?

Rendering is great. You take a flat-looking CAD model, or scene, apply some materials and lighting, press some buttons, wait a while…and behold! You have a photorealistic looking scene. Or as near to that as possible.

Rendering software comes in all shapes and sizes. For instance, Hollywood uses rendering in your favorite CGI blockbusters; video game developers use them in your favorite games;and architects, product designers and engineers alike all use them in more industrial applications where it’s important to communicate product concepts to clients before cutting any metal.

But in 2018, the distinctions between these industries and the renderers that they use became very blurred indeed.

Real-time renderers from game engine companies such as Epic Games and Unity are vying for the engineering market, traditional engineering and product design renderers are adding physics simulation, some renderers are going onto the cloud, and one company is even using real-time rendering for medical imaging. It’s an exciting time to be involved in rendering, be it as a render engine designer, a design engineer, or perhaps most importantly…the clients who will ultimately be the ones feasting their eyes on your visualizations.

We chatted with six rendering software companies to see what is up in the world of this technology.

Unreal Engine

We have been getting down and dirty with Unreal Engine from Epic Games recently, as we have been test driving the company’s new Unreal Studio feature, which is designed to help engineers, architects and product designers create lovely scenes with their existing CAD data.

For those who don’t know, Unreal Engine is a gaming engine that has been used to design games such as Unreal Tournament, Fortnight, as well as the slightly more obscure (but awesome) tech-noir horror/thriller Observer (starring Rutger Hauer).

“The Speed of Light” Porsche demo, which was created with Unreal Engine 4 using real-time ray tracing and NVIDIA RTX.

We spoke to Thomas Convard, technical product manager at Epic Games, to get the lowdown on why engineers might like to try Unreal Studio.

“Unreal Studio offers a quick path from the CAD tools to an interactive experience environment, where users can assess their design in real time plus VR or AR,” said Convard.

“[Engineers using Unreal Studio are] designing complex systems that involve reviewing, comparing design solutions, and making decisions.”

For a few years, architects and product designers have been the primary users of Unreal Engine as a tool to create renders of their products, but until the release of Unreal Studio, the software’s workflow was very clunky, with users having to create their own solutions to aid with converting their various CAD files into something usable in Unreal Engine. Now, with the aid of Datasmith, and the Unreal Studio workflow, an increasing number of engineers are taking to Unreal Engine.

“In aerospace or automotive,for example, Unreal Studio got used initially because of the visual quality of Unreal Engine. So, the first adopters were more from industrial design or marketing,” explained Convard.

“Now we’re seeing a broader adoption in those companies, for system engineering, manufacturing or simulation, where the user profile is more engineer.”

One such company is British supercar manufacturer McLaren, which is using the software to create real-time configurators that enable customers to see how their options will change the appearance of the final product. You can read more about that project, named the McLaren Design Visualization Application, and download a white paper on how McLaren achieved this, over at this link.

Figure 2. McLaren Configurator. (Image courtesy of Epic Games.)

Figure 2. McLaren Configurator. (Image courtesy of Epic Games.)

Part of the attraction to engineers surely comes from the built-in physics engine within Unreal Engine. The collision detection, and effects like gravity, mass and inertia, and lighting can all be simulated within the engine, making for a powerful tool that goes far beyond just rendering pretty pictures (although it does that pretty well, too!).

Engineers in the autonomous vehicle field use Unreal Engine as a simulation platform—[with]environment and vehicle motion being simulated in UE as well as sensor data being synthesized. Engineers are often coupling with a legacy simulation software stack.”

This is an interesting point to make. By using existing simulation data, engineers are taking the outputs of those simulations, then inputting them into Unreal Engine to drive motion, graphical changes and other characteristics within Unreal Engine. Remember, Unreal Engine was designed as a gaming engine. And what is a video game, if it’s not just a bunch of inputs driving something on screen? In principle, you can take any numerical input, run it into Unreal Engine, and have your data visualized to your own requirements.

As a very basic example, you could have a desktop fan with some sort of rotational speed sensor outputting the RPM of the fan, some position sensor that matches the 3D model of the fan to the real-life blade position, and a temperature sensor on the motor. It’s not difficult to have Unreal Engine take those inputs, translate them into color changes for the temperature, rotations on the blades, etc. Hey, that sounds like a digital twin!

“We’re also seeing people using UE as a ‘digital twin’ …for remote supervising or control of complex systems such as factory [processes], robotics, etc.”

Epic Games has some big customers that are using Unreal Engine for engineering purposes. Oceaneering, for example, is using Unreal Engine to augment underwater remotely operated vehicle (ROV) camera vision with HUDs based on the ROV sensor data to aid in poor visibility conditions. NASA uses Unreal Engine to recreate the International Space Station for training astronauts, and BMW is using the platform for its mixed reality lab, which combines VR/AR with tactile hardware for an even more immersive experience.

You can download the latest version of Unreal Studio (beta) for free right here, and try it for yourselves.

SOLIDWORKS Visualize

Next,we spoke to Dassault Systèmes about how SOLIDWORKS rendering capabilities have been evolving.

Up until 2011, SOLIDWORKS users would have been familiar with the PhotoWorks add-in, which was based on NVIDIA’s Mental Ray engine (and was used for movies such as Star Wars Episode II).

Since then, SOLIDWORKS users have been employing PhotoView360, which is described as a one-button render utility. And it’s a fairly accurate description.

You load the PhotoView360 add-in into SOLIDWORKS, position your scene as you like it, and push a “render” button. Then, depending on how powerful your computer is, you wait for a while and your scene is rendered. It’s very user friendly, and comes with a whole bunch of high-dynamic-range imaging (HDRI) scenes, materials and default lighting schemes designed to help engineers (who may not be especially artistically minded) create photorealistic looking images.

More recently, SOLIDWORKS users have been given access to a new software named SOLIDWORKS Visualize Professional, which is much faster than PhotoView360, contains a pretty sweet Denoiser feature, animation tools, and the latest release even contains a driving simulator. We wrote a review and tutorial on that driving simulator a couple of months ago.

[Author’s note: If you’re noticing a lot of automotive renders in this article so far, it’s because they look nice. There is something about the curves, paintwork color and reflections on a car model that really make a vehicle a perfect subject for render testing.]

We spoke to Brian Hillner, senior product portfolio manager at SOLIDWORKS, to see what the company has to say about rendering.

“A good sentence to describe Visualize is that it enables anyone to create photo quality images, animation and other 3D content in the fastest, easiest way possible. So that’s kind of our mantra: fast, easy and fun. We like to think of it as the camera for your CAD data.

“The way that it differs from PhotoView, is that PhotoView 360 is best for designs that are ongoing, with continuous changes, very iterative and very early on in the design process. And because PhotoView is integrated into SOLIDWORKS CAD, it means you don’t have to export [the file] out. You can see all those reflections and materials and surface details, but not in a photoreal way. It’s not highly photoaccurate, but it gets the job across because it’s a very quick and basic idea of what your product will look like, without having to leave the native SOLIDWORKS environment.

“Now, where Visualize has picked up the baton from there, it introduces a very simple and clean interface which is more similar to traditional photography. A lot of the settings, and buttons, and naming conventions reflect what a traditional photographer would see in their photo studio.”

Figure 3. This '69 Camaro took about 7 minutes to render on full settings at 4K. (Image courtesy of the author.)

Figure 3. This ’69 Camaro took about 7 minutes to render on full settings at 4K. (Image courtesy of the author.)

So, you can work on your model in SOLIDWORKS and check how it looks within PhotoView 360 to get a rough idea of how it will appear, Then when you are satisfied with the design, you can load the Visualize add-in within SOLIDWORKS proper, and have it pushed directly to Visualize, enabling you to maintain the materials and lighting that you tried out in the CAD program. Or you can scrap it all and add your own lights, materials and scenery to your model in Visualize. It’s up to you.

“We have a local library in Visualize, and a cloud library with over 500 materials, which look pretty darn good. And there’s infinite material types that you can create because there’s about 15 [base] material types now available in Visualize, so it’s possible to create any material that you see in the real world.

So, what other differences are there between PhotoView360 and Visualize? For starters, they use different render engines.

“PhotoView360 uses a render engine called Modo from The Foundry, and Visualize uses Iray from NVIDIA. Iray is much, much more photorealistic and renders much, much faster. So those are the two biggest improvements.

“The render engine itself is faster than in PhotoView, just because of the way the rays bounce and reflect back to the camera…. I don’t want to get too technical, but the render engine is designed to be extremely fast. But the main improvement is the render performance on NVIDIA graphics cards. So, pair your NVIDIA graphics card with Visualize, which uses Iray, and your renders become lighting quick.

“One of the new features in Visualize is the artificial intelligent Denoiser that NVIDIA came out with. We at Visualize implemented it in the 2018 release, with Service Pack 3.

“What NVIDIA did was, they got thousands of images and thousands of datasets, and trained a neural network to understand what noise is, and how to recognize noise in renders. And they asked [the computers], ‘hey, you know what noise is…why don’t you just remove it?’ So that’s what it is. It’s an insane performance improvement that is literally just a checkbox in Visualize, and the results from our own benchmarking tests show that it is 10x faster.

“So, close-up images of objects such as in automotive, with loads of reflections and surface detail which may have taken an hour before, can now be rendered in 10 minutes. An animation which may have previously been an overnight job, can now be finished over lunchbreak. This is all down to NVIDIA and Iray. I’ve been in the rendering business for 15 years now, and I’ve never seen an innovation [the Denoiser] shave off so much time before. It’s leaps and bounds from where we were previously. So, this AI Denoiser is totally game-changing.”

So, why would engineers want to use Visualize for their renders? We asked for a quick elevator pitch.

“I can answer this in two words…‘photo-quality’ and ‘ease-of-use.’ It doesn’t get any more simple than that.”

That’s technically five words, but we added hyphens, so we think it still counts. Check it out for yourselves, and you’ll be impressed with the speed and quality of the renders it produces.

Figure 4. Nice paint jobs. (Image courtesy of the author.)

Figure 4. Nice paint jobs. (Image courtesy of the author.)

The Vivid Metallic Paint appearance (material) pack is pretty gorgeous, and we have a lot of fun with it (as you can see from the images in this section).

KeyShot

KeyShot from Luxion has been around for a while and is currently on the 8th version of the software.

Luxion claims on its website that KeyShot is the “the fastest, most powerful software for real-time 3D rendering and animation.” One thing we can confirm is that KeyShot does make some pretty impressive renders. And we can also confirm that, qualitatively speaking, it is very easy to use, having downloaded the demo and tried it for ourselves.

Figure 5. A machined thing...or is it? Nope, it's a render.(Image courtesy of Related Fluid Power.)

Figure 5. A machined thing…or is it? Nope, it’s a render.(Image courtesy of Related Fluid Power.)

Rendering with KeyShot is easy. Just how easy? Well, you just import your CAD model, drag and drop the materials onto the 3D model, adjust the lights, position the camera…and then action! It renders before your very eyes.

We spoke with Josh Mings, director of Marketing at Luxion, to find out why engineers like to use KeyShot.

“For an engineer, KeyShot is a fast solution for the need of 3D visuals throughout the design engineering process,” said Mings.

“It helps keep the focus on the design and engineering while letting you create a quick rendering when you need it. KeyShot also supports import for all the major 3D modeling applications and 3D file formats, plus has plug-ins for the most common.”

Indeed it does. Engineers will be pleased to hear that those formats include SOLIDWORKS, Solid Edge, SketchUp, PTC Creo, Rhino, Pro/E, IGES, STEP, FBX, OBJ and 3ds, to name but a few.

“We have a lot of engineering companies that use KeyShot. Some in particular to see are Great PlainsRelated Fluid Power, and Caterpillar.”

You can see more examples over at this link, and you can see a rendering of some Caterpillar heavy machinery in Figure 6.

Figure 6. Digging things. (Image courtesy of Caterpillar Inc.)

Figure 6. Digging things. (Image courtesy of Caterpillar Inc.)

So, what makes KeyShot unique?

You can see on the company’s features page that there are six aspects that Luxion says make all the difference to engineers.These are:

  • Real-time speed
  • Ease of use
  • CPU-powered rendering
  • Accurate materials
  • Advanced lighting
  • Efficient workflow

We asked Luxion what it thought would be the upcoming trends in rendering for product design, architecture and engineering?

“With both hardware and software becoming more powerful, we’ll see some advancement in real-time rendering speeds and the level of realism that is able to be achieved,” said Mings.

“Automation (AI) will be a big topic, not just in generative design, but in adaptation of the software itself. VR capabilities are going to expand and engineers will be able to view their work and even work in the environment their design will be used in.”

So, what is new in the latest release of KeyShot? What new innovations will the company be bringing to the current rendering space?

“KeyShot 8 is a pivotal release that adds tools to do more with materials and removes the dependence on preparing geometry before import to KeyShot or modifying the image after rendered from KeyShot. You can import a model and create interactive cutaways quickly, add 3D textures to surfaces, and create an endless amount of image styles to adjust the feel/appearance of the image.”

You can see more on the What’s New page here, or you can download a fully functional demo right here.

Autodesk Fusion 360

Autodesk has a number of renderers available, including that found in their flagship Fusion360 product, and more recently, the company has announced further collaborations with game engine maker Unity to appeal to those in technical fields looking to get in on the real-time rendering action.

To talk about Fusion 360, we spoke to Rob Cohee, senior product manager at Fusion 360.

“Our customers love rendering in Fusion 360 because it’s fully integrated with their workflow—there’s no need to switch between programs or convert files—and that saves them a lot of time and energy at the end of the day.” said Cohee.

“Fusion 360 automates the tasks that a user would typically have to complete after making a change. Instead, they can easily drag and drop views into the render gallery to setup renders or save updates.”

With Fusion 360 being partly cloud based, should we expect to see more rendering platforms migrating to the cloud to take advantage of all of that delicious and cheap GPU-based compute? Is the future of rendering in the cloud? We asked Autodesk that exact question.

“The expected answer here is that the future of rendering is cloud rendering,” said Cohee.

“But we offer both and will continue to offer both because that’s what our users are asking for. They want the choice to work online, offline, in the cloud, or to use their own hardware to complete rendering tasks.”

Figure 7. Cloud or local? It's your choice. (Image courtesy of Autodesk.)

Figure 7. Cloud or local? It’s your choice. (Image courtesy of Autodesk.)

Unity

Epic Games isn’t the only game engine company with its eyes on the engineering, product design and architectural markets.

Just a couple of months ago, we reported how game engine platform Unity had partnered with Autodesk to bring Revit and VRED format support to the Unity engine for VR modeling. Indeed, it’s not the first time Autodesk and Unity have partnered, as the companies previously added support for Autodesk’s Maya and 3D Studio Max to the game engine.

Figure 8. Imported from VRED into Unity with ease. (Image courtesy of Unity.)

Figure 8. Imported from VRED into Unity with ease. (Image courtesy of Unity.)

File size is one huge difference between engineering CAD models and the 3D models used in gaming. Engineering models tend to be fairly heavy due to the curvatures and high degree of accuracy involved, especially when it comes to CAD parts that are to be used in manufacturing.

But to be honest, you don’t really need so much detail in a video game. Imagine if you used exclusively engineering files when designing a video game. You could end up with a gigabyte of model data just for your main character!

To address this, Unity has partnered with PiXYZ STUDIO on a plug-in meant to deal specifically with CAD data. Like Datasmith, the plug-in reduces the complexity of the CAD model and makes it compatible with the game engine.

The Autodesk/Unity workflow is slowly being rolled out over the next few months, with a few customers currently testing the system.

You can see how Revit files can now be used directly with Unity in the video below.

We spoke to Dan Prochazka, senior product manager for AEC at Unity, about why real-time rendering in a gaming engine is an attractive prospect to engineers.

“Real-time rendering enables more comfortable access to rich interactive 2D, 3D, VR and AR experiences,” said Prochazka.

“Brands like Volkswagen and Skansa use real-time rendering to save money, eliminate design friction, and increase efficiencies for better collaboration—and taking advantage of the interoperability between Unity and Autodesk, which helps improve this all around.

“Being a gaming engine, and having all that physics floating around under the hood, has meant that Unity customers have also discovered that a game engine can be used for much more than rendering.

“Clients including Willow and Stereograph [are using Digital Twins built with Unity], which are enabling inputs from the design phase of a project and from real-time IoT devices to see the current state of a building. Additionally, other clients in areas like manufacturing, shipping and mining have created monitoring systems that connect monitoring data to a 3D visualization model to aid with visual identification of issues.

“This [Autodesk/Unity product] is new and is still being rolled out to clients. An example of a client that will be able to benefit from the collaboration is offshore and onshore geotechnical and survey services company Fugro, who is using Unity to create data visualizations that help identify safety risks.”

Siemens Cinematic Rendering

Finally, we come to Siemens, which also has a wide range of products for rendering. Siemens NX users will be familiar with the Lightworks Iray+ engine, which is also based on NVIDIA’s Iray technology.

For this article, though, we are going to take a look at Siemens’ Cinematic Volume Rendering Technique (VRT)product, which is developed by SiemensHealthineers.

It’s slightly outside of the realm of engineering rendering, but it does make some very awesome and usable imagery for medical applications (and it uses engineering to do so). And for those reasons, we are including the productin an article on rendering for engineers. And besides, who doesn’t want to see how rendering technology is letting us see the meaty bits inside us?

“Cinematic VRT basically operates as a virtual camera. The program makes it possible to hide soft tissue, muscles and blood vessels, giving a clear view of the bone structure,” explained Klaus Engel, researcher at Healthineers.

Figure 9.Cinematic VRT at work—weirdly hungry for oxtail soup right now. (Image courtesy of Siemens.)

Figure 9.Cinematic VRT at work—weirdly hungry for oxtail soup right now. (Image courtesy of Siemens.)

The software uses computer tomography (CT) and magnetic resonance imaging (MRI) to build a 3D model within the Cinematic VRT software. And when those body parts have been converted into 3D and digitized, they can be switched on and off like parts in a CAD assembly (or used to make cutaway views), as you can see in the realistic and meaty image of a body in Figure 9.

“For many people, Cinematic Rendering offers the first real insight into what is going on inside their body,” said Bernd Montag, CEO Siemens Healthineers.

“This example highlights the broad range of opportunities originating from the digitalization of healthcare.”

You can see how Siemens Healthineers isusing AI with Cinematic Rendering in the video below.

Final Thoughts

So, that’s about it for this article. We have seen the cross-pollination of technologies and workflows from different industries—all making use of that delicious GPU power. And also, in the case of KeyShot and SOLIDWORKS Visualize (and others), some are still keeping the option of CPU rendering on the table.

As the cloud develops, and GPUs become cheaper and cheaper still, we may see CPU-based rendering disappear sometime in the future. But until then, there are at least a few companies thatare happy to keep rendering on CPUs on bare metal systems, so no need to throw out your expensive (but aging) workstation just yet.

Also, as withthe medical imaging example, we should probably expect to see a few interesting render-based technologies coming from out of left field. For sure, we will hear more case studies of companies making use of game engines to produce simulations and other visualizations.

Of course, we didn’t really touch upon VR or AR in this article—many of the real-time renderers have VR capabilities (particularly the game engines). VR will play a big role in rendering in the future, but the “VR in Engineering Design”article is a topic unto itself that is best saved for a later date.

Until then, download yourselves some new renderers, and have some fun! Most are free, and have demos or a student version, so check them out.

Source: engineering.com