Epic Games has released Unreal Engine 4.27, the latest update to the game engine and real-time renderer. Although it’s tempting to think of it as a stopgap release before the game-changing Unreal Engine 5 – now in early access, and due for a production release “targeting 2022” – it’s a massive update in its own right. The online release notes alone run to over 40,000 words. But in his article, he picked out 10 changes that are particularly significant for artists, as opposed to programmers – from headline features like the new in-camera VFX features, to hidden gems like multi-GPU lightmap baking and the new camera calibration plugin.Let’s figure it out!
Unreal Engine 4.27 includes a number of new features for in-camera visual effects work, many tested in production during the making of Epic Games’ new live-action demo. They include a set of updates to the nDisplay system, used to project CG environments being rendered in real time in Unreal Engine onto a LED wall or dome, against which actors can be filmed. Key changes include support for the OpenColorIO (OCIO), the colour-management standard specified by the VFX Reference Platform, and described as a ‘gateway’ to ACES colour-managed pipelines for movie work.
In addition, the process of creating new nDisplay set-ups has been streamlined, with a new 3D Config Editor, and all of the key settings consolidated into a single nDisplay Root Actor. Features added in beta include the option to dedicate a GPU to the inner frustum of the display when running a multi-GPU set-up, enabling more complex content to be displayed there. Moreover, experimental support has been added for running nDisplay on Linux, although some key features – notably, hardware-accelerated ray tracing – are not yet supported.
Unreal Engine 4.27 also introduces a number of new features for controlling virtual cameras, intended for scouting locations on virtual sets and for generating camera moves. They include Live Link VCAM, a new iOS app for controlling virtual cameras from an iPad, described as offering a “more tailored user experience” than the existing Unreal Remote app. Related changes include a new drag-and-drop system for building interfaces for controlling UE4 projects from a tablet or laptop, updates to the Remote Control Presets system, and a new Remote Control C++ API.
The release also introduces a new Level Snapshot system, which makes it possible to save and restore configurations for a level without forcing a permanent change to a project, or to source control. For in-camera VFX work, the system makes it possible to adjust a CG environment being rendered in Unreal Engine on a per-shot or per-sequence basis. However, it can also be used more generally to create variant designs for a project for creative reviews.
Support for Universal Scene Description, Pixar’s increasingly ubiquitous framework for exchanging production data between DCC applications, has also been extended. Key changes include the option to export an entire Unreal Engine Level as a primary USD file, with any sublevels or assets automatically being exported as separate USD files referenced by that primary file. Materials with textures can be baked down and exported with the level. In addition, animation sequences can now be exported to “several USD file formats”. Export includes all bone and blendshape tracks, and both the animation preview mesh and the animation itself. The implementation also now supports Nvidia’s MDL material schema, favoured by Nvidia over MaterialX in Omniverse, its USD-based – and UE4-compatible – online collaboration platform.
Artists working with hair or fur in Unreal Engine can now attach hair grooms to Alembic caches. The change makes it possible to bind grooms directly to geometry caches imported from other DCC applications, rather than having to use the “awkward workflow” of binding a groom to a Skeletal Mesh. It is also now possible to import grooms that have already been simulated and which contain cached per-frame hair data, and play back the simulation in the editor, Sequencer and Movie Render Queue.
Users of Composure, Unreal Engine’s real-time compositing system, get a new Camera Calibration plugin, for matching the lens properties of the physical camera generating the video to the virtual camera in UE4. The plugin can also be used to apply real-world lens distortion to an Unreal Engine CineCamera.
GPU Lightmass, the new framework for baking lightmaps on the GPU introduced in Unreal Engine 4.26, has been updated, and now supports Level of Detail (LOD) meshes, coloured translucent shadows, and more lighting parameters, including attenuation and non-inverse-square falloff. It is also now possible to use multiple GPUs for baking lighting, although multi-GPU support is currently limited to Windows 10 and Nvidia GPUs connected via SLI or NVLink bridges.
Path Tracer, Unreal Engine’s physically accurate rendering mode, gets a sizable update in version 4.27. It now supports refraction; transmission of light through glass surfaces, including approximate caustics; most light parameters, including IES profiles; and orthographic cameras. The Path Tracer can also now be used to render scenes with a “nearly unlimited” number of lights. Epic Games pitches the changes as making Path Tracer a viable alternative to the faster hybrid Real-Time Ray Tracing mode for production rendering, particularly for architectural and product visualisation. According to Epic, Path Tracer now creates “final-pixel imagery comparable to offline renders”.
Sequencer, Unreal Engine’s cinematics editor, also gets a number of new features, of which the most significant is probably the new Command Line Encoder for its Movie Render Queue. The encoder makes it possible to batch render image sequences in custom formats using third-party software like FFmpeg, as well as in the preset BMP, EXR, JPEG and PNG formats. Other changes include a new Gameplay Cue track, for triggering gameplay events directly from Sequencer. In-game movie playback via Media Framework is now frame-accurately synced with the Sequencer timeline.
Niagara, Unreal Engine’s in-game VFX framework, gets new tools for troubleshooting particle systems, including a dedicated Debugger panel and HUD display. A new Debug Drawing mode can be used to trace the paths of individual particles within a system. In addition, Niagara’s Curve Editor has been updated to match the one in Sequencer, providing “more advanced editing tools to adjust keys and retiming” for particle systems.
And there’s more…
Unreal Engine 4.27 is available for 64-bit Windows, macOS and Linux. Use of the editor is free, as is rendering non-interactive content. For game developers, Epic takes 5% of gross lifetime revenues for a game beyond the first $1 million.You can find a full list of changes here, which also include significant new features for architectural and product visualization, design reviews, live visuals, virtual and augmented reality, facial motion capture etc.
As you may know, iRender provides high performance and configurable server system for 3D rendering, AI Training, VR & AR, simulation, etc. With our server, you can install any software you need, and do whatever you like with your project and Unreal Engine are no exception. At iRender, we offer you server 3A with single 3090 and powerful hardware like Processor Intel Xeon W-2235 3.8GHz (4.6GHz Turbo) 6 Core, RAM 128GB and Storage NVMe SSD with 512GB. With those configs, the server can serve any project on Unreal, ensuring it to render and load faster and more stable.
When you use iRender, you can free your computer during the hardest and most arduous part – rendering. That is also the reason that Render Farm service is becoming more and more popular and essential. However, as far as we know, iRender is proud to be one of the very few rendering services that support single GPU, and with a powerful card like RTX3090. So let’s register an ACCOUNT today and get FREE COUPON to experience our service.
Thank you & Happy Rendering!
Reference source: Jim Thacker on cgchannel.com