Движок unreal engine 5
Движок unreal engine 5
Augmented Reality Overview
An overview of Epic’s Augmented Reality implementation.
Choose your operating system:
Augmented reality (AR) is a technology that overlays a computer-generated image on a user’s view of the real world, providing a composite view.В
The Unreal EngineВ ARВ framework provides a rich, unified framework for building augmented reality apps with theВ Unreal Engine for both iOS and Android handheld platforms. The unifiedВ framework provides a single path of development for both platforms, allowing developers to build augmented reality apps for both platforms using a single code path. The Handheld AR Blueprint templateВ provides a complete example project demonstrating the augmented reality functionality availableВ in theВ Unreal Engine.
Augmented reality provides user experiences that add 2D or 3D elements to a live view from a device’s camera in a way that makes those elements appear to inhabit the real world.
iOS and Android Release Support
The unified AR framework includes support for basic AR functionality like Alignment, Light Estimation, Pinning, Session State, Trace Results, and Tracking.
However, the augmented reality story for Android and iOS is constantly evolving. As of Unreal Engine 4.23, we now support some of the advanced functionality available in the latest ARCore and ARKit releases.
ARCoreВ 1.7
Vertical Plane Detection
ARKitВ 3.0
2D Image Detection
3D Object Detection
Motion Capture (2D, 3D, LiveLink)*
* Unreal Engine 4.23.1 provides beta support for this feature.
Epic Games developer Joe Graf has written several informative blogs discussing ARKit functionality in UE4.
Augmented Reality API
The Handheld AR Blueprint template provides a complete example project demonstrating the new augmented reality functionality availableВ in Unreal Engine. A good place to start exploring the project and the new augmented reality functions is to open theВ Content Browser, navigate toВ Content\HandheldARBP\Blueprints\UI and open theВ BP_DebugMenuВ assetВ in the Blueprint Editor.
Supported Handheld Platforms
Currently, we support theВ iOS and Android platforms. Please read through the following pages to learn which devices are supported onВ each platform.
It’s worth mentioning that support for handheld iOS and Android devices isn’t new to the Unreal Engine, so if you’re already working with Unreal Engine and iOS or Android devices, you’ll only need a minimal amount of additional configuration to get started with augmented reality in Unreal Engine.
For detailed iOS augmented realityВ prerequisite information, see the ARKit Prerequisits topic. Additionally,В basicВ configuration of Unreal Engine and iOS devices is covered in theВ iOs and tvOS В section of the Unreal EngineВ documentation.В
Android
For detailed Android augmented reality prerequisite information, see the
Getting Started with Unreal AR
Now that you understand some basic information about usingВ Unreal Engine withВ augmented reality, it’s time to get started by running through theВ Handheld AR Template Quickstart В tutorial.
Creating a New Project
Describes how to create and configure a new project in Unreal Engine.
Choose your operating system:
When you launch Unreal Engine, the Unreal Project Browser opens automatically. This is where you can:
Create a new project.
Open an existing project.
Manage existing projects.
The diagram below illustrates the steps to create a new project in Unreal Engine.
Creating a new project in Unreal Engine from the Project Browser window.
To create a new project, follow these steps:
Select the development category (1) that best matches your industry and project goals.
You can select from the following categories:
Film, Television, and Live Events
Architecture, Engineering, and Construction (AEC)
Automotive, Product Design, and Manufacturing (APM)
Select a template (2) for your project. The templates you can choose from are based on the category you selected in step 1.
Unreal Engine contains a number of project templates you can use as a starting point for your own projects. To learn more about the different project templates available, refer to the Templates Reference page.
Configure the Project Defaults (3). In this section, you can select your target platform (that is, the hardware where your game or application will run, like a computer or a mobile device), configure quality and ray tracing settings, and more.
Some of the settings below may not be available for certain templates. For example, the Handheld AR template can only use Blueprint implementation.
You can configure the following settings:
Implementation
Select how you want to implement your project’s logic, such as character movement, level transitions, and so on.
You can choose from the following options:
Blueprint, if you want to build your project in the Unreal Editor, and use the Blueprint Visual Scripting system to create interactions and behavior.
C++, if you want to build your project by programming with C++ in Visual Studio.
For more information about these implementation methods, refer to the following pages:
Target Platform
Select the kind of platform your project is intended for:
Desktop
Mobile
Quality Preset
Select the maximum quality level, based on which platform your project targets. We recommend that you choose:
Maximum, if you are developing your project for a computer or game console.
Scalable, if you are developing your project for mobile devices.
Starter Content
Select whether you want your new project to include starter content. Starter content includes some simple Static Meshes with basic textures and Materials. It is useful if you want to start learning and experimenting straight away, without worrying about sourcing and importing custom content.
Ray Tracing
Select whether to enable or disable ray tracing for your project.
For more information about ray tracing in Unreal Engine, refer to the Hardware Ray Tracing and Path Tracing Features page.
Select where you want to store your project, and give your project a name (4).
Click Create (5) to finish creating your new project.
Result
Unreal Engine creates a new project with the settings you configured, and then automatically opens the project.
Tools and Editors
An overview of the different types of Editors contained within Unreal Engine 5.
Choose your operating system:
Unreal Engine 5 provides a combination of tools, editors, and systems you can use to create your game or application.
This page uses the following terms:
A tool is something you use to perform a specific task, like placing Actors inside a level, or painting terrain.
An editor is a collection of tools you use to achieve something more complex. For example, the Level Editor enables you to build your game’s levels, or you can change the look and feel of Materials inside the Materials Editor.
A system is a large collection of features that work together to produce some aspect of the game or application. For example, Blueprint is a system used to visually script gameplay elements.
Sometimes, systems and editors can have similar names. For example, the Material Editor is used to edit Material assets, while the Material system provides the underlying support for using Materials in Unreal Engine.
Some of these tools and editors in Unreal Engine are built-in, while others come in the form of optional plugins that you can enable or disable depending on your project needs. To learn more about plugins, refer to the Working with Plugins page.
This page gives an overview of the major tools and editors you will be working with inside Unreal Engine 5. The use of various Unreal Engine tools is covered in detail in feature-specific documentation.
Whether you use the Blueprint Editor to script behaviors for the Actors in your level, or create particle effects with the Niagara Editor, a good understanding of what each editor can do and how to navigate each one can improve your workflow and help prevent stumbling blocks during development.
Level Editor
Gameplay Levels
Static Mesh Editor
Static Meshes
Material Editor
Materials
The Material Editor is where you create and edit Materials. Materials are assets that can be applied to a mesh to control its visual look. For example, you can create a dirt Material and apply it to floors in your level to create a surface that looks like it is covered in dirt.
Blueprint Editor
Blueprints
Blueprint Editor inside Unreal Engine 5. Click for full view.
The Blueprint Editor is where you can work with and modify Blueprints. These are special Assets that you can use to create gameplay elements (for example, controlling an Actor or scripting an event), modify Materials, or implement other Unreal Engine features without the need to write any C++ code.
Physics Asset Editor
Physics
Behavior Tree Editor
AI Behavior
The Behavior Tree Editor is where you can script Artificial Intelligence (AI) through a visual node-based system (similar to Blueprints) for Actors in your levels. You can create any number of different behaviors for enemies, non-playing characters (NPCs), vehicles, and so on.
Niagara Editor
Particle Effects
The Niagara Editor is for creating special effects by leveraging a fully modular particle effects system composed of separate particle emitters for each effect. Emitters can be saved in the Content Browser for future use, and serve as the basis of new emitters in your current or future projects.
UMG UI Editor
User Interface
The UMG (Unreal Motion Graphics) UI Editor is a visual UI authoring tool that you can use to create UI elements, such as in-game head’s up displays, menus or other interface-related graphics.
Font Editor
Fonts
Use the Font Editor to add, organize and preview Font Assets. You can also define font parameters, such as Font Asset layout and hinting policies (font hinting is a mathematical method that ensures your text will be readable at any display size).
Sequencer Editor
Cinematics and Dynamic Events
The Sequencer Editor gives you the ability to create in-game cinematics with a specialized multi-track editor. By creating Level Sequences and adding Tracks, you can define the makeup of each Track, which will determine the content for the scene. Tracks can consist of things like Animations (for animating a character), Transformations (moving things around in the scene), Audio (for including music or sound effects), and so on.
Animation Editor
Animation
Control Rig Editor
Animation
Sound Cue Editor
Sound Cues
The behavior of audio playback in Unreal Engine 5 is defined within Sound Cues, which can be edited using the Sound Cue Editor. Inside this editor, you can combine and mix several sound assets to produce a single mixed output saved as a Sound Cue.
Media Editor
External Media Playback
Use the Media Editor to define media files or URLs to use as source media for playback inside Unreal Engine 5.
You can define settings for how your source media will play back, such as auto-play, play rate, and looping, but you can’t edit media directly.
nDisplay 3D Config Editor
Virtual Production and Live Events
nDisplay renders your Unreal Engine scene on multiple synchronized display devices, such as powerwalls, domes, and curved screens. With the nDisplay Configuration Editor, you can create your nDisplay setup and visualize how content will be rendered across all the display devices.
DMX Library Editor
Live Events
DMX in action. This screenshot is from a sample project by Moment Factory. Click for full view.
DMX (Digital Multiplex) is a standard for digital communication used throughout the live-events industry to control various devices, such as lighting fixtures, lasers, smoke machines, mechanical devices, and electronic billboards. In the DMX Library Editor, you can customize these devices and their commands.
Audio in Unreal Engine 5
An overview of the current state and future vision for Audio in Unreal Engine 5.
Choose your operating system:
The Audio Mixer is Unreal Engine’s fully-featured, multi-platform audio renderer. This powerful audio engine was first launched in Fortnite and was later added to Unreal Engine 4.24.
Since its initial release, the Audio Mixer has continued to evolve its legacy feature set, and has added next-generation audio rendering features that provide unprecedented control and flexibility to audio designers working on real-time experiences.
This overview covers the current legacy feature set, as well as the next-generation features added to Unreal Engine 5 (UE5).
Legacy Features
The core of the Unreal Engine legacy audio features are Sound Cues, Sound Classes, and Sound Mixes. These features are supported in the new Audio Mixer to facilitate the migration of projects to the new audio engine.
Sound Cues
Sound Cues are Unreal Engine’s legacy audio objects. Sound Cues are audio parameter graphs from which audio designers can play sound sources and manipulate various parameters, such as volume and pitch.
Sound Cues are not capable of sample-accurate control and do not support the creation of arbitrary digital signal processing algorithms. The workflow is mostly manual, forcing audio designers to create separate Sound Cues for every sound.
Lastly, the inability to handle sample-accurate timing and scheduling of audio causes the creation of procedural sound to become more challenging.
Sound Classes and Sound Mixes
Sound Classes represent a set of static parameters corresponding to specific sounds.
These are used to categorize sounds and apply a consistent set of parameters to the project. Due to their static nature, audio designers will often end up creating many similar Sound Classes to cover all the different use cases needed for a large project. Debugging Sound Classes can also become complex due to inheritance inconsistencies between parameters within Sound Classes, where some take precedence over others.
Sound Mixes provide audio designers the ability to modulate a set of audio parameters dynamically at runtime. This is useful in creating dynamic sounds that react to gameplay events and other conditions. Sound Mixes can also become challenging to debug as audio designers can add complexity by creating multiple instances of the same Sound Mix and combining them with Sound Classes.
Sound Rendering in UE5
Many of the features developed for Audio Mixer expand on the legacy features found in Unreal Engine and address the challenges mentioned above. Other features were newly developed to give Unreal Engine capabilities similar to other professional audio authoring tools.
MetaSounds
Unreal Engine 5 introduces MetaSounds, a new high-performance audio system that provides audio designers with complete control over Digital Signal Processing (DSP) graph generation for sound sources. MetaSounds have feature parity with Sound Cues and include next-generation features that empower designers to create immersive audio experiences.
MetaSounds address the issues present in Sound Cues and serve as a replacement for Unreal Engine’s default audio objects. Unlike Sound Cues, MetaSounds are fundamentally a Digital Signal Processing (DSP) rendering graph. They provide audio designers the ability to construct arbitrarily-complex procedural audio systems that offer sample-accurate timing and control at the audio-buffer level.
In addition to runtime audio rendering capabilities, MetaSounds support improved workflows for audio designers, including: building MetaSounds through composition (MetaSounds within MetaSounds), templatization support, and dynamic and static asset instancing.
MetaSounds also provide significant performance improvements over Sound Cues and offer a fully extensible API that can be used by third-party plugins.
Audio Modulation
In Unreal Engine’s legacy audio feature set, the Sound Class defines static global parameters and applies them to sounds in groups, and Sound Mixes adjust those parameters dynamically at runtime. These parameters are hardcoded and static which limit their use during gameplay.
Audio Modulation is the evolution of these legacy classes and it liberates the concept of parameter control and modulation to a generic «parameter bus». Anything can be a modulation source and anything can be a modulation destination.
The parameter buses can have any number of modulation sources, including Blueprint classes, modulation mixes, and gameplay code. Modulation destinations are parameters that are being modulated and can be locally mapped from the parameter bus in multiple ways.
Audio Modulation Parameter Buses provide audio designers the ability to define their own parameter groups (Sound Classes) and control them however they want (Sound Mixes).
Quartz
Quartz is a feature set that brings sample accurate timing of audio events to Blueprints. Quartz handles the complexity of scheduling audio to play at precise times and is designed to support custom interactive and procedural music systems. Quartz also sends precise timing events back to Blueprints to support synchronized gameplay logic and visuals with audio.
Audio Analysis
Audio Analysis is a set of technologies that offers non-real-time and real-time audio analysis. These tools work with Niagara and Blueprint scripting and will provide integration into Unreal Engine 5’s editor for the creation of UX and debug analyzers, as well as runtime audio analysis to drive gameplay and graphics.
Unreal Editor Interface
Overview of the key elements of the Unreal Editor interface
Choose your operating system:
This page describes the most common elements of the Unreal Engine 5 interface and what they do. It also links to other pages where you can learn more. Some of the elements described on this page, such as panels and menu bars, are generally the same across various parts of the engine. You should spend some time getting familiar with their general purpose and and functionality, especially if you’re new to developing games and applications with Unreal Engine.
When you open Unreal Engine 5 for the first time, the Level Editor opens. You will see the following window:
Default Unreal Editor interface in Unreal Engine 5. Click the image for full size.
Number
Name
Description
Menu Bar
Use these menus to access editor-specific commands and functionality.
Main Toolbar
Contains shortcuts for some of the most common tools and editors in Unreal Engine, as well as shortcuts to enter Play mode (run your game inside Unreal Editor) and to deploy your project to other platforms.
Level Viewport
Displays the contents of your Level, such as Cameras, Actors, Static Meshes, and so on.
Content Drawer button
Opens the Content Drawer, from which you can access all of the Assets in your Project.
Bottom Toolbar
Contains shortcuts to the Command Console, Output Log, and Derived Data functionality. Also displays source control status.
Outliner
Displays a hierarchical tree view of all content in your Level.
Details panel
Appears when you select an Actor. Displays various properties for that Actor, such as its Transform (position in the Level), Static Mesh, Material, and physics settings. This panel displays different settings depending on what you select in the Level Viewport.
Menu Bar
Each editor in Unreal Engine has a menu bar that is located in the top-right of that editor window (Windows) or at the top of the screen (macOS). Some of the menus, such as File, Window, and Help, are present in all editor windows, not just the Level Editor. Others are editor-specific.
Main Toolbar
The Main Toolbar contains shortcuts to some of the most used tools and commands in Unreal Editor. It is divided into the following areas:
1. Save Button
Click this button to save the Level that is currently open.
2. Mode Selection
Contains shortcuts for quickly switching between different modes to edit content within your Level:
3. Content Shortcuts
Contains shortcuts for adding and opening common types of content within the Level Editor.
Shortcut
Description
Create
Choose from a list of common Assets to quickly add to your Level. You can also access the Place Actors panels from this menu.
Blueprints
Create and access Blueprints.
Cinematics
Create a Level Sequence or Master Sequence cinematic.
4. Play Mode Controls
Contains shortcut buttons (Play, Skip, Stop, and Eject) for running your game in the Editor.
5. Platforms Menu
Contains a series of options you can use to configure, prepare, and deploy your project to different platforms, such as desktop, mobile, or consoles.
6. Settings
Contains various settings for the Unreal Editor, Level Editor Viewport, and game behavior.
Level Viewport
The Level Viewport displays the contents of the Level that is currently open. When you open a project in Unreal Engine, the project’s default Level opens in the Level Viewport by default. This is where you can view and edit the contents of your active Level, whether it’s a game environment, a product visualization app, or something else.
The Level Viewport can generally display the contents of the Level in two different ways:
Perspective, which is a 3D view you can navigate to see the contents of the viewport from different angles.
Ortographic, which is a 2D view that looks down one of the main axes (X, Y, or Z).
Different views of the Level Viewport in Unreal Engine 5. Top: Perspective view. Bottom: Top-down ortographic view. Click the image for full size.
To learn more about the Level Viewport, refer to the pages below.
Content Drawer / Content Browser
The Content Browser is a file explorer window that displays all of the Assets, Blueprints, and other files contained in your project. You can use it to browse through your content, drag Assets into the Level, migrate Assets between projects, and more.
Content Browser window in Unreal Engine 5.
The Content Drawer button, located in the bottom-left corner of the Unreal Editor, opens a special instance of the Content Browser that automatically minimizes when it loses focus (that is, when you click away from it). To keep it open, click the Dock in Layout button in the top-right corner of the Content Drawer. This creates a new instance of the Content Browser, but you can still open a new Content Drawer.
Bottom Toolbar
The Bottom Toolbar contains shortcuts to the Command Console, Output Log, and Derived Data functionality. It also displays source control status. It is divided into the following areas:
Number
Name
Description
Output Log
Debugging tool that prints out useful information while your application is running.
Command Console
Behaves as any other command line interface: enter console commands to trigger specific editor behaviors.
Type help and press Enter to open a list of available console commands in your browser.
Derived Data
Provides Derived Data functionality.
Source Control Status
Displays source control status if your project is connected to source control (for example, GitHub or Perforce). Otherwise, this will say Source Control Off.
Outliner
The Outliner panel (formerly known as World Outliner) displays a hierarchical view of all content in your Level. By default, it is located in the upper-right corner of the Unreal Editor window.
You can also use the Outliner panel to:
Quickly hide or reveal Actors by clicking their associated Eye button.
Access an Actor’s context menu by right-clicking that Actor. You can then perform additional, Actor-specific operations from that menu.
Create, move, and delete content folders.
Details Panel
When you select an Actor in the Level Viewport, the Details panel will show the settings and properties that affect the Actor you selected. By default, it is located on the right side of the Unreal Editor window, under the World Outliner panel.
This example shows a cube Static Mesh Actor’s Details panel. With the cube Static Mesh selected, the Details Panel appears docked to the right-hand side of the Unreal Editor. Click the image for full size.
Making Interactive Experiences
How to create gameplay mechanics, behaviors, and conditions that make the virtual world responsive to players carrying out actions over time.
Choose your operating system:
This section contains information about high-level gameplay programming and scripting in Unreal Engine, with an aim towards facilitating interaction between the player and the world. In addition to the basic Gameplay Framework, Unreal Engine includes many systems and frameworks for handling common gameplay elements, including AI, physics, user interfaces, camera management, and input. The guides in this section will provide a reference for how to use these features, as well as walkthroughs for how to re-create common mechanics and systems in games.
Top Level
Overview of accessibility features in Unreal Engine 5.
Describes the systems available within Unreal Engine that can be used to create believable AI entities in your projects.
Large World Coordinates
An Overview to Large World Coordinates and how it is used in Unreal Engine 5.
Choose your operating system:
Large World Coordinates (LWC) introduces support for double-precision data variant types in Unreal Engine 5(UE5) where extensive changes are being implemented across all engine systems to improve floating-point precision. These systems include Architectural Visualization, Simulation, Rendering (Niagara and HLSL code), and projects with massive world scale.
In Unreal Engine 4(UE4), 32-bit float precision types would restrict the size of the world. LWC vastly improves the size of your projects by providing a 64-bit double to your core data types. These new changes will enable you to build massive worlds and greatly improve Actor placement accuracy and orientation precision. Large World Coordinates are ready for you to use when you begin a new project in UE5.
Upgrading your Project to Unreal Engine 5
Experimenting with Large Worlds
As a result of the beta status of Large World Coordinates in UE5, the default WORLD_MAX size has been left at the UE4 WORLD_MAX size of 21km and the engine check on the world bounds remains enabled. There are two options for experimenting with the scale of your large worlds:
You can disable the bounds checks by accessing your WorldSettings class and setting your bEnableLargeWorlds boolean to true:
This will keep the value of WORLD_MAX at approximately 21 km and provides improved stability for experimentation in the initial release of Unreal Engine 5.0.
Alternatively, you can set the global value of UE_USE_UE4_WORLD_MAX to enable larger world bounds:
This will set the WORLD_MAX value to approximately 88 million km.
This value may change before future releases of Unreal Engine and may exhibit stability issues that will continually be optimized throughout the development of Unreal Engine 5.
Upgrading your Project to Unreal Engine 5
When upgrading your UE4 project to a UE5, We generally recommend you refer to the Migration Guide, as you may experience a few edge cases where your code base experiences precision loss. This is not as much of a cause of concern for projects that are not using Large World Coordinates. However, for projects that plan to use the double type to increase the scale of their worlds, we recommend you use the Large World Coordinates Conversion Guidelines.
Blueprints
In Blueprints, floats now display with the appropriate single or double precision subtype. This new type supports feature parity with both float and double. All existing Blueprints and Blueprint types (UMG, Control Rig, Animation Blueprints) have been implicitly converted to use either precision type without any need for you to update your previous work.
Source Code Interface
Source code may now expose both float and double types. The Unreal Header Tool(UHT) will interpret any Blueprint-accessible floating-point type in code as a Blueprint Float with the appropriate single (C++ float) or double (C++ double) precision subtype, enabling automatic conversion of Float values of either precision supplied by any Blueprint node.
Exposing UFUNCTION Property Specifiers Expecting Float Values
Any method marked with a UFUNCTION Property Specifier that contains a float data value risks introducing imprecision, because the Blueprint Float value is cast to the lower precision float. It is important to audit any existing UFUNCTION property. This helps to determine if switching the parameters or return values to double may be necessary to avoid precision issues in the future. Switching between float and double types is safe to do at any time, and in either direction.
This applies to any K2 nodes that you may have constructed or exposed in code.
Rendering
In Unreal Engine you have various coordinate spaces and transformations that describe how an object can exist in the world. World space(coordinates of your level / world) and Local space(relative to a specific object), for additional information refer to the Coordinate Space Terminology documentation.
Each object that can be placed in your project’s world has a three coordinate axis, an orientation, and an origin. Refer to Transforming Actors for additional information.
Shaders
New HLSL types have been introduced along with Large World Coordinates and can be found in the LargeWorldCoordinates.ush file. Refer to the LWC Rendering Doc for additional information on how to convert your shader code to UE5.
Niagara
In Niagara, the implementation is different from the main engine. To keep particle effects performant, the data is stored as a set of floats instead of using doubles. A sufficiently large world will be divided into grid cells. The first float then defines the Niagara system in that grid cell, and the second represents which grid cell the system is in.
To accommodate this additional information, there is a new data type called Position for storing the vectors referenced by Particles.Position. Refer to the Large World Coordinates in Niagara document for additional information.
Chaos
Chaos Destruction Physics continues to function unabridged using the double type. The engine implicitly casts in both directions, for example:
With further release versions of Unreal Engine 5, the FVector casts will continue to implicitly upgrade your floats to double. If you want to cast your double to a float, then you will need to explicitly perform a conversion. Refer to the Large World Coordinates Conversion Guidelines for additional information.
This document serves as a guide to convert your UE4 Projects to UE5 with minimal precision loss.
Hardware Ray Tracing
An overview of the features that use hardware to render real-time ray traced results.
Choose your operating system:
Ray tracing techniques have been used in offline rendering for film, television, and visualization because they produce high-quality, natural-looking results with soft shadowing for lights, accurate ambient occlusion, interactive global illumination, reflections, and more. However, rendering images at those quality levels have often required powerful computers and long periods of time to render even a single frame.
With Unreal Engine, ray tracing is made possible with supported hardware allowing for interactive experiences rendered with subtle lighting effects in real-time. Unreal Engine’s hardware ray tracing capabilities are coupled with those of traditional raster rendering techniques. Combining the two means tracing rays for fewer samples per pixel and utilizing denoising algorithms to achieve results perceptually close to the results of an offline renderer.
Real-time rendering of Ray Tracing features in the Archviz Interior sample project available on the Epic Games Launcher.
Enabling Hardware Ray Tracing
In the Project Settings under Engine > Rendering > Hardware Ray Tracing, enable Support Hardware Ray Tracing and restart the editor for the changes to take effect.
When Ray Tracing is enabled, it also enables Support Compute Skin Cache for the project, if not already enabled.
Some features of hardware ray tracing, such as Ray Traced Shadows and Ray Traced Skylight, can be enabled independently of other ray tracing features. In the same Hardware Ray Tracing section of the Project Settings, you can enable the features you need for your project.
Features of Hardware Ray Tracing
The following hardware ray tracing features are supported.
Ray Traced Shadows
Ray Traced Shadows simulate soft area lighting effects for objects in the environment. This means that based on the light’s source size or source angle, an object’s shadow will have sharper shadows near the contact surface than farther away where it softens and widens.
Ray Traced Ambient Occlusion
Ray Traced Ambient Occlusion (RTAO) accurately shadows areas blocking ambient lighting better grounding objects in the environment, such as shadowing the corners and edges where walls meet or adding depth to the crevices and wrinkles in skin.
When compared with Screen Space Ambient Occlusion (SSAO), RTAO grounds objects and adds depth to the scene to produce natural looking shadowing in indirectly lit areas.
By varying the Intensity and Radius properties of the Ambient Occlusion effect, you can control its size and strength.
Ray Traced Reflections
This feature of ray tracing is deprecated and may be removed in a future release.
Ray Traced Reflections (RTR) simulates accurate environment representation supporting multiple reflection bounces.
This example shows a single bounce of ray traced reflections compared to multiple bounces of ray traced reflection. Using multiple bounces creates real-time inter-reflection between reflective surfaces in the scene.
In this comparison, SSR is only capable of a single reflection bounce and is limited to what is visible on the screen for representation. On the other hand, RTR is capable of multiple bounces and is not limited to what is visible, meaning that we can visibly see the sides of the book, reflected floor behind the camera, and additional lightings being reflected on surfaces coming through the window.
Ray Traced Translucency
This feature of ray tracing is deprecated and may be removed in a future release.
Ray Traced Translucency accurately represents glass and liquid materials with physically correct reflections, absorption, and refraction on transparent surfaces.
Ray Traced Global Illumination
This feature of ray tracing is deprecated and may be removed in a future release.
Ray Traced Global Illumination (RTGI) adds real-time interactive bounce lighting to areas of your scene not directly lit by a given light source.
There are two ray-traced global illumination methods to choose from in the Post Process Volume:
Brute Force emulates offline renderers indirect lighting, but is slower to render.
Final Gather provides a single bounce of indirect lighting, but is faster to render.
Final Gather Method
This feature is experimental.
The Final Gather approach to ray-traced global illumination uses a final gather-based technique that trades some quality for runtime performance. The technique is a two-pass algoirthm, whereby the first pass distributes shading points throughout the scene — similarly to the Brute Force method — but at a fixed rate of one sample per pixel. A history of up to 16 shading point samples are stored in screen space during this pass. During the second pass, the algorithm attempts to reconnect to the shading point history, amortizing the cost of the method.
The Brute Force algorithm is intended to emulate the Path Tracer’s ground truth reference and is similar in how it executes the result. The Final Gather method trades that emulation for performance. This has it’s own limitations, like being limited to a single bounce of indirect diffuse global illumination, and reprojection from the previous frame sample data is susceptible to ghosting when the camera is moving fast.
To aid in suppressing temporal ghosting artifacts, you can use the following command to modify the world space rejection criteria.
It is currently based on a world distance measured from the original shading point. This rejection crieteria defaults to 10 units.
The Final Gather method also requires the following settings to be used in the Post Process Volume for it to work effectively:
Max Bounces: 1
Samples Per Pixel: 16
Any additional Max Bounces beyond 1 are silently discarded and when adjusting the Samples Per Pixel. It’s best to increase the value by powers of two (8, 16, 32, 64).
Using Ray Tracing Features
The sections below provide details for using ray tracing features in your project using the Post Process Volume and individual light properties.
Post Process Volume
Use Post Process Volumes provides controls for some ray tracing features:
Timing Insights
An Overview of the Timing Insights Window in Unreal Insights.
Choose your operating system:
The Timing Insights window is where you can find per-frame performance data for different tracks, including the CPU and GPU tracks. The Timing view has a new toolbar, splitting the Tracks drop-down menu into multiple menus where you can look at different visualizations of the time your project spends on various tasks.
The Timing Insights window features the Frames panel (1), the Timing panel (2), the Log panel (3), the Timers and Counters tabs (4), and the Callers and Callees panels (5).
Timing Insights Main Toolbar
The toolbar provides the capability to select blocks of time to view in groups, sort, or categorize data, and review log output.
You can toggle the visibility to display data for that frame in the Timing, Timers, Callers, Callees, Counters, and Log tracks by clicking on each respective track.
You can select blocks of time to view in aggregate, show or hide the respective panels. sort or categorize data, and review log output. To do this, click on a single frame in the Frames panel, or click and drag a section of the scrub bar at the top of the Timing panel, known as the Time Ruler.
Frames
The Frames panel displays the total time taken by each frame using a bar graph format. This is useful for identifying general trends, such as low performance or framerate drops when a level is loaded, an unoptimized scene is visible, or spawning a large number of Actors simultaneously.
The frames panel displays the Frames, Timing, Timers, Callers, Callees, Counters, and Log tracks.
Hovering the cursor over a bar causes that frame’s index and running time to appear.
If you right-click on on the bar, the following Zoom context menu options will appear:
Auto Zoom
Toggles auto zoom, which fits the entire session time range into the frames display window.
Zoom Timing View on Frame Selection
Toggles whether the timing view is zoomed when a frame is selected.
These options are also available to edit in the UnrealInsightsSettings.ini file.
Timing
You can show or hide each track individually by clicking the arrow next to it.
All Tracks
Clicking the All Tracks dropdown arrow displays a list of all available tracks.
CPU/GPU
Clicking the CPU/GPU dropdown arrow displays the CPU and GPU tracks and thread group options.
Other
Clicking the Other dropdown arrow displays options for visibility of the Main Graph, File Activity, Asset Loading, and Frames Tracks.
Plugins
Clicking the Plugins dropdown arrow displays Tracks and options that are exposed by plugins, including information about Slate, Gameplay, Animation, and RDG Tracks.
View Mode
The View Mode dropdown arrow displays controls for various options for the Timing View.
Timing View Option
Compact Mode (key shortcut: C)
Toggles compact mode which displays the timing tracks with reduced height supporting tracks.
Auto Hide Empty Tracks (key shortcut: V)
Auto-hides empty tracks without timing events. This option is persistent to the UnrealInsightsSettings.ini file.
Depth Limit (key shortcut: X) * Unlimited / 4 Lanes / Single Lane
Toggles the display of different CPU depth options. The X key can be used as a shortcut to cycle through the different CPU depth options.
Coloring Mode (CPU Thread tracks)
You can assign a color to CPU / GPU timing events based on their duration (inclusive time).
The color key is as follows:
≥ 100μs : green.
The Quick Find widget search logic is defined using Groups and Filters. Group nodes contain child Filter nodes and define the logic applied to the children’s result. Filter nodes are leaf nodes and contain a filter each.
Each filter contains:
The filter type, that can be selected from a drop down menu.
The filter operator, also selectable using a drop down menu.
The filter value, that can be entered using a text box.
Once the filter logic is created, it can be used to search for events or filter the tracks from the Timing View.
Find First
Searches for the first event matching the filter in the order of the event’s start time. If a match is found it is selected and the Timing view brings it into view.
Find Previous
Searches for the previous event matching the filter starting from the current selected event’s start time. If no event is selected, it acts as a Find First.
Find Next
Searches for the next event matching the filter starting from the current selected event’s start time. If no event is selected, it acts as a Find Last.
Find Last
Searches for the last event matching the filter in the order of the event’s start time. If a match is found it is selected and the Timing view brings it into view.
Apply Filter
Highlights all timing events that pass the filter logic from the tracks.
Clear filters
Stops highlighting events based on the filter’s logic.
If you make changes to the filter’s logic, you must click Apply Filter again to highlight events based on the new logic.
Timers
The Timers panel lists all timer events that run within the time range designated in the Timing panel. In addition to grouped data based on time range, the list can be sorted in ascending or descending order by the values in any active column.
Flat
Creates a single group. Includes all Timers.
Timer Name
Creates one group for one letter.
Timer Type
Creates one group for each timer type.
Instance Count
Creates one group for each logarithmic range(1-10,10-100, 100-1000.)
To change sort order, or to activate or deactivate columns, right-click on the column. The following options are available:
Additional Sort Option
Sort Ascending (by Timer, Group Name, Instance Count, Total exclusive, Total inclusive.)
Sorts Column by Ascending order.
Sort Descending (by Timer, Group Name, Instance Count, Total exclusive, Total inclusive.)
Sorts Column by Descending order.
Sort By:
Timer or Group Name
Total Inclusive Time
Total Exclusive Time
Name of timer or group.
Number of select instances.
Total inclusive duration of selected timer’s instances.
Total exclusive duration of selected timer’s instances.
Column Visibility Group
View Column
Hides or shows the following columns.
Timer Or Group Name
Meta Group Name
Total Inclusive Time
Max Inclusive Time
Average Inclusive Time
Median Inclusive Time
Min Inclusive Time
Total Exclusive Time
Max Exclusive Time
Acerga Exclusive Time
Median Exclusive Time
Min Exclusive Time
Show All Columns
Resets tree view to show all columns.
Reset Columns to Min/Max/Median Preset
Resets columns to Min/Max/Median Preset.
Reset Columns to Default
Resets columns to default.
Export Functionality
The Timers panel includes the capability to export your timing event data by selecting one or more timers and right-clicking on the context menu.
Timer Options Include:
Open Source in Visual Studio
Opens the source file of the selected message in Visual Studio.
Visual Studio must already be open before using this option otherwise it may only open its Start page.
Other Miscellaneous export options include the following:
Copy To Clipboard (CTRL+C)
Copies the selection of timers and their events to clipboard.
Export (CTRL+S)
Exports the selected timers and their grouped statistics to a text file.
You can navigate to the Timing view and mark the time you are interested in saving from the main timeline view by clicking and dragging on the Time bar.
Observe the grouped stats are updated in the Timers panel to reflect the time selection.
From the Timers panel, manually select the timers you are interested in saving or use Ctrl+A to select all timers.
Export Timing Events
Exports the timing events to a text file.
Navigate to the Timing view, and mark the time you are interested in exporting from the main timeline view, by clicking and dragging on the Time bar.
If no time selection is made, the entire timeline will be exported.
In the Timing panel, click on the CPU/GPU thread tracks to show or hide the tracks you want to export.
Select the timers you are interested in, or use Ctrl+A to select all timers.
Select Export Timing Events (Selection). from context menu and choose a tab-seperated value ( *.tsv/*.txt ) or comma-seperated value ( *.csv ) file.
You can export the «Threads» and «Timers» in order to match the thread id and timer id with the names of threads and timers.
More Export Options / Export Threads
More Export Options / Export Timing Events (All)
The exported file can be massive in size, Even a small session could have several million timing events.
Source File For CPU Timers
In the Timers panel, the source file and line number is visible in the tooltip of each timer.
Callers and Callees
The Callers and the Callees panels display a hierarchical list of task events. When you select an event from the Timing view hovering over the information icon of an individual task will display the Id, Name, Type, Source, Number of instances (Num Instances) including detailed information on the Instance count, and total Inclusive and Exclusive time.
Counters
The Counters panel lists all stats incremented during the same time period as the Timers panel. You can rearrange the panel by updating the sort order and column organization.
The following Groups are available:
Flat
Creates a single group. Includes all Counters.
Stats Name
Creates one group for one letter.
Meta Group Name
Creates groups based on metadata group names of counters.
Counter Type
Creates one group for each counter type.
Data Type
Creates one group for each data type.
Count
Creates one group for each logarithmic range.(1-10,10-100, 100-1000)
The Log view displays all logs generated by calls to the macro UE_LOG from the Trace session. The logs can be filtered by verbosity and category, similar to the Output Log window in the Editor.
Selecting a time period in the Timing panel highlights all log entries that fall within that time. If you select multiple log entries, the Timing panel highlights the time range between those entries. The log panel features a search box that filters out all log messages that don’t match the text you enter. In addition to filtering, clicking any row will pan the Timing panel to the time when that row’s text was logged. You can save logs by selecting one or more messages and right-clicking on the context menu and selecting one of the following options from the drop down menu:
Copy (CTRL+C)
Copies the selected log (with all its properties) to clipboard.
Copy Message (SHIFT+C)
Copies the message text of the selected log to clipboard
Copy Range (CTRL+SHIFT+C)
Copies all the logs in the selected time range (highlighted in blue) to the clipboard.
Copy All
Copies all the logs to clipboard
Save Range As… (CTRL+S)
Saves all the logs in the selected time range (highlighted in blue) to a text file (tab-separated values or comma-separated values).
Save All As…
Saves all the (filtered) logs to a text file (tab-separated values or comma separated values)
Open Source in Visual Studio
Task Graph Insights
Enabling CPU Task Trace
Follow the steps below to profile a CPU Task:
Enable the Task and CPU trace channels when running your application from the command line by using the following command:
a) When hovering over a Timing Event, if the current event is inside a task, extra information is displayed in the tooltip.
b) Navigate to the Other > Tasks submenu from the Timing View track. You can use this to display task graph related visualizations.
Tasks Submenu Option
Show Task Critical Path
Show or Hide relations representing the critical task containing the current path.
Show Task Transitions
Show or hide transitions between the stages of the current task.
Show Task Connections
Show or hide task connections between the following options:
The current task’s prerequisites completed time and the current task’s started time.
The current task’s completed time and the current task’s subsequent started time.
The current task’s nested tasks added time and their started time.
Show Transitions of Prerequisites
Show or hide stage transitions for the current task’s prerequisites.
Show Transitions of Subsequents
Show or hide stage transitions for the current task’s subsequents.
Show Transitions of Nested Tasks
Show or hide stage transitions for the current task’s nested tasks.
Show Task Overview Track
Show or hide the Task Overview Track when a task is selected.
Show Detailed Info on the Task Overview Track
Shows the current task’s prerequisites/nested tasks/ subsequent in the Task Overview Track.
c) Right-click anywhere in the Timing View panel to open the timing view context menu. This menu is used to display new options that can be used to sort tracks on the Task graph.
Top Docked
Docks this track to the top.
Scrollable
Moves this track to the list of scrollable tracks.
Bottom Docked
Docks this track to the bottom.
d) When you navigate to Tasks, the Show Task Dependencies menu options display in the drop-down menu, when a timing event is selected, and if there is a task that encompasses the selected timing event.
The following data is displayed:
Relations in the form of arrows are drawn on the timing view to show the lifetime and stages of the task. This includes information on when it was created, launched, scheduled, started, finished and completed.
Relations are drawn between the current task and its prerequisite, subsequent, and nested tasks.
A new top docked track appears showing the stages of the task as timing events.
Show Dependencies For Prerequisites/Subsequent/Nested tasks
When activating one or more of these options, all of the relations mentioned in the previous section are drawn for the prerequisite, subsequent, and nested tasks of the current task.
As an example, instead of a single relation from the prerequisite task’s Completed Time to the current task’s Scheduled Time, you now see the relations depicting the created, launched, scheduled, started, finished and completed stages of each prerequisite task enabled:
When analyzing your data, we recommend trying to activate each option one at a time as the number of relations can quickly become overwhelming.
This Engine feature is new, therefore the formula is subject to change with optimizations in the future.
If a trace has Task data, then the Tasks minor tab becomes visible. To populate the tab, select an interval on the timing view. The tab updates to show all the tasks in the selected interval. The tab shows each of the included timestamp options: created, launched, scheduled, started, finished, and completed timestamps. By default, these timestamps are shown Relative to previous, which means they are relative to the previous stage, for instance scheduled is relative to launched. Double-clicking on a task in the table draws its relations as if it was selected by selecting a timing event in the timing view.
Context Switches
Context Switches are supported on Windows, XB1/XSX, and PS4/PS5 platforms.
You can enable the ContextSwitch trace channel in the command line:
On Windows, depending on your user permission settings your project runtime should be «run as administrator».
Open your trace file in Unreal Insights, If a session has the ContextSwitch trace event enabled, then the following information will be displayed in the Timing Insights view:
a) Additional CPU Core tracks. One for each CPU core in the recorded trace; shows timing events indicating what thread does execute on the respective CPU core. «Unknown» timing events indicate execution of threads from other applications / processes or from the OS.
b) Each CPU Thread has a header lane with core number events indicating on which core the respective thread is executing. The time range when a thread is executing and when it is preempted it is highlighted.
c) CPU/GPU drop down menu shows additional options re Context Switches:
d). The context menu of a «Core» timing event in a Cpu Thread track shows additional options:
e). The context menu of a «Thread» timing event in a CPU Core track shows additional options:
Virtual Texturing
An overview of the available virtual texturing methods in Unreal Engine.
Choose your operating system:
Virtual Texture support for your project enables you to create and use large-sized textures for a lower—and more consistent—memory footprint at runtime.
Virtual Texturing Methods
Unreal Engine supports two virtual texturing methods: Runtime Virtual Texturing (RVT) and Streaming Virtual Texturing (SVT).
Runtime Virtual Textures
Streaming Virtual Textures
Supports larger texture resolutions.
Texel data cached in memory on demand.
Texel data generated by the GPU at runtime.
Well suited for texture data that can be rendered on demand, such as procedural textures or composited layered materials.
Supports larger texture resolutions.
Texel data cached in memory on demand.
Texel data cooked and loaded from disk.
Well suited for texture data that takes time to generate, such as lightmaps or large, detailed artist created textures.
Runtime Virtual Texturing
RVT supplies an efficient way to render complex, procedurally generated, or layered materials. This makes RVT ideal for rendering complex materials for Landscapes. It enables improved rendering performance and workflows for Landscape Splines, decals for meshes and materials, along with general Landscape and object blending.
Streaming Virtual Texturing
Streaming Virtual Texturing reduces texture memory overhead when using larger texture sizes, including support for virtual texture lightmaps and UDIM (U-Dimension). Streaming virtual textures is an alternative way to stream textures from disk compared to existing mip-based texture streaming.
Virtual Texture Lightmaps
Enabling support for virtual texture lightmaps can improve streaming performance and quality of lightmap bakes.
Enable virtual texture support for lightmaps from the Project Settings by going to Engine > Rendering and setting Enable virtual texture lightmaps.
Enable the following console variables to control how virtual texture lightmaps are used in your project:
Controls whether non-VT lightmaps are generated/saved. Including non-VT lightmaps will constrain the atlas size, which removes some of the benefit of VT lightmaps.
Enables lossy compression of virtual texture lightmaps. Lossy compression has lower quality on lightmap textures compared to regular color textures.
Virtual Texture Topics
Overview of GPU memory allocation with the virtual textures physical memory pool.
Reference information for project settings, console commands, and Actor settings for virtual textures.
A walkthrough of setting up runtime virtual textures with a material applied to a landscape.
Getting Started in Niagara
This page collects all the getting started learning materials for Niagara.
Choose your operating system:
To get started in Niagara, the following materials are available to you.
Niagara Overview Pages
These page goes over the basics of Niagara. They describes the basic building blocks of Niagara systems, emitters, and modules. They also provide some information on how to create Niagara systems and emitters. Read them to learn all about the design philosophy.
This page gives an overview of the Niagara VFX system in Unreal Engine 5.
This page explains the key concepts and design philosophy behind Niagara.
This page gives basic information about using Events and Event Handlers in Niagara.
Niagara Quick-Start Tutorials
These Quick Start guides are step-by-step tutorials that take you through creating your first Niagara projects. Learn how to create an effect and attach it to an animation. Learn how to bake a Niagara system into a flipbook.
A guide to help you quickly start using the Niagara visual effects system.
A quick start guide for creating Niagara flipbooks in Unreal Engine.
Additional Information to Get You Started
These pages are targeted for users with different types of projects. Check the Cascade to Niagara Converter Plugin page if you are migrating from Cascade. Read through the Large World Coordinates page if you are planning to create a large project.
This page discusses and demonstrates the plugin to convert Cascade particle systems to Niagara particle systems.
An overview of how to use the Large World Coordinate System with Niagara.
Multi-User Editing Overview
A conceptual overview of how the Multi-User Editing system works.
Choose your operating system:
The Multi-User Editing workflow is built on a client-server model, where a single server hosts any number of sessions. Each session is a separate virtual workspace that any Unreal Editor instance on the same network can connect to in order to collaborate on the same Project content inside a shared environment.
While inside a session workspace, each user can interact with the Project content in any way made possible by their instance of Unreal Editor. For example:
Some users might work using a keyboard and mouse in a standard desktop PC setup — on different platforms, if needed.
Other users might choose to use the Editor’s VR editing mode to visualize and work in the scene using a VR headset and controllers, or they might use a plugin like the Virtual Camera to enter the same scene using a mobile device.
Whenever any connected user makes a change to a Level or saves an Asset in the Project, their instance of Unreal Editor automatically forwards information about that change to the server. The server is responsible for keeping track of all these change records, or transactions, and for sending out those transactions to all other connected clients. Each client then applies those same changes locally within its own environment. In this way, everyone’s view of their current Level and of other Assets in the Project remains up to date with the latest changes.
Synchronizing Transactions
The Multi-User Editing system chooses different strategies for synchronizing changes between connected clientsВ depending on the type of Asset you’re working with and the type of changes you make.
Levels: Instant Sync
AllВ changes that you make to the contents of a Level are synchronized immediately with all other computers in the session. If you add or remove Actors, move Actors from place to place, swap Materials, or change the properties of an Actor, all other users in the session will see those changes take effect immediately. Dragging a tool to move, rotate, or scale an Actor causes multiple transactions to occur while you’re dragging. Other users see these changes happen even before you release your mouse.
For example, in the following video, when the user on the left makes changes by dragging Actors from place to place, the user on the right sees those changes reflectedВ immediately in their Viewport, even before the drag is released. Similarly, when the user on the right rotates the object, the user on the left sees the changes immediately.
You only see the effects of other users’ changes if you’re in the same Editor mode: that is, if you’re both in editing mode, orВ if you’re both in one of theВ Play In Editor (PIE) or Simulate modes.
Other Assets: Sync on Save
The Multi-User Editing system does not immediately synchronize most types of Assets in your Unreal Engine Project each time you make a change. This includes Asset types like Materials and Material Instances, Static Mesh Assets, Blueprint classes, and so on. In these cases, when you make changes to these Asset, you are the only one who sees the changes take effect in your environment immediately. When you save your changes, the Multi-User Editing system sends the transaction around to all other users in the session.В
In this case, the transaction is not just a record of the change, but the actual saved Asset. Each instance of the Editor that receives one of these updated packages immediately hot-reloads it, making the changes visible.
For example, in the following video, when the user on the left changes a Material, their changes are notВ immediately synchronized.В However, when the user saves their changes to the Asset, the transaction is processed and the changes show up for the user on the right.
After you save your Asset, you automatically acquire a lock on it. In addition, while you have unsaved changes to some kinds ofВ Assets, the Multi-User Editing system marks those AssetsВ as dirty for other users. See Avoiding Conflicts below.
Sequencer: Playback Sync and Optional UI Sync
The Multi-User Editing system treats Level Sequences and Master Sequences like Levels: when any user makes a change to a Sequence such as adding or removing a track, or adding a new keyframe, it immediately synchronizes that change to all other users in the session.В
In addition, when one user plays a Sequence, the same Sequence plays immediately for all users that have that same Sequence open.В Only the user who began playback will be able to stop it.В After the SequenceВ hasВ stopped, any user can restart playback.
You can alsoВ choose to have the Multi-User Editing system automatically open the Sequencer UI on all clients when one user opens it. Because the Multi-User Editing system synchronizes playbackВ for all users that have the same Sequence open, enabling this UI synchronization optionВ helps you make sure that all users see the same Sequence play back at the same time.
The following video shows how the Multi-User Editing system synchronizes Sequencer UI events, playback events, and editing events.
The Multi-User Editing system synchronizes playback events, but different computers may play back the animations at different frame rates. Do not expect the results to be frame-accurate from one instance of Unreal Editor to another.
Presence
The Multi-User Editing system offers a few different ways to see what the other users in your session are doing.
Location and Viewpoint
While you’re working in the same Level as another user, you’ll see an avatar that represents their position and viewpoint on the scene. As they interact with the scene, you’ll also see lasers that indicate the objects highlighted by their mouse cursors or motion controllers.
For example, the following image shows avatars for two desktop users working in the same Level:
At any time, you can use the controls in the Connected Clients panel of the Multi-User Browser window to work with this presence information.
Teleport immediately to the same location and viewpoint as this user.
Toggle displaying the avatar and laser pointers for this user.
You can also click the Level listed for any connected user to switch to that Level and teleport immediately to that user’s location and viewpoint.
Session Change History
The Multi-User Browser window shows a list of all editing transactions that have occurred in the current session, and each time a user connected to or disconnected from the session.
Asset Change History
You can also view the list of changes made to each individual Asset. Right-click an Asset in the Content Browser, and select Multi-User > Asset History. You’ll get a list of all the transactions that modified this particular Asset.
Avoiding Conflicts
When you allow multiple people work on the same content at the same time, they’re likely to eventually try to change the same things at the same time. If you are too permissive about allowing this to happen, it can cause frustrations as people fight for control or lose the changes they make. On the other hand, if you are too strict, you may reduce people’s ability to modify the things they need when they need to, which can slow down their work or block them entirely.
The Multi-User Editing system attempts to find a balance between these two extremes, preventing fighting and overwriting in the most common cases of contention between users.
Locking Dragged Selections
While you’re actively dragging a selection in the Level Viewport — for example, to move, rotate, or resize it — you have exclusive control over those Actors. If another user attempts to modify any of the same Actors, either by dragging it with a tool in the Level Viewport or by setting properties in the Details panel, their changes will be immediately reverted.
As soon as you stop your drag interaction, other users become free to work with those Actors again, even if you still have them selected.
Locking Assets
If you want to prevent other users from modifying certain Assets, you can lock them. While an Asset is locked, only the user who locked it can save new modifications. Other users can still modify locked Assets locally in their own Project, but they can’t save their modifications until the user who owns the lock unlocks it.
There are two ways to lock and unlock Assets:
Any time you begin modifying an unlocked Asset, you acquire a temporary lock on it until you save your changes.
You can also lock and unlock selected Assets without modifying them. Right-click them in the Content Browser and select Multi-User > Lock Asset(s) or Multi-User > Unlock Asset(s).
You can tell locked and modified Assets by the overlay on the Asset’s thumbnail in the Content Browser. Mouse over the Asset to see details about the user who owns a lock or who has modified an Asset:
White indicates that you locked the Asset. You can continue modifying at any time. Other users cannot modify this Asset until you save or unlock it.
White with avatar indicates that another user has locked this Asset. You can still modify this Asset locally in your Project. If you do, you’ll be notified that another user has it locked:
Click for full image.
You won’t be able to save your modifications and sync that change to others until the user who owns the lock releases it.
Asterisk indicates the Asset is dirty (modified but unsaved) on another computer.
Only selected Asset types get marked with this icon. Be cautious about modifying Assets in this state — whoever saves their changes first gets a lock on the file and their change propagates to all other clients, potentially losing changes that others have made.
Undo and Redo
Each connected user has access only to their own history of operations. Within that history, each user can freely undo and redo their own actions, just as if they were working offline without being part of the session. However, a user cannot undo or redo any actions carried out by any other user in the session.
Starting from a Common State
To avoid long delays and high bandwidth usage, the server in the Multi-User Editing system does not transfer the entireВ contents of yourВ Project between connected users.В Instead, it relies on exchanging lightweight transaction records when it needs to achieveВ instant synchronization, andВ only circulatesВ AssetsВ like Levels, Static Meshes, Blueprints, and so on when those AssetsВ get modified and savedВ during the course of a session.
The only way to make sure that incomingВ transactions are applied in exactly the same way for every connected user, and therefore that every user’s content is kept in sync with every other user’s content, is to have every connectedВ userВ start with exactly the same contentВ in exactly the same state. That way, as the same list of transactions is applied to each client in the same order, the overall state of each user’s content is guaranteed to remain synchronized.
The typical way to get all users to start with the same contentВ is to use a source control system such as Perforce, Git, or SVN, and to make sure that each userВ synchronizes their local copyВ to the same changelist or revision as everyone elseВ before they connect to a session together.В See theВ Multi-User Editing and Source Control В sectionВ below for more.
Using source control isn’t an absolute requirement; you can use Multi-User Editing without an external source control system. However, in that case it is up to you to find a way to share your Project content within your organization such that all the users you need to work together are reliablyВ able start with the exact same Project content.
Session Validation
Every time a user attempts to connect to a session, the server checks certain attributes of their content and theirВ working environment to see if they match the content the session was initially started with. If any of these checks fail, the server won’t allow the user to join the session.
Transactions and Persisting Changes
As you and your teammates work in a live session, the Editor applies transactions on top of your local Project content in a kind of virtual sandbox. Your Project files on disk are not actually modified at all as long as you remain in the session. The Multi-User Editing system takes care of keeping track of your changes, and shows you in the Unreal Editor the result of applying those transactions to your Project content.
At any time, any session participant can choose to persist the session changes: to take all the modifications made in the current session, and apply those changes back to their local files on disk. If you began your session while connected to a Source Control provider, you can also optionally check in those same changes in a new changelist or revision.
If you leave a session without persisting changes, the Editor automatically reverts back to the state your Project was in before you started or joined the session. It hot-reloads all the Assets in your Project that you modified while in the session, discarding the session modifications. Although you no longer see your session changes after you leave the session, you don’t lose the transactions entirely. Each session still retains the record of all its transactions, even when all its users have left. If you re-join the same session later, the Multi-User Editing system re-applies all those same transactions in your Editor.
If you leave a session that you own without persisting changes, you’ll also be prompted to persist the changes. This helps avoid the possibility of leaving changes unsaved within the session by accident. Even if you dismiss this prompt, you won’t lose the pending changes forever. You can rejoin the session anytime and persist the changes later. Even if the server shuts down, the session changes aren’t lost; see the next section for details.
Redundancy
The server retains each session until the user who created the session expressly deletes it, or until the server itself is shut down. Therefore, a crash or disconnection for any individual client, or even for all clients, never results in losing modifications. Users can simply reconnect to the server and pick up where they left off.
The server also protects itself against losing information due to unexpected shutdowns by saving its session records to disk.
If the server shuts down abnormally, then the next time you restart that server it will immediately restore all the sessions it had open at the state they were in when it shut down.
If you shut down the server cleanly, it archives all live sessions. You can restore the session from the archive later. See the next section for details.
Multi-User Editing and Source Control
The Multi-User Editing system offers some features that are similar to source control systems, such as having a server that stores a history of transactions, or having users acquire locks on the Assets they work on to reduce contention. However, Multi-User Editing is not a replacement for source control.
Multi-User Editing is best used to augment a standard collaboration workflow in which you use a dedicated source control system such as Perforce, SVN, or Git to regularly record the changes you make to your Project.
When you have such a version control system set up in your team, use Multi-User Editing for limited-time live collaborations on top of a specific, defined changelist.
At the start of each live session (a shoot, a day’s work, a review meeting, or whatever applies to the work you do with your team), all participants decide on the changelist they’ll start from. This is typically the latest revision. Everyone syncs to that revision.
At the end of the live session, one person commits those changes back to the team’s usual source control system to preserve them.
After committing changes from a session back to source control, you can delete the session — you don’t need it anymore.
Next time you need to work live with others, start a new session based on the latest changelist.
We don’t recommend keeping a single Multi-User Editing session in use for a long period of time — that is, for days or weeks of work. Stop and commit your changes regularly to source control.
Networking
When you join a session, your instance of Unreal Editor connects to the server via UDP, on port 6666.
Each computer running the Unreal Editor that wants to connect to a server must be able to see the Private IP Address of that server’s computer.
The server’s computer must have port 6666 open to UDP traffic coming from the local network.
You should only expect this connection to work on a Local Area Network (LAN), or potentially if both endpoints are in the same Virtual Private Network (VPN). Do not expect to share your Unreal Editor sessions over an open Internet connection.
Onboarding Guide for Games Licensees
Steps to getting started with Unreal Engine.
Choose your operating system:
Hardware and Software Requirements
This page covers the hardware and software requirements for Unreal Engine 5. It also describes what is installed by the pre-requisites installer included in the Unreal Engine installer.
Recommended Hardware
Operating System
Processor
Quad-core Intel or AMD, 2.5 GHz or faster
Memory
Graphics Card
DirectX 11 or 12 compatible graphics card
RHI Version
DirectX 11: Latest drivers
DirectX 12: Latest drivers
Vulkan: AMD (21.11.3+) and NVIDIA (496.76+)
Operating System
Latest MacOS Monterey
Processor
Quad-core Intel, 2.5 GHz or faster
Memory
Video Card
Metal 1.2 Compatible Graphics Card
Operating System
Processor
Quad-core Intel or AMD, 2.5 GHz or faster
Memory
Video Card
NVIDIA GeForce 960 GTX or Higher with latest NVIDIA binary drivers
Video RAM
RHI Version
Vulkan: AMD (21.11.3+) and NVIDIA (515.48+)
To get the most out of rendering features of Unreal Engine 5 such Nanite and Lumen, see the Requirements for UE5 Rendering Features section of this page.
Minimum Software Requirements
Minimum requirements for running the engine or editor are listed below.
Running the Engine
Operating System
DirectX Runtime
Running the Engine
Operating System
macOS latest Monterey
Running the Engine
Operating System
Any reasonable new Linux distro fromВ CentOS 7.x and up
Linux Kernel Version
kernel 3.xВ or newer
Additional Dependencies
glibc 2.17В or newer
The requirements for programmers developing with the engine are listed below.
Developing with the Engine
All ‘Running the Engine’ requirements (automatically installed)
Visual Studio Version
Visual Studio 2019 v16.11.5 or later (recommended)
Visual Studio 2022
iOS App Development
iTunes Version
Developing with the Engine
Xcode Version
Developing with the Engine
Operating System
Compiler
IDE
Visual Studio Code, CLion, QtCreator
Software Installed by the Prerequisite Installer
You can find the installer in the Engine/Extras/Redist/en-us folder under your Unreal Engine installation location.
Support for 32-bit platforms has been removed in Unreal Engine 5.
The following table lists the software that is installed by the prerequisite installer.
Visual C++ Redists
XInput 1.3 (April 2007)
Visual C++ 2010 CRT
X3DAudio 1.7 (February 2010)
Visual C++ 2010 OpenMP library
XAudio 2.7 (June 2010)
Visual C++ 2012 CRT
D3D Compiler 4.3 (June 2010)
Visual C++ 2013 CRT
D3DCSX 4.3 (June 2010)
Visual C++ 2015 CRT
D3DX9 4.3 (June 2010)
Visual Studio 2019 redistributable
D3DX10 4.3 (June 2010)
D3DX11 4.3 (June 2010)
The most important DirectX components from that list are the XInput, X3DAudio, and XAudio dependencies. These aren’t included in standard installations of DirectX (and aren’t distributed with Windows by default), so they have to be installed manually or distributed with the application.
Some features of Unreal Engine that require DirectX 12, require at minimum Windows 10 version 1703 (Window Creator Update). For the best experience that supports features like Nanite, Virtual Shadow Maps, and Lumen, we recommend using Windows 10 version 2004 or 20H2.
Graphics Card Drivers
We currently recommend using the latest stable releases from each card manufacturer:
Performance Notes
This list representsВ aВ typical systemВ used at Epic, providingВ a reasonable guidelineВ for developing games with Unreal Engine 5:
Windows 10 64-bit (Version 20H2)
256 GB SSD (OS Drive)
2 TB SSD (Data Drive)
NVIDIA GeForce RTX 2080 SUPER
Xoreax Incredibuild (Dev Tools Package)
Six-Core Xeon E5-2643 @ 3.4GHz
If you don’t have access to Xoreax Incredibuild (Dev Tools Package), we recommend compiling with a machine having 12 to 16 cores.
Requirements for UE5 Rendering Features
Some rendering features of Unreal Engine 5 have different requirements than the minimum specifications.
Lumen Global Illumination and Reflections
Software Ray Tracing:
Video cards using DirectX 11 with support for Shader Model 5
Hardware Ray Tracing:
Windows 10 with DirectX 12 support
Video cards must be NVIDIA RTX-2000 series and higher, or AMD RX-6000 series and higher
Nanite Virtualized Geometry
All newer version of Windows 10 (newer than version 1909.1350) and Windows 11 with support for DirectX 12 Agility SDK are supported.
DirectX 12 (with Shader Model 6.6 atomics), or Vulkan (VK_KHR_shader_atomic_int64)
Latest Graphics Drivers
Virtual Shadow Maps
All newer version of Windows 10 (newer than version 1909.1350) and Windows 11 with support for DirectX 12 Agility SDK are supported.
DirectX 12 (with Shader Model 6.6 atomics), or Vulkan (VK_KHR_shader_atomic_int64)
Latest Graphics Drivers
Temporal Super Resolution
Runs on any video card that supports Shader Model 5, but the limit of 8UAVs per shader has performance implications. Temporal Super Resolution shaders compile with 16bit types enabled on D3D12 that supports Shader Model 6.
Acquiring Unreal Engine
Source Code Branches
You’ll notice that we’ve published UE5’s source code in several branches.
Branches whose names contain dev, staging, and test are typically for internal Epic processes, and are rarely useful for end-users Other short-lived branches may appear from time to time as we stabilize new releases or hotfixes.
Release Branch
The Release branch always reflects the current official release. These are extensively tested by our QA team, so they make a great starting point for learning Unreal Engine and for making your own projects. We work hard to make releases stable and reliable, and aim to publish a new release every few months.
Main Branch
Most active development on UE5 happens in the ue5-main branch. This branch reflects the latest release of the engine and may be buggy or it may not compile. We make it available for developers who are eager to test new features or work in lock-step development with us.
If you choose to work in this branch, be aware that it is likely to be ahead of the branches for the current official release and the next upcoming release. Therefore, content and code that you create to work with the ue5-main branch may not be compatible with public releases until we create a new branch directly from ue5-main for a future official release.
Downloading the Source Code
Please follow these instructions to download the Unreal Engine source code.
Please make sure you are running Visual Studio 2019 or higher for Windows Desktop installed before proceeding.
To use Git from the command line, see the Setting up Git and Fork a Repo articles.
If you’d prefer not to use Git, you can get the source with the ‘Download ZIP’ button on the right. The built-in Windows zip utility will mark the contents of zip files downloaded from the Internet as unsafe to execute, so right-click the zip file and select ‘Properties. ‘ and ‘Unblock’ before decompressing it. Third-party zip utilities don’t normally do this.
Install Visual Studio 2019.
This will download binary content for the engine, as well as installing prerequisites and setting up Unreal file associations. On Windows, a warning from SmartScreen may appear. Click More info, then Run anyway to continue.
A clean download of the engine binaries may take some time to complete. Subsequent checkouts only require incremental downloads and will be much quicker.
Run GenerateProjectFiles.bat to create project files for the engine. It should take less than a minute to complete.
Load the project into Visual Studio by double-clicking on the UE5.sln file. Set your solution configuration to Development Editor and your solution platform to Win64, then right click on the UE target and select Build. It may take anywhere between 10 and 40 minutes to finish compiling, depending on your system specs.
After compiling finishes, you can load the editor from Visual Studio by setting your startup project to UE5 and pressing F5 to debug.
To use Git from the Terminal, see the Setting up Git and Fork a Repo articles. If you’d rather not use Git, use the ‘Download ZIP’ button on the right to get the source directly.
Open your source folder in Finder and double-click on Setup.command to download binary content for the engine. You can close the Terminal window afterwards.
In the same folder, double-click GenerateProjectFiles.command. It should take less than a minute to complete.
Load the project into Xcode by double-clicking on the UE5.xcworkspace file. Select the ShaderCompileWorker for My Mac target in the title bar, then select the ‘Product > Build’ menu item. When Xcode finishes building, do the same for the UE5 for My Mac target. Compiling may take anywhere between 15 and 40 minutes, depending on your system specs.
After compiling finishes, select the ‘Product > Run’ menu item to load the editor.
Our developers and support teams currently use the latest version of Ubuntu; as a result, we may not be able to provide support for other Linux distributions (including other versions of Ubuntu).
Open your source folder and run Setup.sh to download binary content for the engine.
Both cross-compiling and native builds are supported.
Cross-compiling is handy when you are a Windows (Mac support planned too) developer who wants to package your game for Linux with minimal hassle, and it requires a cross-compiler toolchain to be installed (refer to the Linux cross-compiling page).
Additional target platforms
Android support will be downloaded by the setup script if you have the Android NDK installed. See the Android Quick Start guide.
Development for consoles and other platforms with restricted access, like Sony PlayStation, Microsoft Xbox, and Nintendo Switch, is only possible if you have a registered developer account with those third-party vendors.
If you don’t have access to these resources, first register a developer account with the third party vendor. Then contact your Epic Games account manager if you have one, or fill out and submit the Console Development Request form for Unreal Engine if you don’t. Epic will contact you with a formal agreement to digitally sign. Once this is approved, you will receive instructions on how to access source code, binaries, and additional instructions for your platform.
Licensing and Contribution
Next Steps
Footnotes
The first time you start the editor from a fresh source build, you may experience long load times. The engine is optimizing content for your platform to the derived data cache, and it should only happen once.
Your private forks of the Unreal Engine code are associated with your GitHub account permissions. If you unsubscribe or switch GitHub user names, you’ll need to re-fork and upload your changes from a local copy.
Building Unreal Engine from Source
Inside the root directory, run GenerateProjectFiles.bat to set-up your project files.
All project files are intermediate ( [Unreal Engine Root Directory]\Engine\Intermediate\ProjectFiles ). You must generate project files each time you sync a new build to ensure they are up to date. If you delete your Intermediate folder, you must regenerate project files using the GenerateProjectFiles batch file.
Set your solution configuration to Development Editor.
Set your solution platform to Win64.
Right-click the UE target and select Build.
Launching the Editor
The process of running the editor requires passing the name of the project to run as an argument to the executable.
The process of running the editor requires passing the name of the project to run as an argument to the executable.
Running the Editor from the Command Line
From a command prompt, navigate to yourВ [LauncherInstall]/[VersionNumber]/Engine/Binaries/Mac В directory.
Run theВ UEEditor.app В passing it the path to your project:
From a command prompt, navigate to yourВ [LauncherInstall][VersionNumber]\Engine\Binaries\Win64 В directory.
Run theВ UEEditor.exe, В passing it the path to your project.
Running the Editor from the Executable
Navigate to yourВ [LauncherInstall][VersionNumber]\Engine\Binaries\Win64 В directory.
Right-clickВ on theВ UEEditor.exe В executable and chooseВ Create shortcut.
Right-clickВ on the newly created shortcut and chooseВ Properties.
Add the name of the game to run as an argument at the end of theВ TargetВ property:
PressВ OKВ to save the changes.
Double-clickВ the shortcut to launch the editor.
The editor must be runВ from the command prompt В to load a specific project directly orВ with no arguments В to access the Project Browser.
Running the Editor with No Arguments (Stand-alone)
Creating Your First Project
When you open Unreal Editor, the Project Browser will appear. The Project Browser provides a jumping off point from which you can create projects, open your existing projects, or open sample content like sample games and Showcases.
When you launch Unreal Engine, the Unreal Project Browser opens automatically. This is where you can:
Create a new project.
Open an existing project.
Manage existing projects.
The diagram below illustrates the steps to create a new project in Unreal Engine.
Creating a new project in Unreal Engine from the Project Browser window.
To create a new project, follow these steps:
Select the development category (1) that best matches your industry and project goals.
You can select from the following categories:
Film, Television, and Live Events
Architecture, Engineering, and Construction (AEC)
Automotive, Product Design, and Manufacturing (APM)
Select a template (2) for your project. The templates you can choose from are based on the category you selected in step 1.
Unreal Engine contains a number of project templates you can use as a starting point for your own projects. To learn more about the different project templates available, refer to the Templates Reference page.
Configure the Project Defaults (3). In this section, you can select your target platform (that is, the hardware where your game or application will run, like a computer or a mobile device), configure quality and ray tracing settings, and more.
Some of the settings below may not be available for certain templates. For example, the Handheld AR template can only use Blueprint implementation.
You can configure the following settings:
Implementation
Select how you want to implement your project’s logic, such as character movement, level transitions, and so on.
You can choose from the following options:
Blueprint, if you want to build your project in the Unreal Editor, and use the Blueprint Visual Scripting system to create interactions and behavior.
C++, if you want to build your project by programming with C++ in Visual Studio.
For more information about these implementation methods, refer to the following pages:
Target Platform
Select the kind of platform your project is intended for:
Desktop
Mobile
Quality Preset
Select the maximum quality level, based on which platform your project targets. We recommend that you choose:
Maximum, if you are developing your project for a computer or game console.
Scalable, if you are developing your project for mobile devices.
Starter Content
Select whether you want your new project to include starter content. Starter content includes some simple Static Meshes with basic textures and Materials. It is useful if you want to start learning and experimenting straight away, without worrying about sourcing and importing custom content.
Ray Tracing
Select whether to enable or disable ray tracing for your project.
For more information about ray tracing in Unreal Engine, refer to the Hardware Ray Tracing and Path Tracing Features page.
Select where you want to store your project, and give your project a name (4).
Click Create (5) to finish creating your new project.
Result
Unreal Engine creates a new project with the settings you configured, and then automatically opens the project.
Compiling Code Projects
If you create a project with the Blank template, or any of the Blueprints Only templates, you can immediately begin working with your project in Unreal Editor. When working with any game or engine C++ code, however, you will need to compile your code in order to see any changes reflected in the game.
Unreal Engine (UE) uses a custom building method using the Unreal Build Tool which handles all the complex aspects of compiling the project and linking it with the engine. This process occurs transparently allowing you to simply build the project through the standard Visual Studio build workflow.
Build Configuration
Unreal projects have multiple targets (Editor, Client, Game, and Server) described by *.target.cs files, each of which can be built to different configurations. In Visual Studio, this manifests as a Visual Studio *.vcxproj file with different configurations for each target. The solution configurations are named as [Configuration][Target Type] (for example, «DevelopmentEditor» for the default editor target, and «Development» for the default standalone game target). The configuration you use will be determined by the purposes of the build you want to create.
Every build configuration contains two keywords, and the first keyword indicates the state of the engine and your game project. For instance, if you compile using a Debug configuration, you will be able to debug your game’s code. The second keyword indicates the target you are building for. For example, if you want to open a project in Unreal, you need to build with the Editor target keyword.
Debug
This configuration contains symbols for debugging. This configuration builds both engine and game code in debug configuration. If you compile your project using the Debug configuration and want to open the project with the Unreal Editor, you must use the «-debug» flag in order to see your code changes reflected in your project.
DebugGame
This configuration builds the engine as optimized, but leaves the game code debuggable. This configuration is ideal for debugging only game modules.
Development
This configuration enables all but the most time-consuming engine and game code optimizations, which makes it ideal for development and performance reasons. Unreal Editor uses the Development configuration by default. Compiling your project using the Development configuration enables you to see code changes made to your project reflected in the editor.
Shipping
This is the configuration for optimal performance and shipping your game. This configuration strips out console commands, stats, and profiling tools.
Test
This configuration is the Shipping configuration, but with some console commands, stats, and profiling tools enabled.
Game
This configuration builds a stand-alone executable version of your project, but requires cooked content specific to the platform. Please refer to our Packaging Projects Reference page to learn more about cooked content.
Editor
To be able to open a project in Unreal Editor and see all code changes reflected, the project must be built in an Editor configuration.
Common UI Quickstart Guide
A first-time walkthrough of Common UI’s core features.
Choose your operating system:
This page provides a walkthrough for how to set up Common UI in your project. You will learn the following in this guide:
Setting up your viewport to support Input Routing.
How to create Input Action Data Tables, which map controller buttons to actions within your UI.
How to set up Default Navigation Actions, which support global click and back button functionality.
How to create Controller Data Assets and assign them to specific types of controllers on specific platforms.
1. Viewport Input Routing Setup
The Viewport is the base for all input routing in Common UI. When Common UI captures input, it sends it to the Viewport first, which then sends it to whichever root node is drawn on top. To support this functionality, perform the following setup steps:
Open Edit > Project Settings > Engine > General Settings.
Set your Game Viewport Client Class to CommonGameViewportClient.
If you need your own custom game viewport class, you will need to extend it from CommonGameViewportClient to use Common UI.
2. Creating an Input Action Data Table
Common UI uses Input Action Data Tables to create named actions that can be associated with various platform-specific inputs. For examples, see GenericInputActionDataTable in Common UI’s content folder, or NavigationInputActionDataTable in the Content Example project.
Common UI’s Input Action Data Tables are not related to the input actions used in Project Settings or the Advanced Input System. They are exclusively used for managing UI input.
Right-click in the Content Browser, then click Miscellaneous > Data Table Asset.
Select CommonInputActionDataBase as your row structure, then click OK to create a new Input Action Data Table.
To add a new Input Action row, click Add in the top bar.
Populate the Input Action with a name and information about which keys activate it.
Input Actions consist of the following parameters:
Parameter
Description
Display Name
Name of input action. Displayed in Nav-bar if present.
Hold Display Name
Name of the input action if it requires the user to hold the button down.
Nav Bar Priority
Priority used when sorting actions in Nav-bar from left to right.
Keyboard Input Type Info
Key used for this action when using Mouse & Keyboard, if any.
Default Gamepad Input Type Info
Key used for this action when using Gamepad, if any.
Gamepad Input Overrides
Key used for this action on a specific gamepad. Useful for platform-specific button overrides, such as swapping the back and forward buttons for the gamepad on the Nintendo Switch.
Touch Input Type Info
Key used for this action when using a touch interface, if any.
Common UI widgets map these abstract actions to actual inputs. For example, you can add a data table and row name reference to the Triggering Input Action in the CommonButtonBase widget. After that, pressing the button associated with that action will activate the Common UI button.
For fewer version control conflicts, group related sets of actions into their own data tables. For example, put all menu navigation actions into one table together, then put actions for more specialized menus into their own tables as well. After that, create a single composite data table that references those data tables.
3. Default Navigation Action Setup
Cardinal navigation is natively supported by Unreal Engine. However, Common UI uses the Common UI Input Data asset to define universal Click and Back input actions that are used across all platforms.
Create a new Blueprint Class in the Content Browser.
Search for CommonUIInputData and click Select to create a new Blueprint.
Assign an appropriate Data Table containing your default Click and Back actions.
Assign this asset to Project Settings > Game > Common Input Settings > Input Data.
The asset specified above will be loaded by Common UI and used for default navigation. The designated Click button substitutes for mouse clicks when highlighting buttons or other interactable elements, and the designated Back button is universally used for navigating backward from your current menu to a previous one.
4. Controller Data Binding (Platform-Specific UI Elements)
Controller Data Assets associate key-actions with UI elements. Each Controller Data Asset is associated with an input type, gamepad, and platform. CommonUI uses this information to automatically use the correct platform-specific UI elements based on the current platform and input type. Optionally, for platforms that support multiple input types or unique gamepads, it can also use the user’s input to find the correct gamepad and swap its UI elements at runtime.
Right-click in the Content Browser and create a new Blueprint Class.
Search for CommonInputBaseControllerData and click Select to create a new Controller Data Asset.
Populate the Controller Data Asset with assets and information about one of the controllers you plan to support.
Parameter
Description
Input Type
Set this to Gamepad, Mouse and Keyboard, or Touch.
Gamepad Name
If the controller is a Gamepad, this will be the platform this gamepad corresponds to. The default Gamepad is called Generic.
Input Brush Data Map
Mapping of keys to UI elements and Icons.
Input Brush Key Sets
Mapping of multiple keys to a single UI element. Useful for D-Pads and other inputs that can potentially map to different axes.
Once you have created Controller Data for all inputs you plan to support, these classes must be added to their associated platforms under Project Settings > Game > Common Input Settings > Platform Input.
The Default Gamepad Name must exactly match the Gamepad Name field in one of your Controller Data Assets, otherwise it will not be recognized and its icons will not appear.
Assign each gamepad’s data to the Controller Data array for its corresponding platform. You can have multiple gamepads associated with a single platform. For example, PC games would typically support a mouse and keyboard Controller Data as well as a generic gamepad. However, you can also add Controller Data for specific gamepad models.
5. Common UI Widget Library and Widget Styling
Common UI has a library of widgets, listed under the Common UI Plugin section in UMG’s Palette. Many of these are pieces of UI functionality that are commonly reproduced in many games and applications, including:
Specialized text blocks for date / time and numeric values.
Navigation and visibility aids like carousels and animated switchers.
Platform aids like a loading guard and a hardware visibility border.
Widgets that provide basic functionality, like buttons and text, but that use style data assets for styling.
None of these widgets have the styling options of their equivalent base UMG widgets. Instead, they reference Common Style Assets, making it possible to apply a consistent style across multiple menus and HUDs. Any changes you make to a style asset will take effect on every Common UI widget that uses it.
To make a common style asset:
Right-click in the Content Browser and create a Blueprint, then select one of the Common Style classes as the base.
Populate its Details with the styling information that you want to apply to Common UI widgets. These are typically the same as the styling options on standard UMG widget.
Assign it to a Common UI widget of the appropriate type. For example, if you create a Common Text Style asset, you would assign it to the Style field in the Common Text Widget.
You can also assign these to the Template Styles in Project Settings > Plugins > Common UI Editor.
Any Common UI widgets that you have not manually assigned a style will use the appropriate Template Style instead. This makes it easy to create a global default style for your application.
The Project Settings > Plugins > Common UI Framework menu features several more global assets, including a Default Throbber Material that is used in loading screens, and a Default Image Resource Object that displays as a placeholder for UI assets that have not been loaded.
Opening an Existing Project
Describes how to access and open an existing project in Unreal Engine.
Choose your operating system:
There are several ways to open an existing Unreal Engine project, depending on where it is located and the version of Unreal Engine it was created with.
Opening Projects from the Epic Games Launcher
In the Epic Games Launcher, click Unreal Engine on the left navigation, then click Library at the top of the Launcher. You will see a list of all Unreal Engine projects on your local machine.
If you copied a project from another computer or from the Internet, the project will not appear in this list until you have opened it locally at least once. Refer to the Opening Projects From Disk section on this page to learn more.
My Projects section of the Epic Games Launcher. Click the image for full size.
Double-click any project thumbnail to open that project.
The Launcher displays the following information for each Project:
Unreal Engine version that the project is compatible with
Right-clicking a project’s thumbnail in the Launcher brings up a context menu with the following options:
Option
Description
Open
Opens the project in the version of Unreal Engine that it is compatible with.
Show in folder
Opens the project folder in a new Windows Explorer (Windows) or Finder (macOS) window.
Create shortcut
Creates a shortcut to the project on the desktop.
Clone
Creates an exact copy of the project. You can specify the name and location of the new project.
Delete
Permanently deletes the project.
The project files are not moved to the Recycle Bin, like they would be if you deleted the project folder from Windows Explorer or Finder. If you accidentally delete a project using this option, you will not be able to recover it.
Opening Projects from the Project Browser
When you launch any version of Unreal Engine, the Project Browser displays and opens the Recent Projects panel by default. Much like the projects list in the Epic Games Launcher, this section displays all the Unreal Engine projects on your disk.
If you copied a project from another computer or from the Internet, the project will not appear in this list until you have opened it locally at least once. Refer to the Opening Projects From Disk section on this page to learn more.
Project Browser in Unreal Engine 5. Click the image for full size.
Double-click any project thumbnail to open that project. Alternatively, click the project to select it, then click Open.
The Launcher displays the following information for each Project:
Unreal Engine version that the project is compatible with
Right-clicking a project’s thumbnail in the Project Browser brings up a context menu with an option to open your project folder in a Windows Explorer or Finder window.
Enable the Always load last project on startup checkbox in the bottom-left corner of the Project Browser to have Unreal Engine 5 always open the last project you worked on when you launch the engine.
Opening Projects from Unreal Engine
In Unreal Engine, from the main menu, go to File > Open Project. This opens a window that displays a list of local projects, similar to the Project Browser.
Opening a project from Unreal Engine. Click the image for full size.
To open a project, either:
Select the project from the list, then click Open.
Double-click the project thumbnail.
If the project you want to open for isn’t in the list, click Browse, then browse to the project folder and open the (ProjectName).uproject file.
Opening a project with a different version of Unreal Engine than the one it was created in can lead to data corruption and data loss. It is recommended to open a copy of the project to preserve the original.
Opening Projects From Disk
If you copied a project from another computer or from the Internet, the project will not appear in this list until you have opened it locally at least once.
To open a project from disk, follow these steps:
Make sure you have installed the Unreal Engine version that the project was created with.
Navigate to the project’s folder on disk.
Double-click the (ProjectName).uproject file. This file is located directly inside your project’s folder.
In this example, you would double-click the MyProject.uproject file.
If you don’t have the version of Unreal Engine that the project was created with, you will see a pop-up window asking you to select which version of Unreal Engine to open the project with.
Select an Unreal Engine version from the drop-down, or click the ellipsis (. ) button to browse to an Unreal Engine executable on your disk, then click OK.
Opening a project with a different version of Unreal Engine than the one it was created in can lead to data corruption and data loss. It is recommended to open a copy of the project to preserve the original.
XR Best Practices
Best practices for creating and optimizing content for XR projects
Choose your operating system:
Virtual reality (VR) is an immersive new medium, with unique aspects to consider when authoring and presenting content to users on these platforms. Considerations include user comfort, content optimization, and limitations of the platforms. Use this page as a reference for these topics as you develop your projects for VR.
VR Project Settings
VR projects can be created as a Blueprint or C++ project.
When you create a new project targeting VR platforms, start with the VR Template in the Games category. The VR Template provides everything you need to start develping VR projects in Unreal Engine 5.
If you don’t want to use the VR Template as a base for your project, you can create a new blank C++ or Blueprint project with the following settings:
Quality Preset: Scalable
Starter Content: Disabled
These settings create an Unreal project with minimal rendering features enabled. This ensures that the project starts with a good framerate, and you can add any other rendering features you need.
After you create the project, set the following project settings to improve the performance of your apps:
Go to Edit > Project Settings > Description, and enable Start In VR.
In Edit > Project Settings > Rendering > Default Settings, set the Anti-Aliasing Method to Multisample Anti-Aliasing (MSAA).
For mobile VR experiences, enable Mobile Multi-View in Edit > Project Settings > Rendering > VR.
Also set Mobile HDR to False
See the Scalability Reference for explanations and examples on scaling the graphics of your app to improve its performance or quality. The following table describes some console variables and their recommended values for VR projects.
Console Variable
Recommended Value
Description
vr.PixelDensity
1 is the ideal resolution for the HMD that is currently being used. Lower values will perform faster but will be undersampled (more blurry), while values over 1 will perform slower and will supersample (extra sharp).
r.SeparateTranslucency
This can be expensive for mobile VR experiences because of fill rate limits. It’s recommended to disable this feature.
r.HZBOcclusion
See the Hierarchical Z-Buffer Occlusion section in the Visibility Culling page for more details.
VR Frame Rate Optimization
Most VR applications implement their own procedures to control the VR frame rate. Because of this, you should disable several general UE4 project settings that can interfere with VR applications.
Follow these steps to disable general framerate settings in UE:
From the editor’s main menu, choose Edit > Project Settings to open the Project Settings window.
In the Project Settings window, choose General Settings in the Engine section.
In the Framerate section:
Disable Smooth Frame Rate.
Disable Use Fixed Frame Rate.
Set Custom TimeStep to None.
VR World Scale
Ensuring the correct scale of your world is one of the most important ways to help deliver the best user experience possible on VR platforms. Having the wrong scale can lead to all kinds of sensory issues for users, and could even cause simulation sickness. Objects are most easily viewed in VR when they are in a range of 0.75 to 3.5 meters from the player’s camera. Inside of UE4, 1 Unreal Unit (UU) is equal to 1 Centimeter (CM). This means that objects inside of Unreal are best viewed when they are 75 UU to 350 UU away from the player’s camera (when using VR).
Distance
Distance in Unreal Units (UU)
100 Unreal Units
100,000 Unreal Units
You can adjust the scale of your world using the World to Meters variable located under World Settings. Increasing or decreasing this number will make the user feel larger or smaller in relation to the world around them. Assuming your content was built with 1 Unreal Unit = 1 cm, setting World To Meters to 10 will make the world seem very large, while setting World To Meters to 1000 will make the world seem very small.
VR and Simulation Sickness
Simulation sickness is a form of motion sickness that can affect users during immersive experiences. The following list describes some best practices to limit the discomfort that users can experience in VR.
Maintain framerate: Low framerates can cause simulation sickness. Optimizing your project as much as possible can improve the experience for the user. See the following table for the recommended frame rates that you need to target for your XR platform.
HMD Device
Target Frame Rate
90 minimum, up to 144
Windows Mixed Reality VR
Situational: 60/120, 90/90, and 120/120
Keep users in control of the camera: Cinematic cameras, or anything that takes control of camera movements away from the player, contribute to user discomfort in immersive experiences. Camera effects, such as head bobbing and camera shaking, should be avoided because they can lead to user discomfort if the user is not controlling them.
Field of View (FOV) must match the device: The FOV value is set through the device’s SDK and internal configuration, and matches the physical geometry of the headset and lenses. Because of this, the FOV should not be modified in UE, and should not be modifiable by the user. If the FOV value is changed, the world can appear to warp when you turn your head, leading to discomfort.
Use lights and colors that are more dim, and avoid smearing: When designing elements for VR, you might need to use dimmer lights and colors than you normally would. Strong and vibrant lighting in VR can cause simulation sickness to occur more quickly. Using cooler shades and dimmer lights can help prevent user discomfort. This also helps with preventing smearing between bright and dark areas on the display.
Movement speed should not change: Users should start at full speed instead of gradually accelerating to full speed.
Avoid post process effects that greatly affect what the user sees: Avoid post process effects such as Depth of Field and Motion Blur to prevent user discomfort.
Visual Effects such as Motion Blur and Depth of Field should be avoided.
Consider things like character height, width, speed, and camera location as they need to be slightly modified for VR characters.
VR Camera Setup
VR camera setup in UE4 depends entirely on whether your VR experience is seated or standing:
Seated experience: You will need to artificially raise the camera origin to the desired player height for the project. And call the Set Tracking Origin function, with Origin set to «Eye Level».
Standing experience: Make sure the camera origin is set to 0, relative to the pawn’s root, which is typically on the ground. Attach a camera component to a scene component at the base of the pawn, at the ground level. And call Set Tracking Origin with the Origin parameter set to Floor Level.
VR Content Considerations
When creating VR content, remember that users can look at that content from multiple angles. Use the following list as a reference as you create content for VR:
Missing Polygon Faces: In non-immersive experiences, it is a common practice to remove polygon faces from objects that cannot be seen by the player. However, in VR experiences, players may have more freedom to look around. Missing polygon faces can sometimes lead to users being able to see things that they’re not supposed to see.
Lighting Types: You should use Static Lights and Lightmaps for VR projects. This is the cheapest lighting option to render. If you need to use dynamic lighting, make sure to limit the amount of dynamic lights to as few as possible, and make sure that they never touch one another.
VR and VFX: Some VFX tricks, like using SubUV Textures to simulate fire or smoke, do not hold up very well when viewed in VR. In many cases, you will need to use static meshes instead of 2D particles to simulate effects like explosions or smoke trails. Near-field effects, or effects that happen very close to the camera, work well in VR, but only when the effects are made up of Static Mesh particles.
VR and Transparency: In 3D graphics, rendering transparency is extremely costly, because transparency will generally have to be reevaluated per frame to ensure that nothing has changed. Because of this reevaluation, rendering transparency in VR can be so costly that its cost outweighs its benefits. However, to get around this issue, you can use the DitherTemporalAA Material Function. This Material Function will allow a Material to look like it is using transparency and can help avoid common transparency issues, such as self-sorting. Click for full image.
Click for full image.
Known Limitations
Below you will find a list of features that might not work as expected in VR because of how HMDs are designed, along with possible workarounds to address them.
Screen Space Reflections(SSR): While SSR will still work in VR, the reflections they produce could have issues matching up to what it is reflecting in the world. Instead of using SSR, use Reflection Probes as they are much cheaper and suffer less from reflection-alignment issues.
Screen Space Global Illumination: Screen space techniques can generate differences between the displays for the two eyes in the HMD. These differences can cause user discomfort. See Lighting Types for recommended lighting types in VR for replacements.
Ray Tracing: VR apps using ray tracing currently aren’t able to maintain the resolution and frame rate needed for a comfortable VR experience.
User Interface: 2D UI is not supported in stereo rendering since it does not work well when viewed in stereo. Instead of 2D UI, use a widget component placed in the 3D world. See Creating 3D Widget Interaction for more on how to create these widgets.
Normal Mapping Issues
When viewing Normal maps on objects in VR, you will notice that they do not have the impact that they might have once had. This is because Normal mapping does not account for having a binocular display or motion parallax. As a result, Normal maps often look flat when viewed with a VR device. However, that does not mean that you should not or will not need to use Normal maps; it just means that you need to more closely evaluate if the data in the Normal map would be better off made out of geometry. Below, you will find a technique that can be used in place of Normal maps.
Parallax Mapping: Parallax mapping takes Normal mapping to the next level by accounting for depth cues that Normal mapping does not. A Parallax mapping shader can better display depth information, making objects appear to have more detail than they do. This is because no matter which angle you look at, a Parallax map will always correct itself to show you the correct depth information from your viewpoint. The best use of a Parallax map would be for things like cobblestone pathways and surfaces with fine detail.
How to Set up Vehicles
This guide explains how to set up a vehicle to use the Chaos Physics Solver.
Choose your operating system:
The individual assets that make up a vehicle are as follows.
A Skeletal Mesh
A Physics Asset
An Animation Blueprint
A Vehicle Blueprint
One or more Wheel Blueprints
A Float Curve asset that represents the engine’s torque curve
These are the same no matter whether you are creating an automobile or a motorcycle. This document will guide you through the process of setting up a Vehicle.
Enabling the Chaos Vehicles Plugin.
Creating and Editing Chaos Wheel Blueprints.
Creating a Curve Asset for the engine torque.
Importing a Vehicle Mesh.
Creating and Editing a Physics Asset.
Creating an Animation Blueprint with the Wheel Controller node.
Creating a Vehicle Blueprint.
Setting up Vehicle Control Inputs.
Setting up the Vehicle Game Mode.
Enabling the Chaos Vehicles Plugin
Before using Chaos Vehicles, the Chaos Plugin needs to be enabled.
Click Settings > Plugins to open the Plugins Menu.
Click the Physics category and enable the ChaosVehiclesPlugin.
Restart Unreal Editor after enabling the plugin.
This plugin will not work with PhysX enabled.
Creating and Editing Chaos Wheel Blueprints
The wheel Blueprint is where the configuration for a Wheel / Suspension / Brakes combination is set up.
In most cases, you will require at least two wheel types per vehicle. A wheel (or axle) that is affected by the steering / engine / handbrake, and one that is not. Also, this may be the case for having differently sized wheels for the front or the back, in which case, you have full control over setting the differing radii, mass, width, handbrake effect, suspension, and many other properties to give your vehicle the handling you desire.
There is no limit on the number of wheels that a vehicle may have. It is possible for multiple vehicles to share the same Wheel Blueprints, but this strategy is only valid if the wheel dimensions and suspension limits are the same.
Create a Wheel Blueprint
In the Content Browser, right-click and select Blueprint Class from the Create Basic Asset section.
In the Pick Parent Class window, under All Classes, search for «wheel» and select ChaosVehicleWheel. Click Select to create the asset.
The new asset will be created in the Content Browser. Give it a recognizable name so that you can easily locate it later (for example, ‘BP_ChaosFrontWheel’).
(Optional Step) Repeat these steps again so that you have a front and rear wheel type. Think of each as a setup per axle.
Edit the Wheel Blueprint
Double-click the assets in the Content Browser to open them in the Blueprint Editor, where there are options to edit the wheels.
To start, there are five properties that need to change for each wheel, as the rest of the properties depend on how the vehicle performs (and should be tweaked later on) during testing.
Axel Type
This defines whether the wheel is in the front or the rear of the vehicle.
Wheel Radius
This needs to match the size of the render model in centimeters (cm).
Affected by Handbrake
Enable this in the rear wheels.
Affected by Engine
Enable this in the rear wheels for a rear-wheel drive (RWD) vehicle. Enable this in the front wheels for front-wheel drive (FWD) vehicles. Enable this on all of the wheels for all-wheel drive (AWD) vehicles.
Affected by Steering
Enable this in the front wheels. For vehicles with more wheels in the front (for example, a truck cab might have four-wheel steering), enable it in the second layer with a different steering angle. Alternatively, all-wheel steering is possible by selecting steering on the rear wheels with a negative steering angle.
Max Steer Angle
Normally, this is a positive value (specified in degrees). However, a negative value is allowed for rear-wheel counter-steering for all-wheel drive (AWD) vehicles.
Example Class Defaults for the Chaos Wheel Blueprint.
For the buggy example, set the Wheel Radius to 58 for both front and back wheel Blueprints.
Creating a Curve Asset for the engine torque
The torque curve represents the amount of torque output from the engine at a given RPM. The graph’s X axis represents the engine RPM (revolutions per minute) with a range from 0 to the engine max RPM. The Y axis represents the engine Torque output in NM (Newton Meters). A typical torque curve is an inverted U shape with the torque peaking near the middle of the rev range, and trailing off either side.
To create your torque curve, follow these steps:
In the Content Browser, right-click and select Miscellaneous > Curve. Select the Curve Float type and click the Select button to create the asset.
Name the asset TorqueCurve.
In the Content Browser, double-click TorqueCurve to open it in the Curve Editor. Add points to create the curve shape to your liking.
Importing a Vehicle Mesh
This guide uses the Buggy vehicle mesh from the Vehicle Game sample project. In the Epic Games Launcher click the Learn tab to find the project.
Vehicle Physics Editor
After importing a vehicle mesh to your project, follow these steps to view the mesh in the Physics Asset Editor.
In the Content Browser, double-click the vehicle Skeletal Mesh to open it.
Click the Physics tab to open the Physics Asset Editor.
Creating and Editing a Physics Asset
Creating the Physics Asset
If you have a skeletal mesh that does not have an associated physics asset, then you can create a physics asset by following these steps:
In the Content Browser, right-click the skeletal mesh asset and select Create > Physics Asset > Create and Assign.
Click the Primitive Type dropdown and select Single Convex Hull. Click Create Asset to create a new physics asset.
This will generate a physics asset with default collision shapes for each of the bones. The initial collision setup will most likely not be ideal since it will use the same primitive type to represent all of the bones in the physics asset.
In the Content Browser, right-click the physics asset to open it in the Physics Asset Editor.
Inside the Physics Asset Editor you can adjust the collision primitives used on each bone to better suit the vehicle mesh.
Editing the Physics Asset
In the Skeleton Tree window, click the gear icon and select Show Primitives.
Select all the wheel bones inside the Skeleton Tree window.
Go to the Tools window and under the Body Creation section, click the Primitive Type dropdown and select Sphere. Click Re-generate Bodies.
You can now see the Sphere primitives on each of the wheels.
Select the suspension bones inside the Skeleton Tree window. Right-click and select Collision > No Collision to remove collision from the vehicle suspension.
Creating an Animation Blueprint with the Wheel Controller node
Animation Blueprints are used to control Vehicle Skeletal Mesh animations specific to that vehicle, such as spinning tires, suspension, handbrakes, and steering animations. To offload a lot of the work in creating these types of animations, you can use the Wheel Controller Node to drive the animations.
Wheel Controller Node
Where an Animation Blueprint is used to get and control animations for the vehicle, the Wheel Controller node makes controlling all of the animations for the vehicle straightforward with little to no additional setup.
The node gets the necessary information from the wheels (such as «How fast is it spinning?» or «Is it affected by the Handbrake?» or «What are the suspension settings for this wheel?»), and translates the query results to an animation on the bone that the wheel is associated with.
Create an Animation Blueprint
In the Content Browser, right-click and select Animation > Animation Blueprint.
In the Create Animation Blueprint window, select the VehicleAnimationInstance parent class and select the vehicle Skeleton from the list. Click Create to create a new Animation Blueprint asset.
In the Content Browser, double-click the animation Blueprint to open it.
Right-click in the Anim Graph and search for then select Mesh Space Ref Pose.
Right-click in the Anim Graph and search for then select Wheel Controller for WheeledVehicle. Connect the Mesh Space Ref Pose node to the Wheel Controller node.
Right-click in the Anim Graph and search for then select Component To Local. Connect the Wheel Controller node to the Component To Local node. Connect the Component To Local node to the Output Pose node.
(Optional Step) If you have additional struts or other suspension needs (like the sample Buggy from Vehicle Game), you will need additional nodes in the Animation Graph to handle the joints that affect those polygons. For example, in the Buggy, the extra joints are used to control the axle connections to the wheels. These are driven by Look At nodes, which, when given the wheel joints, will be driven by the Wheel Controller node. The Look At nodes will ensure the suspension stays attached to the wheels, as demonstrated in the following example.
The Buggy uses the following Look At node composition:
Creating a Vehicle Blueprint
In this section, you will create the Vehicle Blueprint that uses all the assets created in the previous sections.
In the Content Browser, right-click and select Blueprint Class from the Create Basic Asset category.
In the Pick Parent Class window, expand the All Classes section and search for and select WheeledVehiclePawn. Click Select to create the new Blueprint asset.
In the Content Browser, double-click the Vehicle Blueprint to open it.
In the Components window, click the Mesh Skeletal Mesh Component. Go to the Details panel and under the Mesh section click the Skeletal Mesh dropdown. Select your vehicle’s Skeletal Mesh asset.
In the Details panel, scroll to the Animation section and click the Anim Class dropdown. Select your vehicle’s Animation Blueprint.
In the Details panel, scroll to the Physics section and enable the Simulate Physics checkbox.
From the Components window, click Add Component, and search for then select Spring Arm.
With the Spring Arm component selected, click Add Component, and search for then select Camera.
In the Viewport select the Camera and position it to your liking.
Select the Camera component and go to the Details panel. Scroll to the Camera Settings section and verify that Use Pawn Control Rotation is disabled.
This ensures that the camera is locked to its view direction rather than the Player Controller’s view direction.
Select the Vehicle Movement Component in the Components window.
Go to the Details panel and scroll to the Vehicle Setup section. Expand the arrow next to Wheel Setups and set the following for each wheel:
Set the Wheel Class to the Wheel Blueprint(s) you created.
Set the Bone Name to the name of the joint that should be controlled by the wheel.
The order you assign the wheels has no bearing on whether it is a front or a rear wheel, only the Bone Name and Wheel Class have any effect. For organizational purposes, we recommend keeping the wheels in the same order for each new vehicle (for example: FL, FR, BL, BR). Keeping a standard helps when you access wheel data by wheel index (the index in the Wheel Setup array).
If the vehicle requires more than 4 wheels, click the + icon next to the Wheel Setups property to add more, or conversely remove wheels as needed.
Select the Vehicle Movement Component in the Components window and go to the Details panel. Scroll down to the Mechanical Setup section and expand the Engine Setup category. Expand the Torque Curve category and add the torque curve asset to the External Curve dropdown.
Setting up Vehicle Control Inputs
Click Settings > Project Settings to open the Project Settings window.
Go to the Input category and set up control inputs for steering, throttle, brake, and handbrake. The image below shows the inputs for the Buggy in the Vehicle Game.
In the Content Browser, double-click the Vehicle Blueprint to open it.
Drag the Vehicle Movement Component to the Event Graph to create a node.
Drag from the Vehicle Movement Component node and search for then select Set Throttle Input.
Right-click in the Event Graph and search for then select Throttle to add your Throttle input event.
Connect the InputAxis Throttle node to the Set Throttle Input node. Connect the Axis Value pin from the InputAxis Throttle node to the Throttle pin of the Set Throttle Input node.
Follow the steps above and add the nodes for Set Brake Input and Set Steering Input, and connect them with their corresponding input events.
Follow the steps above and add the nodes for Set Pitch Input, Set Roll Input and Set Yaw Input, and connect them with their corresponding input events.
Follow the steps above and add the nodes for Set Change Up Input and Set Change Down Input, and connect them with their corresponding input events.
Enable the New Gear Up checkbox on the Set Change Up Input node.
Enable the New Gear Down checkbox on the Set Change Down Input node.
Follow the steps above and add the node for Set Handbrake Input twice, and connect each node to the Pressed and Released pins of the InputAction Handbrake node. Enable the New Handbrake checkbox on the Set Handbrake Input node connected to the Pressed pin.
Vehicle Game Mode Setup
In the Content Browser, right-click and select Blueprint Class from the Create Basic Asset category.
In the Pick Parent Class window, select Game Mode Base to create the Game Mode Blueprint.
In the Content Browser, double-click the new Game Mode Blueprint to open it.
Go to the Details panel and scroll to the Classes section. Click the Default Pawn Class dropdown and select your Vehicle Blueprint.
Click Compile and Save, then close the window.
In the Main Viewport window, click Window > World Settings to open the World Settings panel.
Go to the World Settings panel and scroll down to the Game Mode section.
Click the GameMode Override dropdown and select your Game Mode Blueprint.
Press Play and test your vehicle.
If you have an existing PhysX vehicle that you want to convert to Chaos, follow the PhysX to Chaos Vehicle Conversion guide.
Level Designer Quick Start
Get up and running with the basics of the Unreal Editor.
Choose your operating system:
In order to understand and use the content on this page, make sure you are familiar with the following topics:
At the end of this guide, you’ll have a room similar to the one pictured above.
The focus of the Unreal Editor Quick Start Guide is to walk you through the basics of working with Unreal Engine 5.
After going through this tutorial, developers will know the following:
How to navigate viewports
How to create a new level
How to place and edit actors in levels
How to build and run levels
1. Required Setup
In the Project Browser, you can create new projects based off several different template types, or open any previously created projects or samples that you have downloaded. Let’s create a new project.
After Installing Unreal Engine and launching the Unreal Editor, the Project Browser appears. Under New Project Categories, select a development category. For this quick start, let’s create a project from the Games category, then click Next.
Click image for full size.
Click image for full size.
In the second page of the Project Browser, select the Blank template, then click Next.
Click image for full size.
Click image for full size.
On the final page of the Project Browser, select the Blueprint and With Starter Content settings, enter a Folder location and Name for your project, then click Create Project.
Click image for full size.
Click image for full size.
2. Navigating the Viewport
With the project open and ready to go, the first thing you may notice is the Viewport in the center of the Unreal Editor.
Inside the Viewport is where you will do most of your level construction. The template project that we selected in the previous step includes a small sample Level and some assets for us to get started with. Using this little area as a point of reference, take a moment to get used to the Viewport Camera Controls by using the most common methods of navigating the Viewport in Unreal Engine 5.
Standard Controls
These controls represent the default behavior when clicking and dragging in the viewports with no other keys or buttons pressed. These are also the only controls that can be used to navigate the orthographic viewports.
LMB + Drag
Moves the camera forward and backward and rotates left and right.
RMB + Drag
Rotates the viewport camera.
LMB + RMB + Drag
Moves up and down.
Orthographic (Top, Front, Side)
LMB + Drag
Creates a marquee selection box.
RMB + Drag
Pans the viewport camera.
LMB + RMB + Drag
Zooms the viewport camera in and out.
F
Focuses the camera on the selected object. This is essential to make the most out of tumbling the camera.
Click image for full size.
WASD Fly Controls
All of these controls are only valid in a Perspective viewport, and by default you must hold RMB to use the WASD game-style controls.
W / Numpad8 / Up
Moves the camera forward.
S / Numpad2 / Down
Moves the camera backward.
A / Numpad4 / Left
Moves the camera left.
D / Numpad6 / Right
Moves the camera right.
E / Numpad9 / Page Up
Moves the camera up.
Q / Numpad7 / Page Dn
Moves the camera down.
Z / Numpad1
Zooms the camera out (raises FOV).
C / Numpad3
Zooms the camera in (lowers FOV).
Orbit, Dolly, and Track
Unreal Editor supports Maya-style pan, orbit, and zoom viewport controls, making it much easier for Maya artists to jump into the tool. If you are unfamiliar, here is a breakdown of the keys:
Alt + LMB + Drag
Tumbles the viewport around a single pivot or point of interest.
Alt + RMB + Drag
Dollies (zooms) the camera toward and away from a single pivot or point of interest.
Alt + MMB + Drag
Tracks the camera left, right, up, and down in the direction of mouse movement.
The use of the F key is not limited to Maya-style controls. You can always press F to focus on a selected object or group of objects.
3. Create a New Level
Next we will create a new Level that we will use to build our game environment in. While there are several different ways in which you can create a new Level, we will use the File Menu method, which lists level selection options.
Click image for full size.
This will open the New Level dialog window:
Click image for full size.
The Default level includes some of the commonly used assets for constructing levels, the VR-Basic level includes some assets for constructing levels with the VR Editor, and Empty Level is a completely blank level with no assets. For the purposes of this guide we are going to start from scratch with a completely blank slate.
Click the Empty Level to select it.
4. Placing Actors in the Level
In the Place Actors panel on the left, click the Geometry category and select the Box.
Click image for full size.
Left-click and drag the Box into the Level Viewport.
Click image for full size.
When you release the Left Mouse Button, the Box is added to the level.
In the Details panel (lower-right window of the editor), with the Box still selected, set Location and Rotation all to 0.
Set the Scale to 4 x 4 x 0.1.
Click image for full size.
We will use this as the floor on which the player can walk around.
In the Place Actors panel select the Lights tab, then drag-and-drop a Directional Light into the level on top of the floor.
Click image for full size.
If the Directional Light becomes unselected, you can re-select it by Left-clicking on it in the Level Viewport.
In the Place Actors panel, select the Visual Effects tab and drag-and-drop an Volumetric Cloud into the level.
Click image for full size.
The Volumetric Cloud Actor will add a basic sky to the level and the level will become illuminated instead of dark.
In the Place Actors panel, select the Basic tab and drag-and-drop a Player Start into the level.
Click image for full size.
In the Place Actors panel, select the Volumes tab and drag-and-drop a Lightmass Importance Volume into the level.
Click image for full size.
The Lightmass Importance Volume is used to control and concentrate lighting and shadowing effects within the volume. When placing the Lightmass Importance Volume in the level, the default size of the volume does not cover our playable area, so we will need to scale it up.
Click and drag the white box in the center of the Scale Tool so that the Lightmass Importance Volume encapsulates the floor.
Click image for full size.
Inside the Content Browser under Content > StarterContent > Props, drag-and-drop the SM_TableRound into the level.
Click image for full size.
Try to place the table in the center of the floor using the Move Tool (press W if it is not selected).
Also under Content > StarterContent > Props, drag-and-drop the SM_Chair into the level.
Click image for full size.
Left-click and drag the blue axis arc (the gizmo will update to show degrees) and rotate the chair to face the table.
Using the placement methods above, create a small scene by adding more Actors from the Modes Panel and Content Browser.
Click image for full size.
Try adding some lights, props, walls and a roof (found under the Content > StarterContent > Architecture folder).
5. Editing Placed Actors
With several different Actors placed inside our level, the next step involves Editing Actor Properties that can change the look or the way the Actor functions in the level, giving us a more customized looking level. We will start with editing the properties of our Directional Light Actor then shift our focus to applying Materials to some of the Static Mesh Actors that you have placed in your level.
Once you have finished this step, you will have seen where to access and modify the properties of Actors, so that you can begin editing and experimenting with different settings inside your own levels.
Select the Directional Light Actor by Left-clicking on it in the Viewport.
In the Details Panel under the Light category, enable Atmosphere Sun Light:
Click image for full size.
Depending on the rotation of your Directional Light Actor, the sky color will change. If you rotate the Viewport around, you will see that the sun now aligns with the Directional Light Actor. This is a real time process, so you can rotate the Directional Light Actor (press E to switch to Rotation Mode) and the sky will change color from night to sunrise, daytime, and sunset.
Next we will change the Material on one of your placed Static Mesh Actors by first selecting it.
With your Actor selected, in the Details panel under Materials, click the drop-down box under Element 0.
Click image for full size.
In the pop-up window, select the M_Brick_Clay_New Material.
Click image for full size.
All Actors in your level have many properties for you to adjust inside the Details panel. Explore changing their settings!
Click image for full size.
Try changing the Light Color of your lights, applying more Materials or changing the Scale of the Actors in your level.
6. Running the Build Process
By now you may have noticed the «Preview» labels in the shadows and the light leaking under walls.
This is because all the lights in the scene are static and use precomputed, or baked lighting, which has not been calculated yet. The «Preview» text is there to remind you that what you are seeing in the viewport currently is not what you will see in the game.
In this step, we will go through the Build process which will build all levels (precompute lighting data and visibility data, generate any navigation networks and update all brush models). We will also take a look at Light Quality settings inside of the Build Options, which we can use to adjust the quality of our lighting when it is built.
From the Main Toolbar, click the down-arrow next to the Build option.
Click image for full size.
Under Lighting Quality, choose the Production setting.
Click image for full size.
This will give us the highest quality lighting but is the slowest in terms of computation time and will increase the time it takes to Build the game. Our level is small, so it should not impact us too much, but keep this in mind when you are working on larger levels as you may want to leave it on a mid-low level setting while creating your level and switching it to Production in a «final pass» on your level.
Wait for the Build to complete.
You will see the progress in the lower-right corner of the Unreal Editor as seen in the image above. Once the Build process is complete, the level lighting will update to give you a better indication of the final result.
From the Main Toolbar, click the Play Button to play in the editor.
Click image for full size.
Using WASD to move and the Mouse to turn the camera, you can fly around your level.
7. On Your Own!
At this point, you should have created a Build of the level lighting and previewed your game with the Play in Editor feature. Each of the steps leading up to this point have been aimed at getting you quickly up to speed on how to perform the most common actions when constructing levels inside the Unreal Editor.
Using the methods that were provided during the course of this guide, try to do the following on your own:
Change the lighting of the level to a moonlit, night scene.
Add another room, attached to the first room.
On the attached room, try to make it elevated and join them with stairs.
Add some bushes, a couch, shelves and a front door.
Add different kinds of lights with different colors.
Use different Materials on some of your Actors.
As for specifics covered in this quick start:
For more information on the Level Editor, see: Level Editor
For more information on Viewports, see: Viewports
For more information on the Editing Modes available in Unreal Editor, 5 see: Level Editor Modes
For more information on the Content Browser, see: Content Browser
For more information on the Details Panel, see: Details Panel
For more information on Building, see: Lightmass
For more information on Lighting, see: Lighting Quick Start Guide
IK Rig
Retarget and procedurally adjust animations using IK Rig and Retargeting tools.
Choose your operating system:
The IK Rig system provides a method of interactively creating Solvers that perform pose editing for your skeletal meshes. The resulting IK Rig Asset can then be embedded into any animation systems, such as Animation Blueprints to dynamically modify the pose-based solver parameters.
Additionally, the IK Retargeting system can be used to transfer animations between characters of varying proportions, either at runtime or for offline creation of new animation sequences.
This page contains links to documentation covering Unreal Engine’s IK Rig and Retargeting tools, and real-world examples of their workflows.
IK Rig
IK Rigs are the primary Assets you will be using when using the IK Rig system. These pages describe its usage and primary features.
Niagara Overview
This page gives an overview of the Niagara VFX system in Unreal Engine 5.
Choose your operating system:
Niagara is Unreal Engine’s next-generation VFX system. With Niagara, the technical artist has the ability to create additional functionality on their own, without the assistance of a programmer. The system is adaptable and flexible. Beginners can start out by modifying templates or behavior examples, and advanced users can create their own custom modules.
Core Niagara Components
In the Niagara VFX system, there are four core components:
Systems
A Niagara system is a container for everything you will need to build that effect. Inside that system, you may have different building blocks that stack up to help you produce the overall effect.
You can modify some system-level behaviors that will then apply to everything in that effect.
The Timeline panel in the System Editor shows which emitters are contained in the system, and can be used to manage those emitters.
Emitters
Emitters are where particles are generated in a Niagara system. An emitter controls how particles are born, what happens to that particles as they age, and how the particles look and behave.
The emitter is organized in a stack. Inside that stack is several groups, inside which you can put modules that accomplish individual tasks. The groups are as follows.
Emitter Spawn
This group defines what happns when an emitter is first created on the CPU. Use this group to define initial setups and defaults.
Emitter Update
This group defines emitter-level modules that occur every frame on the CPU. Use this group to define spawning of particles when you want them to continue spawning on every frame.
Particle Spawn
This group is called once per particle, when that particle is first born. This is where you will want to define the initialization details of the particles, such as the location where they are born, what color they are, their size, and more.
Particle Update
This group is called per particle on each frame. You will want to define here anything that needs to change frame-by-frame as the particles age. For example, if the color of the particles is changing over time. Or, if the particles are affected by forces like gravity, curl noise, or point attraction. You may even want the particles to change size over time.
Event Handler
In the Event Handler group, you can create Generate events in one or more emitters that define certain data. Then you can create Listening events in other emitters which trigger a behavior in reaction to that generated event.
Render
The last group is the Render group. This is where you define the display of the particle and set up one or more renderers for your particles. You may want to use a Mesh renderer if you want to define a 3D model as the basis of your particles, upon which you could apply a material. Or, you may want to use a sprite renderer and define your particles as 2D sprites. There are many different renderers to choose from and experiment with.
Modules
Modules are the basic building blocks of effects in Niagara. You add modules to groups to make a stack. Modules are processed sequentially from top to bottom.
You can think of a module as a container for doing some math. You pass some data into the module, then inside the module you do some math on that data, and then you write that data back out at the end of the module.
Modules are built using High-Level Shading Language (HLSL), but can be built visually in a Graph using nodes. You can create functions, include inputs, or write to a value or parameter map. You can even write HLSL code inline, using the CustomHLSL node in the Graph.
You can double-click any module from an emitter in Niagara to take a look at the math that’s happening inside. You can even copy and create your own modules. For example, if you double-click on the Add Velocity module to take a look inside, you can inspect the data flow.
Click image for full size.
All modules are built with that basic methodology, though for some the internal math may be more complex.
Parameters and Parameter Types
Parameters are an abstraction of data in a Niagara simulation. Parameter types are assigned to a parameter to define the data that parameter represents. There are four types of parameters:
Primitive: This type of parameter defines numeric data of varying precision and channel widths.
Enum: This type of parameter defines a fixed set of named values, and assumes one of the named values.
Struct: This type of parameter defines a combined set of Primitive and Enum types.
Data Interfaces: This type of parameter defines functions that provide data from external data sources. This can be data from other parts of UE4, or data from an outside application.
You can add a custom parameter module to an emitter by clicking the Plus sign icon (+) and selecting Set new or existing parameter directly. This adds a Set Parameter module to the stack. Click the Plus sign icon (+) on the Set Parameter module and select Add Parameter to set an existing parameter, or Create New Parameter to set a new parameter.
Templates, Wizards, and Behavior Examples
When you first create a Niagara emitter or Niagara system, a dialog displays offering several options for what kind of emitter or system you want to create.
You can change any of the parameters in the template. You can add, modify or delete any of the modules. In a system template, you can also add, modify or delete any of the emitters. Templates are just there to jumpstart your creativity and give you something that you can work with immediately.
System Wizard
To create a new Niagara system, right-click in the Content Browser and locate FX > Niagara System.
Click image for full size.
The System Wizard offers the following options for creating a new system:
New system from selected emitters: If you select this option and click Next, a list of available emitters displays. This list includes both existing emitters in your project and template emitters. Select the emitters you want to include in the new system, and click the green Plus sign icon (+) to add them. Then click Finish to create the system. If you choose an existing emitter, the system will inherit from those emitters. If you choose a template emitter, the system will have no inheritances. Also, the template emitter is an instance that can either be strictly local to that system, or you can save it as a separate emitter asset.
Click image for full size.
New system from a template or behavior example: If you select this option and click Next, you can choose from a list of templates or behavior examples that represent several commonly used effect systems. As with emitter templates, this list can be curated by art leads or creative directors. If you are new to UE, this option will give you an example of how FX systems are built in Niagara. Behavior Examples are simple examples designed to feature one, isolated aspect of the Niagara system you can examine to help demystify that behavior.
Click image for full size.
Copy existing system: If you select this option and click Next, a list of existing systems displays. Choose one of them to copy, then click Finish.
Click image for full size.
Create empty system: If you select this option, your system contains no emitters or emitter templates. This option is useful if you want to create a system that is totally different from your other systems.
Click image for full size.
Emitter Wizard
To create a new emitter, right-click in the Content Browser and select FX > Niagara Emitter.
Click image for full size.
The Emitter Wizard offers the following options for creating a new emitter:
New emitter from a template: If you select this option, you can choose from a list of templates that present several types of commonly used effects. In a large development studio, art leads or creative directors can curate the list of templates, ensuring that the company’s best practices are baked into the templates. These templates also offer a great starting place if you are new to UE.
Click image for full size.
Inherit from an existing emitter: If you select this option, you can create a new emitter that inherits properties from an existing emitter. This option makes the new emitter a child of the existing emitter you selected. If you need many emitters that all have certain properties in common, this is a good option to choose. You can make changes to the parent emitter, and all child emitters will reflect those changes. Select from a parent emitter to use this option.
Click image for full size.
Copy existing emitter: If you select this option, you can create a new emitter that is a copy of an emitter you already created. This can be useful if you need to create several similar emitters. Click Next after selecting this option, and a list of available emitters displays. You can then choose which one you want to copy.
Click image for full size.
Niagara VFX Workflow
Create Systems
First create a Niagara System in which you can add one or more emitters. You can then set up the properties of each emitter.
Create or Add Emitters
In the Niagara Editor, you can adjust your emitter by changing the properties of the modules already in it, or add new modules for the desired effect. You can also copy emitters and add multiple emitters into a single Niagara system. For an example of this, see the Sparks tutorial.
Create or Add Modules
In your emitter, you can add existing modules from Niagara by clicking on the Plus (+) of the group where you want to add the module. Niagara comes with a lot of pre-existing modules, and for the majority of circumstances you will be able to create your effects without needing to do any custom module design.
However, if you want to create your own modules, it can be helpful to understand how the data flows through a module.
Data flow through a module. Click image for full size.
Modules accumulate to a temporary namespace, then you can stack more modules together. As long as they contribute to the same attribute, the modules will stack and accumulate properly.
When writing a module, there are many functions available for you to use:
Nodes that make boilerplate functions easier
Once you create a module, anyone else can use it.
Modules all use HLSL. The logic flow is as follows:
HLSL logic flow. Click image for full size.
Remember that each module, emitter and system you create uses resources. To conserve resources and improve performance, look through the modules already included in Niagara to see if you can accomplish your goal without creating a new module. Dynamic Inputs can be used to great effect here.
Niagara Paradigms
Inheritance
With a flat hierarchy, you cannot effectively locate and use the assets you already have in your library, which leads to people recreating those assets. Duplication of effort lowers efficiency and increases costs.
Hierarchical inheritance increases discoverability and enables effective reuse of existing assets.
Anything inherited can be overridden for a child emitter in a system.
Modules can be added, or can be reverted back to the parent value.
This is also true with emitter-level behaviors such as spawning, lifetime, looping, bursts, and so on.
Dynamic Inputs
Dynamic inputs are built the same way modules are built.
Dynamic inputs give users infinite extensibility for inheritance.
Instead of acting on a parameter map, dynamic inputs act on a value type.
Any value can be driven by Graph logic and user-facing values.
Dynamic inputs have almost the same power as creating modules, but can be selected and dropped into the stack without actually creating new modules.
Existing modules can be modified and customized in many ways by using and chaining Dynamic Inputs; this can reduce module bloat and improve performance.
Micro Expressions
Any inline value can be converted into an HLSL expression snippet.
Users can access any variable in the particle, emitter, or system, as well as any HLSL or VM function.
This works well for small, one-off features that do not need a new module.
Events
Events are a way to communicate between elements (such as particles, emitters, and systems).
Events can be any kind of data, packed into a payload (such as a struct) and sent. Then anything else can listen for that event and take action.
Options you can use:
Run the event directly on a particle by using Particle.ID.
Run the event on every particle in a System.
Set particles to spawn in response to the event, then take some action on those particles.
Events are a special node in the graph (structs). How to use the Event node:
Add whatever data you want to it.
Add an Event Handler into the Emitter stack.
Set the options for the Event Handler.
There is a separate execution stack for events.
You can put elaborate graph logic into Event Handlers.
You can have a whole particle system set up, with complex logic, and then have a whole separate set of behaviors that occur when the event triggers.
Data Interfaces
There is an extensible system to allow access to arbitrary data.
Arbitrary data includes mesh data, audio, external DDC information, code objects, and text containers.
Data interfaces can be written as plugins for greater extensibility moving forward.
Users can get any data associated with a skeletal mesh by using a skeletal mesh data interface.
Houdini
Using Houdini, you can calculate split points, spawn locations, impact positions, impact velocity, normals and so on.
You can then export that data from Houdini to a common container format (CSV).
You can import that CSV into Niagara in your UE4 project.
Lighting the Environment
Topics that demonstrate features and tools for lighting scenes.
Choose your operating system:
A significant part of building any virtual world is determining how it will be lit. This means being able to efficiently light small enclosed scenes with small lights to large worlds primarily lit by single dominant light source. The engine provides the tools and lighting options necessary to achieve the results your project’s demand.
The topics in this page contains information on the the various lighting features and tools available to use, along with guides that create a learning path to the ins and outs of lighting a scene in Unreal Engine.
New UE5 Lighting Features
A high level overview of Lumen’s dynamic global illumination and reflections features.
An overview of technical capabilities and specifications of the Lumen Global Illumination and Reflections system.
Overview of using high resolution shadowing designed with film-quality assets and large dynamically lit open worlds in mind.
Lighting Essentials
The available types of lights to choose from and how their mobility settings affect lighting in the scene.
An overview of the various properties and features that lights support.
Rendering subsystem including lighting and shadowing, materials and textures, visual effects, and post processing.
Lighting Features and Tools
Components and tools that enable users to build immersive worlds with environment lighting from fog, clouds, sky and atmosphere.
A collection of topics on the global illumination options available to choose from.
A high level overview of Lumen’s dynamic global illumination and reflections features.
An overview of Mesh Distance Fields and its available features that you can use when developing your games.
A collection of topics around the supported features of ray tracing and path tracing.
An overview of available shadowing methods and the properties they support.
Overview of using high resolution shadowing designed with film-quality assets and large dynamically lit open worlds in mind.
Systems for how reflections are captured and displayed on reflective surfaces.
Lighting Tools and Plugins
A collection of tools and plugins that are useful for lighting scenes.
An overview of using screen space shadowing to harden shadow contact points.
Information on using phsyics capsules for dynamic soft shadowing of Skeletal Meshes.
General
Effects applied to the whole rendered scene prior to being rendered.
An overview of the volumetric fog and lighting options available with the Exponential Height Fog Component.
An overview of using Direcitonal Light to simulate light scattering through the atmosphere.
This page explains how to use transparency in your Materials.
Guide for using the Bump Offset node in your Materials.
An overview of setting up and using IES textures with lights.
A Blueprint tool to quickly set up product visualization using an HDR image projection with real-time lighting and shadowing.
Materials Tutorials
Tutorials demonstrating various aspects of Material creation in Unreal Engine.
Choose your operating system:
The following step-by-step guides walk you through a wide range of topics designed to help you create and use Materials in your projects.
Starting Out
Guide for placing Material Expressions and Functions inside the Material graph.
Guide for setting up and using the Main Material Node
A guide for previewing Materials and applying them to Actors.
How to use commenting, reroutes, and other strategies to organize Material Graphs and keep them easy to read and edit.
Essential Material Workflows
These pages continue to introduce foundational principles in Unreal Material creation. Material Instances provide a way to quickly and easily customize Materials, and the rest of the tutorials in this section provide information about how to create a few common Material types.
Guide for setting up and using Material Instances.
A guide to the process of creating and using Material Functions.
This page describes how to use Texture Masking in your Materials.
A tutorial for using the Emissive Material input inside the Material Editor.
This page explains how to use transparency in your Materials.
A guide for using Refraction in your Materials.
A guide for making shiny Materials
Doing More with Materials
The tutorials in this section move further into the principles of Material logic, introducing workflows and techniques to create more sophisticated and interesting Materials.
A guide for animating UV Coordinates in a Material
Guide for using the Bump Offset node in your Materials.
A guide for using dual Normal Maps in your Clear Coat Materials.
Guide for setting up and using Colored Translucent Shadows
A guide for using detail textures in Materials to improve appearance at close distances.
A guide to working with and using mesh decals in your projects.
An introductory document on the Layered Materials technique in Unreal Engine.
Description and technical details of the Subsurface Profile shading model available in Materials.
Guide for using the Sub-Surface Scattering shading model in your Materials.
Putting It All Together
This page explains how to use the Fresnel Material node.
This page describes how to set up Radial Motion Blur in a Material.
Animation In Lyra
An overview of the Animation System in Lyra
Choose your operating system:
Lyra‘s character animations were created almost entirely in Blueprints using Unreal Engine 5’s improvements to the Animation Blueprint system. The system setup is inspired by both Paragon and Fortnite, which achieve similar results through the use of custom C++ functionality.
Asset Overview
The AnimBP_Mannequin_Base Animation Blueprint contains the AnimGraph window, which you can use to observe the architecture of Animation Nodes that contribute to the final output pose of the Character Mannequin. You can navigate to this Animation Blueprint by clicking on the Content Drawer > Characters > Heroes > Animations > AnimBP_Mannequin_Base.
The Mannequin Base Animation Blueprint’s AnimGraph and Blueprint Thread Safe Update Animation function.
Blueprint ThreadSafe Update Animation
The Animation Fast Path helps you keep processes that calculate values outside of the Game Thread. You can enable this from the editor by navigating to Edit > Project Settings > Engine > General Settings > Anim Blueprints, then enabling Allow Multi Threaded Animation Update.
You can view the function responsible for gathering the animation data and processing these calculations by opening the Class Defaults of the AnimBP_Mannequin_Base, then navigating to the My Blueprint > Functions category and clicking the BlueprintThreadSafeUpdateAnimation function.
When using Thread Safe functions you can not access data directly from game objects as you could in the Event Graph. For example, if you were attempting to copy the Gameplay float value for the Character’s speed it would not be considered thread-safe, therefore we recommend the use of Property Access to accommodate these instances.
Anim Node functions
In Lyra, the state-specific logic is created by using Anim Node Functions. This has the benefit of keeping the Animation logic organized. If you need to calculate a value while the character is in the Idle Animation, then you can put that logic into the Idle state. To see an example follow the steps below:
Navigate to the AnimBP_Mannequin_Base > Anim Graph and double-click the LocomotionSM State Machine to open a window that displays the Locomotion states.
The Locomotion State Machine includes State Aliases to transition between different Animation states.
You can double-click the Idle state and select the Output Animation Pose node, then under Functions in the Details panel, you can see the Anim Node Functions that provide a setup for the initial values of our nodes.
In our example image, we opened the Idle state machine to view the Anim Node functions used in its Output Animation Pose.
On Initial Update
Called before the node is updated for the first time.
On Become Relevant
Called when the node becomes relevant.
On Update
Called when the node is updated.
Navigate to My Blueprint > Functions > State Node Functions and double-click the UpdateIdleState function to view the logic used to calculate the final output pose of the Idle State locomotion node.
In previous Engine versions, Legacy state machine events are fired after the animation update.
State Aliases
As your projects begin to grow in size, you may have multiple animation states that your characters need to transition to. This can result in a State Machine with multiple transitional lines that can make it difficult to view in the Graph. State Aliases are used to simplify the transition logic while providing control over each individual transition between states. In Lyra, you can view an example of a State Alias being used by navigating to the AnimBP_Mannequin_Base > Anim Graph > LocomotionSM graph, then select the JumpSources state node.
The Locomotion State Machine graph with the Jump Sources node highlighted to observe the State Aliases available.
In the Details panel, you can view the Locomotion States which can directly transition to the Jump state.
When a Lyra Character is Idle, and the Player uses the Jump action, then the Lyra Character will enter into a jumping state. Eventually, they will transition into a falling state and then will either enter back into a cycle or idle state.
Upper/Lower Body Layering
BlendNodes are used to blend animations together. Most of the locomotion animations used in Lyra are full body, meaning that the animation is playing on the entire skeleton (like the jog_fwd animation), then they are combined with a variety of upper body actions that the player can use at any time (such as weapon fire or reload animations).
This is achieved through using the Layered blend per bone node, which you can view by opening the AnimBP_Mannequin_Base > AnimGraph, then navigating to the comment Upperbody/lowerbody split.
When you select the Layered blend per bone node, you can view the Details panel which includes Blend Masks that provide explicit control over the weight of individual bones involved with a blend.
Linked Layer Animation Blueprint
The Animation Blueprint Linking system enables dynamic switching between different sub-sections on the Animation Graph. The main Animation Blueprint has multiple places where you can override the pose through Linked Layer Animation Blueprints. In Lyra, this means that depending on which weapon the player is holding, you can have different locomotion behavior, animation assets, or pose corrections. You can keep their functionality separate and allow multiple users to work on the animation simultaneously, or reduce dependencies between assets while still sharing the same core functionality.
Anim Layer Interface
ALI_ItemAnimLayers is an Anim Layer Interface that specifies where you can override an animation in the Animation Blueprint. In Lyra, this is done for locomotion states in addition to layers for aiming and skeletal controls.
The FullBody_Aiming animation layer, which is a part of the Item Anim Layers interface.
ABP_ItemAnimLayersBase is the base Linked Layer Animation Blueprint that all of the weapons use. You can access this Blueprint from Content > Characters > Heroes > Mannequin > Animations > LinkedLayers.
Accessing Data From the Main AnimBP
Inside of the ABP_ItemAnimLayersBase Animation Blueprint there is a custom function Get Main Anim BPThreadSafe, that is used to get a reference to the main Animation Blueprint (AnimBP_Mannequin_Base).
The Get Main Anim BPThreadSafe function as it appears in the Item Anim Layers Base Animation Blueprint
This provides the use of Property Access to access all of its data and avoids having to re-calculate any values the linked layer may use like Acceleration or Velocity.
Using Anim Node Functions for animation selection
In Lyra, Linked Anim Layers use Property Access along with Anim Node Functions to run logic when an animation updates (On Update) or becomes relevant (On Become Relevant).
In the example below, we are choosing a directional start animation every time the animation becomes relevant.
Linked Layer Child Animation Blueprint
In Lyra, every weapon has a Child Animation Blueprint that inherits from the ABP_ItemAnimLayersBase. Animators can slot in animations and edit any variables per-weapon as seen in the image of the ABP_PistolAnimLayers Animation Blueprint shown below.
Distance Matching and Stride Warping
Distance Matching adjusts the playrate of an animation in instances where it is difficult to match the motion between Animation assets and Gameplay, for example locomotion animation assets like starts, stops, and landing animations.
Stride Warping is used to dynamically adjust the length of the character’s stride in instances where the playrate adjustment will not, such as when the character enters into the Jog state.
By combining both of these techniques, you can dynamically choose to favor one technique over the other. During start states, we begin with using Distance Matching to preserve the pose, then blend in by using Stride Warping as we approach the Jog state.
Orientation warping
Orientation Warping can be used with the root motion angle of a Character’s movement and bend the lower body of a Character to match the angle. In Lyra, Strafe animations are created for four cardinal directions, because the player can move with 360 degrees of freedom, we use Orientation Warping to procedurally adjust their pose. This technique is used during starts because we have limited animation coverage.
Turn in place
In Lyra, the Character Actor is oriented to the Controller’s yaw. To minimize foot sliding, a counter to the rotation is used by a Rotate Root Bone node and is offset by the Root Yaw Offset.
This additional offset is used when passing the yaw values to the Aim Offset in the FullBody Aiming layer. Depending on what action the player is doing, we want to allow the Root Yaw Offset to change its mode to one of the following states from the table below.
Accumulate
During idles, Accumulate will completely counter the Actor’s rotation.
Hold
During starts, Hold will preserve whichever original offset this animation started with
Blend Out
When going into a jog cycle, Blend Out will smoothly blend it, and follow the default «orient to controller» behavior.
This is done with a Request Root Yaw Offset Mode function that each of the states can call when needed.
When a character is idle and the offset mode is set to Accumulate, then we want to apply the animation’s root rotation to the Root Yaw Offset. This information is baked into curves using the Turn Yaw Anim Modifier, and it requires the source animation to have Root Motion enabled.
Additional Notes
Gameplay Tag Bindings
Lyra uses the Gameplay Ability System for most of the player’s actions. You can respond to these events in the Animation Blueprint by using Gameplay Tag Bindings. You can navigate to the Gameplay Tags inside the AnimBP_Mannequin_Base Blueprint from the Class Defaults > Details > Gameplay Tags > Gameplay Tag Property Map.
Montages
Montages have been updated to support Blend profiles and Montages.
Enabling Both simultaneously is currently not supported in 5.0
Notifies
Additional Animation Notify information has been exposed to State Machine transitions. In Lyra, we use this to control precise timing on when we can transition out of specific states.
Debugging
By using the Pose Watch Manager, you can add pose watches to specific points on the graph to inspect the runtime poses and quickly navigate to them. You can navigate to Window > Pose Watch Manager to open the Pose Watch Manager.
Sequencer Basics
Get started making cinematics and animations with Sequencer in Unreal Engine.
Choose your operating system:
This page provides a beginner’s overview of the Sequencer tool for anyone who is starting to learn about Cinematics and Unreal Engine.
What is Sequencer?
Sequencer is Unreal Engine’s cinematic toolset where you can directly animate characters, cameras, properties, and other Actors over time. This workflow is achieved by providing a non-linear editing environment in which tracks and keyframes are created and modified along a timeline.
How to Create and Open Sequencer
The easiest way to start using Sequencer is to click the Cinematics dropdown menu in the main toolbar and select Add Level Sequence. This will prompt you to create a new Level Sequence asset. Give it a name and click Save.
Once the Level Sequence is created, the Sequencer Editor will open at the bottom of the Unreal Editor window.
How to Create Content with Sequencer
The following beginner’s guides will assist you in learning common actions performed while using Sequencer.
Unreal Insights
Profile your project with Unreal Insights
Choose your operating system:
Unreal Insights assist developers to identify bottlenecks to optimize applications for better performance.
Topics
An Overview of Unreal Insights.
An Overview of using the Trace logging framework in Unreal Insights.
An Overview of the Timing Insights Window in Unreal Insights.
An Overview of Memory Insights
Overview of Networking Insights, the network performance profiling tool
Overview of Slate Insights, an extension of Unreal Insights that helps users debug Slate and Unreal Motion Graphics (UMG).
Level Instancing
An introduction to Level Instancing and how it can be used in your projects
Choose your operating system:
Level Instancing is a level-based workflow that facilitates the porting of non World Partition worlds to the World Partition system. It can be used to create complex streaming strategies using the following features:
Level instancing with transforms using a specifically-designed Actor.
Streaming multiple instances of the same Level at runtime. This was not possible in previous versions of Unreal Engine.
Hierarchical nesting of additional sublevels within a sublevel.
For example, you can use World Partition to populate a map with a large number of Actors. In cases where Actors need to be specifically arranged and reused, Level Instances provide an in-context editing workflow which replaces the old Level view.
Though it is possible to use Level Streaming without using World Partition, Level instances do not automatically have streaming management or streaming strategies outside of a World Partition main world.
Creating Level Instances
New Level instances are generated from selected Actors and can be created in either the Viewport or the World Outliner.
After selecting the Actors, right-click one of them to pull up the context-sensitive menu and select Create from selection.
You will see a dialog box with the following settings:
The type of Level Instance:
Packed Level Instance
Packed Level Instance Blueprint
Enable external Actors to use the One File per Actor system. For more information, see the One File Per Actor documentation.
The type of Pivot the Level instance will have:
Center Min Z: Pivot will be located at the center of the Level Instance at the lowest Z value.
Center: Pivot: Pivot will be located at the center of the Level Instance.
Actor: Pivot will be located at the center of the specified Actor.
If Actor is selected as the Pivot, use this dropdown to select which Actor will be the pivot point for the Level Instance.
Once you have selected your settings, you will be asked to save the new Level Instance.
This process will create a new ALevelInstance Actor in the level which replaces the previously selected Actors. This Actor represents the Level that was just created and all transforms applied to this Actor will be applied to the Level Instance.
In the World Outliner, the Level Actors will be grayed out since they can’t currently be selected or edited.
Packed Level Instances
A Packed Level Instance is a type of Level instance that tries to group a Level’s static meshes into as few static mesh instances as possible.
There are two types of Packed Level Instances:
Packed Level Instances: The packing process creates a Packed Level Instance centered on the ALevelInstance Actor’s location.
Packed Level Instance Blueprint: The packing process creates a Level instance that can be referenced from a Blueprint.
Actors and Components that are not static meshes are not included in the packing process and will remain separate in the source Level.
Packing a Level Instance does not change the editing workflow when working with the Level.
Packed Level Instances are currently an experimental feature and may change or be removed in later versions of Unreal Engine.
Editing Level Instances
Level instances can be edited in context from the Viewport or the Details panel.
Shown above, editing mode is opened by doing the following:
Select the Level instance in the Viewport.
Right-click to open the context-sensitive menu.
In the Level Instance section of the menu, hover over the Edit option and select the Level you would like to edit.
Editing mode can also be opened from the Detail panel:
Select the Level Instance that you would like to edit and click the Edit button in the Details panel.
Level Streaming at Runtime
Embedded Mode
When using Embedded Mode, Level instances that use the One File Per Actor (OFPA) system are discarded and their Actors are added to the World Partition grid at runtime. This is the default runtime mode and the suggested method of streaming Level content. This also makes Level instances exist within the editor only.
Some Actors that do not use OFPA are lost at runtime when using Embedded Mode. For example, the AWorldSettings object of an embedded Level Instance does not exist at runtime as this is a non-OFPA Actor.
It is your responsibility to avoid relying on this or to use Level Streaming mode when needed.
Level Streaming Mode
Level instances that do not use OFPA cannot be embedded in the World Partition grid and will instead use standard Level streaming at runtime. This means that when the ALevelInstance Actor is loaded through its owning World Partition runtime cell, it will load the associated Level.
This method of Level streaming has an added runtime cost as it adds more Levels to the stream. It is not recommended to use a high density of Levels that use this streaming mode due to the performance impact.
Embedded mode with World Partition: Data layers is a great way to approximate what could previously be done with dynamically loaded Levels.
Data Layers
Level Instances support World Partition: Data layers when using Embedded Mode or Level Streaming Mode. The Actors contained in the Level Instance will inherit the data layers from the ALevelInstance Actor.
Setting Up a Character
A high-level overview of how to set up a basic character or Skeletal Mesh in Unreal Engine.
Choose your operating system:
No matter your game project or genre, it is likely that at some point you are going to need some kind of animated character to move around in your environment. This may be a character that the player controls or may be some AI-driven entity that interacts with the world in some way. Regardless, you are going to need to know how to set such characters up so that they can properly animate in your world. The purpose of this document is to give you a very high-level overview of how this is done while guiding you to dedicated documents and examples for specific details. For our purposes, we will assume that you want to create a character that is controllable by the player in some way.
Throughout this document, we will make references to various scripting operations that can be done with Blueprints. Anything that can be done in Blueprints can also be done in C++, so you should not feel restricted solely to Blueprint visual scripting. The final section of this document contains references to example content showing the setup in both C++ and in Blueprints.
You can also find an example of the Playable Owen Character on the Animation Content Examples page under section 1.10.
Workflow at a Glance
The primary workflow for character setup in Unreal Engine is as follows:
Create your art assets (Skeletal Meshes) and animations, using a 3rd party digital content creation (DCC) package such as 3ds Max or Maya.
Import your Skeletal Meshes and animations into Unreal Engine by creating a new Skeleton asset for new Skeletal Meshes or by reusing an existing Skeleton asset for identical or similar Skeletal Meshes.
Create a PlayerController script or Blueprint to handle inputs from the player.
Create a Blueprint or script for a Character or Pawn to parse inputs and control the actual movement (not skeletal animation) of the character.
Construct the Animation Blueprint for the character.
Create a GameMode script or Blueprint that utilizes your custom PlayerController and any other custom script assets.
Each of these steps will generally require a wide variety of further sub-steps to be completely successful. This list just gives a general idea of the flow. In the following sections, we will go into further detail on exactly what each one of these steps means and how you can apply them.
Creating Art Assets
In many ways, the creation of your art assets may be the most challenging part of the character development process. Generally, there is significant design, modeling, surfacing, rigging, and animation time that must take place long before you even touch the Unreal Engine. While we cannot teach you the nuances of character design and animation, we do have certain tools to help the process along.
Importing Skeletal Meshes
For more information, please see the FBX Import Options Reference and Skeletal Meshes documentation.
Properly importing your Skeletal Meshes into Unreal Engine is a vital step in the process of creating your animated characters. Unreal contains a robust importing system with a variety of options to speed up your import process.
Creating a Player Controller
The PlayerController is a special type of script or Blueprint whose primary purpose is to parse inputs from the player into events that can drive a character. For instance, it can control how moving the analog stick on a controller upward can cause an event which will eventually be used to push the character forward on the screen.
A PlayerController is already an existing Class within Unreal. In the editor, you can create a new Blueprint with a parent class of PlayerController, and then use this to set up your own events that will take place upon inputs from the player.
For an example of a custom Blueprint PlayerController, you can start a new project within the editor (File > New Project) and check out the Blueprint Top Down template. All of the Blueprint-based templates will contain a PlayerController of some sort (either the default Player Controller or a Player Controller Blueprint), though if you want to see a custom application of using a Player Controller, the Blueprint Top Down template is the most straight forward.
Once within the new project, you can search within the Class Viewer for PlayerController, turning off the filters in the Class Viewer. Double-clicking on this asset will open it up and you can see the setup for yourself.
You can also see a PlayerController in C++ script by creating a new project (File > New Project) and choosing the C++ Top Down template.
Creating a Pawn or Character Blueprint
Once you have set up your PlayerController, your system is now prepared to handle inputs from the player. Now, however, you have to translate those inputs into something that can in turn drive a character around on the screen. That means those inputs need to be translated (or parsed) into actions. This is where the Pawn or Character classes come into play.
Choosing Pawn or Character
You will notice that we mention two potential classes here: Pawn and Character. Both are used for entities in the game that are either controlled by the player or by in-game AI. The key difference is that the Character class is an extension of the Pawn class, adding in player physics, support for a specific mesh, and the general types of handling needed when creating a playable in-game character. For our purposes, we will be using the Character class. For simpler elements that would just need to be driven around the scene by AI, for example, you can generally get away with using a Pawn.
Character Class Setup
Your Character Class is going to start with events that are triggered from the PlayerController, and use scripting (including Blueprint visual scripting) to control what to actually do with those inputs and how they can be used to control the character. For instance, where the PlayerController simply creates a basic event for moving the analog stick on a controller in an upward direction, the Character class is responsible for receiving that event and using it to drive the character forward.
The Character Class also holds a reference to a Skeletal Mesh, which will be the basis for what the player sees while playing the game. In the case of a first-person game, this is often just a pair of floating arms, though there may be a full body if you need that body to shadow the environment properly. For third-person games, the mesh will be the Skeletal Mesh that represents the character.
Motion on a character is generally handled by applying some motion to the physics shape (typically a capsule). This motion also coincides with a MovementMode. This is an enumeration used to keep track of what a character is doing (i.e. walking, running, falling, swimming, etc.). This information will later be used to drive what animations are being played on the Skeletal Mesh.
For an example of a custom Blueprint Character class, you should start a new project within the editor (File > New Project) and choose either of the Blueprint templates for First Person or Third Person. All of the Blueprint-based templates will contain a Character of some sort, though we recommend the First or Third Person templates due to their overall simplicity and the common use of those genres.
Once within the new project, you can search within the Class Viewer for Character, filtering by Blueprints in the Game folder. Double-clicking on this asset will open it up and you can see the setup for yourself.
You can also see a Character in C++ script by creating a new project (File > New Project) and choosing either the First or Third Person Code templates.
Animation Blueprint
You do most of the heavy lifting of hooking up animation to Characters in the Animation Blueprint.
After you have defined the ways in which a Skeletal Mesh Actor moves around in the world in the Character Blueprint, you can start assigning specific animations based on those movements (such as velocity) in the Animation Blueprint.
Animation Blueprints are by far the most sophisticated aspect of character setup. This is where all of your data comes together to actually cause your Skeletal Meshes to perform the appropriate animations. In order to fully understand Animation Blueprints and their power, there are many different animation assets that you should know about, including:
These are just the tip of the iceberg. You would do well to look at the Animation Blueprint documentation, and to also look at some of the Animation Blueprints included in some of our example content, such as the First and Third Person Templates and those found within the Content Examples project.
Once you have created the Animation Blueprint that defines the motion of your character, you will need to make sure you assign it to the Anim Blueprint Generated Class property, found under the Mesh Component of the Character Blueprint. This is necessary because you may have multiple Animation Blueprints for each Skeletal Mesh, and the Character Blueprint needs to know the one into which it will be sending the necessary animation and variable data.
GameMode Setup
A GameMode is a special type of class that is used to define your game. Generally speaking, is it going to be just a collection of properties used to define what the essential classes for your game are going to be. The primary properties you will set up include:
For testing out your character, you need to at the very least set up the Default Pawn Class and the PlayerController Class properties.
World Settings
Once you have set up your GameMode, the last step to being able to play with your custom Character is to make sure that the current Level is using your GameMode. This is done using the World Settings tab, accessible from the Settings button located on the main toolbar.
Within the World Settings, you will want to make sure you have set the GameMode Override to the name of your GameMode class. Once you have done that, you are ready to save and test out your new character!
Summary
So, to summarize the flow of setup back up the chain:
Your Level’s World Settings are used to set which GameMode you are using.
The GameMode specifies which Pawn (Character) Class and which PlayerController Class you will need to play the game.
The Character Class:
Contains the Skeletal Mesh that you imported via FBX.
Takes in data from the PlayerController Class and converts it to movement (not animation).
Stores which Animation Blueprint will be used to drive skeletal animation within its Mesh Component.
The Animation Blueprint:
Takes data from the Character Class into its Event Graph.
Uses that data to drive State Machines, Blend Spaces, and other assets.
Those assets use Animation Sequences (skeletal animation data from an FBX files) to animate the character.
The final result of the Animation Blueprint is applied to your Skeletal Mesh so that you can see the in-game character animate.
Included Examples
There are several examples you can check out in the engine to see how these setups are done and try them for yourself. We include both Templates, which are basic project genres you can use to make your own games, and Content Examples, which are pre-constructed examples of content created by artists and technicians here at Epic.
Templates
When you create a new project in Unreal Engine (File > New Project), you are given the option to choose a Template. Virtually all of these will use their own GameModes, Character Blueprints, Animation Blueprints, and every asset mentioned in this document. For purposes of simplicity and clarity, we strongly recommend you check out the First Person or Third Person Templates.
Each of these Templates is available in a code form or a Blueprint form so that you can choose to develop in the way that you are most comfortable. If you are a coder, you will probably want to use a Code Template. If you are a more artistic developer, you will likely be more comfortable exploring a Blueprint Template. Be aware that the two are not mutually exclusive; you can add code classes to a Blueprint Template project, just as you can add new Blueprint classes to a Code Template project!
Content Examples
Content Examples are specialized versions of content designed by artists and technicians here at Epic. They are found within a project named ContentExamples, which can be downloaded by users via the Marketplace. Of particular importance would be the assets found within the Animation map inside the Maps folder, which shows a variety of uses for Skeletal Mesh animation on a character.
Landscape Quick Start Guide
Getting up and running with the basics of the Landscape System in Unreal Engine.
Choose your operating system:
The Unreal Editor Landscape Quick Start Guide walks you through creating a new Landscape, sculpting the Landscape, creating new Materials for the Landscape, and painting those Materials on the Landscape.
The Landscape system inside of Unreal Engine 5 (UE5) is a collection of tools that allow you to create expansive outdoor environments. But before we dive into creating our first Landscape, let us first familiarize ourselves with some of the tools and keyboard inputs that are most commonly used to interact with the Landscape system.
Opening the Landscape Tool and Working with Modes
All of the tools that are used to interact with the Landscape system can be found under the Landscape option that is located in the Modes dropdown menu. To enable the Landscape tools, open the Modes dropdown and choose the option from the menu.
The Landscape tool has three modes, Manage, Sculpt, and Paint that are accessible by clicking on their icons at the top of the Landscape’s toolbar window. Each mode will allow you to interact with the Landscape in a different manner. Here is a very quick rundown of what each mode allow you to do.
Manage mode
Sculpt mode
Paint mode
Interacting with the Landscape Tools
While each of the three modes within the Landscape tools allows you to interact with the Landscape differently, but the keyboard and mouse keys that you use are similar. Here is a rundown of some of the most common keys, key combinations, and mouse buttons that are used when working with the Landscape tool.
Common Controls
Operation
Ctrl
Allows you to select Landscape components.
Left Mouse Button
Heightens or increases the heightmap or selected layer’s weight. For example, in Sculpting mode, this will raise the Landscape heighmap. In Paint mode this will apply the selected material to the Landscape.
Shift + Left Mouse Button
Lowers or decreases the heightmap or selected layer’s weight. For example, in Sculpting mode, this will lower the Landscape heightmap. In Paint mode, this will erase the selected material that was applied to a particular section of the Landscape.
Ctrl + Z
Undoes last action.
Ctrl + Y
Redoes last undone action.
Creating a New FPS Blueprint Project
Before we begin to create our first Landscape, lets create a new project First Person Project.
Creating a Landscape
First, create a new First Person project if you have not done so already. While you can use other templates for this tutorial, the First Person will make it a little easier to inspect your Landscape. After choosing the First Person option, click the Next button.
Click image for full size.
Make sure your project is set up to use Blueprints and contains the Starter Content folder. Choose a location where your project will be stored on your computer and make sure it has a proper name. Finally, click the Create Project button to continue.
Click image for full size.
Once you have created your new project and the editor has been loaded, create a new level using File > New Level and select the Default Level from the New Level Template.
Click image for full size.
With your new level now created, select the Floor from the level and press the Delete key to remove it from the level.
Make sure that you select your player start and move it up slightly in the Z-axis. This will make sure that your player does not start under your newly created Landscape.
Once completed, you should now have something that looks similar to the following image.
Click image for full size.
With the level cleared out and the player start moved up in the Z-axis slightly, it is now time to create a new Landscape. To create a new Landscape, click on the Landscape option in the Modes dropdown menu.
Click image for full size.
Once you have clicked on the Landscape option, you should see the following set of Landscape tools displayed in the Landscape panel.
Click image for full size.
Click image for full size.
When done, you should have something that looks like this.
Click image for full size.
Sculpting the Landscape is a time consuming process. All of the tools used for sculpting can be found under the Sculpt tab in the Landscape toolbar. If you would like to know more about what each of the Sculpting Tools does in detail, take a look at the Sculpt Mode page. For quick reference, here is a list of the most common key and mouse interactions that are used when sculpting the Landscape.
Common Controls
Operation
Ctrl
Allows you to select Landscape components.
Left Mouse Button
Heightens or increases the heightmap or selected layer’s weight. For example, in Sculpting mode, this will raise the Landscape heighmap. In Paint mode, this will apply the selected material to the Landscape.
Shift + Left Mouse Button
Lowers or decreases the heightmap or selected layer’s weight. For example, in Sculpting mode, this will lower the Landscape heightmap. In Paint mode, this will erase the selected material that was applied to a particular section of the Landscape.
Ctrl + Z
Undoes last action.
Ctrl + Y
Redoes last undone action.
For this part of the Landscape tutorial, we are going to start with a completely flat section of the Landscape and then build up the details as we go along. The goal here is not to exactly mimic what was created in the tutorial but to get you familiar and comfortable with using the various Landscape tools.
There could be a lot of various reasons as to why what you do in this tutorial does not come out exactly the same as what you see in the following screen shots. Working with the Landscape tools requires a lot of trial and error so your results will vary, sometimes greatly, from what you are seeing in the following set of images. The most important thing to get out of this tutorial is to understand how each of the Landscape tools work and how all the tools work together to give you the final product.
To begin, first find a section of the Landscape that you would like to work with. For this tutorial, we are not going to be filling in the entire Landscape but just a section of it. For ease of use, set a camera bookmark by pressing Ctrl + 1 on the keyboard. This will set a camera bookmark which will make it easier for you to gauge how your Landscape is coming along by giving you a camera view to always come back to. At any time during your editor session, if you press the 1 Key, your camera will be returned to the exact same position that you set.
Click image for full size.
With the bookmark set, begin painting in the larger details for hills and valleys using the Sculpt Tool. You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following. You can change the value of the Brush Size and Strength either in the Landscape panel or the Landscape toolbar located just above the viewport.
Remember that you use Left Mouse Button to raise the Landscape height and Shift + Left Mouse Button will lower the height of the Landscape.
Click image for full size.
Once the the hills and valleys are blocked out, it is time to use the Smooth Tool to help refine the look and feel of them. Using this tool will smooth your Landscape features and make them seem more natural. Be careful not to smooth away all your features! You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following.
Click image for full size.
Now that the Landscape is smoothed out, it is time to add some flat mesa like sections using the Flatten Tool. The Flatten Tool captures the height information of the location of your first click and raises/lowers the heightmap to meet that point as you drag around the brush. You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following.
Click image for full size.
It is time to use the Ramp Tool to add some flat ramps between the mesas. This tool works by designating a start point and an end point for your ramp and then clicking the Add Ramp button to create a flat path between the two points. Each point can be moved any direction to create a ramp that fits each unique situation. You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following. If it is not very clear where the Ramp was used, it has been highlighted in yellow.
Click image for full size.
Next, we are going to add some erosion effects to the Landscape to give it a weathered look using the Erosion Tool which works by simulating erosion done by wind. This tool is perfect for shaving away parts of your hills to create mountain peaks and ridges. You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following.
Click image for full size.
In the next step, we will take the erosion that was just added in the previous step and push it further by adding some Hydro Erosion to the Landscape. The Hydro Erosion Tool is different than the Erosion Tool as it is for simulating how water will erode Landscape details over time. Like the Smooth Tool, be careful not to erode away all your detail. You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following.
Click image for full size.
To break up the surface of the Landscape even more, we will use the Noise Tool. The Noise Tool adds random noise to the surface of the Landscape by randomly moving the Landscape vertices up or down or both at the same time. You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following.
Click image for full size.
For the final step in the Landscape sculpting part of the tutorial, we will re-use the Smooth Tool to help smooth out some of the more jagged areas of the Landscape to give it a more natural look. While you might not need to do this step yourself, this was done to help even out some of the areas that appear too deep or areas that the player might get stuck in if they fall into. You can find the brush size and strength settings that were used for this step listed below and when completed, you should have something that looks like the following.
Click image for full size.
Folder Setup
We have finished sculpting the Landscape, it is time to add some Materials to it so that it better resembles something that we see in the real world. But before you do this, you need to first setup some folders to organize the content that you create and migrate into your project.
Start by creating a new folder called Landscape in your project’s Content folder.
Then inside the Landscape folder, create the following three folders:
When completed, you should have something that looks like the following.
Click image for full size.
Migrating Textures
When Migrating content between projects, you could possibly end up with additional folders that you do not want. To fix this, select the Textures that you want inside of the Content Browser and then drag them from their current location into the folder that you want them to be placed in. This is purely a house keeping step and will have no impact on the outcome of the tutorial.
You can find the textures located in the following folder located in the Landscapes Content example project.
/Game/ExampleContent/Landscapes/Textures/
The Textures that you will be Migrating over from the Landscape Content Example project are as follows.
T_LS_Grass_01_D
T_LS_Grass_01_N
T_FullGrass_D
T_FullGrass_N
T_IceNoise_N
Once you have the textures migrated over, make sure they are placed in the Textures folder that was created in the steps above.
Creating the Landscape Material
Creating a Material for our Landscape can be done in the following steps.
Navigate to the Materials folder in the Content Broswer.
Right-click in the Content Browser and select Material from the Create Basic Asset list.
Name the newly created Material something that will allow you to easily find it, like Landscape_Material for example.
If you have not already done so, please check out the Materials pages to gain a more in-depth understanding of how materials work inside of Unreal Engine 5.
When this is complete, you will have something that looks like this:
Click image for full size.
With our new Landscape Material created, open up the Material by Double-clicking on it inside of the Content Browser. When you do, you should see something like this come up on the screen:
Click image for full size.
It is time to start adding nodes inside of the Material Editor. The first node that you are going to want to create is a LandscapeLayerCoords UV node. This node will help to generate UV coordinates that will be used to map the Landscape Material to the Landscape Actor.
Click image for full size.
The quickest way to find nodes specific to Landscape is to search for them in the Materials Palette box using Landscape as the key word.
Click image for full size.
The next Material nodes that we are going to add are going to be for the textures for the ground’s Base Color and Normal maps. For the snow, we are just going to use a Vector Parameter (V + Left-click) that uses an off White color. To make sure that no Metallic information is used, a Constant (1 + Left-click) of 0 is used and plugged into the Metallic input. Finally, for the Roughness, we set a Scalar Parameter (S + Left-click) so that this value can be tweaked via a Material Instance. Finally, make sure that you connect the LandscapeCoords to the UV’s of each of the Texture Samples. When completed, your node network should look like this:
Click image for full size.
To add the Texture Sample nodes for the various textures, select the desired texture in the Content Browser and then press T + Left-click in the Material Editor‘s graph to create the node.
Number
Texture Name
1
2
3
4
5
After the Material nodes have been added and the LandscapeCoords connected to the textures UVs, it is time to add and setup the Landscape Layer Blend node. The Landscape Layer Blend node allows for the blending of layers via Weight blending, Height blending, or Alpha blending. Weight blending uses each layer’s painted weight to determine which to display. We use Weight blending where we want two surfaces to blend seamlessly into each other, such as rock into sand. Height blending uses the same weight information along with an additional height value taken from the Texture Sample’s Alpha channel and is best used when one material needs to clearly sit on top of another, such as the Grass and Snow sitting on top of the Soil layer. Finally, Alpha blending uses the painted weight information with an Alpha layer to determine the final result.
The following table shows what Textures are associated with which Layer Name and what Blend Type they use.
When you first place down a Landscape Layer Blend node, it will be blank like in the image below labeled one. To add Layers, you need to select the node in the Material Graph and then in the Details panel, click on the Plus icon that is in-between the word Elements and the Trash Can icon. This icon is highlighted yellow in the image labeled two. How many textures you are using will determine how many Layers you will want to have.
Layer Blend Base Color
Texture
Layer Name
Blend Type
Preview Weight
LB Weight Blend
LB Height Blend
Snow as a Vector 3
LB Height Blend
Click image for full size.
Layer Blend Normal
Texture
Layer Name
Blend Type
Preview Weight
LB Weight Blend
LB Height Blend
LB Weight Blend
Click image for full size.
For more in-depth information about using the Landscape Layer Blend node or to troubleshoot any issues, please read the Terrain Expressions page.
Once the Layer Blend nodes have been set up, it is time to connect the Texture maps to them. Height blended materials will have both a Layer connection and a Height connection to accommodate the need for the additional height information. When completed, you should have something that looks like the following.
Click image for full size.
The material connections were colored in Photoshop to help better illustrate where everything needed to be connected. Currently there is no way to change the color of lines connecting Material nodes inside of Unreal Engine 5.
With the Landscape material created, it is time to apply the Material to the Landscape and begin using the Paint tools.
Landscape Painting Prep
Before we can begin painting the Landscape, there is some setup that needs to be done first. Start by applying your Landscape Material to the Landscape:
Find your Material in the Content Browser. This should be located under a folder labeled Materials that was created in the previous section. Click on it so that it is selected.
Click image for full size.
With the Landscape Material selected in the Content Browser, select the Landscape in the world. Then in the Details panel, expand the Landscape section and look for the Landscape Material input.
Click image for full size.
Apply the Material to the Landscape using the Use Selected Asset from the Content Browser arrow icon. You can also drag the Material asset from the Content Browser to the Details panel and drop it on the Landscape Material input.
Click image for full size.
When completed, you should have something that looks like this:
Click image for full size.
If you see black lines in your Landscape, they come from having Un-Built lighting. If you re-build your level’s lighting, the black lines will go away.
Now that the Landscape Material is applied, it is almost time to stat painting but before we can do that, we must first create and assign three Landscape Layer Info Objects. If you try to paint before you assign the Landscape Layer Info Objects, you will get the following warning message.
Click image for full size.
You need to create three Landscape Layer Info Objects, one for each Texture that you want to paint. Here is how you do it:
First, make sure that you are in Landscape Paint mode.
Click image for full size.
In the Landscape panel, under the Target Layers section, you should see three inputs labeled Soil, Grass, and Snow.
Click image for full size.
To the right of the names, there is a Plus Sign icon. Clicking that will bring up another menu that will ask what type of layer you would like to add. For this example, pick the Weight-Blended Layer (normal) option.
Click image for full size.
When you select the Weight-Blended Layer (normal) option, you will be prompted with a pop-up box that is asking you where you want to save the newly created Landscape Layer Info Objects. Select the Resources folder that is under the Landscape folder and then press the OK button.
Click image for full size.
Once you have completed the first one, repeat the same process for the other two. You should have something that looks like the following:
Click image for full size.
Now with the Landscape Layer Info Objects created and applied, we can begin to paint our Landscape.
Painting the Landscape
Before you begin to paint the Landscape, here is a re-cap of some of the most used commonly used keyboard and mouse inputs that you will use when painting the Landscape.
Common Controls
Operation
Left Mouse Button
Performs a stroke that applies the selected tool’s effects to the selected layer.
Ctrl+Z
Undo last stroke.
Ctrl+Y
Redo last undone stroke.
The main tool that you will use for applying textures to your Landscape is going to be the Paint Tool. To find out more about all of the tools that you can use to paint on the Landscape, check out the Paint Mode documentation.
To apply a Material to the Landscape, press and hold the Left Mouse Button to apply whatever you have selected to the area under the brush.
To select a new texture to paint, make sure you are in Landscape Painting Mode then under the Target Layers section, select which texture you want to paint with by clicking on it in the list. Whichever texture is highlighted will be painted on the Landscape. In the image below, you can see the Soil layer is highlighted meaning that this is the texture that will be painted to the Landscape.
Click image for full size.
When you finish painting, you should have something that looks like this.
Click image for full size.
Possible Issues and Workarounds
When you first start painting on your Landscape you might run into an issue where the base Material disappears or turns black, like in the following picture:
Click image for full size.
This happens when there is no Paint Layer data on the Landscape when you first start to paint. To fix this issue, continue to paint over the Landscape generating the Paint Layer data as you go. If you would like to fill in the entire Landscape, first select a large brush size, like 8192.0, pick a layer that you want to use as a base and paint over the entire Landscape once. This will create Paint Layer data and allow you to continue to paint without anything turning black.
Another issue that you might run into is that the scale of the Textures on your Landscape are either too big or too small. To fix this, open your Landscape Material and select the Landscape Coords node. With that node selected, adjust the Mapping Scale in the Details panel and save the Material. Once the Material is re-compiled, check the scale out in the viewport. If the scale is not to your liking, then repeat the processes above until you get the results you want.
Click image for full size.
Here is a comparison between a Mapping Scale of 0.5 on the Left and 5.0 on the Right.
While the quick start tutorial above will get you up and running with a Landscape, it barely scratches the surface of what the Landscape tools can do. This section aims to show you some tips and tricks for using the Landscape Tool as well as some external tools that you can use to generate your Landscape.
Tips & Tricks
When using the Paint Tools, you might find it easier to paint over what you would like to erase than to try and erase it using Shift + Left Mouse Button.
When using the Alpha Brush, remember that you can change what the pattern that the brush uses by selecting a different RGB channels from the Texture Channel drop down menu. This is very handy because you can pack up to three different Alpha patterns in a single texture.
Click image for full size.
Landscape compiles shaders separately for each component based on which layers are painted on them. For example, if you have a component with a dirt layer on it but no trace of the grass layer has been painted on it, the textures for the grass layer are left out of the material for that component, making it cheaper to render. So when you do an optimization pass, it can be worthwhile to go over a Landscape and look for components that only have a tiny trace of a given layer and erase them to reduce material complexity.
Another issue to watch out for when painting layers is to avoid having too many textures on one component. The material editor stats show the limit of how many texture samples you are allowed to use, but for Landscape materials the masks for each layer count as texture samples too and do not show in the stats. If a component starts showing the default texture (Grey Squares) when you paint a new layer onto it, it is likely that it is gone over the texture sample limit and either needs to have a layer erased or the material needs to be optimized to use less textures.
You can change the LOD Distance Factor for individual Landscape components so they will simplify at closer or further distance thresholds. Things like mountain peaks or anything with a distinct silhouette will LOD most noticeably as you move further away, so you can reduce the LOD bias for those components to preserve their shape. You can also raise the LOD bias for low-detail areas like flat plains that will not look noticeably different with less tessellation.
World Composition
Unreal Engine 5 (UE5) now offers the ability to create massive worlds made using Landscape that can easily be managed by using the World Composition tool. World Composition was designed to help simplify the management of large worlds, especially if those worlds are made using the Landscape system. To find out more about the World Composition tool, please refer to the official document that you can find here:
External Creation Tools
While the default Landscape tools do have the ability to meet all of your sculpting and painting needs, there could be some situations where you might want some extra control over your Landscape’s look and feel. The following is a list of software packages that could help you obtain the results you are looking for if you cannot get them from using the Landscape tools.
Procedural terrain generation in Houdini uses a collection of heightfield nodes to let you use layer shapes and add noise to define the look of your digital landscapes. Use advanced erosion tools for more control over details such as fluvial lines, river banks, debris, and new hierarchical scattering for more efficient placement of elements into landscapes. You can then export height and/or mask layers to create terrain in UE5, or you can wrap up your terrain networks into Houdini Digital Assets that will open up in UE5 using the Houdini Engine plug-in. When Digital Assets include heightfield nodes, they will integrate seamlessly with Unreal Engine’s native Terrain tools.
While primarily a tool for digital sculpting and painting 3D meshes, Mudbox can also be used to generate heightmap data for your Landscape. You can read more about how Mudbox might be able to help you out with your Landscape by checking out their website.
Terragen is another powerful fully procedural terrain creation software. Much like World Machine, it can be used to build, texture, and export both heightmaps and textures for your Landscape. You can read more about how Terragen might be able to help you out with your Landscape by checking out their website.
World Machine is a powerful procedural terrain creation software. It can be used to build, texture, and export both heightmaps and textures for your Landscape. You can read more about how World Machine might be able to help you out with your Landscape by checking out their website.
Zbrush is another digital sculpting and painting tool that can be used to generate heightmap data for your Landscape. You can read more about how Zbrush might be able to help you out with your Landscape by checking out their website.
Источники:
- http://docs.unrealengine.com/5.0/en-US/creating-a-new-project-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/tools-and-editors-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/audio-in-unreal-engine-5/
- http://docs.unrealengine.com/5.0/en-US/unreal-editor-interface/
- http://docs.unrealengine.com/5.0/en-US/making-interactive-experiences-and-gameplay-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/large-world-coordinates-in-unreal-engine-5/
- http://docs.unrealengine.com/5.0/en-US/hardware-ray-tracing-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/timing-insights-in-unreal-engine-5/
- http://docs.unrealengine.com/5.0/en-US/virtual-texturing-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/getting-started-in-niagara-effects-for-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/multi-user-editing-overview-for-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/onboarding-guide-for-unreal-engine-games-licensees/
- http://docs.unrealengine.com/5.0/en-US/common-ui-quickstart-guide-for-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/opening-an-existing-unreal-engine-project/
- http://docs.unrealengine.com/5.0/en-US/xr-best-practices-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/how-to-set-up-vehicles-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/level-designer-quick-start-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/unreal-engine-ik-rig/
- http://docs.unrealengine.com/5.0/en-US/overview-of-niagara-effects-for-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/lighting-the-environment-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/unreal-engine-materials-tutorials/
- http://docs.unrealengine.com/5.0/en-US/animation-in-lyra-sample-game-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/how-to-make-movies-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/unreal-insights-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/level-instancing-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/setting-up-a-character-in-unreal-engine/
- http://docs.unrealengine.com/5.0/en-US/landscape-quick-start-guide-in-unreal-engine/