SlideShare ist ein Scribd-Unternehmen logo
1 von 182
Luis Cataldi - Epic Games
Making VR with Unreal Engine
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
Recent Developments
At Epic, we drive engine development by creating content.
We use these as real-world test cases to drive our framework and
optimizations to the engine.
20
New: Forward Shading Renderer with MSAA
Supported forward rendering features include:
• Full support for stationary lights, including dynamic
shadows from movable objects which blend
together with precomputed environment shadows
• Multiple reflection captures blended together with
parallax correction
• Planar reflections of a partial scene, composited
into reflection captures
• D-Buffer decals
• Precomputed lighting and skylights
• Unshadowed movable lights
• Capsule shadows
• Instanced stereo compatible
The forward renderer can be faster than the deferred
renderer for some content. Most of the performance
improvement comes from features that can be
disabled per material. By default, only the nearest
reflection capture will be applied without parallax
correction unless the material opts-in to High Quality
Reflections, height fog is computed per-vertex, and
planar reflections are only applied to materials that
enable it.
Forward Rendering
Forward rendering is the standard, out-of-the-box rendering technique that most engines use. You supply the
graphics card the geometry, it projects it and breaks it down into vertices, and then those are transformed and split
into fragments, or pixels, that get the final rendering treatment before they are passed onto the screen.
gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342
Deferred Rendering
In deferred rendering, as the name implies, the rendering is deferred a little bit until all of the geometries have
passed down the pipe; the final image is then produced by applying shading at the end.
gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342
Lighting Performance
Lighting is on of the reason for going one route versus the other. In a standard forward rendering pipeline, the
lighting calculations have to be performed on every vertex and on every fragment in the visible scene, for every
light in the scene.
Deferred Rendering is a very interesting approach that reduces the object count, and in particular the total fragment
count, and performs the lighting calculations on the pixels on the screen, thereby using the resolution size instead
of the total fragment count.
gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342
There are advantages to both!
The UE4 Deferred Render is a full-featured workhorse, but takes a bit of skill to fully leverage.
Temporal Anti Aliasing limits how sharp your image can be.
A new UE4 Forward Renderer will be a specialized renderer, with less features, but faster
baseline. Multi Sampling Anti Aliasing (MSAA) is the sharpest solution for anti-aliasing.
Instanced Stereo Rendering
Lets us use a single draw call to draw both the left and right eyes, saving
CPU (and some GPU) time.
• Currently works on PC, and coming soon on PS4 and mobile platforms
• Enable it in your project’s settings
Instanced Stereo Rendering
Instanced stereo is available now for the deferred (desktop) renderer and the PS4.
We’re also implementing multi-view extension support for the forward (mobile) renderer, and
improving PS4 support to better utilize the hardware.
Hidden And Visible Area Mesh
We draw meshes to cull out geometry
that you can’t see, and only apply post
processing where you can.
For the deferred renderer, the visible
area mesh is a bigger optimization!
This is specific per-platform.
Hidden And Visible Area Mesh
The Hidden Area Mask uses a mesh to early-out on pixels that aren’t visible in the final
image.
Hidden And Visible Area Mesh
The Visible Area Mask uses a mesh to constrain our post processing to the visible pixels
only.
Camera Refactor
As of 4.11, we’ve completely rewritten
the camera system in order to make
developing much easier!
• Camera Components now move
exactly as the real HMD is moving
• You can attach components (meshes,
UI, etc) directly to the camera
component!
Platform Support
As of 4.12, we support the following platforms out of the box:
• Oculus Rift
• Steam VR (including the HTC Vive)
• PSVR
• OSVR (preview)
• Samsung Gear VR
• Google Daydream
• Leap Motion
Platform Support
Create once, and deploy anywhere.
Mobile
Desktop / Console
Oculus Rift
HTC Vive / Steam VR
PSVR
OSVR
Samsung Gear VR
Google Daydream
Platform Support
All of these platforms go through UE4’s
common VR interfaces, so you can make
your content once, and deploy it anywhere.
• Unified Camera System
• Motion Controller System
• Optimized rendering paths
• Low-latency optimizations
Oculus
Vive PSVR
OSVR
VR Project Template
We’ve added a new project template designed for Virtual Reality on desktop and console.
The template can be selected from the Blueprint tab as a new project is created.
VR Project Template
The motion controller template provides examples for object interaction and manipulation as
well as point based teleportation.
VR Project Template
39
The VR Editor
New: Contact Shadows
Contact shadows allow for highly detailed dynamic shadows on objects.
The Contact Shadows feature adds a short ray cast in screen space against the depth
buffer to know whether a pixel is occluded from a given light or not. This helps provide
sharp detailed shadows at the contact point of geometry.
New: Automatic LOD Generation
Unreal Engine now automatically reduces the polygon count of your static meshes to create LODs!
Automatic LOD generation uses what is called quadric mesh simplification. The mesh simplifier will calculate
the amount of visual difference that collapsing an edge by merging two vertices would generate. It then picks
the edge with the least amount of visual impact and collapses it. When it does, it picks the best place to put
the newly merged vertex and removes any triangles which have also collapsed along with the edge. It will
continue to collapse edges like this until it reaches the requested target number of triangles.
44
New: Improved Per-Pixel Translucent Lighting
Recent Developments
45
New: Reflection Capture Quality Improvements
Recent Developments
46
New: Full Resolution Skin Shading
Recent Developments
47
UE4.11: Realistic Eye Shading
Recent Developments
48
UE4.11: Realistic Hair Shading
Recent Developments
49
UE4.11: Realistic Cloth Shading
Recent Developments
50
UE4.12: Cinematic Cameras and Cinematic Viewports
Recent Developments
51
All New Audio Engine for Unreal
What’s Next
52
All New Animation Tools
What’s Next
53
All New Mesh and Authoring Tools
What’s Next
And much more!
How can we learn to
harness the power of
Unreal Engine?
VR Learning Resources for Unreal Engine:
Starting Out:
• Oculus Quick Starts
• SteamVR Quick Start
• Google VR Quick Start
• Gear VR Quick Starts
VR Platforms:
• Samsung Gear VR Development
• Google VR Development
• Oculus Rift Development
• SteamVR Development
VR Topics:
• VR Cheat Sheets
• VR Best Practices
• Motion Controller Component Setup
• VR Camera Refactor
VR Learning Resources for Unreal Engine:
Video:
• 2015 UE4 - VR and Unreal Engine
• Making Bullet Train and Going off the Rails in VR
• VR Bow and Arrow Tutorial w/ Ryan Brucks - Training Livestream
• Training Livestream - Sam and Wes' VR Stream: Cameras, Multiplayer, Tips and Tricks!
• Creating Interactions in VR with Motion Controllers 1-3
• Setting Up VR Motion Controllers
• VR Networking and 3D Menus
• Up and Running with Gear VR
• Developing for VR
• Integrating the Oculus Rift into UE4
Presentations:
• UE4 VR - Niklas Smedberg
• Lessons from Integrating the Oculus Rift into UE4
• Going Off The Rails: The Making of Bullet Train
Links:
• Sam Deiter - 10 VR tips for Unreal Engine
• Tom Looman’s - Getting Started with VR in Unreal Engine 4
VR Learning Resources for Unreal Engine:
VR Editor Starting Out:
• Activating VR Mode
VR Editor Guides:
• Setting Up VR Editor from GitHub
• Navigating the World in VR Mode
• Working with Actors in VR Mode
VR Editor Reference:
• VR Editor Controls
• Quick Select Menu
• Radial Menu
• Transforming Actors in VR
• Editor Windows in VR Mode
• Unreal Engine 4.14 Release Notes
• Unreal Engine 4.13 Release Notes
• Unreal Engine 4.12 Release Notes
• Unreal Engine 4.11 Release Notes
• Unreal Engine 4.10 Release Notes
• Unreal Engine 4.9 Release Notes
• Unreal Engine 4.8 Release Notes
• Unreal Engine 4.7 Release Notes
• Unreal Engine 4.6 Release Notes
• Unreal Engine 4.5 Release Notes
• Unreal Engine 4.4 Release Notes
• Unreal Engine 4.3 Release Notes
• Unreal Engine 4.2 Release Notes
• Unreal Engine 4.1 Release Notes
The Unreal Engine Release Notes:
Mitchell McCaffrey’s - Mitch VR Labs
• Mitch's VR Lab - an Introduction
• Mitch's VR Lab - Look Based interaction
• Mitch's VR Lab - Simple Teleportation Mechanic
• Mitch's VR Lab - Introduction to SteamVR
• Mitch's VR Lab - Simple Head IK
• Mitch’s UE4 Forum Post
Education Community VR for UE4:
Education Community VR for UE4:
Free Unreal Engine Courses:
• Twin Stick Shooter
• 3rd Person Power-Up Game with C++
• 2D Sidescroller with Blueprints
• Endless Runner with Blueprints
• Unreal Match 3 Game
Education Community VR for UE4:
Free UE4 Community Youtube.com Learning Channels:
• World of Level Design UE4 Fundamentals
• Virtus Education series
• Unreal Engine 4 Beginner Tutorials
• Mathew Wadstein Tutorials
• Leo Gonzales Unreal Basics
• Tesla Dev Tutorials
• UE4 Style Guide
Free UE4 Community Blueprints:
• Communication Training - Zak Parrish
• Blueprints Compendium - VOLUME II
• BP_Compendium.pdf
• Network Compendium
Free UE4 Community VR Learning Channels:
• Unreal Engine VR Curriculum
Free UE4 Community ArchViz Learning Channels:
• Architectural Visualization Tutorials
Education Community VR for UE4:
Paid Elearning Courses:
• Unreal Engine 4: The Complete Beginner's
Course
• Learn to Code in C++ by Developing Your First
Game
• Complete Introduction to Unreal 4
• An Introduction to Creating Realistic Materials
in UE4
• Master Blueprints in Unreal Engine 4 - Endless
Runner
• Create a Helicopter Game Control System in
Unreal Engine 4
• Unreal Essential Training - Lynda.com
• Unreal: Learn Lighting - Lynda.com
• 3dmotive - Unreal Engine courses
• Pluralsight - Unreal Engine courses
Much more coming soon.
Improved metadata support
● Skill level
● Engine version
● Sitemap filters
● Checkpoints
Learning Resources for Unreal Engine:
Learning Resources for Unreal Engine:
Getting the most value from the UE4 Launcher Learn Tab
Learning Resources for Unreal Engine:
Getting the most value from Content Examples
Design Considerations for VR
One of the biggest issues for working in
VR is Motion/Simulation Sickness.
How is it caused?
en.wikipedia.org/wiki/Virtual_reality_sickness
Sensory conflict theory believes that sickness will occur
when a user's perception of self-motion is based on
incongruent sensory inputs from the visual
system,vestibular system, and non-
vestibular proprioceptors, and particularly so when these
inputs are at odds with the user's expectation based on
prior experience.
Five typical causes of Motion/Simulation Sickness in VR
Read more about it
1. Non-forward movements
• No unnatural movements
2. Awareness of Vection
• When a large part of the visual field moves, a viewer feels like
he has moved and that the world is stationary
3. The feeling of accelerations
4. Too much camera YAW
5. Helped by adding a static reference frame
Things we CAN DO in Unreal Engine to
improve VR Games and Experiences
You MUST maintain framerate
For the VR experience to feel smooth, your game needs to run on 75 hz (Oculus DK2) or even 90
hz. (HTC Vive and Oculus CV1) depending on the device. To see the current framerate type in
“stat fps” or “stat unit” (for more detailed breakdown) in your console when running the game.
Here is where VR Instanced Stereo Can Help
“Basically, we’re utilizing hardware instancing to draw both eyes simultaneously with a single draw call and
pass through the render loop. This cuts down render thread CPU time significantly and also improves GPU
performance. Bullet Train was seeing ~15 – 20% CPU improvement on the render thread and ~7 – 10%
improvement on the GPU.” – Ryan Vance.
To enable this feature in 4.11
and above, go to your Project
Settings and look for
“Instanced Stereo” under the
Rendering category.
Things to keep in at the front of your mind:
Check your performance constantly to ensure that you are hitting
your VR performance targets.
Things to keep in at the front of your mind:
○ Maintain a very simplistic approach to making your content.
○ Minimize complex shaders as best possible.
○ Add detail to the mesh within reason in lieu of relying of complex
shaders for surface details.
Things to keep in at the front of your mind:
LOD's and aggressive culling are a must to ensure that you are
hitting your VR performance targets.
Known issues and possible workarounds:
Screen Space Reflections(SSR)
SSR will work in VR but may not give you the results that you
want. Instead consider working with reflection probes.
Known issues and possible workarounds:
Normal Mapping Issues
When viewing Normal maps on objects in VR, you will notice that they do not have the
impact that they might have once had. This is because normal mapping does not
account for a binocular display or motion parallax. Because of this, Normal maps have
a tendency to look flat when viewed with a VR device.
Known issues and possible workarounds:
Parallax Mapping
Parallax mapping takes Normal mapping to the next level by accounting for depth cues,
Normal mapping does not. A Parallax shader can better display depth information,
making objects appear to have more visually rich detail. This is because no matter what
angle you look at, a Parallax map will always correct itself to show you the appropriate
depth information from that viewpoint. The best use of a Parallax map would be for
cobblestone pathways and fine detail on surfaces.
Known issues and possible workarounds:
Tessellation Shader Displacement
Tessellation Shader Displacement will displace 3D Geometry in real time by adding
details that are not modeled into the object. Tessellation shaders do a great job of
displaying information because tessellation shaders actually create the missing detail
by creating more vertices and displacing them in 3D Space.
Launching VR Preview:
Testing out your VR headset is straightforward, simply select “VR Preview” from the
Play dropdown button. By default the head tracking will work without changes to your
existing project or template.
GPU Profiling:
To capture a single frame with GPU timings press Ctrl+Shift+, or type in “profilegpu” in the
console. This command collects accurate timings of the GPU, you will find that certain
processes are a heavy burden on the framerate (Ambient Occlusion is one common
example) when using VR.
See GPU Profiling & Performance and Profiling for documentation.
Disable Heavy Post-Processors:
Due to the demanding requirements of VR many of the advanced Post Processing features
that are normally use should be disabled. This will need to be done per-level.
• Add a Post Process(PP) volume to your level if there is not already one there.
• Select the PP volume and enable the Unbound option so that these settings will be applied to the entire level.
• Expand each Settings of the Post Process Volume and disable any undesired active PP settings by enabling that property by
clicking on it and then set the value from the default, usually 1.0, to 0 to disable the feature.
• Consider first disabling the biggest offenders to VR performance like Lens Flares, Screen Space reflections, Screen
Space Ambient Occlusion, and anything else that might be impacting performance.
• While some of the features are disabled by settings in your .INI files, this ensures that performance will not be affected if the .INI is
removed by mistake.
UE4 – Lighting for VR
• Dimmer lights & colors can help reduce simulation sickness.
• Use Static Lighting over Stationary or Dynamic.
• Make sure your Stationary / Dynamic Lights do not overlap.
• Baked lights are the best option for VR environments.
• If using Dynamic Shadows, only have one shadowing light.
• Use Stat LightRendering to see current lighting cost.
• Profile, Profile, Profile to ensure you are maintaining performance goals.
Fake shadows Wherever You Can!!
Using cheats like fake blob shadow drop to simulate dynamic shadows are a good
consideration for keeping VR project running at frame.
Blob shadow example. Image by Eric Chadwick
UE4 – Effects for VR
• Mesh based VFX work the best for VR.
• Camera Facing particles do not hold up well in VR on their own due to the stereoscopic view.
• The Dither Temporal AA Material Function can make Opacity masked objects look like
Translucent ones.
• Local Space rotation does not look correct in VR.
UE4 – Environments for VR
• Use Reflection probes instead of screen space reflections.
• Again… Texture Blob shadows are a cheap alternative to dynamic shadows.
• The Merge Actor Tool can help cut down on Static Mesh draw call without having to do work
outside of UE4.
Some very important things we all need
to know about Unreal Engine.
The Unreal Engine
Framework
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The GameMode is the definition of
the game.
● It should include things like
the game rules and win
conditions.
● It also holds important
information about:
○ Pawn
○ PlayerContoller
○ GameState
○ PlayerState
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The Pawn class is the base class of
all Actors that can be controlled by
players or AI.
● The Pawn represents the
physical location, rotation,
etc. of a player or entity within
the game.
● A Character is a special type
of Pawn that has the ability to
walk around.
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
A PlayerController is the interface
between the Pawn and the human
player controlling it.
● The PlayerController decides
what to do and then issues
commands to the Pawn (e.g.
"start crouching", "jump").
● Putting input handling or other
functionality into the
PlayerController is often
necessary.
● The PlayerController persists
throughout the game, while the
Pawn can be transient.
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The GameInstance is a class who’s
state persists switching of levels,
game modes, pawns etc. Where
classes like GameMode or
PlayerController are being reset
and data stored in those classes is
removed.
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
A PlayerState is the state of a
participant in the game, such as a
human player or a bot that is
simulating a player. Non-player AI
that exists as part of the game
would not have a PlayerState.
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The GameState contains the state
of the game, which could include
things like the list of connected
players, the score, where the
pieces are in a chess game, or the
list of what missions you have
completed in an open world game.
The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The HUD is the base object for
displaying elements overlaid on the
screen. Every human-controlled
player in the game has their own
instance of the AHUD class which
draws to their individual Viewport.
Object
Actor
Pawn
Character
Base building blocks in the Unreal Engine
Any object that can be placed into a level
Subclass of Actor and serve as an in-game avatar
Subclass of a Pawn that is intended
to be used as a player character
CharacterMovementComponent
Pawn
Character
Default Pawn
Wheeled
Vehicle
CapsuleComponent
SkeletalMeshComponent
Etc.
VehicleMovementComponent
SkeletalMeshComponent
PhysicsHandle
Etc.
SpectatorPawn
DefaultPawnMovementComponent
StaticMeshComponent
CollisionComponent, Etc.
Controller
1-to-1 Relationship
How about programming
interactivity for VR?
Programming VR Interaction with Blueprints
Blueprints in Unreal Engine is a complete visual scripting system based on the
concept of using a node-based interface to create interactions from within Unreal
Editor.
Programming VR Interaction with Blueprints
Learning Blueprints through Content Examples
Hey!! We need AUDIO for VR too!!
UE4 – Audio for VR
Ambient Sound Actors in VR
Ambient Sound Actor can be used for
many purposes such as ambient
looping sounds and non-looping
sounds. Generally, the Ambient
Sound Actor conforms to the real
world where the closer you are to a
sound, the louder it will appear.
UE4 – Audio for VR
Sound Properties
You can assign a sound asset from
the Details panel by selecting an
asset from the Sound settings drop-
down menu or by highlighting a sound
asset in the Content Browser and
clicking the button.
UE4 – Audio for VR
Attenuation Properties
Attenuation is the ability of a sound
to decrease in volume as the player
moves away from it.
It is advisable to use Sound
Attenuation objects whenever
possible, if for no other reason than
to give broad control over the
settings for many Actors.
UE4 – Audio for VR
New: Stereo Spatialization
3D spatialization is now possible for
stereo audio assets.
The 3D Stereo spread parameter
defines the distance in game units
between the left and right channels
and along a vector perpendicular to
the listener-emitter vector.
UE4 – Audio for VR
Audio Volume
Audio Volumes allow you to control
and apply various sounds in your
level as well as provide an avenue
to create compartmentalized audio
zones where you can control what is
heard inside and outside of the
volume.
Additional toolsets in Unreal Engine to enhance VR:
Complete state of the art suite of AI Tools.
Additional toolsets in Unreal Engine to enhance VR:
Complete set of tools for animation and animation retargeting
So What’s Next?
Let’s Build a new VR Project,
VR Pawn and VR Player
Controller!
Begin a new Project with a Third Person Template ✓ Desktop/Console
✓ Maximum Quality
✓ With Starter Content
Bring in new assets ✓ Infinity Blade Grass Lands
✓ Infinity Blade Ice Lands
✓ Audio
Navigate to the Infinity Blade Grass Lands folder ✓ Open the ElvenRuins map
➔ Establish a Profiler workflow
with camera bookmarks
➔ Evaluate the profile data
➔ Address performance issues on
a per-case basis
* Look out for Post Process
related issues
* Look out for lighting and
shadow related issues
* Look out for issues related
to reflections
* Look out for issues related
to transparency
** Consider the tradeoffs for keeping
or changing these elements in
the level
Working with the GPU Profiler
Utilize BumpOffset material nodes to enhance the VR experience
** When viewing Normal maps on
objects in VR, you will notice that
they do not have the same impact.
This is because Normal Mapping
does not account for having a
binocular display or motion parallax.
Because of this, Normal Maps will
often look flat when viewed with a VR
device. However that does not mean
that you should not or will not need to
use Normal maps, it just means that
you need to more closely evaluate if
the data you are trying to convey in
the Normal Map would be better off
being made out of geometry. Below
you will find some different
techniques that can be used in place
of Normal maps. This is a good
place to explore BumpOffset.
The BumpOffset Reference Place
✓ The ReferencePlane
✓ Specifies the approximate height
in texture space to apply the
effect. A value of 0 will appear to
distort the texture completely off
the surface, whereas a value of
0.5 (the default) means that some
of the surface will pop off while
some areas will be sunken in.
Understand the different parts of the Blueprint editor
✓ Components
✓ Menu Bar
✓ Details
✓ Viewport
✓ Construction Script
✓ Graph Editor
✓ My Blueprint: Variables,
Functions, Components
✓ Debug
✓ Compiler Results
Build the VR-Pawn-BP
✓ Scale the Capsule Component
Height to 120
✓ Add a Spring Arm Component
✓ Add a Camera Component
✓ Nest the Camera Component
under the Spring Arm
✓ 0 out the Target Arm length of the
Spring Arm
✓ Move the Spring Arm up 90 units
in Z
✓ Toggle on the Use Pawn
Controller Tick Box
Build the VR-Pawn-BP
✓ Move to the Event Graph
✓ Get the InputAxis for LookUp
✓ Connect it to the Add
Controller Pitch Input
✓ Get the InputAxis for Turn
✓ Connect it to the Add
Controller Yaw Input
Create a new GameMode Override
✓ Call the new GameMode
Override something like
VRGameMode
✓ Save it to something like the
Blueprints folder
Change the Default Pawn Class to VR-Pawn-BP
✓ Assign VR-Pawn-BP to the
Default Pawn Class on the
GameMode Override
Create a new PlayerController Class Blueprint in the GameMode Override
✓ Call the new PlayerController
something like
VRPlayerController
✓ Save it in the Blueprints folder
Navigate to the Project Settings ✓ Add some inputs
Create two new Action Mappings
✓ Create a Teleport Action Mapping
and assign it the Middle Mouse
Button and the Gamepad Right
Shoulder Button
✓ Create a Glide Action Mapping
and assign it the Right Mouse
Button and the Gamepad Left
Shoulder Button
Back in the VRPlayerController, build the Glide Linetrace Blueprint graph
✓ In the EventGraph on the
VRPlayerController, bring in the
InputAxis for Glide
✓ Get Player Camera Manager
✓ Pull out the GetActorLocation
✓ Pull out the
GetActorFowardVector
✓ Multiply Float x Float from the
GetActorForwardVector and
promote the multiple to a
GlideDistance Variable set to
something like 1500 units
✓ Create a LineTraceByChannel
✓ Connect the ExecutionPin from
the Glide InputAxis
✓ Connect the Start from the
GetActorLocation
✓ Connect the Out from the
Addition of the GetActorLocation
and the multiple of the below
nodes
Create 2 new Vector Variable
✓ Create a StartTrace Vector
Variable
✓ Create a EndTrace Vector
Variable
Create BreakHitResult and connect the new Variables
✓ Set the EndTrace to the
TraceEnd of the BreakHitResault
✓ Wire through the Set StartTrace
but wire that in from a
GetPlayerPawn to a
GetActorLocation node
Create a Timeline called something like GlideCruve
✓ Make a FloatCurve called
something like GlideFloat
✓ Add 2 Keys
✓ The first Key set to 0 and 0
✓ The second Key set to something
like .9 and .9
✓ Set the Use Last Keyframe
Tickbox
Use the Timeline Component to get the SetPlayRate node to readjust the rate
✓ Wire the SetPlayRate through to
the StartTrace
✓ Divide the NewRate by 1 over a
new Variable called RateOfGlide
✓ Set this RateOfGlide to
something like 20
✓ Wire this through to the
PlayfromStart of the GlideCurve
Timeline
Add a LERP (Vector) and a SetActorLocation node
✓ Connect the GlideFloat out from
the Timeline Curve to the LERP
Alpha
✓ Connect the StartTrace to A
✓ Connect EndTrace to B
✓ Wire the Timeline to a
SetActorLocation brought in from
the GetPlayerPawn Node
✓ Connect the LERP output to the
Location X and Y but take the Z
Location from the PlayerPawn
Compile and test the Glide Input ✓ Compile
✓ Test
✓ Tweak
Duplicate the Trace with inputs for the Teleport ✓ Copy the first part of the Graph
✓ Paste below
To the Teleport Input, change the GlideDistance to a new Variable called TeleportDistance ✓ Delete the GlideDistace from the
Teleport Trace
✓ Add a new Variable called
TeleportDistance
✓ Set it to a distance of 10000
To the Teleport Input, add a DotProduct, CanTeleport Boolean, and Point to Nav
✓ Create a new Variable called
CanTeleport of type Boolean
✓ Set it by dragging out from the
Impact normal of the Hit Result,
Setting Z to 1, Dragging out and
getting a GreaterThan Node,
Setting it to something like .9, and
wiring it to the CanTeleport node
✓ Wire that through to a Branch
node to set True
✓ Wire all that into the graph
✓ Drag out from Location and
create a
ProjectPointToNavigation
connected to the SetEndTrace
✓ Drag out from it, AddVector, and
connect to the SetActorLocation
after adding 90 to the Z axis to
offset the player height
Let's add a Teleport Preview Sphere
✓ Create a new Actor Class
Blueprint
Make a new Material
✓ Create a New Material called
something like VR-MAT
✓ Change it to a Translucent Blend
Mode with Unlit Shading Model
✓ Add a White Constant3Vector for
the Emissive Color
✓ Wire a Scalar Value, a Fresnel
Node, and a Power node into the
Opacity
✓ Set the Scalar value to something
like 4.5
✓ Apply it to the VR-Preview
Sphere
In the VRPlayerController, let set up the VR-Preview
✓ Duplicate the Teleport trace
graph
✓ Change the Teleport input to a
EventTick input
Add a SpawnActor from Class to the new VR-Preview graph
✓ Use Event Begin Play to call the
SpawnActorFromClass node
✓ Assign the VR-Preview-BP to it
✓ Wire the HitResult's Location into
the Spawn Transform
✓ Promote the Return Value to a
variable called something like VR
Preview
Call the PreviewSphere to toggle its Visibility
✓ Use the Branch statement from
the CanTeleport to set up the
Visibility toggle for the VR-
Preview sphere component
✓ Wire the output to the
SetActorLocation node with the
VR-Preview-BP as the target
node to be relocated
✓ Make sure to offset the sphere up
in Z value using the add vector
node
Let's lay in some Localized VR Audio ✓ Navigate to the Audio folder
✓ Find the Bird and Wind sounds
Let's lay in some Localized VR Audio ✓ Drag in the 7 Bird sounds to the
Cue
✓ With the Bird Sounds selected,
choose Random from the Pallet
to auto connect them
Let's lay in some Localized VR Audio ✓ Drag in a wind loop sound
✓ Add a Looping Pallet node
✓ Wire it all though
✓ Save the node
Let's lay in some Localized VR Audio
✓ Another Mix Cue for the top of the
hill with howling winds
Let's lay in some Localized VR Audio ✓ The new CUE is set with Override
Attenuation set
✓ The Radius scaled up
✓ The Falloff Distance values
increased
Let's build some interactable Blueprints ✓ Make a new Actor Class Blueprint
✓ Call it something like Idol-BP
Assemble the new Idol Blueprints ✓ Bring in an Angle Statue Static
Mesh Component
✓ Bring in a Torch Static Mesh
Component
✓ Bring in a Particle System Fire
Component
✓ Bring in a Sphere Mesh
Component
✓ Duplicate the VR-MAT material
and tint the color to be slightly
yellowish in hue
Right Click to create a new Blueprint Interface
✓ Call the new Blueprint Interface
something like VR-Interact-BPI
Right Click to create a new Blueprint Interface
✓ Make a new BPI Variable
Function called OnLookAt
Right Click to create a new Blueprint Interface
✓ Make a new BPI Variable
Function called OnLookAt
Let's add an Interact Action Mapping Input ✓ Call it something like Interact
✓ Assign it to the F key on the
keyboard
✓ Assign it to the Gamepad Face
Button Bottom
Add the Interact Action Mapping to the VRPlayerController Blueprint
✓ Duplicate the LineTrace and wire
in the Interact ActionMapping
Add Wire in the OnLookAt BPI to the VRPlayerController
✓ Add an IsValid to make sure you
are hitting an object from the
HitResult’s Hit Actor
✓ Wire that through to the
OnLookAt node from the BPI
Let’s replace the existing angel Idol with our new Idol-BP
✓ Select the Idol-BP in the Content
Browser
✓ Select the angel mesh in the level
✓ Right click and use the Replace
Selected Actor With utility to
swap it for the Idol-BP
✓ Zero out the rotation in Z to get it
facing the right way
Modify the Interact-MAT to prepare it for interaction ✓ Modify the Interact-MAT by
adding a Scalar Node called
something like MatPower
✓ Multiply it to the Fresnel Node
✓ Pipe the output to the PowerNode
✓ This will make the sphere around
the idol go transparent
✓ We will be driving the MatPower
value with the Idol-BP to toggle
the visibility of the Sphere
Back in the Idol-BP, set up the Torch and Sphere Preview ✓ Call the BPI Event OnLookAt
✓ Wire in a .2 Delay
✓ Wire in a Create Dynamic
Material Instance Node with the
Sphere as the Target
✓ Wire in a Set Scalar Parameter
Value with the Return Value as
the Target and the MatPower as
the Parameter Name and the
Value set to .8
Back in the Idol-BP, set up the Torch and Sphere Preview ✓ Add in another Delay of about 1.3
Duration
✓ Reset the Scalar Parameter
Value to 0 for the Parameter
Name of MatPower again
✓ Wire in a Set Visibility for the
Particle System to turn it on after
the sphere Material blinks out
Place more Idols around the level. Both Idol-BP and Static Mesh Idols.
✓ This way, the VR player can
explore the level looking for
interactable objects
Let’s keep track of all the Idol’s we interact with to affect gameplay ✓ Add an Integer Variable in the
Idol-BP called IdolValue
✓ Add an Integer Variable in the
VR-Pawn-BP called IdolValue
In the VR-Pawn-BP In the Idol-BP
Let’s keep track of all the Idol’s we interact with to affect gameplay ✓ Create a new GameState in the
GameMode Override called
VRGameState
In the VRGameState ✓ Create a CustomEvent
✓ Call it something like LevelTest
✓ Cast to the VR-Pawn-VP
✓ Use it to call IdolCount
✓ Test to see if IdolCount is Equal
or Greater than X
✓ Branch to test if True
✓ Create a Boolean Variable called
LevelClear set to False by default
✓ If the Branch Statement tests
True, set LevelClear to True
In the Idol-BP, insert this below increment function ✓ After the OnlookAt Event and the
Delay…
✓ Cast to the VR-Pawn-BP
✓ Set IdolCount
✓ Get IdolCount
✓ Add IdolValue to IdolCount
✓ Wire that through to the
DynamicMaterialInstance
** You can add a PrintString after the
Set IdolCount to see the value
increasing on screen
Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ In the top menu Cinematics
dropdown, choose Add Level
Sequence
Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Once the Sequencer window
opens
✓ Add a Camera Cuts
✓ An Events track
✓ A Fade Track
Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Use the Camera Icon to add a
new Cinematic Camera
✓ Key frame the camera movement
as desired
Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Add a set of doors as Actors to
Sequencer
✓ Use the Rotation Transform to
key frame the doors to open as
the camera animates into place
during the sequence
✓ You can add a fade track to face
in and out with ease
Back at the Idol-BP, add a Level Sequence Actor Variable ✓ Add a Level Sequence Actor
✓ Set it to Public
Back at the Idol-BP, add a Level Sequence Actor Variable ✓ Add a Level Sequence Actor
✓ Set it to Public
Let’s modify the torch flame and change the color for effect ✓ Change the initial color
✓ Change color over life
✓ Add a light to the smoke
✓ Remove the light from the
Trans_Square
Duplicate the Idol-BP to Idol-Trans-BP ✓ Change the ParticleSystem to the
blue fire
Modify the Idol-Trans-BP EventGraph to open a new level ✓ Use the same Event OnLookAt
from the BPI to run a Delay
✓ Set the Visibility on for the Blue
Flame ParticalSystem
✓ Test to ensure that the LevelClear
is True
✓ Delay again
✓ Then Open a new Level
✓ Promote Level Name to a new
Variable and make it Public
Assign the Next Level in the Idol-Trans-BP ✓ Assign a NextLevel name in the
public variable of the placed Idol
in the Details Panel
Setting up an optional Fade Effect for the Teleport • Add a Sphere Static Mesh
component to the VR-Pawn
• Turn collision OFF on the
Sphere or it will block the
LineTrace
Setting up an optional Fade Effect for the Teleport ✓ If you want to fade to white,
make the Vector3Node color
White
✓ If you want to fade to black,
make the Vector3Node color
Black
Make it Two Sided
Call the Parameter “Alpha”
Setting up an optional Fade Effect for the Teleport ✓ Create a Custom Event in the
VR-Pawn and set up the blink
graph
Setting up an optional Fade Effect for the Teleport ✓ Make a Timeline to drive the
Fade
✓ Set the out value to .09 and set
Use Last Keyframe
Setting up an optional Fade Effect for the Teleport ✓ Drive the Scalar Parameter of
the Alpha with the Timeline
✓ ** Make sure to connect the
SwitchonEtimelineDirection on
Finished
Setting up an optional Fade Effect for the Teleport ✓ Wire the FadeSphere Custom
Event into the Teleport Graph
before the SetActorLocation
182
Give it a try, it’s a lot of fun.
luis.cataldi@epicgames.com

Weitere ähnliche Inhalte

Ähnlich wie Making VR with Unreal Engine Luis Cataldi

Intro to VR with Unreal Engine
Intro to VR with Unreal Engine   Intro to VR with Unreal Engine
Intro to VR with Unreal Engine Unreal Engine
 
Making High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis CataldiMaking High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis CataldiUnreal Engine
 
Making High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis CataldiMaking High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis CataldiLuis Cataldi
 
Making VR Games and Experiences in UE4
Making VR Games and Experiences in UE4Making VR Games and Experiences in UE4
Making VR Games and Experiences in UE4Unreal Engine
 
Making VR games and experiences in Unreal Engine
Making VR games and experiences in Unreal EngineMaking VR games and experiences in Unreal Engine
Making VR games and experiences in Unreal EngineLuis Cataldi
 
Luis cataldi-ue4-vr-best-practices2
Luis cataldi-ue4-vr-best-practices2Luis cataldi-ue4-vr-best-practices2
Luis cataldi-ue4-vr-best-practices2Luis Cataldi
 
[1C7] Developing with Oculus
[1C7] Developing with Oculus[1C7] Developing with Oculus
[1C7] Developing with OculusNAVER D2
 
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSebastien Kuntz
 
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...Unity Technologies
 
ARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experienceARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experienceDevGAMM Conference
 
Building Applications with the Microsoft Kinect SDK
Building Applications with the Microsoft Kinect SDKBuilding Applications with the Microsoft Kinect SDK
Building Applications with the Microsoft Kinect SDKDataLeader.io
 
On-device Motion Tracking for Immersive VR
On-device Motion Tracking for Immersive VROn-device Motion Tracking for Immersive VR
On-device Motion Tracking for Immersive VRQualcomm Research
 
Philipp Nagele (Wikitude): What's Next with Wikitude
Philipp Nagele (Wikitude): What's Next with WikitudePhilipp Nagele (Wikitude): What's Next with Wikitude
Philipp Nagele (Wikitude): What's Next with WikitudeAugmentedWorldExpo
 
Oculus insight building the best vr aaron davies
Oculus insight building the best vr   aaron daviesOculus insight building the best vr   aaron davies
Oculus insight building the best vr aaron daviesMary Chan
 
Synthetic environment
Synthetic environmentSynthetic environment
Synthetic environmentUllas Gupta
 
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...Unity Technologies
 

Ähnlich wie Making VR with Unreal Engine Luis Cataldi (20)

Intro to VR with Unreal Engine
Intro to VR with Unreal Engine   Intro to VR with Unreal Engine
Intro to VR with Unreal Engine
 
Making High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis CataldiMaking High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis Cataldi
 
Making High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis CataldiMaking High Quality Interactive VR with Unreal Engine Luis Cataldi
Making High Quality Interactive VR with Unreal Engine Luis Cataldi
 
Making VR Games and Experiences in UE4
Making VR Games and Experiences in UE4Making VR Games and Experiences in UE4
Making VR Games and Experiences in UE4
 
Making VR games and experiences in Unreal Engine
Making VR games and experiences in Unreal EngineMaking VR games and experiences in Unreal Engine
Making VR games and experiences in Unreal Engine
 
Luis cataldi-ue4-vr-best-practices2
Luis cataldi-ue4-vr-best-practices2Luis cataldi-ue4-vr-best-practices2
Luis cataldi-ue4-vr-best-practices2
 
Alexey Savchenko, Unreal Engine
Alexey Savchenko, Unreal EngineAlexey Savchenko, Unreal Engine
Alexey Savchenko, Unreal Engine
 
[1C7] Developing with Oculus
[1C7] Developing with Oculus[1C7] Developing with Oculus
[1C7] Developing with Oculus
 
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architectureSEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
SEARIS 2014 Keynote - MiddleVR - Philosophy and architecture
 
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...
 
ARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experienceARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experience
 
LightWave™ 3D 11 Add-a-Seat
LightWave™ 3D 11 Add-a-SeatLightWave™ 3D 11 Add-a-Seat
LightWave™ 3D 11 Add-a-Seat
 
Building Applications with the Microsoft Kinect SDK
Building Applications with the Microsoft Kinect SDKBuilding Applications with the Microsoft Kinect SDK
Building Applications with the Microsoft Kinect SDK
 
On-device Motion Tracking for Immersive VR
On-device Motion Tracking for Immersive VROn-device Motion Tracking for Immersive VR
On-device Motion Tracking for Immersive VR
 
Intro to auto_desk_maya2015
Intro to auto_desk_maya2015Intro to auto_desk_maya2015
Intro to auto_desk_maya2015
 
Philipp Nagele (Wikitude): What's Next with Wikitude
Philipp Nagele (Wikitude): What's Next with WikitudePhilipp Nagele (Wikitude): What's Next with Wikitude
Philipp Nagele (Wikitude): What's Next with Wikitude
 
Oculus insight building the best vr aaron davies
Oculus insight building the best vr   aaron daviesOculus insight building the best vr   aaron davies
Oculus insight building the best vr aaron davies
 
Synthetic environment
Synthetic environmentSynthetic environment
Synthetic environment
 
Hacking for salone: drone races
Hacking for salone: drone racesHacking for salone: drone races
Hacking for salone: drone races
 
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...
Developing and optimizing a procedural game: The Elder Scrolls Blades- Unite ...
 

Kürzlich hochgeladen

Presentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxPresentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxRosabel UA
 
TEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docxTEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docxruthvilladarez
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Projectjordimapav
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operationalssuser3e220a
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmStan Meyer
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...JojoEDelaCruz
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataBabyAnnMotar
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17Celine George
 

Kürzlich hochgeladen (20)

YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
Presentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxPresentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptx
 
TEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docxTEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docx
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptxINCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Project
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operational
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and Film
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped data
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17
 

Making VR with Unreal Engine Luis Cataldi

  • 1. Luis Cataldi - Epic Games Making VR with Unreal Engine
  • 2. 2
  • 3. 3
  • 4. 4
  • 5. 5
  • 6. 6
  • 7. 7
  • 8. 8
  • 9. 9
  • 10. 10
  • 11. 11
  • 12. 12
  • 13. 13
  • 14. 14
  • 15. 15
  • 16. 16
  • 17. 17
  • 18. 18
  • 19. Recent Developments At Epic, we drive engine development by creating content. We use these as real-world test cases to drive our framework and optimizations to the engine.
  • 20. 20 New: Forward Shading Renderer with MSAA
  • 21. Supported forward rendering features include: • Full support for stationary lights, including dynamic shadows from movable objects which blend together with precomputed environment shadows • Multiple reflection captures blended together with parallax correction • Planar reflections of a partial scene, composited into reflection captures • D-Buffer decals • Precomputed lighting and skylights • Unshadowed movable lights • Capsule shadows • Instanced stereo compatible
  • 22. The forward renderer can be faster than the deferred renderer for some content. Most of the performance improvement comes from features that can be disabled per material. By default, only the nearest reflection capture will be applied without parallax correction unless the material opts-in to High Quality Reflections, height fog is computed per-vertex, and planar reflections are only applied to materials that enable it.
  • 23. Forward Rendering Forward rendering is the standard, out-of-the-box rendering technique that most engines use. You supply the graphics card the geometry, it projects it and breaks it down into vertices, and then those are transformed and split into fragments, or pixels, that get the final rendering treatment before they are passed onto the screen. gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342
  • 24. Deferred Rendering In deferred rendering, as the name implies, the rendering is deferred a little bit until all of the geometries have passed down the pipe; the final image is then produced by applying shading at the end. gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342
  • 25. Lighting Performance Lighting is on of the reason for going one route versus the other. In a standard forward rendering pipeline, the lighting calculations have to be performed on every vertex and on every fragment in the visible scene, for every light in the scene. Deferred Rendering is a very interesting approach that reduces the object count, and in particular the total fragment count, and performs the lighting calculations on the pixels on the screen, thereby using the resolution size instead of the total fragment count. gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342
  • 26. There are advantages to both! The UE4 Deferred Render is a full-featured workhorse, but takes a bit of skill to fully leverage. Temporal Anti Aliasing limits how sharp your image can be. A new UE4 Forward Renderer will be a specialized renderer, with less features, but faster baseline. Multi Sampling Anti Aliasing (MSAA) is the sharpest solution for anti-aliasing.
  • 27. Instanced Stereo Rendering Lets us use a single draw call to draw both the left and right eyes, saving CPU (and some GPU) time. • Currently works on PC, and coming soon on PS4 and mobile platforms • Enable it in your project’s settings
  • 28. Instanced Stereo Rendering Instanced stereo is available now for the deferred (desktop) renderer and the PS4. We’re also implementing multi-view extension support for the forward (mobile) renderer, and improving PS4 support to better utilize the hardware.
  • 29. Hidden And Visible Area Mesh We draw meshes to cull out geometry that you can’t see, and only apply post processing where you can. For the deferred renderer, the visible area mesh is a bigger optimization! This is specific per-platform.
  • 30. Hidden And Visible Area Mesh The Hidden Area Mask uses a mesh to early-out on pixels that aren’t visible in the final image.
  • 31. Hidden And Visible Area Mesh The Visible Area Mask uses a mesh to constrain our post processing to the visible pixels only.
  • 32. Camera Refactor As of 4.11, we’ve completely rewritten the camera system in order to make developing much easier! • Camera Components now move exactly as the real HMD is moving • You can attach components (meshes, UI, etc) directly to the camera component!
  • 33. Platform Support As of 4.12, we support the following platforms out of the box: • Oculus Rift • Steam VR (including the HTC Vive) • PSVR • OSVR (preview) • Samsung Gear VR • Google Daydream • Leap Motion
  • 34. Platform Support Create once, and deploy anywhere. Mobile Desktop / Console Oculus Rift HTC Vive / Steam VR PSVR OSVR Samsung Gear VR Google Daydream
  • 35. Platform Support All of these platforms go through UE4’s common VR interfaces, so you can make your content once, and deploy it anywhere. • Unified Camera System • Motion Controller System • Optimized rendering paths • Low-latency optimizations Oculus Vive PSVR OSVR
  • 36. VR Project Template We’ve added a new project template designed for Virtual Reality on desktop and console.
  • 37. The template can be selected from the Blueprint tab as a new project is created. VR Project Template
  • 38. The motion controller template provides examples for object interaction and manipulation as well as point based teleportation. VR Project Template
  • 40. New: Contact Shadows Contact shadows allow for highly detailed dynamic shadows on objects.
  • 41. The Contact Shadows feature adds a short ray cast in screen space against the depth buffer to know whether a pixel is occluded from a given light or not. This helps provide sharp detailed shadows at the contact point of geometry.
  • 42. New: Automatic LOD Generation Unreal Engine now automatically reduces the polygon count of your static meshes to create LODs!
  • 43. Automatic LOD generation uses what is called quadric mesh simplification. The mesh simplifier will calculate the amount of visual difference that collapsing an edge by merging two vertices would generate. It then picks the edge with the least amount of visual impact and collapses it. When it does, it picks the best place to put the newly merged vertex and removes any triangles which have also collapsed along with the edge. It will continue to collapse edges like this until it reaches the requested target number of triangles.
  • 44. 44 New: Improved Per-Pixel Translucent Lighting Recent Developments
  • 45. 45 New: Reflection Capture Quality Improvements Recent Developments
  • 46. 46 New: Full Resolution Skin Shading Recent Developments
  • 47. 47 UE4.11: Realistic Eye Shading Recent Developments
  • 48. 48 UE4.11: Realistic Hair Shading Recent Developments
  • 49. 49 UE4.11: Realistic Cloth Shading Recent Developments
  • 50. 50 UE4.12: Cinematic Cameras and Cinematic Viewports Recent Developments
  • 51. 51 All New Audio Engine for Unreal What’s Next
  • 52. 52 All New Animation Tools What’s Next
  • 53. 53 All New Mesh and Authoring Tools What’s Next
  • 55. How can we learn to harness the power of Unreal Engine?
  • 56. VR Learning Resources for Unreal Engine: Starting Out: • Oculus Quick Starts • SteamVR Quick Start • Google VR Quick Start • Gear VR Quick Starts VR Platforms: • Samsung Gear VR Development • Google VR Development • Oculus Rift Development • SteamVR Development VR Topics: • VR Cheat Sheets • VR Best Practices • Motion Controller Component Setup • VR Camera Refactor
  • 57. VR Learning Resources for Unreal Engine: Video: • 2015 UE4 - VR and Unreal Engine • Making Bullet Train and Going off the Rails in VR • VR Bow and Arrow Tutorial w/ Ryan Brucks - Training Livestream • Training Livestream - Sam and Wes' VR Stream: Cameras, Multiplayer, Tips and Tricks! • Creating Interactions in VR with Motion Controllers 1-3 • Setting Up VR Motion Controllers • VR Networking and 3D Menus • Up and Running with Gear VR • Developing for VR • Integrating the Oculus Rift into UE4 Presentations: • UE4 VR - Niklas Smedberg • Lessons from Integrating the Oculus Rift into UE4 • Going Off The Rails: The Making of Bullet Train Links: • Sam Deiter - 10 VR tips for Unreal Engine • Tom Looman’s - Getting Started with VR in Unreal Engine 4
  • 58. VR Learning Resources for Unreal Engine: VR Editor Starting Out: • Activating VR Mode VR Editor Guides: • Setting Up VR Editor from GitHub • Navigating the World in VR Mode • Working with Actors in VR Mode VR Editor Reference: • VR Editor Controls • Quick Select Menu • Radial Menu • Transforming Actors in VR • Editor Windows in VR Mode
  • 59. • Unreal Engine 4.14 Release Notes • Unreal Engine 4.13 Release Notes • Unreal Engine 4.12 Release Notes • Unreal Engine 4.11 Release Notes • Unreal Engine 4.10 Release Notes • Unreal Engine 4.9 Release Notes • Unreal Engine 4.8 Release Notes • Unreal Engine 4.7 Release Notes • Unreal Engine 4.6 Release Notes • Unreal Engine 4.5 Release Notes • Unreal Engine 4.4 Release Notes • Unreal Engine 4.3 Release Notes • Unreal Engine 4.2 Release Notes • Unreal Engine 4.1 Release Notes The Unreal Engine Release Notes:
  • 60. Mitchell McCaffrey’s - Mitch VR Labs • Mitch's VR Lab - an Introduction • Mitch's VR Lab - Look Based interaction • Mitch's VR Lab - Simple Teleportation Mechanic • Mitch's VR Lab - Introduction to SteamVR • Mitch's VR Lab - Simple Head IK • Mitch’s UE4 Forum Post Education Community VR for UE4:
  • 62. Free Unreal Engine Courses: • Twin Stick Shooter • 3rd Person Power-Up Game with C++ • 2D Sidescroller with Blueprints • Endless Runner with Blueprints • Unreal Match 3 Game Education Community VR for UE4: Free UE4 Community Youtube.com Learning Channels: • World of Level Design UE4 Fundamentals • Virtus Education series • Unreal Engine 4 Beginner Tutorials • Mathew Wadstein Tutorials • Leo Gonzales Unreal Basics • Tesla Dev Tutorials • UE4 Style Guide
  • 63. Free UE4 Community Blueprints: • Communication Training - Zak Parrish • Blueprints Compendium - VOLUME II • BP_Compendium.pdf • Network Compendium Free UE4 Community VR Learning Channels: • Unreal Engine VR Curriculum Free UE4 Community ArchViz Learning Channels: • Architectural Visualization Tutorials Education Community VR for UE4: Paid Elearning Courses: • Unreal Engine 4: The Complete Beginner's Course • Learn to Code in C++ by Developing Your First Game • Complete Introduction to Unreal 4 • An Introduction to Creating Realistic Materials in UE4 • Master Blueprints in Unreal Engine 4 - Endless Runner • Create a Helicopter Game Control System in Unreal Engine 4 • Unreal Essential Training - Lynda.com • Unreal: Learn Lighting - Lynda.com • 3dmotive - Unreal Engine courses • Pluralsight - Unreal Engine courses
  • 65. Improved metadata support ● Skill level ● Engine version ● Sitemap filters ● Checkpoints Learning Resources for Unreal Engine:
  • 66. Learning Resources for Unreal Engine: Getting the most value from the UE4 Launcher Learn Tab
  • 67. Learning Resources for Unreal Engine: Getting the most value from Content Examples
  • 69. One of the biggest issues for working in VR is Motion/Simulation Sickness.
  • 70. How is it caused?
  • 71. en.wikipedia.org/wiki/Virtual_reality_sickness Sensory conflict theory believes that sickness will occur when a user's perception of self-motion is based on incongruent sensory inputs from the visual system,vestibular system, and non- vestibular proprioceptors, and particularly so when these inputs are at odds with the user's expectation based on prior experience.
  • 72. Five typical causes of Motion/Simulation Sickness in VR Read more about it 1. Non-forward movements • No unnatural movements 2. Awareness of Vection • When a large part of the visual field moves, a viewer feels like he has moved and that the world is stationary 3. The feeling of accelerations 4. Too much camera YAW 5. Helped by adding a static reference frame
  • 73. Things we CAN DO in Unreal Engine to improve VR Games and Experiences
  • 74. You MUST maintain framerate For the VR experience to feel smooth, your game needs to run on 75 hz (Oculus DK2) or even 90 hz. (HTC Vive and Oculus CV1) depending on the device. To see the current framerate type in “stat fps” or “stat unit” (for more detailed breakdown) in your console when running the game.
  • 75. Here is where VR Instanced Stereo Can Help “Basically, we’re utilizing hardware instancing to draw both eyes simultaneously with a single draw call and pass through the render loop. This cuts down render thread CPU time significantly and also improves GPU performance. Bullet Train was seeing ~15 – 20% CPU improvement on the render thread and ~7 – 10% improvement on the GPU.” – Ryan Vance. To enable this feature in 4.11 and above, go to your Project Settings and look for “Instanced Stereo” under the Rendering category.
  • 76. Things to keep in at the front of your mind: Check your performance constantly to ensure that you are hitting your VR performance targets.
  • 77. Things to keep in at the front of your mind: ○ Maintain a very simplistic approach to making your content. ○ Minimize complex shaders as best possible. ○ Add detail to the mesh within reason in lieu of relying of complex shaders for surface details.
  • 78. Things to keep in at the front of your mind: LOD's and aggressive culling are a must to ensure that you are hitting your VR performance targets.
  • 79. Known issues and possible workarounds: Screen Space Reflections(SSR) SSR will work in VR but may not give you the results that you want. Instead consider working with reflection probes.
  • 80. Known issues and possible workarounds: Normal Mapping Issues When viewing Normal maps on objects in VR, you will notice that they do not have the impact that they might have once had. This is because normal mapping does not account for a binocular display or motion parallax. Because of this, Normal maps have a tendency to look flat when viewed with a VR device.
  • 81. Known issues and possible workarounds: Parallax Mapping Parallax mapping takes Normal mapping to the next level by accounting for depth cues, Normal mapping does not. A Parallax shader can better display depth information, making objects appear to have more visually rich detail. This is because no matter what angle you look at, a Parallax map will always correct itself to show you the appropriate depth information from that viewpoint. The best use of a Parallax map would be for cobblestone pathways and fine detail on surfaces.
  • 82. Known issues and possible workarounds: Tessellation Shader Displacement Tessellation Shader Displacement will displace 3D Geometry in real time by adding details that are not modeled into the object. Tessellation shaders do a great job of displaying information because tessellation shaders actually create the missing detail by creating more vertices and displacing them in 3D Space.
  • 83. Launching VR Preview: Testing out your VR headset is straightforward, simply select “VR Preview” from the Play dropdown button. By default the head tracking will work without changes to your existing project or template.
  • 84. GPU Profiling: To capture a single frame with GPU timings press Ctrl+Shift+, or type in “profilegpu” in the console. This command collects accurate timings of the GPU, you will find that certain processes are a heavy burden on the framerate (Ambient Occlusion is one common example) when using VR. See GPU Profiling & Performance and Profiling for documentation.
  • 85. Disable Heavy Post-Processors: Due to the demanding requirements of VR many of the advanced Post Processing features that are normally use should be disabled. This will need to be done per-level. • Add a Post Process(PP) volume to your level if there is not already one there. • Select the PP volume and enable the Unbound option so that these settings will be applied to the entire level. • Expand each Settings of the Post Process Volume and disable any undesired active PP settings by enabling that property by clicking on it and then set the value from the default, usually 1.0, to 0 to disable the feature. • Consider first disabling the biggest offenders to VR performance like Lens Flares, Screen Space reflections, Screen Space Ambient Occlusion, and anything else that might be impacting performance. • While some of the features are disabled by settings in your .INI files, this ensures that performance will not be affected if the .INI is removed by mistake.
  • 86. UE4 – Lighting for VR • Dimmer lights & colors can help reduce simulation sickness. • Use Static Lighting over Stationary or Dynamic. • Make sure your Stationary / Dynamic Lights do not overlap. • Baked lights are the best option for VR environments. • If using Dynamic Shadows, only have one shadowing light. • Use Stat LightRendering to see current lighting cost. • Profile, Profile, Profile to ensure you are maintaining performance goals.
  • 87. Fake shadows Wherever You Can!! Using cheats like fake blob shadow drop to simulate dynamic shadows are a good consideration for keeping VR project running at frame. Blob shadow example. Image by Eric Chadwick
  • 88. UE4 – Effects for VR • Mesh based VFX work the best for VR. • Camera Facing particles do not hold up well in VR on their own due to the stereoscopic view. • The Dither Temporal AA Material Function can make Opacity masked objects look like Translucent ones. • Local Space rotation does not look correct in VR.
  • 89. UE4 – Environments for VR • Use Reflection probes instead of screen space reflections. • Again… Texture Blob shadows are a cheap alternative to dynamic shadows. • The Merge Actor Tool can help cut down on Static Mesh draw call without having to do work outside of UE4.
  • 90. Some very important things we all need to know about Unreal Engine.
  • 92. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class
  • 93. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class The GameMode is the definition of the game. ● It should include things like the game rules and win conditions. ● It also holds important information about: ○ Pawn ○ PlayerContoller ○ GameState ○ PlayerState
  • 94. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class The Pawn class is the base class of all Actors that can be controlled by players or AI. ● The Pawn represents the physical location, rotation, etc. of a player or entity within the game. ● A Character is a special type of Pawn that has the ability to walk around.
  • 95. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class A PlayerController is the interface between the Pawn and the human player controlling it. ● The PlayerController decides what to do and then issues commands to the Pawn (e.g. "start crouching", "jump"). ● Putting input handling or other functionality into the PlayerController is often necessary. ● The PlayerController persists throughout the game, while the Pawn can be transient.
  • 96. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class The GameInstance is a class who’s state persists switching of levels, game modes, pawns etc. Where classes like GameMode or PlayerController are being reset and data stored in those classes is removed.
  • 97. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class A PlayerState is the state of a participant in the game, such as a human player or a bot that is simulating a player. Non-player AI that exists as part of the game would not have a PlayerState.
  • 98. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class The GameState contains the state of the game, which could include things like the list of connected players, the score, where the pieces are in a chess game, or the list of what missions you have completed in an open world game.
  • 99. The Unreal Engine Framework GameInstance GameMode Pawn Class HUD Class PlayerController Class GameState Class PlayerState Class The HUD is the base object for displaying elements overlaid on the screen. Every human-controlled player in the game has their own instance of the AHUD class which draws to their individual Viewport.
  • 100. Object Actor Pawn Character Base building blocks in the Unreal Engine Any object that can be placed into a level Subclass of Actor and serve as an in-game avatar Subclass of a Pawn that is intended to be used as a player character
  • 103. Programming VR Interaction with Blueprints Blueprints in Unreal Engine is a complete visual scripting system based on the concept of using a node-based interface to create interactions from within Unreal Editor.
  • 104. Programming VR Interaction with Blueprints Learning Blueprints through Content Examples
  • 105. Hey!! We need AUDIO for VR too!!
  • 106. UE4 – Audio for VR Ambient Sound Actors in VR Ambient Sound Actor can be used for many purposes such as ambient looping sounds and non-looping sounds. Generally, the Ambient Sound Actor conforms to the real world where the closer you are to a sound, the louder it will appear.
  • 107. UE4 – Audio for VR Sound Properties You can assign a sound asset from the Details panel by selecting an asset from the Sound settings drop- down menu or by highlighting a sound asset in the Content Browser and clicking the button.
  • 108. UE4 – Audio for VR Attenuation Properties Attenuation is the ability of a sound to decrease in volume as the player moves away from it. It is advisable to use Sound Attenuation objects whenever possible, if for no other reason than to give broad control over the settings for many Actors.
  • 109. UE4 – Audio for VR New: Stereo Spatialization 3D spatialization is now possible for stereo audio assets. The 3D Stereo spread parameter defines the distance in game units between the left and right channels and along a vector perpendicular to the listener-emitter vector.
  • 110. UE4 – Audio for VR Audio Volume Audio Volumes allow you to control and apply various sounds in your level as well as provide an avenue to create compartmentalized audio zones where you can control what is heard inside and outside of the volume.
  • 111. Additional toolsets in Unreal Engine to enhance VR: Complete state of the art suite of AI Tools.
  • 112. Additional toolsets in Unreal Engine to enhance VR: Complete set of tools for animation and animation retargeting
  • 114. Let’s Build a new VR Project, VR Pawn and VR Player Controller!
  • 115. Begin a new Project with a Third Person Template ✓ Desktop/Console ✓ Maximum Quality ✓ With Starter Content
  • 116. Bring in new assets ✓ Infinity Blade Grass Lands ✓ Infinity Blade Ice Lands ✓ Audio
  • 117. Navigate to the Infinity Blade Grass Lands folder ✓ Open the ElvenRuins map
  • 118. ➔ Establish a Profiler workflow with camera bookmarks ➔ Evaluate the profile data ➔ Address performance issues on a per-case basis * Look out for Post Process related issues * Look out for lighting and shadow related issues * Look out for issues related to reflections * Look out for issues related to transparency ** Consider the tradeoffs for keeping or changing these elements in the level Working with the GPU Profiler
  • 119. Utilize BumpOffset material nodes to enhance the VR experience ** When viewing Normal maps on objects in VR, you will notice that they do not have the same impact. This is because Normal Mapping does not account for having a binocular display or motion parallax. Because of this, Normal Maps will often look flat when viewed with a VR device. However that does not mean that you should not or will not need to use Normal maps, it just means that you need to more closely evaluate if the data you are trying to convey in the Normal Map would be better off being made out of geometry. Below you will find some different techniques that can be used in place of Normal maps. This is a good place to explore BumpOffset.
  • 120. The BumpOffset Reference Place ✓ The ReferencePlane ✓ Specifies the approximate height in texture space to apply the effect. A value of 0 will appear to distort the texture completely off the surface, whereas a value of 0.5 (the default) means that some of the surface will pop off while some areas will be sunken in.
  • 121. Understand the different parts of the Blueprint editor ✓ Components ✓ Menu Bar ✓ Details ✓ Viewport ✓ Construction Script ✓ Graph Editor ✓ My Blueprint: Variables, Functions, Components ✓ Debug ✓ Compiler Results
  • 122. Build the VR-Pawn-BP ✓ Scale the Capsule Component Height to 120 ✓ Add a Spring Arm Component ✓ Add a Camera Component ✓ Nest the Camera Component under the Spring Arm ✓ 0 out the Target Arm length of the Spring Arm ✓ Move the Spring Arm up 90 units in Z ✓ Toggle on the Use Pawn Controller Tick Box
  • 123. Build the VR-Pawn-BP ✓ Move to the Event Graph ✓ Get the InputAxis for LookUp ✓ Connect it to the Add Controller Pitch Input ✓ Get the InputAxis for Turn ✓ Connect it to the Add Controller Yaw Input
  • 124. Create a new GameMode Override ✓ Call the new GameMode Override something like VRGameMode ✓ Save it to something like the Blueprints folder
  • 125. Change the Default Pawn Class to VR-Pawn-BP ✓ Assign VR-Pawn-BP to the Default Pawn Class on the GameMode Override
  • 126. Create a new PlayerController Class Blueprint in the GameMode Override ✓ Call the new PlayerController something like VRPlayerController ✓ Save it in the Blueprints folder
  • 127. Navigate to the Project Settings ✓ Add some inputs
  • 128. Create two new Action Mappings ✓ Create a Teleport Action Mapping and assign it the Middle Mouse Button and the Gamepad Right Shoulder Button ✓ Create a Glide Action Mapping and assign it the Right Mouse Button and the Gamepad Left Shoulder Button
  • 129. Back in the VRPlayerController, build the Glide Linetrace Blueprint graph ✓ In the EventGraph on the VRPlayerController, bring in the InputAxis for Glide ✓ Get Player Camera Manager ✓ Pull out the GetActorLocation ✓ Pull out the GetActorFowardVector ✓ Multiply Float x Float from the GetActorForwardVector and promote the multiple to a GlideDistance Variable set to something like 1500 units ✓ Create a LineTraceByChannel ✓ Connect the ExecutionPin from the Glide InputAxis ✓ Connect the Start from the GetActorLocation ✓ Connect the Out from the Addition of the GetActorLocation and the multiple of the below nodes
  • 130. Create 2 new Vector Variable ✓ Create a StartTrace Vector Variable ✓ Create a EndTrace Vector Variable
  • 131. Create BreakHitResult and connect the new Variables ✓ Set the EndTrace to the TraceEnd of the BreakHitResault ✓ Wire through the Set StartTrace but wire that in from a GetPlayerPawn to a GetActorLocation node
  • 132. Create a Timeline called something like GlideCruve ✓ Make a FloatCurve called something like GlideFloat ✓ Add 2 Keys ✓ The first Key set to 0 and 0 ✓ The second Key set to something like .9 and .9 ✓ Set the Use Last Keyframe Tickbox
  • 133. Use the Timeline Component to get the SetPlayRate node to readjust the rate ✓ Wire the SetPlayRate through to the StartTrace ✓ Divide the NewRate by 1 over a new Variable called RateOfGlide ✓ Set this RateOfGlide to something like 20 ✓ Wire this through to the PlayfromStart of the GlideCurve Timeline
  • 134. Add a LERP (Vector) and a SetActorLocation node ✓ Connect the GlideFloat out from the Timeline Curve to the LERP Alpha ✓ Connect the StartTrace to A ✓ Connect EndTrace to B ✓ Wire the Timeline to a SetActorLocation brought in from the GetPlayerPawn Node ✓ Connect the LERP output to the Location X and Y but take the Z Location from the PlayerPawn
  • 135. Compile and test the Glide Input ✓ Compile ✓ Test ✓ Tweak
  • 136. Duplicate the Trace with inputs for the Teleport ✓ Copy the first part of the Graph ✓ Paste below
  • 137. To the Teleport Input, change the GlideDistance to a new Variable called TeleportDistance ✓ Delete the GlideDistace from the Teleport Trace ✓ Add a new Variable called TeleportDistance ✓ Set it to a distance of 10000
  • 138. To the Teleport Input, add a DotProduct, CanTeleport Boolean, and Point to Nav ✓ Create a new Variable called CanTeleport of type Boolean ✓ Set it by dragging out from the Impact normal of the Hit Result, Setting Z to 1, Dragging out and getting a GreaterThan Node, Setting it to something like .9, and wiring it to the CanTeleport node ✓ Wire that through to a Branch node to set True ✓ Wire all that into the graph ✓ Drag out from Location and create a ProjectPointToNavigation connected to the SetEndTrace ✓ Drag out from it, AddVector, and connect to the SetActorLocation after adding 90 to the Z axis to offset the player height
  • 139. Let's add a Teleport Preview Sphere ✓ Create a new Actor Class Blueprint
  • 140. Make a new Material ✓ Create a New Material called something like VR-MAT ✓ Change it to a Translucent Blend Mode with Unlit Shading Model ✓ Add a White Constant3Vector for the Emissive Color ✓ Wire a Scalar Value, a Fresnel Node, and a Power node into the Opacity ✓ Set the Scalar value to something like 4.5 ✓ Apply it to the VR-Preview Sphere
  • 141. In the VRPlayerController, let set up the VR-Preview ✓ Duplicate the Teleport trace graph ✓ Change the Teleport input to a EventTick input
  • 142. Add a SpawnActor from Class to the new VR-Preview graph ✓ Use Event Begin Play to call the SpawnActorFromClass node ✓ Assign the VR-Preview-BP to it ✓ Wire the HitResult's Location into the Spawn Transform ✓ Promote the Return Value to a variable called something like VR Preview
  • 143. Call the PreviewSphere to toggle its Visibility ✓ Use the Branch statement from the CanTeleport to set up the Visibility toggle for the VR- Preview sphere component ✓ Wire the output to the SetActorLocation node with the VR-Preview-BP as the target node to be relocated ✓ Make sure to offset the sphere up in Z value using the add vector node
  • 144. Let's lay in some Localized VR Audio ✓ Navigate to the Audio folder ✓ Find the Bird and Wind sounds
  • 145. Let's lay in some Localized VR Audio ✓ Drag in the 7 Bird sounds to the Cue ✓ With the Bird Sounds selected, choose Random from the Pallet to auto connect them
  • 146. Let's lay in some Localized VR Audio ✓ Drag in a wind loop sound ✓ Add a Looping Pallet node ✓ Wire it all though ✓ Save the node
  • 147. Let's lay in some Localized VR Audio ✓ Another Mix Cue for the top of the hill with howling winds
  • 148. Let's lay in some Localized VR Audio ✓ The new CUE is set with Override Attenuation set ✓ The Radius scaled up ✓ The Falloff Distance values increased
  • 149. Let's build some interactable Blueprints ✓ Make a new Actor Class Blueprint ✓ Call it something like Idol-BP
  • 150. Assemble the new Idol Blueprints ✓ Bring in an Angle Statue Static Mesh Component ✓ Bring in a Torch Static Mesh Component ✓ Bring in a Particle System Fire Component ✓ Bring in a Sphere Mesh Component ✓ Duplicate the VR-MAT material and tint the color to be slightly yellowish in hue
  • 151. Right Click to create a new Blueprint Interface ✓ Call the new Blueprint Interface something like VR-Interact-BPI
  • 152. Right Click to create a new Blueprint Interface ✓ Make a new BPI Variable Function called OnLookAt
  • 153. Right Click to create a new Blueprint Interface ✓ Make a new BPI Variable Function called OnLookAt
  • 154. Let's add an Interact Action Mapping Input ✓ Call it something like Interact ✓ Assign it to the F key on the keyboard ✓ Assign it to the Gamepad Face Button Bottom
  • 155. Add the Interact Action Mapping to the VRPlayerController Blueprint ✓ Duplicate the LineTrace and wire in the Interact ActionMapping
  • 156. Add Wire in the OnLookAt BPI to the VRPlayerController ✓ Add an IsValid to make sure you are hitting an object from the HitResult’s Hit Actor ✓ Wire that through to the OnLookAt node from the BPI
  • 157. Let’s replace the existing angel Idol with our new Idol-BP ✓ Select the Idol-BP in the Content Browser ✓ Select the angel mesh in the level ✓ Right click and use the Replace Selected Actor With utility to swap it for the Idol-BP ✓ Zero out the rotation in Z to get it facing the right way
  • 158. Modify the Interact-MAT to prepare it for interaction ✓ Modify the Interact-MAT by adding a Scalar Node called something like MatPower ✓ Multiply it to the Fresnel Node ✓ Pipe the output to the PowerNode ✓ This will make the sphere around the idol go transparent ✓ We will be driving the MatPower value with the Idol-BP to toggle the visibility of the Sphere
  • 159. Back in the Idol-BP, set up the Torch and Sphere Preview ✓ Call the BPI Event OnLookAt ✓ Wire in a .2 Delay ✓ Wire in a Create Dynamic Material Instance Node with the Sphere as the Target ✓ Wire in a Set Scalar Parameter Value with the Return Value as the Target and the MatPower as the Parameter Name and the Value set to .8
  • 160. Back in the Idol-BP, set up the Torch and Sphere Preview ✓ Add in another Delay of about 1.3 Duration ✓ Reset the Scalar Parameter Value to 0 for the Parameter Name of MatPower again ✓ Wire in a Set Visibility for the Particle System to turn it on after the sphere Material blinks out
  • 161. Place more Idols around the level. Both Idol-BP and Static Mesh Idols. ✓ This way, the VR player can explore the level looking for interactable objects
  • 162. Let’s keep track of all the Idol’s we interact with to affect gameplay ✓ Add an Integer Variable in the Idol-BP called IdolValue ✓ Add an Integer Variable in the VR-Pawn-BP called IdolValue In the VR-Pawn-BP In the Idol-BP
  • 163. Let’s keep track of all the Idol’s we interact with to affect gameplay ✓ Create a new GameState in the GameMode Override called VRGameState
  • 164. In the VRGameState ✓ Create a CustomEvent ✓ Call it something like LevelTest ✓ Cast to the VR-Pawn-VP ✓ Use it to call IdolCount ✓ Test to see if IdolCount is Equal or Greater than X ✓ Branch to test if True ✓ Create a Boolean Variable called LevelClear set to False by default ✓ If the Branch Statement tests True, set LevelClear to True
  • 165. In the Idol-BP, insert this below increment function ✓ After the OnlookAt Event and the Delay… ✓ Cast to the VR-Pawn-BP ✓ Set IdolCount ✓ Get IdolCount ✓ Add IdolValue to IdolCount ✓ Wire that through to the DynamicMaterialInstance ** You can add a PrintString after the Set IdolCount to see the value increasing on screen
  • 166. Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ In the top menu Cinematics dropdown, choose Add Level Sequence
  • 167. Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Once the Sequencer window opens ✓ Add a Camera Cuts ✓ An Events track ✓ A Fade Track
  • 168. Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Use the Camera Icon to add a new Cinematic Camera ✓ Key frame the camera movement as desired
  • 169. Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Add a set of doors as Actors to Sequencer ✓ Use the Rotation Transform to key frame the doors to open as the camera animates into place during the sequence ✓ You can add a fade track to face in and out with ease
  • 170. Back at the Idol-BP, add a Level Sequence Actor Variable ✓ Add a Level Sequence Actor ✓ Set it to Public
  • 171. Back at the Idol-BP, add a Level Sequence Actor Variable ✓ Add a Level Sequence Actor ✓ Set it to Public
  • 172. Let’s modify the torch flame and change the color for effect ✓ Change the initial color ✓ Change color over life ✓ Add a light to the smoke ✓ Remove the light from the Trans_Square
  • 173. Duplicate the Idol-BP to Idol-Trans-BP ✓ Change the ParticleSystem to the blue fire
  • 174. Modify the Idol-Trans-BP EventGraph to open a new level ✓ Use the same Event OnLookAt from the BPI to run a Delay ✓ Set the Visibility on for the Blue Flame ParticalSystem ✓ Test to ensure that the LevelClear is True ✓ Delay again ✓ Then Open a new Level ✓ Promote Level Name to a new Variable and make it Public
  • 175. Assign the Next Level in the Idol-Trans-BP ✓ Assign a NextLevel name in the public variable of the placed Idol in the Details Panel
  • 176. Setting up an optional Fade Effect for the Teleport • Add a Sphere Static Mesh component to the VR-Pawn • Turn collision OFF on the Sphere or it will block the LineTrace
  • 177. Setting up an optional Fade Effect for the Teleport ✓ If you want to fade to white, make the Vector3Node color White ✓ If you want to fade to black, make the Vector3Node color Black Make it Two Sided Call the Parameter “Alpha”
  • 178. Setting up an optional Fade Effect for the Teleport ✓ Create a Custom Event in the VR-Pawn and set up the blink graph
  • 179. Setting up an optional Fade Effect for the Teleport ✓ Make a Timeline to drive the Fade ✓ Set the out value to .09 and set Use Last Keyframe
  • 180. Setting up an optional Fade Effect for the Teleport ✓ Drive the Scalar Parameter of the Alpha with the Timeline ✓ ** Make sure to connect the SwitchonEtimelineDirection on Finished
  • 181. Setting up an optional Fade Effect for the Teleport ✓ Wire the FadeSphere Custom Event into the Teleport Graph before the SetActorLocation
  • 182. 182 Give it a try, it’s a lot of fun. luis.cataldi@epicgames.com