DEV LOG 3: THIRD PERSON AND PHOTOGRAMMETRY
April 1, 2023PROJECT “BULK/BRANE”
DEV LOG 3:
THIRD PERSON AND PHOTOGRAMMETRY.
Third-person Camera Modes
With a vast landscape like the Land of ‘Brane’ I wanted to experiment with a different camera mode and give players the freedom to switch between them. I’ll call this camera mode the ‘far-angle follow camera’. It looks at the player from a further distance than the normal camera mode. This allows for an increased field of view along with a cinematic effect that puts the player in contrast with the broad landscape surrounding them. The player becomes part of the environment smoothly as the camera zooms away from the player.
To achieve this I have to make the camera move a further distance away from its original location and the player smoothly by the press of a button and then return the camera to its original position smoothly when the button is pressed again. To do this I tried two different techniques. The second technique ended up working much better than the first one but both of them have different features and drawbacks. First I need to map a key that when pressed, will switch the cameras. To do that, I create an input action for switching cameras. I’ll have to bind a key to the input action using Unreal Engine’s new enhanced Input mapping system which works a little differently from the action mapping system of previous versions of the engine. Instead of mapping actions to keys in the Project Settings, you now create an input mapping component where you can map the keys to actions. Unreal Engine’s Third personal content includes an enhanced Input mapping component with keys mapped to third-person actions of moving and jumping. Here, I create a new component for switching the camera and map the key ‘c’ to it.
1st Technique:
The first technique is based on a tutorial for switching between third-person and first-person cameras on YouTube by Gorka Games.[1] I wanted to implement this for two third-person camera modes. I start by opening the third-person controller blueprint and creating a new camera. Let’s call them Camera1 and Camera2. I add the node for the InputAction for switching the camera that I previously created. I attach a FlipFlop node to it, this node alternates between output A and output B when it receives input from the previous node. To output A, I attach a SetActive node with my Camera2 attached to the target, I set it as active. I attach another SetActive node with Camera1 Attached and don’t set it as active. I repeat the process for output B of the FlipFlop node but this time I set Camera1 as active instead of Camera2. While this allows me to easily switch between the two camera modes in my gameplay I cannot switch between them smoothly which breaks immersion and ruins the ‘ASMR-esque’ nature of the game and would not go well with soundtracks. This is why I decided to go with the second one which ended up working well. Instead of deleting the nodes, I will use them later on to develop a first-person camera.
2nd Technique :
This technique is based on another tutorial by Gorka Games for locomotion.[2] In this technique, I take the existing InputAction node + FlipFlop node and for A and B outputs I add a Timeline node, I attach output A to play and B to Reverse from the end. In the timeline itself, I create a new track and call it Cam Length and add two keyframes, the first one at time 0 and value 0 and The second one at time 1 and value 1. This creates a smooth slope from 0 to 1 over a period of 1 second. Going back to the event graph I add a Set Target Arm Length node to the update output of the timeline node. I set the target arm length at 600. This is the new distance that the camera will be away from the player, the original and default third-person target arm length is 400. Then I added a lerp node and set value A to 400 and Value B to 600 and attached the cam length (track) output of our timeline to the alpha of the lerp node this will allow the camera to smoothly travel between from 400 to 600 distance as per the keyframes of the track in the timeline.
Replace mannequin with MetaHuman
It’s time to put a main character in there! I would love to work on a fully realized character customization system for this game. But before that, it’s time to add a character of my own to replace the default mannequin that comes with the Third person template in Unreal Engine 5.1.
I want to use Epic Games’ Metahuman technology to create my first character. To do this I followed a tutorial by the YouTuber Jobutsu.[3] Metahumans are high-fidelity digital humans created using Unreal Engine’s MetaHuman Creator tool. They are designed to be realistic and expressive, with advanced facial animation and motion-capture technology. To create your own Metahuman, you can use the MetaHuman Creator tool[4], which allows you to customize various aspects of the character, such as their facial features, hair, clothing, and body shape. The tool offers a wide range of presets and sliders to choose from, and you can also import your custom assets.
Once I created my Metahuman, I added it to my project in Unreal Engine 5.1 using Quixel Bridge, where it gets automatically uploaded. I open my Metahuman blueprint. In this blueprint, inside Class Settings, I change the parent class from actor to the Third Person Blueprint. When I compile this, I get a bunch of errors.
I go to the first error which tells me ‘Target must have a component’. Clicking on this error takes me to the Set Update Animation graph. Here, the Get Children node is missing a Target node. I add the mesh as the target and compile it. The second and third error tells me that ‘usage of Get Skeletal Mesh has deprecated’. Clicking on these errors takes me to the Enable Master Pose and Construction Script graph. Here, in both cases, I replace the Get Skeletal Mesh node with the Get Skeletal Mesh Asset node. Then I open The Third Person Blueprint, go to the event graph and select everything except the Event BeginPlay node. I copy everything I selected and paste it into my Metahuman blueprint event graph. Here, I connect the output to the Hair LODSetup node to the Cast to PlayerController node. Now I go to the Metahuman Blueprint viewport and Set the location and the rotation to 0.
After that, I click on my mesh. In the details panel, set Visible to off and Visibility Based Anim Tick Option to ‘Always Tick Pose and Refresh Bones’. Then under the variables section. Find Live Retarget and select UseLiveRetargetMode and in details under Default value switch it on.
Now The MetaHuman is successfully retargeted, but there is something uncanny going on with the arms, to fix that. I open the LiveRetargetSetup under functions in my Metahuman Blueprint. Enjoy that is a node that connects the Get and Branch node. Inside this node, I click on browse. This opens up a folder inside the asset browser that contains the Metahuman Animation Blueprint. In this Blueprint, under, functions, I click on AnimGraph which takes me to the AnimGraph blueprint. Here I click on the Retarget Pose From Mesh node and in detail, inside IKRetargeterAsset I click on Browse. This opens the asset browser, here I select the RTG_Metahuman Component and duplicate it and rename it. I select this duplicate component as the IKRetargeterAsset and hit compile. Then I open this duplicated RTG component and I see that the arms of the two mesh do not align with each other, causing the issue. Here under details, in the source IKRig asset, I set the SourceIKRigAsset as IK_Mannequin, Source Preview Mesh as SKM_Quinn, the TargetIKRigAsset as IK_Metahuman and the Target Preview Mesh as the same body as my Metahuman (in this case, m_med_nrw_body_preview). Next, under Chain mapping. I click on Left Arm and under details, I set the Blend To Source from 1 to 0. I repeat this for the Right Arm as well. I hit save and close the IK Retargeter. Then I compile the AnimGraph and close that and now my Metahuman is completely Retargeted.
Now to replace my third-person player. I go to the ThirdPerson folder in the Asset Browser and inside that inside the Blueprints folder, I open the Third Person Game Mode Component. Here I change the DefaultPawnClass from the Third Person Blueprint to my Metahuman Blueprint. And hit compile. It’s finally done! Now when I hit play I can see my Metahuman character as the main player instead of the default mannequin.
Turning on Collision for 3D Assets
Now that the landscape looks fairly set up, It’s time to make things have a physical sense by adding collision. When we play the game we noticed that the player character does not have any kind of physical interaction with the 3D assets in the game. The player just passes through an object as if it does not exist. To turn on collision for a particular foliage object or a 3D object.
I begin by searching for and opening its static mesh component from the asset browser. Then I click on Collision which is on the top left area of the viewport. Here I will choose the auto-convex collision option. This will open up a convex decomposition widget in the bottom right of the screen. You can adjust the Hull Count, Max Hull Vertices and Hull Precision these three values correspond to how detailed and how accurately The convex collision capsule will capture the shape of your 3D asset. Once I said these values I click apply.
I close this window and then I select all my foliage. I side the details panel, I can see the root component and all its branches. I select all the branches and inside details search for Collision Presets and change it from No Collision to Block All. And just like that collision has been turned on for all that particular 3D asset. Now when you play the game your player character cannot pass through that object and can also stand on the object if it is level enough.
Photogrammetry
Photogrammetry is a technique that uses photographs to create 3D models of objects, environments, and structures. The process involves taking multiple images of an object or scene from different angles and using specialized software to extract the 3D geometry from the images. Photogrammetry offers many benefits, including the ability to capture accurate and detailed models, the ability to work with large and complex environments, and the ability to easily update and modify models over time. With advancements in technology, photogrammetry has become more accessible, allowing more industries and individuals to take advantage of its capabilities.
In ‘Brane’, photogrammetry is being used to collect 3D scans of objects from around the world, which are then used to populate the virtual world of the game. Users can submit their photogrammetry assets, which are then added to the world, creating a diverse collection of objects from different cultures and places. The use of photogrammetry in this project allows for the preservation and sharing of objects that hold cultural significance. By creating a virtual world that includes objects from all over the world, ‘Brane’ aims to promote cross-cultural collaboration and appreciation. Users can search for and find objects in the world and add them to this landscape, which is a collaboration and an alternate dimension for all the 3D assets. Photogrammetry plays a critical role in this project.
Scanning 3D Objects using Scanniverse
Scaniverse is a powerful app for 3D scanning that allows you to easily scan and export 3D objects as FBX files. Here are the steps to use the app to scan a 3D object and export it as an FBX.
First, open the Scaniverse app on your device. Next, select the option to “Scan an Object” from the main menu.
Position your object on a flat surface and ensure that it is well-lit. Then, use your device’s camera to capture images of the object from different angles. The app will guide you through the scanning process and show you which areas still need to be scanned.
Once you have captured enough images, the app will begin processing the data and create a 3D model of the object. After the processing is complete, you can refine the model by using the editing tools in the app.
Once you are satisfied with the model, select the “Export Model” option from the main menu. Choose the FBX file format and set the desired resolution and quality settings. Finally, save the file to your device or export it to a cloud service. With these steps, you can easily use Scaniverse to scan a 3D object and export it as an FBX file.
Setting up 3D photogrammetry scans for Unreal Engine 5.1
Usually apps like Scaniverse, Polycam, etc. You are very decent job of processing and stitching together photographs to create a 3D mesh. The 3D mesh is good for viewing purposes but when you import the mesh inside a game engine or 3D software you start to notice cracks. It is not optimized for handling lighting effects or the mesh is just not stitched correctly, this can give rise to a lot of collision, lighting and texture issues when it comes to importing the asset in the game. The reason that these 3D models are not processed correctly is that they are processed within the app itself which is not meant to be a sophisticated 3D modelling and processing app. I have a very easy fix for this.
I use Maya as a ‘middleman’ between the photogrammetry app and Unreal Engine. Maya is a rigorous 3D modelling and animation app and it is good at processing and exporting 3D Objects. We will use this software to rebuild our 3D objects. This is very simple to do. You just have to import your 3D scan from your photogrammetry app in FBX format.
After importing in Maya you can crop out unwanted parts by viewing the faces and selecting the unwanted parts and pressing delete. After making any required changes, export all as FBX and make sure to select the ‘Embed Texture’ option before exporting. This is the FBX file you import into Unreal Engine 5.1. This 3D model is built properly, does not glitch and even handles collision. This is how I will be refining the 3D models before putting them in the game.
Send me your 3D Photogrammetry Files (FBX Format)!!!!
I am finally set up to accept photogrammetry files from users. While I currently do not have a method for users to directly upload 3D files into the project. I am working on it because it is a natural upgrade to this project.
Until I figure that out. Users can simply submit their 3D scans to me but dropping their FBX files into this Google drive folder. Simply drag and drop your files in the driv and you will find them in the game.
DROP YOUR 3D FBX PHOTOGRAMMETRY FILES HERE AND SHARE THIS LINK WITH YOUR FRIENDS!
https://drive.google.com/drive/folders/10YXfbfQRLprdQfMcgh0sQ0s1U-rlR8ov?usp=share_link
REFERENCES
How to switch between third person and first person in Unreal engine 5 - in 3 mins! YouTube. (2022, September 10). Retrieved April 1, 2023, from https://youtu.be/lMuinhr0SXU
Unreal engine 5 RPG Tutorial series - #2: Locomotion - blendspace, crouching and procedural leaning! YouTube. (2023, February 21). Retrieved April 1, 2023, from https://youtu.be/WcDj4uZygyE
[new] metahuman as Thirdperson - UnrealEngine 5.1. YouTube. (2022, December 5). Retrieved April 1, 2023, from https://youtu.be/vuhquaUE5HI
Metahuman. Unreal Engine. (n.d.). Retrieved April 1, 2023, from https://www.unrealengine.com/en-US/metahuman