Categories
3D Computer Animation Fundamentals Immersion

Week 12: Project Submission

Final Showreel

Concept: For my temple cave project, I aimed to create a unique and mystical environment inspired by ancient cave temples to honour God Shiva’s form ‘Natraja’. I tried to connect the God of Dance Natraja to the environment with elements that symbolise the balance of nature: Earth through the lush greenery and rocky surfaces, Fire with glowing torches and ambient lighting, Air through dynamic lighting and shadows, and Water with subtle streams or droplets. I added an animated man dancing, representing the cosmic balance found in the Nataraj Murti but also creates an immersive experience that reflects themes of creation, destruction, and spiritual transformation.

Design Proposal

In my design proposal, I have explained my previous project and the concepts behind it. For my upcoming project, I’m shifting focus to high-speed car chases and car culture. This video offers a preview of what’s to come, where I’ll be working on a car rig, character animation, and creating a dynamic car meet scene with NPCs. I’m excited to explore this new direction and push my skills further as I develop this project.

Categories
3D Computer Animation Fundamentals Immersion

Week 10 & 11: Project Final Steps & Polishing

After the base environment was done and everything was set up, I moved on to the foliage part. I also decided my final character for the animation.

For the character I chose an Indian looking guy from SketchFab to fit the theme and added all the textures and defined the rig in Human IK in Maya to be able to add the motion capture animation to the rig.

I then exported the character to Unreal Engine along with the rigs and the textures to start with the animation process.

After adding the character and testing out the animations, I added foliage in the inside as well as the outside to add the element of ‘Earth’ in my scene.

I proceeded to add lighting to the scene to add those finishing touches. I added a variety of spot lights and point lights (like above the fire and in between the trees) to lit up the darker areas where the sun casted a lot of shadows on.

I proceeded to add my cameras to the level sequencer and keyframe the locations and rotations. Then I created a Master Shot and added all the shots together to create a cohesive video.

With this, I had a proper sequence ready for render to finalise my video and I proceeded to render it in 4K resolution by changing the settings according to my video.

Categories
3D Computer Animation Fundamentals Immersion

Week 8 & 9: Term Project Progress

Next, I proceeded to add the element of fire that fit my narrative and played a little with the glow and smoke settings to get the desired effect in my scene. I changed the emissive colour and fade settings according to my liking and played around with a few other parameters as well.

Once I had the fire set up, I started to fix the dynamic sky and atmosphere in the environment and played around with the location of the sun to get the God Rays effect from the cracks of the rock.

After that, I added my second desired element – Water. By adding an ocean from the Place Actors panel in my scene, it made the bridge connect the entrance to the Murti and have water on both sides.

The water however, did not look right as it looked a little greenish. So i changed a few settings including absorption and fixed the water body.

I then moved on to the outside area of the cave to build the entrance part.

Categories
3D Computer Animation Fundamentals Immersion

Week 7: Project Initial Review

This week, we had to show Serra our progress with the term project and get a review on what we had done so far.

My Concept:

A temple inside the caves holding the Natraja Murti, the God of Dance, with dance animations that symbolise cosmic movement and energy. The four elements of nature – Earth, Fire, Water and Air are symbolised through foliage, dynamic lighting and other such actors.

I started by designing the temple structure but making sure it looks traditional and old. I added multiple stone blocks and pavements to see what fits the scene perfectly.

I started adding boulders and rocks to form the cave structure around the statue and added the side wall assets that were taken from Fab and CG Trader.

I then added railings to the side of the main path turning it into a bridge.

In my review, Serra told me to finalise shots in cine camera actor first to see how long the video will be and will help me decide the animation for my male character and set a correct timeline. She also told me watch a few references of dance videos and films to help me in the shots.

Categories
3D Computer Animation Fundamentals Immersion

Week 6: Control Rigs

This week we looked at Control Rigs in Unreal Engine 5 where we started by adding a Control Rig in the content browser via the Animation section. By creating a Modular Rig, we added the Unreal default mannequin’s skeletal mesh into the blueprint.

We dragged and dropped the rig module into the sockets available in the skeletal mesh and added everything to the character.

We also looked at the module settings to change the size or colours of the controllers according to our preferences.

Then we moved on to the second type of rigging method which we learnt using a octopus model. We started by editing the skeleton, and set the root bone transform to 0. We deleted the existing bones of the arms and left the shoulder as is. Then we added joints from the shoulder by clicking on the arms and add as many bones as we wanted and proceeded to bind the skin.

We then added the control rig samples pack from Fab into our projects. Then we created a regular control rig for the octopus and imported the octopus in the rig hierarchy to look at chain hierarchy.

We then selected the Bone > Right Click > New Control to create a controller for the shoulder and the last joint of the arm. To ensure its not a child, we pressed Shit + P on the controller. We then created a parent constraint and assigned the root bone in the name section. Then we parented the root bone control and attached that to the bone. This ensured that the controller can transform the bone it is assigned to. We did that to all the shoulders then.

Then we added SpringInterp and turned it into a vector. We dragged and dropped the last bone of the arm into the graph and clicked on get bone. We connected the translation to the target. We then dragged the control of the same bone and clicked on set control and joined translation to the result.

Then we created an Aim Constraint and defined the name and the target of the arm in the node from where the aim starts and where the aim ends. We set the Aim Control’s parent type to Control and attached the execute node from the Parent Constraint to the Set Transform’s execute. We then connected that to the Aim Constraint.

Then we tried rigging a human character from what we learnt so far in the class using all the nodes and it allowed us to better understand how characters are prepared for animation, emphasising the importance of proper joint placement and control creation.

Categories
3D Computer Animation Fundamentals Immersion

Week 5: UE Physics

Physics in Animation

  • Explored how physics adds realism to animations, especially in scenes where objects are falling or impacted.
  • Learned that applying physics makes objects react naturally to forces, adding immersion.

Exploring Unreal Engine Modes

  • We usually work in Selection Mode, but this week we explored additional modes:
    • Landscape Mode
    • Foliage Mode
    • Fracture Mode – our main focus, used to apply fractures and destruction effects to meshes.

Modeling and Fracturing Meshes

  1. Started with Modeling Mode to create and place basic meshes in the scene.
  2. Switched to Fracture Mode to break down meshes in various ways:
    • Uniform Fracture – breaks down a mesh evenly.
    • Cluster Fracture – creates smaller, clustered fragments, ideal for explosive effects.
    • Radial Fracture – creates a point-of-impact effect, like a gunshot.
    • Plane Fracture – simulates slicing, like a sword cutting through.
    • Custom Fracture – allows for user-defined break patterns.

Fracturing a Cube Step-by-Step

  1. Selected a cube mesh to fracture.
  2. Generated a fracture asset:
    • Went to Generate > Save in a folder > Generate Fracture.
  3. Enabled physics for realistic impact:
  4. In Details Panel, enabled Simulate Physics to respond to gravity and collisions.
  5. Applied Uniform Fracture mode:
  1. Adjusted the fracture levels to control the number of broken parts.
  2. Observed how one cube fractured into 20 separate pieces.

Advanced Fracture and Impact Control

  • Further fractured pre-existing fractured parts to create additional detail.
  • Adjusted impact strength:
    • Lowered the Damage Threshold in the Details Panel to increase impact sensitivity.
    • Turned off bone colours by unchecking Show Bone Colours in the Details Panel.

Nanite for Performance Optimisation

  • Switched to Nanite for performance when using heavy physics scenes:
    • In Chaos Physics settings, located the collection, enabled Nanite in the Content Browser, making physics processing more CPU-efficient.

Adding Material Properties to Enhance Physics

  • Applied bounce and material properties to objects:
    • In Details Panel > Collision > Physical Material Override, adjusted friction, density, and strength.

Scene Recording Setup

  • Enabled Engine Settings to access built-in blueprints:
    • Activated by ticking Show Engine Content in Content Browser Settings.
    • Located Anchor and FS Master Field in the content browser, then copied them to a custom physics folder.
  • Set up the scene for recording:
    • Dragged Anchor and Master Field into the scene.
    • Added an Initialization Field and created a Chaos Solver to simulate physics interactions.
    • Added objects to the Chaos Cache Manager for precise recording.

Using the Sequencer to Capture Scenes

  1. Added objects to the Sequencer and set keyframes for start and end points.
  2. Worked with Constraint Actors to apply physics constraints:
    • Added two cubes to test the constraint as an anchor point.
    • Observed how one fractured cube reacted to constraints, swinging around the anchor point.

Alternative Recording with Take Recorder

  • Enabled Take Recorder in Plugins and added actors for recording:
    • Started simulation with Alt + S to capture real-time physics effects.
    • Viewed recorded sequences in the Content Browser after recording.

Creating a Bouncing Ball

  1. Created a sphere and enabled Simulate Physics in Details Panel.
  2. Applied a custom physical material for bounce:
    • Created a blueprint class called BP_Ball and added a static mesh.
    • Created a physical material with customised friction and density settings.
  3. Applied the physical material to the sphere to achieve the desired bounce.

This week’s exploration provided a solid understanding of fractures, physics settings, and recording techniques in Unreal Engine. These tools allow us to bring more realism and dynamic effects to our animations.

Categories
3D Computer Animation Fundamentals Immersion

Week 4: Sequencer and Materials

In Week 4, I continued to use the Sequencer in Unreal Engine and started working on creating Materials.

I learned how to set up Shots in the Sequencer, which is key for staying organized when using several cameras. To create a Shot, you first add a Subsequence track by clicking the Add button in the Sequencer menu. This Subsequence Track helps keep track of and manage the different cameras used for various tasks.

I learned how to create Shots in Unreal Engine’s Sequencer:

  • To create Shots, generate a Level Sequence from the Cinematics menu.
  • Ensure proper naming for better organisation.
  • Add these sequences to the Subsequence Track, where you can adjust their lengths.
  • To assign cameras to Shots:
    • Select the desired camera.
    • Press Ctrl + X to cut the camera.
    • Double-click on the chosen Shot to paste the camera into it.

This week, I also learned how to create Materials in Unreal Engine. I started by downloading a Material from Quixel Bridge and importing it into Unreal Engine. Then, I created a new Material and used the Material’s node editor to add the following maps from the imported Material:

  • R Channel: Ambient Occlusion
  • G Channel: Roughness
  • B Channel: Displacement

I connected the maps as follows:

  • The RGB channel of the Colour Map to the Base Colour
  • The G channel of the Roughness Map to the Roughness input
  • The RGB channel of the Normal Map to the Normal input of the new Material

Afterward, I started experimenting with the tiling of the Material. To do this, I incorporated several nodes, including Texture Coordinate, Multiply, and Add. By adding these nodes, I was able to adjust the tiling according to different requirements. I played around with different values to see how each adjustment impacted the appearance of the Material. This exploration not only helped me understand how tiling works but also allowed me to visualise a range of outcomes, giving me greater insight into the creative possibilities within materials in UE.

Once I created a Material, I was guided to generate a Material Instance. This is done by right-clicking on the Material I created and selecting the Material Instance option from the menu. The primary distinction between a Master Material and a Material Instance is that the Material Instance inherits all the properties from the Master Material. 

This inheritance enables real-time updates and adjustments, allowing for more flexibility and efficiency in the material design process. By using Material Instances, I can tweak parameters without altering the original Master Material, making it easier to experiment with different looks while maintaining a consistent base. In conclusion, this week was very informative as I enhanced my skills in material creation and manipulation in UE.

Categories
3D Computer Animation Fundamentals Immersion

Week 3: Virtual Production Sequencer

This week, I explored the Unreal Engine 5 Sequencer, learning how to apply traditional film-making techniques to Unreal. I focused on the key differences between regular film production and how things work in Unreal.

Film Techniques:

  • Master Scene (Master Shot): This means filming an entire scene in one continuous shot from a wide angle. It shows all the actors and the set, helping establish the action and relationships between characters.
  • Coverage Cameras: After the master shot, additional shots are taken from different angles, like close-ups or over-the-shoulder shots. These highlight emotions and details and help create a smooth final scene during editing.
  • Linear vs. Non-Linear Storytelling:
    • Linear: The story goes in a straight line from beginning to end.
    • Non-Linear: The story can jump around in time, using flashbacks or different sequences for a more interesting narrative.
  • Triple Take Technique: This involves filming three different versions of a shot, with slight changes in performance or camera angles. It gives editors more options to choose from later.
  • Overlapping Action: This technique makes movements feel more natural by staggering actions. For example, when a character turns, their body parts move at slightly different times.
  • Hitting Marks: Actors have specific spots where they need to stand or move during a scene to get the best camera angles and lighting.
  • Film Production Roles:
    • Gaffer: The person in charge of lighting.
    • Grips: Technicians who set up equipment.
    • Production Manager: Manages schedules and budgets.
    • Director of Photography (DP): Decides how the film looks, including camera angles and lighting.

Unreal Engine Techniques:

  • Sequence-Based Linear Workflow:
    • One Level Sequence: Organizes a scene as a single timeline, moving from start to finish without changes.
    • Multi-Camera Setup: Uses multiple cameras to capture different angles of the scene at once.
    • Single Camera Cuts Track: Films each shot separately and then combines them in editing.
  • Shot-Based Non-Linear Workflow:
    • Nested Level Sequences: Smaller parts of a project that can be worked on separately and then combined later.
    • Take System: Helps manage different versions of a shot, making it easier to find the best one.
    • Sub-Scene Tracks: Allows editing specific parts of a scene, like sound or animation, without changing everything.
  • Collaborative Workflow:
    • Sub-Levels: Sections of a project that different artists can work on independently.
    • Sub-Scene Tracks: Focus on specific elements, letting artists work on their parts without affecting others.
    • Visibility Tracks: Control what elements are visible during editing, allowing focus on certain aspects.

Workflow Comparison:

  • Linear Workflow: A simple process where everything is done in order, commonly used in traditional filmmaking.
  • Non-Linear Workflow: More flexible, allowing edits to be rearranged or done out of order. This is helpful for animation and VFX projects, enabling multiple artists to work together.

Both workflows are important for big projects, especially when teams need to work together on some parts of the project.

After exploring the differences between traditional film production and Unreal Engine, we started working with the Sequencer. I learned that the Sequencer is Unreal Engine’s Non-Linear Editing Tool, which includes:

  • Ground-up Shot Creation: This feature lets creators build individual shots from scratch in Unreal, giving full control over camera angles, lighting, and scene layout.
  • Pre-visualisation: A tool to create rough versions of scenes before full production. It helps visualise how the final scene will look and assists in planning.
  • Full Film Creation: Unreal Engine can be used to create entire films, from pre-production to final rendering, providing a virtual environment for production.
  • Game Cinematic Creation: The Sequencer is also used to create cinematic sequences for games, helping to develop narrative-driven cutscenes or trailers with high-quality visuals.

This versatility makes Unreal Engine valuable for both the film and game industries.

I learned that in traditional film, the narrative is usually structured into sequences, often following a three-act format. In contrast, Unreal Engine organises the story using a Level Sequence, which builds the entire narrative using Nested Level Sequences.

We also covered two new terms: Possessable and Spawnable Actors. Possessable actors are like zombies that appear in the scene but aren’t needed, while Spawnable actors are essential elements that we want to keep in our scene. Spawnable actors need to be called to appear, while possessable actors are always visible.

Afterward, we worked on a sample project called DMX Previs Sample. In this project, we learned how to create new cameras in the scene and animate their movements. This experience helped me understand the Sequencer better and how to add cameras and other objects to keyframe and animate them.

We moved on to create Spawnable Actors by adding an actor to the Sequencer. To do this, I right-clicked on the object I wanted to convert and selected Create Spawnable. This process ensures that the object is always accessible in the Sequencer when we need to render the scene.

We created a Level Sequence and opened the Sequencer to add a camera to the DMX Previs Sample scene. After adding the camera, I adjusted the focus property from frame 1 to frame 100 and key framed it to create a simple animation.

We concluded the lecture by experimenting with camera settings and movements to develop different camera animations. I added shots using the camera cut feature, which helped me enhance my understanding of cameras in Unreal Engine while learning to use the Sequencer effectively.

Categories
3D Computer Animation Fundamentals Immersion

Week 2: Introduction to World Building

Introduction & Basics of World Building in Unreal Engine 5

This week, I delved deeper into world building in Unreal Engine 5 and gained a greater understanding of its powerful tools that allow us to create bigger, more detailed environments with high efficiency.

World Design and Workflow Efficiency

Unreal Engine 5 offers a seamless world-building workflow, making it easier to design vast environments. The introduction of world partition and level instances helps manage large-scale worlds by optimizing the level of detail and memory usage. It simplifies handling of bigger worlds and complex stories, making the process more streamlined. The Windows tab in Unreal Engine is a central part of the interface that allows us to customize and manage our workspace. It provides access to different windows and panels that help streamline the workflow, making it easier to navigate, control, and edit various aspects of the project.

The Content Browser in Unreal Engine is an essential tool for managing all project assets, providing a centralized hub for organizing, importing, and manipulating various content types like models, textures, animations, and blueprints. It features drag-and-drop functionality for easy asset placement, detailed previews for quick assessments, and robust search capabilities to swiftly locate specific items. We can create folders for organization, apply filters and tags for efficient navigation, and edit asset properties directly within the browser.

Lighting and Environment Improvements

Lighting plays a crucial role in creating realistic environments. I explored Unreal’s enhanced lighting tools, including the Environmental Light Mixer, Directional Light, Sky Atmosphere, Sky Light, Exponential Height Fog, and Volumetric Clouds. These tools work together to provide dynamic, real-time updates to lighting, giving immediate feedback and helping us craft realistic, immersive environments.

Quixel Bridge

The updated Quixel Bridge simplifies asset importing, making high-quality models, materials, and MetaHumans easily accessible. The drag-and-drop functionality speeds up workflow, and assets are now optimized for virtual textures, ensuring high fidelity without performance compromises. It’s incredibly useful for large-scale projects, like the environments I will be working on. We were made aware that Quixel is now moving to Fab to get all the new downloads and megascans.

Another advantage is that Quixel Bridge assets are now available directly within Unreal Engine’s Content Browser. This eliminates the need to switch between multiple applications or manually import files, which keeps the workflow fluid and focused. You can also make adjustments to assets right in the content browser, making it much easier to tweak and refine them in real-time.

Modeling and UV Tools

The new modeling and UV tools allow for in-engine mesh creation and editing, removing the need for external software. I can now create and modify meshes directly within Unreal Engine, which speeds up the workflow and makes last-minute adjustments much more manageable. From creating new meshes to reviewing and editing them, these tools are a game-changer for asset management.

Nanite Virtualized Geometry

Unreal’s Nanite Virtualized Geometry is another powerful tool I learned about, allowing us to handle millions of polygons without performance loss. Nanite automatically clusters polygons, optimizing them with a single draw call, while still maintaining the high-quality visuals we need. This opens up the potential for highly detailed environments without the usual performance constraints.

Lumen: Real-Time Global Illumination and Reflections

The Lumen system provides real-time global illumination and reflections, which brings unprecedented realism to scenes. It reacts dynamically to changes in the environment, ensuring that lighting and reflections adapt in real-time. Lumen’s ability to work with hardware ray tracing and distance fields adds depth and realism to lighting setups.

Virtual Shadow Maps

Finally, we explored Virtual Shadow Maps, which offer cinematic-quality dynamic shadows in real time. These shadows can be rendered from objects both near and far, with unlimited resolution. Virtual Shadow Maps replace older shadowing methods, providing a more optimal solution for projects using Nanite and Lumen.

Overall, this week’s learnings have equipped me with essential tools to create high-quality, detailed worlds while maintaining performance. I’m excited to apply these concepts in my upcoming projects!

Categories
3D Computer Animation Fundamentals Immersion

Week 1: Introduction to Unreal Engine

Introduction to Unreal Engine 5.4.4

In week 1 of getting introduced to Unreal Engine, we began with an overview of how the software can be used to create larger and more detailed worlds. The streamlined workflow helps make the engine work faster and enables a efficient and seamless process.

The key features include:

  1. Improved asset creation tools inside the engine offering more precision and flexibility 
  2. A robust rendering system ensuring quality without sacrificing performance 
  3. Dynamic GI and reflections

Installing Unreal Engine 

We were told to make an Epic Games account to get started with the software and to access Unreal Engine. After setting up the account, we were told to download Unreal Engine 5.4.4 which is the latest version to ensure us having the software for Week 2.

In this term, we will be working towards mastering the fundamentals of 3D computer animation, focusing on integrating animations and environments in innovative and experimental ways. Our aim is to explore the virtual production pipeline using tools like Unreal Engine 5, Maya, and Premiere Pro to bring our creative ideas to life.

Key goals for Term 1 include:

  • Portfolio Development: We will have to build a build a portfolio that demonstrates our technical skills and creativity in 3D animation, incorporating both still images and moving videos.
  • Creative Exploration: We’ll be encouraged to push boundaries by experimenting with the deconstruction and reconstruction of environments, playing with proportions, and exploring materiality to evoke different emotions and narratives.
  • Animation Techniques: We will work on creating a showreel (30-120 seconds) as the final submission for Term 1 to highlight our animation techniques, focusing on movement qualities such as linear action, scale, tension, and force in our environments.
  • Research and Reflection: Our journey will be documented through a blog, where we will reflect on our research weekly, talk about design decisions, and technical challenges. This blog will track our progress week by week, like a summary of our project development and its connections to broader social and cultural themes.
  • Presentations and Design Proposals: We will also have to create a 5-minute recorded presentation along with the final video to showcase our design concepts, storyboards, and methods, presenting our research and creative outcomes while challenging traditional approaches to 3D animation.