Categories
3D Computer Animation Fundamentals Immersion

Week 6: Control Rigs

This week we looked at Control Rigs in Unreal Engine 5 where we started by adding a Control Rig in the content browser via the Animation section. By creating a Modular Rig, we added the Unreal default mannequin’s skeletal mesh into the blueprint.

We dragged and dropped the rig module into the sockets available in the skeletal mesh and added everything to the character.

We also looked at the module settings to change the size or colours of the controllers according to our preferences.

Then we moved on to the second type of rigging method which we learnt using a octopus model. We started by editing the skeleton, and set the root bone transform to 0. We deleted the existing bones of the arms and left the shoulder as is. Then we added joints from the shoulder by clicking on the arms and add as many bones as we wanted and proceeded to bind the skin.

We then added the control rig samples pack from Fab into our projects. Then we created a regular control rig for the octopus and imported the octopus in the rig hierarchy to look at chain hierarchy.

We then selected the Bone > Right Click > New Control to create a controller for the shoulder and the last joint of the arm. To ensure its not a child, we pressed Shit + P on the controller. We then created a parent constraint and assigned the root bone in the name section. Then we parented the root bone control and attached that to the bone. This ensured that the controller can transform the bone it is assigned to. We did that to all the shoulders then.

Then we added SpringInterp and turned it into a vector. We dragged and dropped the last bone of the arm into the graph and clicked on get bone. We connected the translation to the target. We then dragged the control of the same bone and clicked on set control and joined translation to the result.

Then we created an Aim Constraint and defined the name and the target of the arm in the node from where the aim starts and where the aim ends. We set the Aim Control’s parent type to Control and attached the execute node from the Parent Constraint to the Set Transform’s execute. We then connected that to the Aim Constraint.

Then we tried rigging a human character from what we learnt so far in the class using all the nodes and it allowed us to better understand how characters are prepared for animation, emphasising the importance of proper joint placement and control creation.

Categories
3D Computer Animation Fundamentals Animation

Week 7: Walk Cycle Blocking

Once that was done, we moved on to learning how to block out a walk cycle using the same character, ‘Walker.’ We were told the main key poses that are needed for a walk cycle:

  • Contact Pose
  • Down Pose
  • Passing Pose
  • Up Pose

These poses form the foundation of the walk, and we were told to create a simple 24-frame walk cycle from left to right and also from the front view before extending it to 48 frames. For reference, I also used “The Animator’s Survival Kit” by Richard Williams, which helped me understand the movement better later on.

I started by referencing the walker into my scene. For this assignment, we also played with the lighting by adding two directional lights to make the scene look better. I then proceeded to place the Contact Poses at frames 0, 12, and 24. Then, I added the Passing, Down, and Up Poses, spacing them evenly with 3 frames between each. This structure helped create the rhythm for the walk cycle. Then it became easy to replicate these initial 24 frames and changing the values to create an opposite walk. At this point I had a basic walk cycle ready with 48 frames.

After laying out the basic poses, I went further and added some character to the animation. I used the front view of the character and applied the principles of weight shift to make the walk feel more realistic by moving it along the x axis on the leg that it shifts its weight on. I also incorporated a slight rotation with the weight shift on the sides to enhance the fluidity of the movement. Additionally, I brought the feet a little closer together and turned them slightly outward to make the walk appear more realistic. These subtle adjustments helped the cycle feel more grounded and natural.

Categories
3D Computer Animation Fundamentals Animation

Week 6: Weight Shift Spline

This week, I worked on refining my weight shift and turning it into spline, focusing on the feedback and review from previous session.

I started by revisiting the blockout, where George had provided feedback on the weight shift, certain leg positions, and foot rotations. After making the necessary changes based on his suggestions, I converted the animation to spline and focused on editing the graphs to create smoother transitions. The goal here was to eliminate any sharp movements and ensure that the transition between key poses felt natural and the weight shift feels realistic.

Looking ahead in the next week, I’ll continue refining the cycle when I turn it into spline by adding more details and editing the graph as needed.

Categories
3D Computer Animation Fundamentals Animation

Week 5: Weight Shift Blocking

We continued working in Maya by setting up our projects and using the “Walker” model as a reference in my file. The first step was to create a selection set of the main controls we needed for key framing and add it to the shelf in the animation tab.

Our main task this week was to work only in the blocking phase. We started by planning out our poses and shot our own reference videos for accuracy. I chose a simple pose where the character steps from the left to the right. For the first frame, I posed the Walker leaning on its left hip, which helped me understand how the weight shifts left before moving right and then shifts again during the step. I used this principle to animate the weight shift.

To add realism, we adjusted the foot brake value to 100 and tweaked the foot roll and heel roll settings to perfect the Walker’s pose as needed.

Initially, I had some challenges with joint positioning and determining how much to bend the Walker in the anticipation pose. However, after reviewing my reference video and getting feedback from George on my blocking, I was able to improve the pose. We were also instructed to convert our blocking to splines the following week. 

We were also asked to create three poses using references from either our sketches or the internet. For this, we used the ‘Ultimate Bony’ rigged character. I chose to make an action pose, a yoga pose, and a dance pose to add variety. This exercise taught me how to position different body parts like joints in the shoulders, hips, and knees to make the poses look natural. 

It also helped me understand how each body part affects the others when creating a pose. Adjusting joint rotations and considering the angles helped me make each pose feel more dynamic and balanced.

Categories
Design for Animation, Narrative Structures and Film Language

Week 7: Narrative structure and Character Role

Narrative Structure

We started by understanding how narrative structure is crucial for storytelling. It sets up a certain chain of events in a story, guiding how the audience connects with the characters and how the story unfolds. A narrative must be able to engage the audience and ensure a satisfying conclusion. For character-driven stories, it’s vital that the actors not only have appeal but also convey the role convincingly. Directors mainly play a central role in extracting the best performance from the actors, enabling them to communicate the story effectively. This is important because the emotional and dramatic connections to the audience depend on the character’s portrayal.

In my understanding, the focus on character appeal and performance emphasizes how the success of a narrative depends not only on the plot but also on how well the characters are brought to life. This reinforces the director’s responsibility in extracting the best from their actors.

Literary Structures

The traditional forms of storytelling—novels, poetry, plays, short stories, etc.—are often used as references in structuring narratives. These forms influence how stories in animation and other media are built, with different genres (like myths or fairy tales) helping shape the expected flow of events. For animation, these traditional literary forms often serve as the basis for constructing narratives, with their age-old conventions about structure, character roles, and themes guiding the development of the animated stories.

From what I understood, by connecting animation to these literary forms, we can see how certain conventions (such as the “hero’s journey” or the archetypal good vs. evil) are woven into animation narratives, just as they have been in literature for centuries.

The Three-Part Story Structure & The Five-Act Structure

Aristotle’s idea of a beginning, middle, and end serves as the foundation of most stories, including animations. This structure is still relevant today, particularly in understanding how to organize events to create a satisfying narrative.

The Five-Act Structure (Exposition, Rising Action, Climax, Falling Action, and Resolution) helps deepen this concept by offering a more granular breakdown of a narrative’s development. In Act 1, the audience is introduced to the characters, setting, and conflict. Act 2 intensifies the story, with the protagonist encountering obstacles. The Climax (Act 3) is where the story hits its highest point of tension. In Act 4, the action begins to wind down as conflicts are resolved. Finally, in Act 5, we reach the resolution, where the story concludes, and any remaining plot points are tied up.

I learned that these structures help build the pace and tension of the story, ensuring the audience stays engaged from the introduction to the resolution. Applying these structures to animation helps make the plot more digestible, clear, and emotionally engaging.

Equilibrium and Re-Equilibrium

The equilibrium-re-equilibrium model follows a structure where the narrative begins in a balanced state (equilibrium), is disrupted (disruption), and eventually resolves (re-equilibrium). This concept is particularly useful for understanding the dynamic nature of narratives, where the protagonist’s journey or growth leads them to a new equilibrium, often after facing significant challenges.

The idea of disruption and re-equilibrium in animation stories resonates because it shows that animation can manipulate time and space in a way that other mediums can’t. It makes the medium powerful for telling stories where reality can be bent to the narrative’s will.

Metamorphosis in Animation

Metamorphosis refers to the ability of animation to transform objects, characters, or environments in unexpected ways. This process can distinguish animation from traditional cinema by allowing for constant change and transformation within the story. The ability to show fluid, non-linear, and imaginative transformations (like changing shapes or environments) is an essential characteristic of animation, setting it apart from more static live-action films.

I understand that metamorphosis in animation allows for an expression of creativity and flexibility. Characters or worlds can change form or perspective, supporting the fluidity and dream-like qualities that animation offers, which live-action films can’t achieve as naturally.

The Language of Animation: Editing

In animation, editing is essential in connecting shots and scenes to create a coherent narrative. It is a tool for pacing, narrative progression, and maintaining audience engagement. The rules of editing, such as ensuring a smooth transition between scenes or using close-ups for emphasis, are crucial for storytelling. The editing should never distract from the story but instead should flow seamlessly, guiding the audience without them noticing the mechanics behind it.

I understand from this that editing is an art form in itself. It’s not just about technical skills but also about knowing when and how to cut a scene to maintain emotional tension, highlight details, or shift the narrative’s focus. The idea that editing should be “invisible” speaks to how well-crafted edits can make the audience focus on the story and characters, rather than the cut itself.

Disney’s Hyperrealism and Influences in Animation

Disney’s hyperrealistic animation—where even in an artificial medium, realism is emphasized—is a driving aesthetic that many studios have sought to replicate. For example, the attempt to replicate realistic movement and emotions in characters is a major influence in studios such as Pixar, DreamWorks, or Blue Sky Studios. These studios often adopt similar techniques to Disney to evoke believability, such as detailed textures and lifelike movements.

On the other hand, some studios resist Disney’s hyperrealism, focusing on more stylized forms of animation. For example, the animation style in films like “The Triplets of Belleville” or the works of Studio Ghibli takes a more abstract approach to character design and movement. They emphasize artistic expression over hyperrealistic detail, which creates a different emotional connection with the audience.

From this, I grasp that hyperrealism in animation isn’t just about achieving photo-realistic visuals; it’s about conveying believability through the medium’s artificial nature. Studios either adhere to this realism or deliberately choose to defy it for creative reasons, both of which result in distinct viewer experiences.

Research Areas

The research areas raised questions about animation’s disruptive properties, such as its ability to break the boundaries of physical reality. Animation is more fluid, and it has the power to visualize the impossible. Cartoons like Duck Amuck and surreal moments like Pink Elephants on Parade show how animation can surprise the audience by distorting reality in ways live-action can’t, offering a playful, imaginative perspective that challenges traditional cinematic boundaries.

In essence, animation’s freedom allows it to express ideas that would be impossible or highly difficult in live-action, such as visual metaphors, whimsical transformations, or exaggerated emotional expressions. This reinforces how animation can be both a form of entertainment and a medium for exploring more abstract concepts.

Categories
Design for Animation, Narrative Structures and Film Language

Week 6: The Language Of Animation – Mise-en-Scène

Mise-en-Scène is a French term that means “what is put into a scene” or “frame.” I learned that mise-en-scène refers to all the visual elements within a frame that contribute to storytelling in animation and film. These elements work together to communicate essential information to the audience without needing words. The key elements of mise-en-scène are:

Settings & Props

  • I learned that the setting of a scene plays a significant role in shaping the story’s mood and guiding the audience’s expectations. Settings can either be built from scratch or carefully selected to add depth to the narrative. For example, in An American Tail, the location of Manhattan is not just a backdrop, but it also adds to the character’s emotional journey. The setting helps set the tone for the events that unfold. Props, on the other hand, provide additional meaning and context to the characters and the plot, as seen in Toy Story and The Godfather, where the props play key roles in understanding the characters’ personalities and the storyline.
  • My takeaway: The setting and props in a scene are not just there to fill space but are integral to conveying meaning and expectations to the audience.

Costume, Hair & Make-Up

  • I learned that costume, hair, and makeup are used to instantly convey a character’s personality, social status, and occupation. For example, in 101 Dalmatians, the costumes and makeup choices for Cruella de Vil immediately tell us she is extravagant and villainous. In Barry Lyndon, the makeup and costumes highlight the social standing of the characters, supporting the film’s thematic depth.
  • My takeaway: These elements act as immediate visual cues, helping the audience quickly understand who a character is, even before they speak or take action.

Facial Expressions & Body Language

  • I realised that facial expressions are a direct way to show a character’s emotions, while body language can indicate how characters relate to each other. For example, in The Breadwinner, the protagonist’s facial expressions and body language reflect her resilience and determination in the face of adversity. The way characters position themselves or move can subtly express power dynamics or emotional states.
  • My takeaway: Animation and film often rely on non-verbal cues like facial expressions and body language to establish relationships and convey emotions effectively.

Positioning of Characters & Objects within the Frame

  • I learned that where a character or object is placed in the frame directs the audience’s attention. For instance, in Isle of Dogs, the positioning of the characters in relation to one another often signifies their emotional connection or tension. The way characters are placed within the frame can also highlight their importance or vulnerability.
  • My takeaway: Positioning within the frame is a powerful tool for visual storytelling, guiding the viewer’s focus and adding layers to the narrative.

Lighting & Colour

  • I discovered how lighting and colour can shape the mood of a scene. For example, low key lighting, which creates sharp contrasts and deep shadows, is used in films like Citizen Kane to add a sense of mystery or drama. High key lighting, as seen in The Barber of Seville, is bright and natural, making the scene feel more realistic. Colour also plays a critical role, as seen in Amelie, where warm tones create a nostalgic and whimsical atmosphere, or in The Revenant, where the cold, muted colours enhance the harshness of the environment.
  • My takeaway: Lighting and colour are not just technical aspects of filmmaking; they are essential tools for creating mood, character emotion, and thematic depth.

Depth-of-Field

  • I learned that depth-of-field refers to the distance between the nearest and farthest objects in focus. This technique can be used to emphasise certain elements in a scene. For instance, deep focus allows both close and distant objects to remain sharp, making it possible to highlight the character’s isolation or the vastness of their environment. Shallow focus, on the other hand, keeps only a specific area or object in focus, often highlighting a character’s inner thoughts or feelings.
  • My takeaway: The use of focus adds depth to the scene and directs the audience’s attention to what is important in the narrative at that moment.

Types of Shots

  • I learned that different types of shots are used to convey varying perspectives and emotions. For example, extreme close-ups, like in The Incredibles, focus intensely on a small detail, which can amplify tension or importance. A medium shot or long shot, like in Wall-E, helps establish the relationship between the character and their environment.
  • My takeaway: The choice of shot type has a profound impact on how the audience perceives the story, emphasising details or broadening the narrative’s scope.

Special Shot Types

  • I explored shot types that focus on specific relationships, such as a one-shot, which shows a single character (as in Anomalisa), or a two-shot, which features two characters (as in My Life as a Courgette). Group shots, like in Meek’s Cutoff, show multiple characters interacting and can emphasise unity or conflict.
  • My takeaway: Special shot types like the one-shot or two-shot are used to highlight the relationship between characters, influencing how the audience perceives their interactions.

Angle Shots

  • I learned that the angle of a shot can change the power dynamics within a scene. A high-angle shot, like in The Lion King, can make the character seem small or vulnerable, while a low-angle shot, like in There Will Be Blood, can make the character seem powerful or intimidating.
  • My takeaway: Camera angles can visually communicate a character’s emotional state or role within the story, affecting how the audience perceives them.

Point of View (POV) Shots

  • I discovered that point-of-view shots let the audience see the world through a character’s eyes, creating a deeper emotional connection. This technique is effective for immersing the audience in the character’s perspective, as seen in various films and animations.
  • My takeaway: POV shots strengthen the connection between the character and the audience, making the story more personal and immersive.

Moving Shots

  • I learned about the different types of moving shots, such as pan shots (which pivot along the horizon), tilt shots (which move up or down), and dolly shots (which move the camera forward or backward). These shots are often used to follow the action or explore the environment. In The Breadwinner, moving shots help create a sense of urgency and emotional intensity.
  • My takeaway: Moving shots are dynamic tools in animation and film, adding energy and emotional depth to the narrative.

Reading the Mise-en-Scène: The Breadwinner & Isle of Dogs

  • I analysed the mise-en-scène in The Breadwinner, where the combination of settings, lighting, and character positioning reinforces the protagonist’s sense of isolation and her emotional journey. Similarly, Isle of Dogs uses colour, lighting, and character placement within the frame to communicate the characters’ relationships and the thematic elements of loyalty and survival.
  • My takeaway: The way mise-en-scène is crafted in animated films is crucial for conveying the emotional and thematic depth of the story. It’s not just about what is seen, but how it’s presented to shape the viewer’s experience.

Screen Direction

We moved on to screen direction which refers to the movement of characters or objects on the screen from the audience’s perspective, and how it’s essential for maintaining visual continuity. If the movement isn’t consistent, it can confuse the audience. Screen direction is governed by camera positioning and movement, and this continuity is crucial for smooth editing and storytelling.

What I understood is that consistent screen direction is necessary to maintain a fluid and believable flow between shots. Terms like “camera left” and “camera right” help filmmakers define the movement within a frame. This needs to be established early in production, especially in the storyboard and animatic stages, so that the timing and flow of the scenes remain intact.

My takeaway is that screen direction helps us guide the audience’s attention and ensures that the characters’ actions and relationships are clearly understood. Without it, even simple interactions could feel disjointed or confusing. I also learned that pre-determined screen direction is especially crucial in animation, where movements must be precise and planned in advance.

Screen Continuity and the 180-Degree Rule

I learned that once screen direction is established, it must be maintained throughout the scene to avoid visual disorientation. This consistency ensures that the actors are positioned and moving in ways that make sense in relation to each other. The Imaginary Line or 180-degree rule helps keep track of the screen direction.

What I understood from the 180-degree rule is that if we shoot from one side of the axis, the movement and eye lines of the characters will remain consistent. This keeps the audience from getting lost or confused about the characters’ relationships or the direction they’re moving in.

My takeaway is that crossing the axis can disrupt continuity, but certain techniques, like using a neutral shot, can help reset the direction and allow smooth transitions. This flexibility in screen direction is essential when managing the complexity of film and animation production.

Animation Layout and Screen Direction

I learned that animation layout is the process of designing the environments for animated films. This stage is crucial for adapting the story to the film’s style, and it’s closely linked to screen direction, as the layout needs to be planned to maintain consistent movement and positioning of characters and objects.

What I understood is that layout artists need to ensure that the rules of screen direction are considered, especially in camera movements like pans and tracks. This organisation helps avoid confusion and ensures that the audience can follow the animation seamlessly.

My takeaway is that screen direction is just as important in animation as in live-action filmmaking. The movement of the camera and characters must be carefully thought out to keep the audience engaged and the story clear. 

Animation Staging

I also learned that staging in animation shares many purposes with film and theatre in directing the audience’s attention but has unique implications in its execution. This principle focuses on making an idea completely clear, whether it’s an action, expression, mood, or personality.

What I understood is that character placement and composition play a critical role in achieving this clarity. Elements such as camera angles, light and shadow, the dynamics of character movements, and how a character enters a scene all work together to focus the audience’s attention. For example, a sudden entry can create surprise, while an expectant one builds anticipation.

My takeaway is that designing the use of long, medium, and close-up shots helps emphasise and pace the narrative. Each shot type serves specific purposes: long shots establish context, while close-ups highlight emotion or detail. The timing and pacing of these shots have significant production implications, requiring careful planning to maintain the flow and meaning of the scene. If the background clashes with the character or is overly complex, it can distract from the main focus. It’s essential to keep the design clean and scale the key subject properly to avoid unnecessary distractions. Every object or detail in a frame has the potential to be a symbol, so unnecessary elements should be edited out to maintain clarity and impact.

My overall conclusion is that these cinematic principles underscore how storytelling in animation and film is an intricate balance of visual and thematic elements. From the arrangement of props and characters to shot choices and screen direction, each decision plays a role in directing the audience’s focus, building tension, and conveying emotion. Mastering these aspects is essential for creating engaging, coherent, and impactful stories.

Categories
Design for Animation, Narrative Structures and Film Language

Week 5: Social and Political Comment in Animation

Politics and Persuasion in Entertainment

In week 5, we explored the complex relationship between politics and persuasion within entertainment and how these elements manifest in film, animation, and other media. One significant aspect of this topic is understanding the mechanisms through which media platforms can shape, influence, and persuade audiences on both conscious and subconscious levels.

Broadly, audiences can be influenced through various outlets, such as social media, broadcast news, film and animation, and television. Each of these mediums carries a unique potential to embed persuasive messages. For instance, broadcasts and print media maintain an authoritative presence that can sway public opinion, while independent film and animation often provide a platform for personal stories and critical commentary on societal issues.

We analysed how media platforms—from mainstream and independent film to games, podcasts, and social media—hold the power to direct, challenge, and reinforce specific narratives. The potential of these platforms to deliver impactful messages stems from their ability to reach diverse audiences and evoke emotional responses.

A key part of this exploration was understanding how messages within moving images can be presented. These can range from subliminal or masked content, which subtly embeds messages, to overtly propagandist intentions that are clear and direct. Persuasive content might serve commercial purposes, aiming to promote products or ideologies, while documentary or investigative approaches seek to inform or provoke thought.

Under the broad umbrella of politics in media, key areas of focus include political and commercial persuasion, and how subjects like race, gender, equality, disability, ethics, and ecology are depicted. The way these themes are approached can vary widely across documentary films, mainstream cinema, television, games, and advertising. Each of these formats can either challenge existing social norms or reinforce them, depending on the underlying political context. 

In addition to my reflections on political persuasion in entertainment, I also explored the concept of animated documentaries and the unique role animation plays in non-fiction contexts. Animated documentaries, which are recorded frame by frame, represent the real world rather than an imagined one. They are presented as documentaries by their producers or accepted as such by audiences, festivals, or critics. Animation in these contexts is often used to clarify, explain, illustrate, and emphasise certain points.

A key question in this field is what the use of animation means as a representational strategy in documentary. Animation can be a powerful tool for presenting subjective experiences, offering insights into mental states and providing alternative ways of seeing the world. While some critics argue that animation destabilises the documentary’s claim to represent reality, Annabelle Honess Roe suggests the opposite—that animation broadens our ability to depict reality in non-conventional ways, allowing us to explore the world from unique perspectives.

The issue of authenticity in documentary also arises with animated documentaries. Bill Nichols argues that documentary images are often linked to the reality they represent, but animation’s departure from traditional documentary realism raises questions about how authenticity is conveyed. Honess Roe notes that animated documentaries do not easily fit into the traditional documentary mold. This challenges the widely held belief that documentaries should be objective and factual, with their authenticity dependent on their realism.

Furthermore, some critics, like Paul Wells, argue that animation’s inherent subjectivity makes it difficult to achieve objectivity, which is a cornerstone of traditional documentary. However, animation’s ability to present subjective experiences can enhance the understanding of complex, personal narratives. For example, animated works like Waltz with Bashir and the Animated Minds series use animation to convey first-person accounts of trauma and mental health, adding layers to the storytelling that live-action documentary may not easily achieve.

Animated documentaries challenge the notion of what a documentary “should” be, which leads to debates about their place within the genre. As animation becomes more commonplace in documentaries, some worry it may become a “layer” that distances the audience from the real experiences being portrayed, while others express concern about its potential for lazy storytelling, where animation is simply used to illustrate an existing narrative.

Overall, I’ve come to see animated documentaries as a unique and evolving form of storytelling, one that pushes the boundaries of what can be represented in non-fiction and challenges traditional notions of authenticity and objectivity. The evolving role of animation in documentary reflects broader changes in how we define reality and truth in visual media.

Categories
3D Computer Animation Fundamentals Immersion

Week 5: UE Physics

Physics in Animation

  • Explored how physics adds realism to animations, especially in scenes where objects are falling or impacted.
  • Learned that applying physics makes objects react naturally to forces, adding immersion.

Exploring Unreal Engine Modes

  • We usually work in Selection Mode, but this week we explored additional modes:
    • Landscape Mode
    • Foliage Mode
    • Fracture Mode – our main focus, used to apply fractures and destruction effects to meshes.

Modeling and Fracturing Meshes

  1. Started with Modeling Mode to create and place basic meshes in the scene.
  2. Switched to Fracture Mode to break down meshes in various ways:
    • Uniform Fracture – breaks down a mesh evenly.
    • Cluster Fracture – creates smaller, clustered fragments, ideal for explosive effects.
    • Radial Fracture – creates a point-of-impact effect, like a gunshot.
    • Plane Fracture – simulates slicing, like a sword cutting through.
    • Custom Fracture – allows for user-defined break patterns.

Fracturing a Cube Step-by-Step

  1. Selected a cube mesh to fracture.
  2. Generated a fracture asset:
    • Went to Generate > Save in a folder > Generate Fracture.
  3. Enabled physics for realistic impact:
  4. In Details Panel, enabled Simulate Physics to respond to gravity and collisions.
  5. Applied Uniform Fracture mode:
  1. Adjusted the fracture levels to control the number of broken parts.
  2. Observed how one cube fractured into 20 separate pieces.

Advanced Fracture and Impact Control

  • Further fractured pre-existing fractured parts to create additional detail.
  • Adjusted impact strength:
    • Lowered the Damage Threshold in the Details Panel to increase impact sensitivity.
    • Turned off bone colours by unchecking Show Bone Colours in the Details Panel.

Nanite for Performance Optimisation

  • Switched to Nanite for performance when using heavy physics scenes:
    • In Chaos Physics settings, located the collection, enabled Nanite in the Content Browser, making physics processing more CPU-efficient.

Adding Material Properties to Enhance Physics

  • Applied bounce and material properties to objects:
    • In Details Panel > Collision > Physical Material Override, adjusted friction, density, and strength.

Scene Recording Setup

  • Enabled Engine Settings to access built-in blueprints:
    • Activated by ticking Show Engine Content in Content Browser Settings.
    • Located Anchor and FS Master Field in the content browser, then copied them to a custom physics folder.
  • Set up the scene for recording:
    • Dragged Anchor and Master Field into the scene.
    • Added an Initialization Field and created a Chaos Solver to simulate physics interactions.
    • Added objects to the Chaos Cache Manager for precise recording.

Using the Sequencer to Capture Scenes

  1. Added objects to the Sequencer and set keyframes for start and end points.
  2. Worked with Constraint Actors to apply physics constraints:
    • Added two cubes to test the constraint as an anchor point.
    • Observed how one fractured cube reacted to constraints, swinging around the anchor point.

Alternative Recording with Take Recorder

  • Enabled Take Recorder in Plugins and added actors for recording:
    • Started simulation with Alt + S to capture real-time physics effects.
    • Viewed recorded sequences in the Content Browser after recording.

Creating a Bouncing Ball

  1. Created a sphere and enabled Simulate Physics in Details Panel.
  2. Applied a custom physical material for bounce:
    • Created a blueprint class called BP_Ball and added a static mesh.
    • Created a physical material with customised friction and density settings.
  3. Applied the physical material to the sphere to achieve the desired bounce.

This week’s exploration provided a solid understanding of fractures, physics settings, and recording techniques in Unreal Engine. These tools allow us to bring more realism and dynamic effects to our animations.

Categories
3D Computer Animation Fundamentals Immersion

Week 4: Sequencer and Materials

In Week 4, I continued to use the Sequencer in Unreal Engine and started working on creating Materials.

I learned how to set up Shots in the Sequencer, which is key for staying organized when using several cameras. To create a Shot, you first add a Subsequence track by clicking the Add button in the Sequencer menu. This Subsequence Track helps keep track of and manage the different cameras used for various tasks.

I learned how to create Shots in Unreal Engine’s Sequencer:

  • To create Shots, generate a Level Sequence from the Cinematics menu.
  • Ensure proper naming for better organisation.
  • Add these sequences to the Subsequence Track, where you can adjust their lengths.
  • To assign cameras to Shots:
    • Select the desired camera.
    • Press Ctrl + X to cut the camera.
    • Double-click on the chosen Shot to paste the camera into it.

This week, I also learned how to create Materials in Unreal Engine. I started by downloading a Material from Quixel Bridge and importing it into Unreal Engine. Then, I created a new Material and used the Material’s node editor to add the following maps from the imported Material:

  • R Channel: Ambient Occlusion
  • G Channel: Roughness
  • B Channel: Displacement

I connected the maps as follows:

  • The RGB channel of the Colour Map to the Base Colour
  • The G channel of the Roughness Map to the Roughness input
  • The RGB channel of the Normal Map to the Normal input of the new Material

Afterward, I started experimenting with the tiling of the Material. To do this, I incorporated several nodes, including Texture Coordinate, Multiply, and Add. By adding these nodes, I was able to adjust the tiling according to different requirements. I played around with different values to see how each adjustment impacted the appearance of the Material. This exploration not only helped me understand how tiling works but also allowed me to visualise a range of outcomes, giving me greater insight into the creative possibilities within materials in UE.

Once I created a Material, I was guided to generate a Material Instance. This is done by right-clicking on the Material I created and selecting the Material Instance option from the menu. The primary distinction between a Master Material and a Material Instance is that the Material Instance inherits all the properties from the Master Material. 

This inheritance enables real-time updates and adjustments, allowing for more flexibility and efficiency in the material design process. By using Material Instances, I can tweak parameters without altering the original Master Material, making it easier to experiment with different looks while maintaining a consistent base. In conclusion, this week was very informative as I enhanced my skills in material creation and manipulation in UE.

Categories
3D Computer Animation Fundamentals Animation

Week 4: Ball With Tail Spline

Refining the Ball with Tail Blocking

This week, we continued with our work on the Ball with Tail animation, picking up from last week’s progress by focusing on converting our Block Out animations into Spline animations. We kicked off with a critique session, where we received initial feedback on our Block Out animations, helping me pinpoint areas for improvement.

Following the critique, we resumed our work and learned how to transform Block Out animations into Spline. After the conversion, we utilised the Graph Editor to refine the animations and correct the ball’s trajectory using the Motion Trail feature. This allowed us to make subtle adjustments to the ball’s path, enhancing the realism of the jump and making it look smoother than the blocking. I also learned to fine-tune the Translate Y graph in the Graph Editor, incorporating ease-in and ease-out effects to give the jump a more natural feel.

In terms of the tail animation, I made slight modifications in the Graph Editor to smooth out its movement. I also cleaned up on the rotations to make it follow the right curvature while it is in motion. We learned about managing keyframes, focusing on how to delete any extra keyframes at the end of the animation process. This keeps the Graph Editor neat and easy to use. It’s important to remove these extra keyframes only after completing the animation so that nothing changes while we’re still working.

This week, I adhered to the planning I developed last week and successfully converted my Block Out animation into Spline. By leveraging the Graph Editor, I was able to add ease-in and ease-out effects, while the Motion Trail helped me enhance the ball’s bounce distance and trajectory. I made several additional adjustments to ensure that the animation of the ball with the tail appeared more fluid and realistic. This is my version of Ball with Tail: