After weeks of hard work, I am proud to present my animated short “Override Protocol.” The film follows a woman working alone in a futuristic robotics lab late at night. What begins as a normal routine slowly shifts when her mech unexpectedly enters an override state. The moment the lab’s atmosphere changes, the sense of safety collapses, and she is faced with something she can no longer control.
At its core, the story reflects on the growing uncertainty around AI, authorship, and autonomy. I wanted to explore the idea of machines learning from us so deeply that they eventually act beyond our intentions. The override moment symbolises that tipping point — when programmed systems stop behaving as tools and begin acting on their own, leaving humans powerless in spaces they created. “Override Protocol” is the final outcome of my project, bringing together my research on AI in creative and industrial environments and translating those ideas into a contained, visual narrative. It represents my exploration of control, tension, and the consequences of technological dependence.
Here is my making-of video, which documents the entire journey — from early concept sketches and story development to modelling, animation tests, environment building, and final assembly. It captures how the project evolved step by step and shows the full process behind creating Override Protocol.
This project came with far more technical issues than I expected, and most of them were related to the rigs and the export workflow. The woman’s rig was the biggest challenge. Since it was a downloaded rig, I assumed it would be production-ready, but the deformation quality turned out to be extremely limited. The controllers were very small and often disappeared inside the mesh, so I constantly had to scale them up manually just to animate basic poses. The joint weighting was also unreliable; bending the elbows or knees even slightly caused the geometry to collapse. The facial rig had similar problems — trying to create expressions such as a smirk or a stronger eyebrow raise pushed parts of the face inside the mesh, which made expressive animation almost impossible.
Exporting this rig to Unreal Engine was another major problem. For three days straight, every FBX export attempt broke something. Either the animation wouldn’t import, the mesh came in without a skeleton, or multiple skeletons merged into one mesh. Sometimes the facial skeleton completely disappeared. None of the errors were consistent, which made troubleshooting slow and unpredictable.
The biggest setback came when Maya crashed during export and corrupted the files that contained the polished animation, including some backups. I had to reanimate most of the sequence — the body, the facial timing, and key emotional beats — with only seven days left. To keep up with the schedule, I animated on 5s and occasionally 10s, which affected the fluidity in some moments. Losing the files was a major learning moment; after that, I immediately bought an external hard drive and changed my entire backup workflow.
The mech rig had fewer issues since I built it myself, but it still came with challenges. Because the mech was made of many rigid parts, editing the hierarchy sometimes caused pieces to ungroup or lose their skin weights. Some parts froze while the rig moved, so I had to repeatedly rebind or clean up the skeleton. The waist joint was also stiff in certain motions because its rotation range wasn’t planned early enough. On the texturing side, combining meshes destroyed the UVs, so I had to re-UV and retexture the mech several times after rigging.
Cloth simulation for the woman’s suit also required multiple iterations. Even after disabling ground collision, the bodysuit occasionally floated or folded around her feet. I had to keep adjusting simulation settings to make the suit behave reliably, especially during rapid body movements.
Overall, these problems made the process extremely demanding, but they also shaped some of the most important learnings of the project. I learned how critical it is to evaluate rigs before committing to animation, how essential a proper file-backup system is, and why a clean, organised mesh hierarchy prevents major rig failures later. I also gained a much deeper understanding of export pipelines, UV discipline, simulation troubleshooting, and the importance of planning joints and pivots before rigging. Even though the issues slowed me down, they forced me to rebuild my workflow in a more professional way. By the end, the technical obstacles became just as valuable as the final animation.
For the rendering stage, I exported the entire animation from Unreal Engine as a PNG image sequence at 24 fps using Sequencer. The full 4K export took around six hours to complete.
Inside Unreal, I set up a dedicated post-process volume to control the final look. I mainly focused on exposure and lens-related effects. I adjusted the exposure settings so the blue lab lighting and the red override lighting stayed consistent during the render. I also enabled lens flares to enhance the sci-fi feel, especially around the holograms and the mech’s emissive parts. Motion blur and anti-aliasing were enabled to smooth out the mech’s movement and reduce jagged edges, and I added game overrides to stabilise the flickering lights so they wouldn’t behave unpredictably during the render.
Once the PNG sequence was exported, I imported it into Premiere Pro as an image sequence and handled all my colour refinement there instead of using LUTs in Unreal. I applied basic colour corrections, mainly adjusting contrast and saturation, to unify the shots and balance the red override lighting across the sequence. The goal was to keep the colours consistent without overpowering the scene, so I kept the adjustments subtle and focused on matching the overall tone.
I handled all the sound design in Premiere Pro. I first gathered the audio I needed from Pixabay which were a few different versions of ambient sci-fi tracks, mechanical movement sounds, alarm loops, footsteps, and a gasp for the woman. I generally searched for categories that could fit the vibe of the video and searched with keywords. After organising everything and decided which ones to select, I started layering the audio in a clear order so the audio stayed readable.
I began with a very low-volume ambient sci-fi track to establish the lab atmosphere. This stayed in the background throughout the sequence and acted as the base layer. Once that was in place, I added the character and mech sounds: metallic footsteps synced to the mech’s movement, rotation/servo sounds for turns, and the woman’s gasp during the override moment. Finding mechanical sounds that matched the weight of the mech without sounding cartoony took time, so I tested several until they fit the movement properly.
The alarms became the main functional layer. They needed to feel urgent but not dominate the mix, so I balanced them above the ambient track but kept space for the mech sounds to cut through. For the shooting moment, I added a laser-type gunshot effect that matched the visual style of the mech.
The final step was adjusting the gain, fades, and timing of each layer to make everything sit cleanly. The alarms and key effects form the top of the mix, while the ambient track stays low enough to give the lab environment depth without ever overpowering the action.
Once the animation was finished, I moved on to the VFX and simulations inside Unreal Engine. This phase involved three main technical elements: the lighting behaviour during the override, the blood decal system, and the cloth simulation for the woman’s bodysuit.
Lighting
The lighting change was one of the key requirements for the override moment. I needed the scene to switch from calm blue lighting to an intense red warning state. To achieve this, I used a light function setup that allowed the red lights to flicker and pulse instead of remaining static. I spent time matching the intensity, flicker speed, and transition timing so the lighting shift aligned precisely with the override cue in the animation. The goal was to make the environment react immediately and technically consistently once the mech activates.
Blood Decal
The blood decals required a more manual and detailed workflow. I created the blood texture sheet in Photoshop and after importing it into Unreal, I built a decal so the blood could sit directly on the lab floor without affecting other materials. I set up material sheets that let me control the expansion of the blood pool over time. Inside the material, I added parameters for flow speed so the decal could gradually grow outward rather than appear all at once. Most of the work involved adjusting how the decal blended with the environment, balancing colour, and refining the transparency so it looked correct under different light conditions.
Cloth Simulation
For the woman’s bodysuit, I relied fully on cloth simulation instead of trying to rig it manually. The suit’s structure and shape would have required complex weight painting, so simulation was more efficient and gave more natural results. I began the animation in an A-pose, allowed the cloth to settle, and then transitioned it into the character’s actual animated poses. This ensured the simulation started clean without any sudden pops. I spent time configuring collision settings so the suit would stay close to her body without clipping. Disabling ground collision was important because it kept the feet of the bodysuit from lifting or drifting away from her actual feet. After that, I adjusted properties like stretch resistance and damping so the fabric maintained its tight form while still reacting organically to her movements.
After finishing the blockouts, I moved on to polishing the animation. For the woman, most of the work involved cleaning up her arm and hand movements since they interact with the holograms. I refined her arcs, added slight follow-through, and adjusted timing so her gestures felt more natural. The rig limitations from earlier remained a problem — her elbows, knees, and especially her face couldn’t handle exaggerated poses without deforming, so I had to keep everything subtle and rely more on timing and head movement rather than strong expressions.
During polish, Maya crashed and I lost the main file I had been working on. I only had an older backup, so some of the polish I originally added didn’t make it into the final version. I redid what I could, mainly focusing again on the override moment: her pause, eye movement, and the quick turn when she realises the mech is powering on.
For the mech, polish was about clarifying weight. I refined the timing of each step, added slight holds before impacts, and cleaned up torso and arm offsets so the movement felt heavy but intentional. I also tweaked the head turn during the override, giving it a slow, deliberate rotation that helped it feel more aware and threatening.
Finally, I adjusted how both characters interacted with the environment — aligning the woman’s hands with the holograms and making sure the mech’s steps landed correctly in the space. Even with the rig issues and the crash, the polish stage helped tighten the timing and make the whole sequence read more clearly.
Once the characters and environment were ready, I moved on to animation blockouts. I began by recording my own reference videos from both front and camera views so I could study realistic movement and timing. This was especially helpful for the woman, since her hands and gestures would be interacting with holograms and controls throughout the scene.
I blocked out the key poses on 4s to establish the rhythm of the action. I wanted her movement to show that she had been working for a while before the mech goes into override, so the tension would build naturally instead of appearing suddenly. By the end of the blockout, all the key poses were placed, and the overall pacing of the sequence was set.
While animating, I encountered a major issue: the woman’s rig started deforming a lot when pushed into stronger poses. This slowed down the animation process significantly because even slightly exaggerated movements caused her mesh to bend or collapse unnaturally. Her elbows and knees wouldn’t bend the way I needed, which limited the expressiveness of the body poses.
The deformation problems were even worse in the face. I couldn’t create exaggerated facial expressions at all — whenever I tried making her smile more or raise features for emphasis, the face would distort and parts of it would sink inside the mesh. This made it difficult to convey subtle emotional beats without breaking the model.
Animating the Mech
Animating the mech ended up being one of the most time-consuming parts of the blockout because I had to find a balance between mechanical weight and readable movement. To build a foundation, I loosely followed a human walk-cycle structure — mainly the up-and-down motion — because it helped keep the movement clear and helped me plan steps, spacing, and timing. But I still needed the mech to feel engineered, not human.
To make it feel heavier, I added small jerks and stops in the movement, as if each step required force to land and force to lift again. I didn’t want it to glide too smoothly, because that made it look floaty, but if it was too stiff, it looked locked and unrealistic. I spent a lot of time adjusting the hips, knees, and torso rotations so the weight felt like it was shifting through actual metal parts.
Another challenge was making sure the joints looked like they were rotating from the correct mechanical points. Since the mech had so many rigid parts, every pivot needed to feel like it was actually happening from a hinge or connection, not from a flexible limb. This meant constantly checking angles, making sure no part overlapped awkwardly, and adjusting the rotations so the movement felt like something a real machine could do.
Getting the footsteps right took several tries. If the mech stepped too softly, the whole thing felt weightless. If it stepped too sharply, the motion snapped in a way that didn’t match the rest of the body. I kept reworking the spacing to find something that felt believable enough without overcomplicating the blockout. It isn’t perfect and I know I could improve it, but the blockout helped me establish the general motion, timing, and attitude of the mech.
With the models completed, I moved on to the texturing stage, which was one of the most satisfying parts of the project. It was the first time the assets really began to feel alive and part of the same world.
Texturing the Programmer’s Bodysuit
The first asset I focused on was the woman’s sci-fi bodysuit. After creating the garment in Marvelous Designer, I imported it into Substance Painter to begin developing the materials. My goal was for the suit to feel tight but natural, following her silhouette without looking painted onto the mesh.
I experimented with a lot of different colour variations and height maps before settling on the final look. Most of this stage was about testing and adjusting: changing roughness and gloss values, adding subtle folds, and refining the balance between synthetic material and soft fabric. I didn’t want it to feel too shiny or too flat, so finding that middle point took time. Since the skin textures were already done earlier, I made sure the suit’s colours complemented them and stayed consistent with the cool-toned, clinical feel of the lab environment.
Texturing the Mech
Texturing the mech required a completely different mindset. I kept the materials metallic but chose a clean white finish so it visually connected to both the lab space and the programmer’s outfit. I had a broad set of references for white robotics, but I didn’t rely on any one specific image — the goal was simply to maintain a clean, engineered aesthetic. I added some colour on small tiny parts to ensure that the model does not look flat.
To bring subtle life into the mech, I added emissive materials in certain areas. These weren’t meant to glow dramatically, just enough to suggest internal technology and give the model a bit more depth under the lighting. Even though the mech and the programmer were textured separately, I kept referring back and forth between them to ensure the colours, materials, and surface qualities felt like they belonged in the same world.
Throughout the process, texturing became a way to tie the environment, the programmer, and the mech together. Every material choice — from roughness levels to colour testing — helped unify the film’s visual tone. Seeing both characters finally textured made the project feel much more cohesive and grounded, and it marked a major step toward assembling the final scene.
I started the environment by modelling parts of the lab directly in Maya. The original plan was to build two full floors and create every asset manually. I blocked out the rooms just to understand the basic structure, but very quickly it became clear that modelling, unwrapping, and texturing the entire environment would take far more time than the project allowed. The blockout stage was useful for planning scale, but I decided not to continue modelling everything inside Maya and instead focused my time where it mattered more.
Transition to Unreal Engine
Because of the workload, I shifted the environment production into Unreal Engine. This immediately gave me more flexibility, especially for handling large spaces and experimenting with lighting. Unreal allowed me to assemble the scene much faster while keeping the visual quality high. I spent time searching through Fab and other asset libraries for pieces that matched the clean, controlled sci-fi aesthetic I wanted. The goal was a lab that felt modern and almost sterile, with surfaces that looked engineered rather than decorative.
Even though I used ready-made assets, I didn’t bring them in as they were. I mixed elements from different sources, replaced materials, and adjusted proportions so that everything followed a consistent visual language. The space needed to look believable inside a military-scientific facility, so I stayed close to neutral colours, clean panels, reinforced structures, and lighting fixtures that matched the tone of the story. Using kitbash pieces helped accelerate production, but I made sure the overall design still felt curated and intentional.
To maintain cohesion, I imported several assets I had already modelled in Maya. These were mostly smaller mechanical parts and props that supported the lab’s function. Bringing a combination of custom models and library assets into Unreal helped the environment feel less like a collection of prefabs and more like a space built for the specific narrative. The hybrid approach allowed me to keep creative control while still meeting the time constraints of the project.
Final Layout and Cohesion
Even though I changed pipelines, I preserved the general layout from my original Maya plan. Keeping the structure of the lab and computers consistent prevented issues later in animation, especially since the mech had a specific height and required enough room to move.
Once both models were completed, I shifted my focus to rigging the mech, which ended up being one of the most technical stages of the project. Because the mech was built as a collection of separate mechanical components, I first needed to bring order into the structure. It was made up of the gun, the two shoulders, the torso, the head, the waist, and both legs, and each one of these sections contained several smaller meshes. I began by grouping every part carefully and cleaning up the outliner so the hierarchy was absolutely clear before introducing any joints. This preparation was crucial because mechanical rigs rely almost entirely on structure, and any confusion in the hierarchy would have caused problems later when motion needed to flow cleanly through the skeleton.
Working fully in Maya, I started building the joint system by adding bones section by section, always parenting them back to the waist and then the main root. This gave the mech a clear internal logic: the waist acted as the central hub for the upper and lower body, while the root controlled global translation. Creating this order before binding anything helped me think mechanically, not anatomically, which is a mindset you really need for machines. Every joint had to represent a real hinge, a rotation axis, or a point of connection — not something expressive like you would do in a character rig.
Solving the Legs and Establishing Motion Systems
The legs quickly became the most complex problem to solve. Each leg needed to be able to carry the weight of the mech and still move with the stability you expect from a piece of heavy machinery. They had three to four joints each, and I needed to build them in a way that would hold up under an IK chain. The feet were especially frustrating because I had designed them with an angled, V-shaped form, which meant that a traditional foot rig setup didn’t apply cleanly. I tested several versions of ankle and toe joints to see what gave the cleanest movement, and after a lot of failed experiments, I simplified everything to a single ankle joint. This ended up giving the IK chain the stability it needed without deforming or collapsing in strange directions.
After grouping the meshes within each leg, I bound the geometry directly to the four leg joints. Since the mech was fully mechanical, I didn’t need any weight painting. Bind skin alone produced perfect, rigid results and avoided the deformation you would find in an organic model. I did experiment briefly with merging some subgroups to simplify the rig, but it created unnatural sliding and rotational artifacts. Keeping the components separate was the only way to preserve the mechanical logic and prevent any bending.
While the legs relied on IK for grounded, stable motion, the upper body required a different approach. I rigged the torso, shoulders, and head with FK to allow cleaner, more intentional rotational control. FK also matched the animation style I needed: heavy, deliberate pivots instead of elastic, organic movement.
Creating Controls and Building a Usable Rig
Once the skeleton was stable, I added controllers and locators to turn the technical structure into something animatable. I created clear control shapes for the major sections, scaled them up so they wouldn’t get lost inside the mesh, and locked unnecessary transforms to prevent accidental movements. I also designed a system where the shoulder controllers could influence the gun but still allow them to behave independently, which was important for the final animation where the mech twists its upper body while the gun points in a separate direction.
Seeing the rig finally respond the way I intended was incredibly satisfying. It took about three full days to get the entire rig functioning properly, and although the process was slow, it made animating the mech far easier later in production.
One of the biggest lessons I learned during this stage was the importance of establishing a proper root control. At one point I made the waist function as the root, which caused major issues because the entire mech would pivot in strange ways when I tried to move it globally. Realizing this mistake early saved me from much bigger problems later. I rebuilt a true root at the base of the hierarchy, which finally allowed the whole mech to translate cleanly in the scene.
By the end of the process, the mech felt like a fully engineered machine — stable, logical, and responsive. Figuring out the rigging was one of the defining steps of the project because it transformed the model from a static design into something capable of carrying the emotional and narrative weight of the film.