Category: Term 3
Artefact project
Final Output
Challenges
Creating physics simulations was the most challenging part of this project.
I first tried to use ragdoll addons in Blender with rigid body simulation to get the effect I needed. But the simulations didn’t look natural and had weird bends on some bones.
For my second try, I used the physics simulations in Unreal Engine.
I fixed the physics mesh and their constraints on the rig I had. Depending on the joints, I limited the angle rotation constraints so that the ragdoll simulation looks more human-like.
But the main issue was exporting a .fbx or .abc file out of Unreal to populate on the mesh I had on Blender.



Learnings
There were 2 main things that I realised from this project.
- Pairing the human ragdoll simulation style with the miniature effect resulted in the humans not being legible in the scene. The animation also lost its meaning when the characters were of small scale. [close up ss]
- Doing simulations takes a lot of time, and it affected my initial timeline plan. A lot of aspects, like textures and lighting, had to be overlooked because the simulation process required trial and error. A better alternative would be to work out the simulation on Houdini.
On the positive side, I learnt how to get the miniature effect in Blender.
Future Explorations
I will work on the idea further while fixing the style of animation. I personally like the theme and want these loops to be thought-provoking bits to reach out to people.
Week 19: Acting Spline
Acting Spline
Week 18: Blocking Plus & VJing
Acting Blocking plus
For the 2nd pass, I turned the head towards the camera more, keeping the eyes locked towards the ‘friend’ that the character is saying the line to.
Using a constrained empty object at the nose tip, I got a motion trail and fixed the arcs on its movement.
I also added exaggeration to the body movements to improve the moment when the character shouts.

Resolume: VJ
I’ve always wanted to learn a MIDI controller for music, but using it for video was even better than my expectations.
I tried to experiment by making a composition using the default clips available in Resolume and syncing it with electronic music, playing it simultaneously while changing the visuals based on rhythm and beats.
Key Learnings –
- We learnt how to map the keyboard and MIDI keys with different controllers in Resolume.
- How to import different clips into Resolume and add transition effects.
- Using sliders and dials to control speed, opacity, scale and other properties of clips.

Week 17: Animation Blocking & 360° Video
Acting Blocking
Based on my planning and reference, I did the blocking. 1st step was to get the head and body movements right and then add in the main mouth and eye poses.
After the 1st pass, the head movement didn’t feel right. The head wasn’t following proper arcs, and its movement seemed disconnected from the body.
360° Video



Body Mechanics – Polish
To fix the overall motion path, I imagined the hips to be a bouncing ball. Tracking its movement in arcs and stopping after the slide made the physics of the body more realistic.

Some major fixes done:
- Better arcs for hips, head, hands and legs.
- Improved weight shift on each step.
- Added movement to fingers and sword sheath as overlapping motion.
- Fixed Sword flicks to be faster and snappier.
Acting Planning
For the acting animation, I chose a dialogue from a DnD series I watch. The dialogue had a variation in tone and went from arrogant to angry but in a sarcastic way. I decided to change the scenario and context of the dialogue to a funny, helpless situation that the actor is stuck in.

For the planning, I kept the voice audio and marked key moments when the sentences pause and change tone. Then I went on to record multiple references, deciding on 2 in the end. One had good head movement and timing; the other for the expressions.

Motion capture
This week we did a Motion Capture breakout. It was fun to experiment with different movements and see them translated in 3D software in real time.
I tested out some movements with a sword prop to get insights for my body mechanics shot.
I didn’t get time to take the motion capture data and retarget on a rig, but I will test them out after the term.

Advanced Body Mechanics – Spline
After fixing the arcs and poses, I switched my animation to spline, and a lot of errors started showing up.
- The hand movement was jittering during the slide.
- The sword and sheath were popping in some places.
- The overall movement felt robotic and not human-like.
Mad Mapper: Projection mapping
This was an interesting session for me. I have wanted to learn projection mapping for a long time but never got the opportunity.
I worked on a custom object made from paper to test the projections. The tool was easy to understand and use, but mapping on thin edge surfaces was a little tricky.


Skill Application: Update after 16th June
I used the techniques and skills learnt from this class in a contract project of an animated mural for the ‘handover event for the Bakerloo line’. I worked on the motion graphics which was projected on a painted mural.
I did not do the mapping myself, but it was easier to plan the animations and output files knowing how the tool will be used later for the projection.




Artefact project: Concept & Planning
For my experimental artefact, I want to create small looping animations that would create an ‘oddly discomforting’ feel. The idea is to create a ragdoll-like mass of human bodies and experiment with physics simulations.
These would be based on real-life scenarios, as seen in India, where the huge population causes accidents and harms the natural urban fabric of cities.


I picked 3 scenarios to represent this concept.
1. Ghazipur Landfill Mountain – This is a waste landfill site near New Delhi that has been filled over capacity for the past 20 years, making it a huge mountain of waste. It causes pollution in the air and water.
2. Mumbai Local Train – There are a number of deaths every year in accidents on the train station or tracks. A recent news story was about a few people dying after falling from a moving train.
3. Bengaluru Water Lodging & Flooding – Water lodging during rains is common in many cities in India, this both due to corruption and mismanagement from the authorities as well as illegal constructions by citizens.



Inspirations
For the style and mood, I decided to take inspiration from the style of the artist ‘Extraweg’ (Oliver Latta) https://www.extraweg.com/art
The ragdoll-like simulated bodies, moving in a clump, gave me an eerie feeling that I wanted to recreate in my experiment.
To show the massive scale of population (and not to complicate my idea further), I decided to use a tilt-shift-like effect for my compositions. Keeping an orthographic view from top with a still camera, similar to one seen in BUCK studio’s Night of the Mini Dead from Love, Death and Robots (2022)
Advanced Body Mechanics: Blocking Plus
For the body mechanics, I continued fixing my poses and timings in blocking. Adding more keys in places of longer spaces between existing keys.
I fixed some of the arcs and timings from previous blocking, but it need another pass before moving to spline.
Advanced Body Mechanics : Blocking plus
For the next pass, I added moving holds and copied pairs. Both these concepts were new for me.




In the examples above, the overall body is holding a pose, but to keep life in character, a slight motion of the head or arm is added. This also gives a pause in the overall movement to add rhythm to the animation.
I also added a placeholder background to define the composition and added projectiles that the character is deflecting to make the shot more engaging.
Errors to be fixed –




Unreal Engine : NDisplay
At its core, nDisplay is a system in Unreal Engine that lets you project your virtual environment onto multiple displays, like LED panels, projectors, or even curved screens. All these displays can be synced together in real time. This is the backbone of many modern virtual production sets, where instead of a green screen, actors are surrounded by digital environments displayed on massive LED walls.
It is certainly an evolution in filmmaking techniques, from the use of green screens to making it more immersive for the performer to see their background environment in real time. It would also make a better lighting setup for the shoot and reduce the post-process lighting work.
I was always intrigued by how The Mandalorian was produced; I had looked at their making-of videos but never understood how the displays worked. After this lesson I learned how the combination of NDisplays and VCams might have been utilised to create real-time changes in the environment linked to the camera movements.

Another use of NDisplays can be the anamorphic commercial banners. The artist can make the animations as usual and make the output be displayed on multiple screens or a curved screen easily.

Week 12: Animation Blocking & VCams
Advanced Body Mechanics: Blocking
I recorded some parts of the reference and revised my planning to add story to the character motion.
I also decided to change the ending to be just a simple jump rather than a spinning movement.

After the planning, I focussed on the timings of animation for the first blocking pass.
Unreal Engine : VCams
This week we tried to use Virtual Camera in Unreal Engine. It is a useful tool to quickly look around the scene and set up camera angles in Unreal Engine using just an iPad or mobile phone.
I attempted to take a few shots from my old project using my phone, but due to the vertical elevations in my project file, I could not record a proper take. But given the right environment, this tool can be useful for quick Pre-Vis or prototyping.
