Week 6: Low-poly-black-liquid-evil-Swarovski doggo
- priscillaanne07
- Jul 9, 2018
- 3 min read
In our NukeX tutorial this week we covered making a planet earth. I am glad we got a bit more in depth into the 3D side of Nuke. Check out my earth:

I love the fact that we can semi-recreate a 3D scene in Nuke, using a point & normal pass, for the purposes of relighting a scene if need be.
This week I had a go at colour matching a raw image of a sunset to a beautifully graded image of a sunset, using the colour lookup and match grade node. These two ‘autofix’ nodes didn’t work perfectly (the blue hue of the sea was slightly more green than the hue of the sky). I sectioned off the sea with a crop node (with feathering), colour matched the sea separately, then merged it over the sky. The blues turned out a bit better. Note to self: avoid .jpegs.
Raw compared to graded:

Target compared to graded source:

I tried my hand at stabilising hand held camera footage using keyframe tracking. The footage of tigers being fed was so jerky that I couldn’t get away with automatic camera tracking.
Before setting up trackers, I colour graded and sharpened the footage so the trackers could identify pixel values a bit better. I chose points in the shot where the pixel value didn’t move out of shot, or change too much (e.g. corner of cage, cement on cage). I manually set keys at the start point, midpoint and endpoint, then all the midpoints points in between those keys, then all the midpoints in between those keys. I went through the footage and set keys where the frame jumped significantly (about 60 keys altogether), then auto-tracked everything else. I went through and refined any errors.

I did this twice over and checked the transform (T), rotate (R) and scale (S) boxes for each tracker, so all those factors were taken into consideration when exporting a transform stabilise node. My two tracking points were centrally placed, so I added an extra one one the side to give the stabiliser more scope to work with. My third track point went off screen in the second half of the footage, so I took advantage of the offset tool to finish tracking. #handy. My third track point automatically updated the data in the transform stabiliser, improving my footage a little bit more.

I had to crop the footage down to a ridiculous amount to cut out all the void edges. The result wasn't stabilised perfectly in the end; it was probably a poorly chosen clip.

I also tested putting a little (pink) monopoly house into drone footage. The preset camera (lens) settings in the camera track node are super helpful. The test went well; the geo stays put and doesn’t float or slide. I need to fix the texturing and lighting, and maybe add some rotoscoping.



This week I put a lot of hours into testing more dynamic effects for the dog in the boss battle scene. Many were unsuccessful, some were not bad. I tried to avoid spending hours simply exploring fluids, dynamics, mash etc (the possibilities are endless), and instead went to my peers for feedback on what was working well and what wasn't.
Here are some examples:





As it turns out, none of the dynamic tests I came up with this week hit the mark for our client. Regardless, I learnt a lot about fluids and mash networks. After pressing out a clearer vision from our client for the boss dog, I was able to go away and produce something that hit the target!

The successful result was a lot simpler to achieve than I thought. I kept the dog low poly and hardened the mesh display (this was key to keeping the low-poly look sharp). I applied an aiStandardSurface shader with a glass preset, heightened the specularity and changed the colour to black. I applied a texture deformer with volume noise file plugged in. I tweaked some attributes to give the dog a slight glitchy look when in motion, & animated the texture deformer handle to keep the mesh moving in a wave-like motion even when the dog comes to a standstill, et voila! The result was like a low-poly-black-liquid-evil-Swarovski doggo.

Going forward I want to research more about what AOV’s, render layers/settings are suitable for our short film. I want to put a mock scene together using the test room, textures, lighting & the test 2D dog with its attached camera, render out a few seconds and practice compositing that in NukeX to get a nice, balanced Pixar-like atmosphere.
See you next week!
Comments