Muliebrity (2020-22)

Muliebrity [i]—describing womanliness—is an ironic naming for my programmatic animation system, which is to effect an audio visualizer. In parody to the imagining of a Matrix as a reproductive agent, my rendering exploit is generative, cyclic, and glitched. This page in the spirit of open-source learning unpacks the logic behind this system, which can be reproduced in the Unity Game Engine. I have included a demo package at the end of this entry with a scene set up already.

Why use the Unity Game Engine?

Using a game engine like Unity bypasses significant wait-times associated with traditional processes of rendering moving-image. Rendering can take considerable time depending on the computing-power of a machine, meaning minor adjustments can spell major waits. Unity appeals to me as a storytelling device because it can be interacted with in real time. This affords a degree of play in otherwise rigid production pipeline. It is incredibly satisfying to hit ▷ and jump into a composition. My approach of Muliebrity [i] is intended as a form of experimental machinima, that is, ‘machine-animation’.

I began articulating Muliebrity [i] in the second of the four Pollinators I attended—which was live-streamed in absence of Modifyre Festival 2019 at Frequencies TV Meanjin/Brisbane. I applied the techniques I formalised during these sessions to several artworks for public exhibition and learnt a lot about c# and Unity’s animation system in the process. Sound has become an integral component to Muliebrity [i], which is captured using a Fast Frontier Transform (FFT) equation to scale the speed of several elements in real-time.

My scripts are by no means programmatically intuitive. For instance, the script I have liberally renamed Stream Audio Data (SAD) is based on a free tutorial series by Peter Olthof on youtube. Olthof’s free-share of knowledge allowed me to grasp the concept of FFT audio synthesis and branch out to gain confidence as a programmer to construct my own scripts. I hope to encourage others to play-it-forward.

To Stream Audio Data (SAD):

An audio stream captured from an internal or external source can be converted into data using a Fast Frontier Transform (FFT). A FFT is commonly used for signal-processing to represent sound as a sinusoidal wavelength (frequency and amplitude). For more information on FFT visit here.

To perform a FFT on spectrum data collected from an audio source with Unity you can use this (SAD) script: Stream Audio Data.

An audio source could be a combination of sounds collected from the game space, system audio, or it could be collected from a microphone in the real world.

It is generally accepted that the human ear can process between 20 to about 20000 hz of frequency. This audio spectrum can be collected, converted, and split into categories where the frequencies of instruments are likely to fall. For information on the frequency of generic orchestra instruments visit here.

Sub bass: 20 to 60 hz
Bass: 60 to 250 hz
Low midrange: 250 to 500 hz
Midrange: 500 to 2 khz
Upper midrange: 2 to 4 khz
Presence: 4 to 6 khz

To utilize this information falling into each band we perform a calculation that says hey everything between 20 to 60 hz (for instance) is an interpolation between 0 and 1f.

Floats can be normalised using this logic:

//Calculate the normalized float;
normalizedFloat = (i – min) / (max – min);
//Clamp the “i” float between “min” value and “max” value
i = Mathf.Clamp(i, min, max);
//Clamp the normalized float between 0 and 1
normalizedFloat = Mathf.Clamp(normalizedFloat, 0, 1);

Because Unity is so flexible, SAD can be used to modulate just about anything that is accessible publicly. My graphics and animation speeds are modulated by multiplying interpolating values collected from SAD.

The following describes some of the ways to implement SAD.

Modulate the locomotion of a 3D gameObject using SAD:

SAD can be utilised to modulate the locomotion of a GameObject. This is demonstrated in hangmen @ Charon’s Train Station (2021), which features a sentient cyborg train Charon carting a wagon containing an over the top sound-system.

Powered by Drum & Bass, Charon achieves ‘chugga chugga’ by scaling the animation of a 3D GameObject between a minimum and maximum threshold using SAD. This logic is applied to several mechanisms including rotation, force, and speed which bring the character to life in time with the beat of its environment.

For a higher degree of control over the figure, SAD can also modulate an Animation Controller directly. This means that animations can be keyframed prior, and then sped in real-time.

A strategy follows this logic:

public float speed = x;

if (spedBySAD)

        {

float startSpeed; startSpeed = speed;
float StreamData;
StreamData = StreamAudioData.bandBuffer[band] * 10;
speed = startSpeed * StreamData;

}

Below is as discussed in my exegesis.

Modulate the exposure of a 2D image reel using SAD:

Muliebrity [i] works by collating all the images in a designated array into a list, then cycling through one of the scenarios I have coded: “sequentially”, “pseudo-random”, and “random chunks”.

Sequentially plays the animation in an ordered sequence from beginning to end then loops.

Pseudo-random plays the animation 1 by 1 in a random sequence (nothing is every truly random with computers).

Random chunks picks a random position in the array, plays a set number of frames in ordered sequence, then picks another random position, looping this process.

My touchstones for artistic production are two software programs from the dial-up internet era: the game Solitaire and the Music Visualiser found in Windows Media Player (Microsoft 1998; Microsoft 2000) With Solitaire specifically, I am interested in how the win animation of bouncing cards is drawn by exploiting an empty, rendered field and limiting the frames per second (FPS). FPS is derived from celluloid film-making and details how many cells were exposed per second. With the latter, I am attracted to the collaborative possibilities of the computer visually interpreting sound. The speed of Muliebrity [i] is either pre-set or modulated using SAD.

The human-eye is thought to be able to process around 10 to 12 individual images sequentially, any additional frames thereafter begin to fill the gaps and create motion. In film the minimum accepted frame-rate is 24 FPS. FPS in digital film is now processed electronically. In videogames the standard is 30-60 FPS, but some videogames are uncapped—pushing the CPU to output as many FPS it can reasonably handle.

If one views panels on a comic book left-to-right and downwards, they are traveling through a narrative timeline sequentially. On a screen, different cuts of a scene are pieced together and travel spatially on a 3D render field.

Like Solitaire, Muliebrity [i] ‘stamps’ imagery over an empty render field to produce a generative animation. I further innovate on this by modulating the opacity of the image-reel using SAD (following the same logic as speeding an animation described above). By layering several image-reels, content organically merges into itself, producing an effect similar to the aesthetic of ‘data-moshing‘. *In Unity, the FPS must be limited programmatically (included in my Demo) otherwise the uncapped FPS render iterates over too fast, and the ghosting effect is lost.

The demo scene I link to at the base of this webpage demonstrates how the win animation for Hydra: An Absurdist Horror about Centrelink (2022) can be reproduced with Muliebrity [i]. The content I used to produce the win-animation of Hydra was machinima of the stars in the far background of Hydra, converted in an image-sequence, and then reintroduced to Muliebrity [i]. As I did with my other artwork during this project, this process was repeated a few times to promote compression artefacts, resulting in the final composition.

Jaggies:

The visual effect I feature in my artworks, a pixelated stippling that branches into a membrane is achieved by layering imagery over an empty render field while exploiting Temporal Antialiasing (TAA) within Unity to induce ‘jaggies’.

Jaggies are the stair-like artefacts which emerge when a straight-line steps over a pixel and is pushed vertically or horizontally across during an Antialiasing procedure. Antialiasing was the achievement of 3D gaming in the 1990s, giving consoles such as the Nintendo 64 and Playstation 1 their muddy visual quality, smoothing the polygonal render edges akin to a painter’s wash. Normally, Antialiasing smooths when a jaggie occurs by calculating the difference between two pixels next to it and averaging it.

In combination with a few standard post-processing techniques including lens distortion and sharpness, alongside a limited FPS, jaggies can be induced and manipulated to animate interesting and often hypnotic visual effects.

I articulate Muliebrity [i] using the most current version of Unity 2019 (at time of writing Unity2019.4.40f1) because I exploit Temporal Antialiasing (TAA) to generate an aesthetic. TAA is currently not supported in Unity 2020+ because the Forward Renderer in the Universal Render Pipeline and Light Weight Render Pipeline doesn’t support motion vectors.

According to Unity’s documentation:

“TAA uses frames from a history buffer to smooth edges more effectively than FXAA. It’s better at smoothing edges in motion, but you must enable motion vectors for this. … Because TAA is temporal, it often creates ghosting artifacts in extreme situations, such as when a GameObject moves quickly in front of a surface that contrasts with it.” (Unity 2022)  

In closing, my style of glitched generative animation is an appropriate call-back to the aesthetics of the 1990s—when the SSA 1991 and DDA92 was legislatively bought into action.

Demo: link.
Muliebrity Code: link.
SAD Code: link.

*note to examiner: In my exegesis I wrote that Muliebrity works by collating images found in a Resources folder into an image-reel. While packaging up my demo, I decided that this method was too performance heavy for lower-spec machines, and so changed it to a system where the user must collate the array themselves by dragging and dropping into the index prior to building.

I also wrote that Muliebrity both happened in real-time in Hydra and did not. This is contradictory, but accurate, and something I regret not expanding on. An amendment I would like to include in my final submission is that the win-animation was filmed in real-time, exhibited as a Video, and along with the locomotion of ‘The Maggot’ (the videogame’s titular space-ship) modulated by SAD. The TAA exploit does not occur in real-time (for the reasons described above).