top of page

What's my project about?

My project for the MA Creative Technologies course aims to explore the relationship between audio and visual, and how audio can be represented in visual form. Within this project I aim to create a VFX showreel, acting as a sort of VJ pack that mirrors the characteristics of a music video. This showreel will be made from clips showing the different audio-visualizers that I have made and each of their capabilities. To do this I have equipped myself with a range of differing game engines and modelling software’s that are currently used within today’s industry.   

Development Timeline

Creative Technologies

Unreal Engine 5 (UE5) is a powerful game engine developed by Epic Games. It's a software framework that game developers can use to create video games, simulations, and other interactive experiences. Unreal Engine 5 is designed to take advantage of the latest hardware technology, such as the next generation of consoles and powerful PCs, to create highly realistic and immersive environments. Some of the key features of Unreal Engine 5 include advanced lighting and shadows, dynamic weather systems and realistic physics (Unreal Engine undated). It's used by game developers all over the world to create some of the most popular games on the market today. Outside of these reasons I’m using UE5 because of the Niagara System within the software. It’s at the forefront of real time particle generation making it valuable for my current project.

Blender is a free and open-source 3D creation suite. It supports the entirety of the 3D pipeline—modeling, rigging, animation, simulation, rendering, compositing and motion tracking, even video editing and game creation. Its user interface is highly customizable, making it accessible to both beginners and professionals alike. With Blender, users can create anything from simple objects to complex environments and characters, making it a popular choice in the entertainment industry for creating games, films, and other visual media (Blender undated).

TouchDesigner Logo.jpg

TouchDesigner is a visual development platform that allows users to create interactive and real-time 3D graphics, animations, and multimedia experiences. The software operates on a node-based system, where users connect different modules, called operators, to create complex visual peices. These operators can handle various tasks such as rendering, animation, audio, and data processing. TouchDesigner finds use in a variety of fields, including art installations, live performances, interactive exhibits and more. Its user-friendly interface and vast community support make it an accessible tool for both beginners and seasoned professionals (Derivative undated).

Adobe Premiere Pro is a video editing software program used for creating and editing video projects, such as movies, TV shows and more. It offers a wide range of tools and features for video editing, including audio editing, color correction, and visual effects. Adobe Photoshop, on the other hand, is a digital image editing software program used for creating and editing images. It's often used for graphic design, photo retouching, and creating artwork. You can use it to adjust colors and tones, remove unwanted elements from images, and create visual effects (Adobe undated). Each of these software’s are at the forefront of the media industry, having constant updates throughout their lifetime to keep each workspace as innovative as possible.

Dev Blog - Chapter 1

Experimentation & Testing

For this first dev blog I wanted to use this opportunity to introduce myself to the UE5 software and more specifically, the Niagara system within it. With this I aimed to create a very basic and rudimentary audio visualizer. This visualizer was only made from one emitter since I wanted to create an effect that was not demanding on my pc due to this being an introductory exercise.  

For the second dev blog I went about expanding on the techniques that I used within the first visualizer. With this in mind I aimed to create a pixelated cloud which had movement reminiscent of a tree in the wind. The reason for this being that when an high particle count effect is combined with a force, it makes a highly aesthetically pleasing simulation. I thought that combining this idea with an audiovisual aspect could make for a very responsive visualizer, especially if the effect is made from tens of thousands of meshes/particles, each reacting to the audio individually.  

When working on this third dev blog, I wanted to test the audio reactive capabilities of Blender. To do this I went about with the aim of making a spherical audio-visualizer. Prior to this project I had some experience using blender but this would be the first time creating an item that used the shaders workspace.

Within this fourth dev blog, I set out to create an audio-visualizer in a more traditional format. This format being a simple wave form. Again, I used Blender for the creation of this effect since UE5 doesn’t give me the same meticulous control that Blender offers when molding and sculpting materials.

Due to completing prior experiments, I now felt confident in my ability to navigate and work within the Blender software with much more ease. As a result, for my fifth dev blog I went about creating a much more complex audio-visualizer. To do this I used a spectrogram as the basis input for the audio visualization. By using this over the normal method of tying audio to an effect, it gives me much more flexibility when trying to create my desired effect. For example, this visualizer would not have been possible to make without the use of a spectrogram.   

Dev Blog - Chapter 2

After completing my 5th dev blog, I reached the end of the development stage of my project. To mark the inception of the realisation stage, the rest of my dev blogs have been split into a second chapter.


This second chapter has also been introduced due to changes to the project. The major change to the project has been the addition of TouchDesigner. This is down to how the software tailors to the needs of my project to a much more acute degree - this can be seen after the 6th dev blog.

​

The second change being an alteration of the choice of audio. This is because the original audio's waveform was very monotone and as a result, the visualisers were lacking in movement. 

So
what's
changed?

Within this 6th dev blog, I went about creating an entry level effect within TouchDesigner. This dev blog mirrors the characteristics of my first dev blog, acting as an introduction to the TouchDesigner workspace. This effect was built from a material rather than a particle emitter. The numerical values of the audio have then been tied to equal segments within the material resulting in the individual cells of the worm act as incremental visualisers.

After completing my preliminary experiment within TouchDesigner and gaining a level of confidence. I went about creating a histogram-based visualizer. I found that the creation of this visualizer was actually easier to carry out since it followed some of the same development steps seen within dev blog 5, the spectrogram visualizer. To add, much like the prior visualiser, this one is also material based. These materials are twenty evenly segmented lines that have been extended and placed along the x-axis. The numerical values from the audio spectrum are then tied to each individual line, resulting in the visualisation that you see within the output. 

Through the creation of this dev blog, I aimed to create a visualizer that had a psychedelic visual style. Again, this was a material-based visualizer made of individual lines. These lines are then placed to make solid flat plane. This plane is then tied to a series of nodes, the main of which being a noise node. This node tells the plane which nodes to display and which to hide. The result is then twisted and mirrored to create the effect seen in the output. The audio is then put through a spectrum and the resulting numerical values are then tied to the rotation of the visualizer. However even with the effect being a nice visual piece, I found it hard in making it an effective visualizer. As a result, I decided to leave it out of my final collective output. However, many of the skills gained through the completion of this dev blog would prove to be useful in my next visualiser.

With prior dev blog complete, I aimed to create a more complex particle cube visualizer. This visualiser used many of the techniques gained from the prior visualiser, more specifically within the post productive stages of this visualizer. As well as this, it's earlier constructive steps were very reminiscent of those seen within dev blog 2, the pixel cloud. One of the main characteristics that both of these effects follow is that they're both based on a particle emitter rather than a material. This was my first time using the particle emitter within TouchDesigner; however, its customizability had the same characteristics seen within UE5. As a result, building this visualiser came to be quite an intuitive process. However, once I completed my first render I was struggling as to how to best incorporate the audio into the visualizer. I didn’t actually nail the effect until returning to it on multiple occasion. The action that really brought the effect success was separated the bass from the audio using the audio analyser. And then tying that value to the speed of the individual particles. I then used a series of mirrors to create the effect seen within the outputs page.

For my final dev blog, I aimed to create my most complex effect yet within TouchDesigner. This visualizer's main charecteristic is that it follows a pathway network for its visual style. Much like the prior visualizer, this one was uses a particle emitter for its foundation. The pathway structure is set up through a large series of nodes. The particle emitter then uses these pathways as the route for its movement. The effect is then ran through a feedback loop resulting in the whole particle structure seen within the output. I also ended up returning to this effect since I knew that I could create further visualzer's by using this one as the basis of the effect. The other visualzer made with this effect can also be seen within the output. 

© 2035 by Major. Powered and secured by Wix

bottom of page