Animations Implementation
Our implementation provided two main ways to experience ausio visualisation, Partile Visualiser and Shader Visualiser
- Particle Visualiser Animation
- Shader Visualiser
Particle Visualiser Animation
Overview
The first mode of particle animations we chose to implement is the Particle Visualiser Animation system which creates an interactive, physics-based particle effect that responds to user input and audio characteristics. Users can interact with a variety of particle types that each have unique physical properties and visual styles, creating a dynamic and engaging visual experience that accompanies music playback.
Key Technologies
The system is built using several key technologies:
- p5.js: A JavaScript library focused on creative coding and visual arts, providing a complete framework for drawing and animation
- TypeScript: Used to add type safety and improve code organisation
- Electron: For accessing local file paths and resources
- React: For integrating the animation system into the application
Architecture
The particle animation system follows a modular, object-oriented architecture with these core components:
- Particle Class: Represents individual particles with physics properties and rendering logic
- ParticleSystem Class: Manages collections of particles, their interactions, and lifecycle
- ParticleSelector: Handles particle type validation and selection
- particlePhysics: Defines physical properties for different types of particles
- Sketch Module: The main p5.js sketch that initialises the animation
The particle data is stored in a structured JSON file (particleList.json
), making it easy to add or modify particle types without changing the core code.
Implementation Details
Particle Physics and Behavior
Each particle type has unique physical characteristics defined in particleList.json
:
{
"id": "musicNotes",
"name": "musicNote",
"weight": 1,
"gravity": 0.15,
"bounce": 0.6,
"airResistance": 0.01,
"lifespan": 5000,
"glow": false,
"images": [],
"moods": ["happy", "energetic"],
"dir": "musicNotes", //directory unter which the particles images are stored
"count": 5 //the number of different particle images that exist for this particle group
}
These properties control how particles move and interact:
- Weight: Affects how forces impact the particle
- Gravity: Determines downward acceleration (can be negative for upward movement)
- Bounce: Controls elasticity in collisions
- Air Resistance: Simulates drag
- Lifespan: Sets how long the particle exists before fading out
- Glow: Optional visual effect
The Particle
class implements these physics properties with realistic motion equations:
update() {
// Get physics properties
const physics = getParticlePhysics(this.type);
// Apply gravity
const gravity = this.p.createVector(0, physics.gravity * physics.weight);
this.applyForce(gravity);
// Apply air resistance
const airResistance = this.vel.copy();
airResistance.mult(-physics.airResistance);
this.applyForce(airResistance);
// Update position based on velocity
this.vel.add(this.acc);
this.pos.add(this.vel);
this.acc.mult(0);
// Edge detection and bouncing
if (this.pos.x < 0 || this.pos.x > this.p.width) {
this.vel.x *= -physics.bounce;
// Push particles back into view
if (this.pos.x < 0) this.pos.x = 5;
if (this.pos.x > this.p.width) this.pos.x = this.p.width - 5;
}
// Decrease lifespan for gradual fading
this.lifespan -= 2;
}
Collision Detection and Response
The system implements two types of collisions:
- Particle-Particle Collisions: When particles interact with each other, using elastic collision physics
- Mouse-Particle Interactions: When users move their cursor near particles
The particle-particle collision code uses conservation of momentum principles:
checkCollision(other: Particle) {
const dx = other.pos.x - this.pos.x;
const dy = other.pos.y - this.pos.y;
const distance = Math.sqrt(dx * dx + dy * dy);
const minDist = 40; // Interaction radius
if (distance < minDist && distance > 0) {
// Calculate collision normal
const nx = dx / distance;
const ny = dy / distance;
// Calculate relative velocity
const dvx = other.vel.x - this.vel.x;
const dvy = other.vel.y - this.vel.y;
// Calculate impulse with bounce effect
const impulse = (dvx * nx + dvy * ny) * 0.8;
// Apply impulse to both particles
const impulseX = nx * impulse;
const impulseY = ny * impulse;
this.vel.x += impulseX;
this.vel.y += impulseY;
other.vel.x -= impulseX;
other.vel.y -= impulseY;
// Add separation to prevent overlap
const overlap = minDist - distance;
const separation = overlap * 0.7;
this.pos.x -= nx * separation;
this.pos.y -= ny * separation;
other.pos.x += nx * separation;
other.pos.y += ny * separation;
}
}
Mouse interactions create repulsion forces that push particles away from the cursor:
handleMouseCollision(mouseX, mouseY, mouseVelX, mouseVelY) {
const mouseRadius = 50;
const dx = this.pos.x - mouseX;
const dy = this.pos.y - mouseY;
const distance = Math.sqrt(dx * dx + dy * dy);
if (distance < mouseRadius) {
// Create repulsion force that's stronger when closer
const repulsionStrength = 1 - (distance / mouseRadius);
const baseForce = 15;
// Calculate force direction
let forceX = (dx / distance) * baseForce * repulsionStrength;
let forceY = (dy / distance) * baseForce * repulsionStrength;
// Add mouse velocity influence for momentum transfer
forceX += mouseVelX * 0.3;
forceY += mouseVelY * 0.3;
// Apply the force
this.vel.x += forceX;
this.vel.y += forceY;
}
}
Particle Lifecycle Management
The ParticleSystem
class manages the full lifecycle of particles:
- Creation: New particles are spawned at intervals or in response to user interactions
- Update Loop: Physics and positions are continuously updated
- Rendering: Particles are drawn to the screen with appropriate visual effects
- Deletion: Old or off-screen particles are removed to maintain performance
The system maintains a maximum number of particles to ensure consistent performance:
async addParticle(x, y, type, imageNum?) {
if (this.particles.length >= this.maxParticles) {
// Remove oldest particle when limit is reached
this.particles.shift();
}
const particle = new Particle(this.p, x, y, type, imageNum);
await particle.loadImage(imageNum);
this.particles.push(particle);
return particle;
}
Dynamic Image Loading
Each particle type uses one or more image assets that are loaded dynamically:
async loadImage(imageNum?) {
try {
let imagePath;
if (imageNum !== undefined) {
imagePath = await getRandomParticleImage(this.type, imageNum);
} else {
imagePath = await getRandomParticleImage(this.type);
}
// Load image with p5's loadImage
this.img = await new Promise((resolve, reject) => {
const img = this.p.loadImage(
imagePath,
() => resolve(img),
(err) => reject(err)
);
});
} catch (error) {
console.error('Error loading particle image:', error);
}
}
The getRandomParticleImage
function constructs the correct path for each particle type's images:
export const getRandomParticleImage = async (type, imageNumberDet?) => {
// Find particle config in JSON
const particle = particleListData.particles.find(p =>
p.name.toLowerCase() === type.toLowerCase());
// Determine which image variant to use
const imageNumber = imageNumberDet ??
(particle.count > 0 ? Math.floor(Math.random() * particle.count) + 1 : "");
// Construct relative path
const imageName = particle.count === 0
? `${particle.name.toLowerCase().replace(/\s+/g, '_')}.png`
: `${type.toLowerCase().replace(/\s+/g, '_')}${imageNumber}.png`;
const relativePath = `particles/${particle.dir}/${imageName}`;
// Use electron to get full path
return await window.electron.fileSystem.mergeAssetPath(relativePath);
};
User Interaction
The sketch module handles mouse and touch interactions to create an engaging experience:
p.mousePressed = async () => {
isMouseHeld = true;
isMouseOnCanvas = true;
// Create an initial burst of particles when mouse is pressed
if (particleSystem && isActive) {
createParticlesAtMouse(p, 5);
}
return false;
};
p.mouseReleased = () => {
isMouseHeld = false;
mouseHoldTimer = 0;
// Change particle type on release for variety
randomiseParticleType();
return false;
};
When users hold down the mouse or touch the screen, particles continuously spawn:
// In draw function:
if (isMouseHeld && particleSystem.particles.length < particleSystem.maxParticles) {
mouseHoldTimer++;
if (mouseHoldTimer >= 6) {
mouseHoldTimer = 0;
createParticlesAtMouse(p, 2);
}
}
Integration with Audio System
The particle system is designed to work with the audio visualisation and analysis:
- Particle Types: Different particle types can be associated with specific audio moods or characteristics
- Interactive Controls: The particle system can be activated/deactivated based on audio playback state
- Visual Feedback: The cursor overlay provides visual feedback that connects user interaction to the audio experience
Extensibility
The system is designed for easy extension in several ways:
- Adding New Particle Types: Simply add a new entry to the
particleList.json
file with appropriate physics properties - Mood Associations: Particles can be tagged with moods, enabling automatic selection based on music mood
- Custom Images: The folder structure and image naming conventions make it easy to add new visual assets
- Customisable Attributes: Simply change any one of the physics properties in the
particleList.json
Sequence Diagram: Particle Creation and Lifecycle
Performance Considerations
The particle system includes several optimisations:
- Particle Limit: A maximum number of particles prevents performance degradation
- Image Caching: Images are loaded once and reused
- Off-screen Clean-up: Particles that leave the visible area are removed
- Efficient Collision Detection: Collision checks are performed without expensive operations
Initial Implementation
Initially, the particle system was built with a static configuration. All physics properties and behavioral attributes of the particles were hardcoded in the particlePhysics.ts
file, while the particleList.json
file only contained a list of particle names. This meant that each particle type had predefined characteristics that could not be easily adjusted or modified at runtime.
As development progressed, we recognised the importance of making the system more flexible and customisable. One of the key factors driving this change was the sensitivity of the children using the application. Since each child has unique sensory preferences and needs, we wanted to ensure that the particle animations could be tailored to provide a comfortable and engaging experience for every user. Additionally, customisation was a core design goal of the application, allowing users to personalise interactions based on their preferences.
To achieve this, we refactored how particle data was stored and managed. Instead of relying on a static physics file, we moved all particle attributes—including physics properties, visual effects, and behavior—into particleList.json
. This allowed for dynamic modifications, enabling attributes to be adjusted in real-time or even new particle types to be added or removed directly within the app. This shift made the particle system more adaptable, ensuring that users could fine-tune their experience and developers could extend the system with ease.
Conclusion
The Particle Visualiser Animation system provides a flexible, interactive, and visually engaging experience that complements the audio playback features. Its object-oriented design and data-driven approach make it easy to maintain and extend, while the physics-based animation creates a natural and satisfying user interaction model.
Shader Visualiser
Integration with React
First, we need to integrate ThreeJS with React framework and Electron. To achieve this, I constructed the visualiser as a class-based React component, which allowed for more effective management of component state and lifecycle methods. The componentDidMount
lifecycle method was used to initialise the visualiser, load audio and visual assets, and establish the necessary event listeners.
Framebuffer Objects (FBO)
Having completed the preparatory tasks, the next phase involves constructing the actual particles. To facilitate this process, we utilised a WebGL feature known as Framebuffer Objects (FBO) as a utility to manage the texture data derived from a black-and-white image. Then we used the RGB value to find the correct pixel.
let pixels = [];
for (let i = 0; i < canvasData.length; i += 4) {
let x = (i / 4) % img.width;
let y = Math.floor(i / 4 / img.width);
if (canvasData[i] < 100) {
pixels.push({ x: x / canvas.width - 0.5, y: 0.5 - y / canvas.height });
}
}
By filtering out all black pixels, the FBO extracted and stored the coordinates of the desired (non-black) pixels into a data array. This data was then processed in an off-screen environment using a simulated scene, material, and mesh setup, enabling accurate rendering position calculations for later use. By using such technique, the performance of CPU can be greatly improved because more calculations will be handled by GPU.
Creating Particles
We created a BufferGeometry
that reads the texture data from the FBO. After retrieving the positions and the texture coordinates (uvs
), the shader used OpenGL Shading Language (GLSL) to program vertexShader
and fragmentShader
to define how graphics are rendered on the screen. In the vertexShader, the positions were read and mapped to fit the screen before being piped into the rendering process. Similarly, the fragmentShader assigns the particle colour uniformly across all the particles based on the given RGB value. In the end, ThreeJS will render the desired points on the screen based on the provided information, and display a particular shape based on the texture image (a snowlake, a dog etc.)
addObjects() {
// .... Read from the FBO positions
const color = this.state.track.textureColor;
this.material = new THREE.ShaderMaterial({
uniforms: {
time: { value: 0 },
uTexture: { value: this.positions },
uFrequency: { value: this.analyser.getAverageFrequency() },
uColor: {value: new THREE.Vector3(color[0], color[1], color[2])}
},
vertexShader: vertexShader,
fragmentShader: fragmentShader,
depthWrite: false,
depthTest: false,
transparent: true,
});
this.mesh = new THREE.Points(this.geometry, this.material);
this.scene.add(this.mesh);
}
Mouse Intersection
To implement the interactive function, the idea is when the user moves the mouse pointer through the particle shape, the points around will perform an effect as if been pushed away from the center, like waves or explosion. Before creating the actual effect, the app needs to figure out the position where the mouse intersects with the particles. To achieve this, we began with creating a mousemoveEventListener
to capture the current move position. Then, ThreeJS’ raycaster
will use the pointer information to calculate the intersection points with the rendered particles. We stored the result containing a coordinate in the this.simMaterial.uniforms.uMouse.value
, which was used later.
mouseEvents() {
this.planeMesh = new THREE.Mesh(
new THREE.PlaneGeometry(10, 10),
new THREE.MeshBasicMaterial(),
);
window.addEventListener('mousemove', (e) => {
this.pointer.x = (e.clientX / window.innerWidth) * 2 - 1;
this.pointer.y = -(e.clientY / window.innerHeight) * 2 + 1;
this.raycaster.setFromCamera(this.pointer, this.camera);
const intersects = this.raycaster.intersectObjects([this.planeMesh]);
if (intersects.length > 0) {
this.simMaterial.uniforms.uMouse.value = intersects[0].point;
}
});
}
In the FBO, the simFragmentShader
receives the mouse coordinates and uses GLSL to generate visual effects centered around the cursor. By simulating a gravitational force through mathematical functions, the particle’s velocity drops linearly when being pushed around the cursor, resulting in a smoother and more natural non-linear trajectory movement.
// simFragmentShader GLSL
varying vec2 vUv;
uniform sampler2D uCurrentPosition;
uniform sampler2D uOriginalPosition;
uniform float time;
uniform vec3 uMouse;
void main(){
vec2 position = texture2D( uCurrentPosition, vUv ).xy;
vec2 original = texture2D( uOriginalPosition , vUv ).xy;
vec2 force = original - uMouse.xy;
float len = length(force);
float forceFactor = 1./max(1., len * 100.);
vec2 positionToGo = original + normalize(force) * forceFactor * 0.1 ;
position.xy += (positionToGo - position) * 0.05;
gl_FragColor = vec4(position, 0., 1.);
}
Player Console
For user to have a better control over the music been played. We first designed a player console that provides the play, pause, drag, progress functionalities. The console was connected to the ThreeJS Audio and controls the behaviour of the player directly. By setting an internal clock, the progress bar was able to update itself by simply comparing the time difference from the last frame.
handleProgressClick(e) {
if (!e) return;
const rect = e.currentTarget.getBoundingClientRect();
const clickX =e.clientX - rect.left;
const clickProgress = (clickX / rect.width) * 100;
if (this.state.isPlaying) {
this.togglePlayPause()
}
this.audio.offset = (clickProgress * this.audio.buffer.duration) / 100;
this.setState({ progress: clickProgress });
}
updateProgress() {
const currentProgress = (this.dtime / this.audio.buffer.duration) * 100;
if (this.state.progress + currentProgress >= 100) {
this.setState({ progress: 0 });
this.setState({ isPlaying: false });
} else {
this.setState((prev) => ({ progress: prev.progress + currentProgress }));
}
}
Music Mapping
to correctly map the visuals to the corresponding music, we used React Router to pass the song data retrieved from the backend database. Before rendering the visualiser, we processed the JSON data and stored it using React Hooks. This ensured that each visualiser had access to the necessary metadata—such as texture, background, and texture color—allowing it to dynamically adapt its appearance based on the selected song.
useEffect(() => {
const loadTrack = async () => {
// Parse the texture color string into an array of integers
const convertColor = (color) => {
let colorData = color
.split(',') // Split by comma
.map(num => parseFloat(num/255));
return colorData
}
setTrack({
title: songDetail.title,
artist: songDetail.uploader,
albumArt: await findCompletePath(songDetail.jacket),
background: await findCompletePath(songDetail.shaderBackground),
audioPath: await findCompletePath(songDetail.audioPath),
texture: await findCompletePath(songDetail.shaderTexture),
textureColor: convertColor(String(songDetail.particleColour))
});
};
if (songDetail) {
loadTrack();
}
}, [songDetail]);