Skip to main content

Animations Implementation

Our implementation provided two main ways to experience ausio visualisation, Partile Visualiser and Shader Visualiser

Particle Visualiser Animation

Overview

The first mode of particle animations we chose to implement is the Particle Visualiser Animation system which creates an interactive, physics-based particle effect that responds to user input and audio characteristics. Users can interact with a variety of particle types that each have unique physical properties and visual styles, creating a dynamic and engaging visual experience that accompanies music playback.

Key Technologies

The system is built using several key technologies:

  1. p5.js: A JavaScript library focused on creative coding and visual arts, providing a complete framework for drawing and animation
  2. TypeScript: Used to add type safety and improve code organisation
  3. Electron: For accessing local file paths and resources
  4. React: For integrating the animation system into the application

Architecture

The particle animation system follows a modular, object-oriented architecture with these core components:

  1. Particle Class: Represents individual particles with physics properties and rendering logic
  2. ParticleSystem Class: Manages collections of particles, their interactions, and lifecycle
  3. ParticleSelector: Handles particle type validation and selection
  4. particlePhysics: Defines physical properties for different types of particles
  5. Sketch Module: The main p5.js sketch that initialises the animation

The particle data is stored in a structured JSON file (particleList.json), making it easy to add or modify particle types without changing the core code.

Implementation Details

Particle Physics and Behavior

Each particle type has unique physical characteristics defined in particleList.json:

{
"id": "musicNotes",
"name": "musicNote",
"weight": 1,
"gravity": 0.15,
"bounce": 0.6,
"airResistance": 0.01,
"lifespan": 5000,
"glow": false,
"images": [],
"moods": ["happy", "energetic"],
"dir": "musicNotes", //directory unter which the particles images are stored
"count": 5 //the number of different particle images that exist for this particle group
}

These properties control how particles move and interact:

  1. Weight: Affects how forces impact the particle
  2. Gravity: Determines downward acceleration (can be negative for upward movement)
  3. Bounce: Controls elasticity in collisions
  4. Air Resistance: Simulates drag
  5. Lifespan: Sets how long the particle exists before fading out
  6. Glow: Optional visual effect

The Particle class implements these physics properties with realistic motion equations:

update() {
// Get physics properties
const physics = getParticlePhysics(this.type);

// Apply gravity
const gravity = this.p.createVector(0, physics.gravity * physics.weight);
this.applyForce(gravity);

// Apply air resistance
const airResistance = this.vel.copy();
airResistance.mult(-physics.airResistance);
this.applyForce(airResistance);

// Update position based on velocity
this.vel.add(this.acc);
this.pos.add(this.vel);
this.acc.mult(0);

// Edge detection and bouncing
if (this.pos.x < 0 || this.pos.x > this.p.width) {
this.vel.x *= -physics.bounce;
// Push particles back into view
if (this.pos.x < 0) this.pos.x = 5;
if (this.pos.x > this.p.width) this.pos.x = this.p.width - 5;
}

// Decrease lifespan for gradual fading
this.lifespan -= 2;
}

Collision Detection and Response

The system implements two types of collisions:

  1. Particle-Particle Collisions: When particles interact with each other, using elastic collision physics
  2. Mouse-Particle Interactions: When users move their cursor near particles

The particle-particle collision code uses conservation of momentum principles:

checkCollision(other: Particle) {
const dx = other.pos.x - this.pos.x;
const dy = other.pos.y - this.pos.y;
const distance = Math.sqrt(dx * dx + dy * dy);
const minDist = 40; // Interaction radius

if (distance < minDist && distance > 0) {
// Calculate collision normal
const nx = dx / distance;
const ny = dy / distance;

// Calculate relative velocity
const dvx = other.vel.x - this.vel.x;
const dvy = other.vel.y - this.vel.y;

// Calculate impulse with bounce effect
const impulse = (dvx * nx + dvy * ny) * 0.8;

// Apply impulse to both particles
const impulseX = nx * impulse;
const impulseY = ny * impulse;

this.vel.x += impulseX;
this.vel.y += impulseY;
other.vel.x -= impulseX;
other.vel.y -= impulseY;

// Add separation to prevent overlap
const overlap = minDist - distance;
const separation = overlap * 0.7;
this.pos.x -= nx * separation;
this.pos.y -= ny * separation;
other.pos.x += nx * separation;
other.pos.y += ny * separation;
}
}

Mouse interactions create repulsion forces that push particles away from the cursor:

handleMouseCollision(mouseX, mouseY, mouseVelX, mouseVelY) {
const mouseRadius = 50;
const dx = this.pos.x - mouseX;
const dy = this.pos.y - mouseY;
const distance = Math.sqrt(dx * dx + dy * dy);

if (distance < mouseRadius) {
// Create repulsion force that's stronger when closer
const repulsionStrength = 1 - (distance / mouseRadius);
const baseForce = 15;

// Calculate force direction
let forceX = (dx / distance) * baseForce * repulsionStrength;
let forceY = (dy / distance) * baseForce * repulsionStrength;

// Add mouse velocity influence for momentum transfer
forceX += mouseVelX * 0.3;
forceY += mouseVelY * 0.3;

// Apply the force
this.vel.x += forceX;
this.vel.y += forceY;
}
}

Particle Lifecycle Management

The ParticleSystem class manages the full lifecycle of particles:

  1. Creation: New particles are spawned at intervals or in response to user interactions
  2. Update Loop: Physics and positions are continuously updated
  3. Rendering: Particles are drawn to the screen with appropriate visual effects
  4. Deletion: Old or off-screen particles are removed to maintain performance

The system maintains a maximum number of particles to ensure consistent performance:

async addParticle(x, y, type, imageNum?) {
if (this.particles.length >= this.maxParticles) {
// Remove oldest particle when limit is reached
this.particles.shift();
}

const particle = new Particle(this.p, x, y, type, imageNum);
await particle.loadImage(imageNum);
this.particles.push(particle);
return particle;
}

Dynamic Image Loading

Each particle type uses one or more image assets that are loaded dynamically:

async loadImage(imageNum?) {
try {
let imagePath;
if (imageNum !== undefined) {
imagePath = await getRandomParticleImage(this.type, imageNum);
} else {
imagePath = await getRandomParticleImage(this.type);
}

// Load image with p5's loadImage
this.img = await new Promise((resolve, reject) => {
const img = this.p.loadImage(
imagePath,
() => resolve(img),
(err) => reject(err)
);
});
} catch (error) {
console.error('Error loading particle image:', error);
}
}

The getRandomParticleImage function constructs the correct path for each particle type's images:

export const getRandomParticleImage = async (type, imageNumberDet?) => {
// Find particle config in JSON
const particle = particleListData.particles.find(p =>
p.name.toLowerCase() === type.toLowerCase());

// Determine which image variant to use
const imageNumber = imageNumberDet ??
(particle.count > 0 ? Math.floor(Math.random() * particle.count) + 1 : "");

// Construct relative path
const imageName = particle.count === 0
? `${particle.name.toLowerCase().replace(/\s+/g, '_')}.png`
: `${type.toLowerCase().replace(/\s+/g, '_')}${imageNumber}.png`;

const relativePath = `particles/${particle.dir}/${imageName}`;

// Use electron to get full path
return await window.electron.fileSystem.mergeAssetPath(relativePath);
};

User Interaction

The sketch module handles mouse and touch interactions to create an engaging experience:

p.mousePressed = async () => {
isMouseHeld = true;
isMouseOnCanvas = true;

// Create an initial burst of particles when mouse is pressed
if (particleSystem && isActive) {
createParticlesAtMouse(p, 5);
}
return false;
};

p.mouseReleased = () => {
isMouseHeld = false;
mouseHoldTimer = 0;
// Change particle type on release for variety
randomiseParticleType();
return false;
};

When users hold down the mouse or touch the screen, particles continuously spawn:

// In draw function:
if (isMouseHeld && particleSystem.particles.length < particleSystem.maxParticles) {
mouseHoldTimer++;
if (mouseHoldTimer >= 6) {
mouseHoldTimer = 0;
createParticlesAtMouse(p, 2);
}
}

Integration with Audio System

The particle system is designed to work with the audio visualisation and analysis:

  1. Particle Types: Different particle types can be associated with specific audio moods or characteristics
  2. Interactive Controls: The particle system can be activated/deactivated based on audio playback state
  3. Visual Feedback: The cursor overlay provides visual feedback that connects user interaction to the audio experience

Extensibility

The system is designed for easy extension in several ways:

  1. Adding New Particle Types: Simply add a new entry to the particleList.json file with appropriate physics properties
  2. Mood Associations: Particles can be tagged with moods, enabling automatic selection based on music mood
  3. Custom Images: The folder structure and image naming conventions make it easy to add new visual assets
  4. Customisable Attributes: Simply change any one of the physics properties in the particleList.json

Sequence Diagram: Particle Creation and Lifecycle

Performance Considerations

The particle system includes several optimisations:

  1. Particle Limit: A maximum number of particles prevents performance degradation
  2. Image Caching: Images are loaded once and reused
  3. Off-screen Clean-up: Particles that leave the visible area are removed
  4. Efficient Collision Detection: Collision checks are performed without expensive operations

Initial Implementation

Initially, the particle system was built with a static configuration. All physics properties and behavioral attributes of the particles were hardcoded in the particlePhysics.ts file, while the particleList.json file only contained a list of particle names. This meant that each particle type had predefined characteristics that could not be easily adjusted or modified at runtime.

As development progressed, we recognised the importance of making the system more flexible and customisable. One of the key factors driving this change was the sensitivity of the children using the application. Since each child has unique sensory preferences and needs, we wanted to ensure that the particle animations could be tailored to provide a comfortable and engaging experience for every user. Additionally, customisation was a core design goal of the application, allowing users to personalise interactions based on their preferences.

To achieve this, we refactored how particle data was stored and managed. Instead of relying on a static physics file, we moved all particle attributes—including physics properties, visual effects, and behavior—into particleList.json. This allowed for dynamic modifications, enabling attributes to be adjusted in real-time or even new particle types to be added or removed directly within the app. This shift made the particle system more adaptable, ensuring that users could fine-tune their experience and developers could extend the system with ease.

Conclusion

The Particle Visualiser Animation system provides a flexible, interactive, and visually engaging experience that complements the audio playback features. Its object-oriented design and data-driven approach make it easy to maintain and extend, while the physics-based animation creates a natural and satisfying user interaction model.