Building Clayable - A 3D Sculpting Web Application
Author: magical paperclipProject Overview
The goal of this project was to create an interactive 3D sculpting application that simulates the experience of molding clay through a web browser. The application needed to provide real-time vertex manipulation, intuitive controls, and a responsive user experience across both desktop and mobile platforms.
Foundation and Initial Setup
The project began with establishing a basic Three.js environment to render 3D graphics in the browser. Three.js was chosen for its comprehensive 3D graphics capabilities and extensive documentation.
// Initial Three.js scene setup
let scene = new THREE.Scene();
let geometry = new THREE.SphereGeometry(2, 64, 32); // High-resolution sphere
let material = new THREE.MeshLambertMaterial({ color: 0xe8c291 });
let sphere = new THREE.Mesh(geometry, material);
The sphere geometry uses 64 horizontal and 32 vertical subdivisions, resulting in approximately 4,000 vertices. This high resolution was necessary to achieve smooth deformation during sculpting operations.
Core Sculpting Algorithm
The primary challenge was implementing real-time vertex deformation that mimics the behavior of physical clay. A 3D mesh consists of thousands of individual vertices, and when sculpting real clay, applying pressure to one area affects surrounding material with diminishing intensity based on distance.
The sculpting system required three fundamental operations:
- Raycasting - Determining the precise 3D coordinates where the user interacted with the mesh
- Proximity Detection - Identifying vertices within the influence radius of the interaction point
- Vertex Transformation - Applying appropriate mathematical transformations to affected vertices
Implementation Details
moldClay(x, y, z, isTouch = false) {
let pos = new THREE.Vector3(x, y, z); // Interaction point in 3D space
let geom = this.ball.geometry;
let verts = geom.attributes.position.array; // Vertex position data
// Iterate through all vertices in the mesh
for (let i = 0; i < verts.length; i += 3) {
let v = new THREE.Vector3(verts[i], verts[i + 1], verts[i + 2]);
let dist = v.distanceTo(pos); // Euclidean distance calculation
if (dist < this.size) { // Within influence radius
// Calculate falloff factor using quadratic curve
let factor = Math.pow(1 - (dist / this.size), 2);
this[this.tool](i, v, pos, factor, isTouch);
}
}
// Signal geometry update to rendering engine
geom.attributes.position.needsUpdate = true;
geom.computeVertexNormals();
}
Mathematical Foundation
The falloff calculation Math.pow(1 - (dist / this.size), 2)
implements a quadratic decay function:
- Normalization:
dist / this.size
converts the distance to a value between 0 and 1 - Inversion:
1 - (normalized distance)
creates a value where 1 represents maximum influence - Quadratic Curve:
Math.pow(..., 2)
applies exponential decay, creating more natural deformation
This approach ensures vertices at the interaction point receive maximum transformation (factor = 1), while vertices at the brush boundary receive minimal transformation (factor ≈ 0).
Sculpting Tool Implementation
The application implements five distinct sculpting tools, each employing different mathematical approaches to vertex manipulation. Each tool simulates a specific clay working technique through calculated vertex transformations.
Push Tool - Inward Deformation
The push tool simulates pressing into clay material, creating indentations by moving vertices along their normal vectors in the inward direction.
push(i, v, pt, factor, isTouch) {
let dir = v.clone().normalize(); // Unit vector from origin to vertex
let amt = this.str * factor * (isTouch ? 6 : 5); // Scaled displacement amount
// Apply inward displacement along normal vector
this.verts[i] -= dir.x * amt;
this.verts[i + 1] -= dir.y * amt;
this.verts[i + 2] -= dir.z * amt;
}
Mathematical Explanation: The normalize()
method converts the direction vector to unit length (magnitude = 1), enabling precise control over displacement magnitude. The transformation applies negative displacement along the vertex normal, creating inward deformation.
Smooth Tool - Surface Regularization
The smooth tool implements surface regularization by interpolating vertex positions toward their original coordinates, effectively removing surface irregularities.
smooth(i, v, pt, factor, isTouch) {
// Reference to original vertex position
let original = new THREE.Vector3(this.origPos[i], this.origPos[i + 1], this.origPos[i + 2]);
let amt = this.str * factor * (isTouch ? 3 : 2.5) * 1.2;
// Linear interpolation toward original position
this.verts[i] += (original.x - this.verts[i]) * amt;
this.verts[i + 1] += (original.y - this.verts[i + 1]) * amt;
this.verts[i + 2] += (original.z - this.verts[i + 2]) * amt;
}
Mathematical Explanation: The expression (original.x - this.verts[i])
calculates the displacement vector from current to original position. Multiplying by the amount factor (amt
) implements partial interpolation, creating gradual smoothing over multiple iterations rather than immediate position snapping.
The complete tool set includes push, pull, smooth, pinch, and inflate operations, each implementing specialized vertex transformation algorithms to achieve distinct clay manipulation effects.
User Experience Optimization
Achieving natural, responsive interaction required extensive parameter tuning and behavioral analysis. The challenge was translating physical clay manipulation properties into digital equivalents that felt intuitive across different input methods.
Critical Parameters
Several key parameters required careful calibration:
- Brush radius: Determines the spatial extent of vertex influence during sculpting operations
- Deformation strength: Controls the magnitude of vertex displacement per interaction
- Falloff characteristics: Defines how deformation intensity decreases with distance from the interaction point
- Input sensitivity: Manages the relationship between user input and resulting deformation
Cross-Platform Input Handling
Different input methods require distinct calibration approaches. Touch interfaces typically demand more pronounced visual feedback compared to precise mouse inputs due to the inherent differences in user expectations and interaction patterns.
let amt = this.str * factor * (isTouch ? 6 : 5); // Platform-specific multipliers
The multiplier values (6 for touch, 5 for mouse) were determined through iterative testing to achieve comparable perceived responsiveness across input methods. Touch interactions utilize higher multipliers to compensate for the less precise nature of finger-based input.
User Interface Design
The interface design prioritized accessibility and functionality while minimizing visual obstruction of the 3D workspace. After evaluating multiple design approaches, a glassmorphism aesthetic was selected for its modern appearance and functional transparency.
Design Requirements
The control interface needed to satisfy several constraints:
- Maintain accessibility across desktop and mobile platforms
- Avoid obscuring the primary 3D visualization area
- Provide intuitive tool selection and parameter adjustment
- Support responsive layout adaptation
.controls {
position: fixed; bottom: 30px; left: 50%;
transform: translateX(-50%); z-index: 10;
background: rgba(0, 0, 0, 0.8);
-webkit-backdrop-filter: blur(10px);
backdrop-filter: blur(10px);
}
The implementation utilizes backdrop filtering to create visual depth while maintaining interface legibility. Multiple color themes were integrated to provide visual variety and accommodate different user preferences.
Mobile Platform Adaptation
Implementing touch-based interactions presented unique challenges in translating 2D screen coordinates to 3D world space. Touch events require fundamentally different handling compared to traditional mouse inputs due to the distinct interaction paradigms.
Touch Event Processing
function onTouchStart(e) {
e.preventDefault(); // Prevent default browser touch behaviors
if (e.touches.length === 1) { // Single-touch interaction only
dragging = true;
let touch = e.touches[0]; // Primary touch point
let rect = renderer.domElement.getBoundingClientRect();
// Convert screen coordinates to normalized device coordinates
mouse.x = ((touch.clientX - rect.left) / rect.width) * 2 - 1;
mouse.y = -((touch.clientY - rect.top) / rect.height) * 2 + 1;
// Perform raycasting to determine 3D intersection point
raycaster.setFromCamera(mouse, cam);
let hits = raycaster.intersectObject(clay.ball);
if (hits.length > 0) { // Successful intersection detected
let pt = hits[0].point; // Extract 3D coordinate
clay.moldClay(pt.x, pt.y, pt.z, true); // Execute sculpting operation
}
}
}
Coordinate System Transformation
The coordinate conversion process involves several mathematical transformations:
- Screen Space to Percentage:
touch.clientX / rect.width
converts pixel position to relative position (0-1) - Percentage to NDC:
(percentage * 2) - 1
transforms to normalized device coordinates (-1 to 1) - NDC to World Space: Raycasting projects the 2D coordinate into 3D world space through camera transformation matrices
Normalized Device Coordinates (NDC) represent a standardized coordinate system where the screen center is (0,0), left/bottom edges are -1, and right/top edges are +1. This system enables consistent 3D graphics calculations across different screen resolutions and aspect ratios.
Keyboard Shortcuts for Better Workflow
After testing it for a while, I quickly realized that constantly clicking buttons was disrupting the flow, it felt like having to put down your clay and walk across the room to pick up a different tool every time you wanted to switch modes.
To solve this, I implemented keyboard shortcuts for common actions:
1-5
= Switch between sculpting tools instantlyr
= Reset the clay to original spherespace
= Toggle automatic rotation (helpful for viewing your work)t
= Switch between light and dark themes
function onKey(e) {
if (e.key.toLowerCase() === 'r') reset(); // Reset clay geometry
if (e.key === ' ') { // Spacebar for auto-rotation
e.preventDefault(); // Prevent page scrolling
autoSpin = !autoSpin; // Toggle rotation state
}
// Tool selection mapping
let toolMap = {'1': 'push', '2': 'pull', '3': 'smooth', '4': 'pinch', '5': 'inflate'};
if (toolMap[e.key]) { // Number keys 1-5
tool = toolMap[e.key]; // Switch to selected tool
clay.setTool(tool);
}
}
Technical Challenges and Solutions
Building Clayable presented several interesting technical hurdles that required creative solutions.
Here are the main challenges I encountered and how I approached solving them:
Performance: Maintaining 60 FPS with 4000 Dynamic Vertices
One of the biggest challenges was keeping the app running smoothly while manipulating thousands of vertices in real-time. Every time you click, the system needs to check all ~4000 vertices to see which ones should move. If this was done naively, this would make the app crawl at around 5 FPS.
The solution was implementing spatial optimization - only checking vertices that are actually close to where you clicked. It’s like only moving the clay where your finger touches instead of checking the entire sculpture every single time you touched the clay.
Distance-Based Filtering Algorithm:
// Only process vertices within brush influence radius
for (let i = 0; i < vertices.length; i += 3) {
let dx = vertices[i] - clickX;
let dy = vertices[i + 1] - clickY;
let dz = vertices[i + 2] - clickZ;
let distance = Math.sqrt(dx * dx + dy * dy + dz * dz);
if (distance <= brushRadius) {
// Apply transformation only to nearby vertices
applyVertexTransformation(i, distance);
}
}
This approach reduces computational complexity from O(n) for all vertices to O(k) where k represents only vertices within the influence radius, typically reducing processing requirements by 80-90%.
To break that up let’s define O(n) and O(k) along with a few other key terms: O(n): Checks all vertices, slower. O(k): Checks only nearby/affected vertices, faster. Influence radius: The area where changes matter. 80-90% reduction: Most vertices are ignored, so less work is needed.
Analogy Imagine you’re looking for people in a crowd who can hear you speak. Instead of asking everyone (O(n)), you only ask those standing close enough (O(k)), saving time and effort.
To rephrase, the original statement means that instead of checking every vertex (O(n) complexity), the algorithm only checks a smaller group of vertices (O(k)), where k is the number of vertices affected by the operation (those within the “influence radius”). This targeted approach can make the algorithm much faster—often requiring 80-90% less processing—because it ignores vertices that aren’t relevant.
Raycasting: Converting 2D Clicks to 3D Positions
Here’s the problem, your screen is flat (2D), but the clay ball exists in 3D space. When you click at pixel (200, 300) on your screen, where exactly did you mean to click on the 3D ball floating in space?
Think of it like this - imagine that you’re looking through a window at a basketball in your backyard. If you point at the ball with your finger against the window glass, your finger touches a specific spot on the window (2D), but you’re actually trying to point at a specific spot on the basketball (3D). The question is: which part of the basketball were you really pointing at?
The solution is raycasting - imagine shooting a laser pointer from your eye, through the spot where your finger touches the window, and out into the backyard. Wherever that laser beam hits the basketball is exactly where you meant to point. That’s essentially what raycasting does: it shoots an invisible ray from the camera through your click point into 3D space.
Real-world analogy - it’s like those old movies where someone shoots through a window. The bullet goes from the gun (camera), through the hole in the glass (your click), and hits something in the distance (the 3D object). We’re just calculating where the “bullet” lands.
The Math Behind It (the coordinate transformations): Think of this like giving directions to someone:
- Screen pixels → “Click the spot 200 pixels from the left, 300 from the top”
- Normalized Device Coordinates → “That’s actually the center-right area of the screen” (converts to -1 to 1 range)
- Camera space → “From the camera’s perspective, you’re looking slightly to the right and down”
- World space → “In the actual 3D world, that ray hits this exact coordinate”
// Convert screen coordinates to world space ray
raycaster.setFromCamera(normalizedCoords, camera);
let intersections = raycaster.intersectObject(clayMesh);
if (intersections.length > 0) {
let worldPosition = intersections[0].point;
// Apply sculpting transformation at the exact 3D location
}
It’s like having a really good translator who can convert “I pointed here on my phone screen” into “you meant this exact spot on the clay ball in 3D space.”
Memory Management: Storing Vertex Data Efficiently
The smooth tool needs to remember what the original ball looked like so it can restore vertices toward their initial positions. With 4000 vertices × 3 coordinates × 32-bit precision, this adds up to significant memory usage.
Optimization Strategy:
- Store original geometry once at startup (
this.origPos = [...this.verts]
) - Use typed arrays (Float32Array) for better memory layout
- Avoid creating new objects every frame to prevent garbage collection issues
- Minimize dynamic allocations during runtime
Technical Skills Developed
Real-Time 3D Graphics: I gained hands-on experience with vertex manipulation, understanding how to move thousands of points smoothly while keeping everything responsive. This involved learning about GPU-CPU communication, buffer management, and rendering optimization.
Cross-Platform Input Handling: Dealing with mouse events, touch gestures, and keyboard shortcuts taught me that different platforms really do “speak different languages.” Each input method required its own approach while maintaining a consistent user experience.
User Experience Design: Making virtual clay “feel” right was harder than any of the math involved. I learned that good UX requires balancing technical precision with intuitive design - if the clay doesn’t feel natural to sculpt, people won’t use it.
Performance Optimization: Keeping 60 FPS with thousands of calculations per frame taught me to think carefully about algorithmic complexity. I learned to identify bottlenecks and implement targeted optimizations rather than premature optimization.
Mathematical Implementation: Translating concepts like coordinate transformations, distance calculations, and falloff curves into working code gave me practical experience with applied math in real projects.
The Final Result
You can try out Clayable at clayable.vercel.app. It works on both desktop and mobile, includes 5 different sculpting tools, multiple color themes, and honestly feels much closer to real clay than I thought would be possible when I started.
The complete source code is available at github.com/magical-paperclip/clayable if you want to see how everything works under the hood or learn from my mistakes.
Future Plans
I’m considering several improvements for future versions:
- Undo/Redo System: Because everyone makes mistakes and Ctrl+Z should work everywhere
- Additional Tools: Flatten, grab, and maybe even a knife tool for cutting pieces
- Texture Painting: Sculpt the shape first, then paint colors and patterns on top
- 3D Export: Save sculptures as .obj or .stl files for 3D printing
- Collaborative Sculpting: Multiple people sculpting the same clay ball in real-time (chaotic but fun)