Work
Selected work exploring real-time performance, motion capture and virtual production for live theatre, immersive storytelling and creative education.
NYT Digital Accelerator: Real-Time Motion Capture Systems
2024–2025
Development and deployment of a portable, suit-free motion capture pipeline for live performance and rehearsal contexts.
The system integrates real-time facial and full-body tracking into Unreal Engine, supporting digital doubles within standard theatre environments without reliance on specialist studio infrastructure.
Presented through live demonstrations and workshops across multiple programme iterations.
StoryFutures VR Performance Environment
2025
Design and build of a cross-platform immersive environment for live performance, developed in Unity and deployed via Spatial.io.
The project explored real-time hybrid performance, enabling actors to perform within a virtual environment accessible across VR and web platforms.
Delivered as part of National Youth Theatre’s StoryFest programme.
Real-Time Unreal Short Film
Eden’s Veil
2025
A fully real-time short film developed in Unreal Engine, exploring ecological collapse and posthuman themes through procedural environments and cinematic virtual production workflows.
The project utilised custom real-time pipelines, digital cinematography techniques and performance-driven animation.
Markerless Motion Capture Pipeline: Independent R&D
2024–Present
Ongoing development of a modular, rehearsal-room-ready motion capture system designed to operate using consumer hardware, optional depth sensors and custom Unreal Engine blueprints.
The pipeline prioritises portability, rapid setup and performer usability, enabling integration into live creative environments.
Markerless Motion Capture Pipeline: Independent R&D
2024–Present
Ongoing development of a modular, rehearsal-room-ready motion capture system designed to operate using consumer hardware, optional depth sensors and custom Unreal Engine blueprints.
The pipeline prioritises portability, rapid setup and performer usability, enabling integration into live creative environments.
Real-Time Unreal Short Film
Closed System
2026
Closed System is a real-time experimental animation developed in Unreal Engine, examining capitalist realism through the spatial logic of a city that circulates within itself.
The film follows steam rising from a domestic interior, transforming into cloud and rain, before descending through drainage infrastructure into the urban underlayer below. As the camera shifts perspective, luminous advertising that promises aspiration and upward mobility begins to read differently from the bottom of the hierarchy.
Set within a hyper-commercialised, emissive cityscape, the work explores how environments naturalise ideology. Billboards glow with messages of merit and inevitability while the system quietly reabsorbs everything it produces. Movement, circulation, and containment function as metaphors for a world without an outside.
The version presented here is a real-time screen capture recorded directly from the Unreal Engine viewport via OBS. Due to computing limitations, a full offline cinematic render was not feasible; however, this preserves the work as it was experienced in-engine, maintaining its original lighting, motion and atmosphere.
Body Parts: Edinburgh Fringe (Upcoming Aug 2026)
Technological Design & Motion Capture Integration
Body Parts is a new play written by Iola King Alleyne and produced for the Edinburgh Fringe 2026. The production explores female desire, ageing, inheritance and generational shame through an intimate four-character structure set in rural isolation.
The work combines stripped-back staging with carefully integrated digital elements, maintaining an actor-led aesthetic while introducing technological layers that explore fragmentation, exposure and embodiment.
Role: Technological Design Lead
Alensi Studios Ltd will act as the Technological Design Lead, supplying and integrating all production technology.
The motion capture workflow used in this production was independently developed as part of my ongoing research into rehearsal-room-ready virtual production systems. Unlike conventional pipelines designed for film or gaming environments, this process was built specifically for live theatre.
The system uses pre-recorded motion capture, modelled around the actor’s physicality, to generate projected digital imagery that extends and fragments the body on stage. It does not rely on live tracking during performance, ensuring stability, repeatability and minimal technical risk within a Fringe venue.
Key characteristics:
Suit-free motion capture
No fixed installations
Fully portable and Fringe-viable
Actor-centred design
Integrated through standard projection infrastructure
This workflow was previously showcased through the National Youth Theatre’s Digital Accelerator programme and later embedded into audience-facing performance at StoryFutures. Body Parts marks its first presentation at a Fringe festival.
Production Scope (Alensi Studios)
For this production, Alensi Studios provides:
Motion capture hardware and system integration
Projection-ready digital assets
Playback and cueing systems
Technical rehearsal integration
On-site technological supervision
Projector, computers and technology associated
The technology is used selectively and dramaturgically, ensuring that performance remains the primary focus.