John Middlehurst
Developer and designer of real time interactive experiences and content
Immersive experiential design is the ultimate tool for self-expression. Through creative exploration and a passion for artistic development, I can help make your ideas come to life - let me craft and develop designs that transform your dreams into reality
Welcome
About Me
I am an interactive audio visual artist and developer, who creates immersive and engaging experiences. I use Touchdesigner and Unreal engine, to combine any types of input device or sensor, with any output hardware or media, to achieve the desired product. I have a background in computer science, digital media, and music production, and love to combine elements of all these fields into engaging, captivating experiences.
I am a problem solver who can work independently and collaboratively alongside clients, who can take ideas through experimentation, prototyping, to deliver completed installations.
What I can do
Services I Provide
Touchdesigner
Unreal Engine
Creative Technical Services
Creative technical services for immersive experiences, specialising in system design and programming of sensor inputs, interactive integration, and end output
Audio-visual Content Design
Design of audio-visual content for use in installations, marketing campaigns, or live performances, specialising in Touchdesigner and Unreal Engine
Interactive Installation Development
Real-Time interactive installation development and design using Touchdesigner and Unreal Engine. Turning client visions from concept to reality
Stuff I've done
Click items for more info
Projects & Collaborations
Immersive Dining Projection
Projections and content themed around an immersive dining experience
3D Head Tracking Using Kinect
Head tracking in real space to create a powerful illusion of depth and perspective
Quest For Eternal Aether
A kinetic laser installation commissioned for Burning Baer arts festival in Berlin
Facial Gesture 3D Motion Control
Using Facial position and gestures to illicit visual 3D responses in real-time
Interactive Strings
A Kinect sensor tracks and detects hand and body movements, which are used to strum the mixed reality strings shown in the final composite output for visual and audio effect
Unreal Engine Cinematic
Cinematic sequence created in UE5 using camera shake effects
Point Cloud Manipulation
Lidar scanning photogrammetry used in 3d motion content
Audio-Reactive Planet Visualisations
An audio-reactive visual created for Afrikaburn arts festival
Interactive Tunnels
Control over tunnel visual generation using hardware midi controllers
Interactive Generative Art
Manipulated in real time via midi control units
Audio-Visual Reactive Projection Mapping
Audio reactive real-time visual projections for various live music events
Myth of the Ego
A virtual reality soundscape in collaboration with ism.earth