Case Studies

Otherworld Philadelphia
2024

Otherworld Philadelphia

21 Puzzles · 18 Rooms · 150+ Projectors · #7 USA Today Best New Attractions

Designed and developed 21 interactive puzzles across 18 immersive rooms for a permanent, 24/7 art installation. Integrated 50+ computers, 150+ projectors, 30+ screens, and 20+ sensors into a unified system running a cohesive 4-character narrative — all built for long-term reliability with remote support.

Experience Snapshot

Audience + Setting Walk-in guests, permanent 18-room art installation (Philadelphia)
Intent Guide visitors through 21 puzzles woven into a 4-character narrative
Core Mechanic Room-by-room sensor-triggered interactions (LiDAR, motion, touch, object tracking)
Journey Enter → discover rooms → solve puzzles → follow character arcs → complete narrative
Constraints 24/7 uptime, all ages, 50+ networked machines, remote-only support
Role Senior Creative Technologist
Team 3 core + expanded crew
Duration Permanent installation
Client Otherworld

Experience Visual

Room 1
Puzzle A
Room 2
Puzzle B
Room 3
Puzzle C
Room 4
Puzzle D
Character Arc Threads
Character 1
Character 2
Character 3
Character 4
Sensors Per Room
LiDAR · Motion · Touch · Object Tracking
Narrative Finale

Design Decisions

  • Isolated each room on its own VLAN so one crash can't cascade
  • Built self-healing watchdog processes instead of relying on staff restarts
  • Chose LiDAR over camera tracking for privacy in a public venue
  • Designed puzzles with progressive difficulty so casual visitors still feel rewarded
  • Used 12-channel spatial audio per room to avoid bleed between adjacent spaces
  • Created content timelines that loop seamlessly during low-traffic periods

Prototyping + Testing

Prototyped each puzzle mechanic in isolation before integrating into the full room system. Ran week-long soak tests with simulated guest loads to catch memory leaks and sensor drift. Calibration routines were tested overnight to validate LiDAR accuracy over time.

What Went Wrong + How We Solved It

Problem: With 50+ networked machines running simultaneously, early testing revealed cascading failures — one room crashing could ripple into adjacent rooms sharing network resources. Sensor drift on the LiDAR arrays caused false triggers in high-traffic areas.

Fix: Isolated each room onto its own VLAN with independent watchdog processes and auto-restart sequences. Implemented health-check heartbeats so the central dashboard could flag and reboot individual machines without affecting the rest of the venue. For LiDAR drift, built calibration routines that run during overnight downtime and added dead-zone filtering to ignore edge-case noise. The system now self-heals without staff intervention.

Responsibilities

  • Led experience design for 21 interactive puzzles and 4 character narrative arcs
  • Developed custom software applications for real-time interactivity and exhibit management
  • Designed UX flows, audience flow diagrams, story arcs, and copy
  • Integrated and stabilized 50+ computers, 20+ sensors, 30+ screens, and 150+ projection systems
  • Built 12-channel spatial audio systems and room-specific audio environments
  • Created content timelines and puzzle sequences for each exhibit
  • Provided team onboarding and ongoing remote support infrastructure

Tech Stack

TouchDesigner GLSL LiDAR Projection Mapping Spatial Audio LED Mapping Motion Tracking Object Tracking Touchscreen Volumetric LED

System Architecture

Central Show Control
Management + Monitoring Dashboard
Room Group Controller
Proj Map
LED Map
LiDAR
12ch Spatial Audio
Room Group Controller
Proj Map
LED Map
Touch
Touchscreens & Tablets
Room Group Controller
Proj Map
LED Map
Motion
Cameras & Surveillance
× 18 rooms

Constraints

Uptime
24/7 permanent operation, 7 days/week with no planned downtime windows
Scale
50+ computers, 150+ projectors, 20+ sensors, 30+ screens across 18 rooms
Safety
Public-facing with continuous guest flow — all interactions must be safe for all ages
Install
Permanent venue — systems must survive years of continuous use with remote-only support

Outcomes

21
Interactive puzzles deployed
18
Fully integrated rooms
24/7
Continuous uptime
#7
USA Today Best New Attractions 2024
Immersive Show Control Multi-Room 24/7 Permanent
GitHub Universe 2024
2024

GitHub Universe 2024

3 Interactive Exhibits · Flagship Developer Conference · San Francisco

Produced three interactive exhibits for GitHub's flagship developer conference: a survey-driven projection experience, a 270-degree data visualization of attendee GitHub profiles, and a conference-wide scavenger hunt — all designed to collect research data while reinforcing community engagement.

Experience Snapshot

Audience + Setting Developer conference attendees, Fort Mason convention center (San Francisco)
Intent Engage developers with personalized data visualizations while collecting research data
Core Mechanic Survey → generative balloon, GitHub handle → particle viz, QR scan → scavenger hunt
Journey Approach exhibit → interact (survey/search/scan) → see personalized output → share
Constraints 2-day window, thousands of simultaneous attendees, live research data pipeline
Role Technical Producer
Team 3 core + install crew
Duration 2-day conference
Client GitHub

Experience Visual

Join The Fold
Attendee
Take Survey (<60 sec)
Generate Unique Balloon
Launch into Bay Scene
The Core
Attendee
Enter Handle
GitHub API → Cache Layer
Particle Viz (270° Display)
GitHunt
Attendee
Scan QR Code
Track Progress (10 Mona Cats)
Claim Reward (Gift Card)
Research Dashboard
Real-Time Survey Data · Live Adjustable Qs

Design Decisions

  • Cached GitHub API results locally to survive rate limiting during peak hours
  • Pre-fetched trending handles during off-peak to speed up lookups
  • Made survey flow under 60 seconds so lines stayed short
  • Designed balloon visuals to be unique per response — every attendee got something shareable
  • Routed all data to a real-time research dashboard so the team could adjust questions mid-event

Prototyping + Testing

Stress-tested the GitHub API integration with simulated concurrent requests to find the rate limit ceiling. Ran the full survey-to-balloon pipeline with test data to tune the generation speed. Validated the research dashboard with sample data before the event.

What Went Wrong + How We Solved It

Problem: The GitHub API rate limits nearly throttled The Core during peak hours. With hundreds of attendees searching their handles simultaneously, the system started queuing requests and the particle visualization stalled.

Fix: Implemented an aggressive local cache layer — once a handle was looked up, the data was stored locally for the rest of the conference. Added a pre-fetch queue that pulled popular/trending handles during off-peak. The result was sub-second lookups for returning queries and graceful degradation for new ones during spikes.

Responsibilities

  • Technical production for all three exhibits from concept through strike
  • Designed and built "Join The Fold" — survey responses generated custom hot air balloons launched into a projection-mapped bay scene
  • Developed "The Core" — 270° projection with real-time particle field pulling GitHub account stats by handle
  • Created "GitHunt" — conference-wide scavenger hunt with 10 hidden Mona Cats and gift card rewards
  • Built real-time data pipeline so research team could review survey results live during the event
  • Coordinated A/V integration, projection alignment, and interactive hardware

Tech Stack

TouchDesigner Projection Mapping REST APIs GitHub API Real-Time Data Viz Generative Particles Interactive Survey

System Architecture

Join The Fold
Survey Kiosk
Balloon Generator
Projection Map (Bay Scene)
The Core
Handle Search UI
GitHub API Query
Particle Field (270° Surround)
GitHunt
QR Scan Stations
Progress Tracker
Reward Validator (Gift Card)
Research Dashboard
Real-Time Data

Constraints

Latency
GitHub API queries had to resolve and render particle visualizations within seconds to keep attendees engaged
Scale
Thousands of conference attendees cycling through 3 exhibits simultaneously across 2 days
Data
Research team needed real-time access to survey data — no batch processing, live review during the event
Install
Convention center load-in with strict union labor rules and tight setup windows

Outcomes

3
Interactive exhibits shipped
270°
Projection surround for The Core
10
Hidden Mona Cats for GitHunt
Live
Real-time survey insights for research team
Conference Data Viz Interactive Survey Projection
TKE Building / Gensler
2022

TKE Building / Gensler

Largest LED Wall in North America · LEED Gold · American Architecture Award · AGC First Place

Built the content management system and auto-failover infrastructure for the largest LED wall in North America. The system handles smart scheduling, brightness scaling, holiday programming, and real-time generative visuals driven by elevator position — all managed remotely with 24/7 uptime.

Experience Snapshot

Audience + Setting Building occupants + public passersby, corporate HQ exterior (Atlanta)
Intent Dynamic LED facade that responds to building activity and time of day
Core Mechanic Elevator position API drives real-time generative visuals on the LED wall
Journey Approach building → see ambient content → patterns shift as elevators move → sunset dims the wall
Constraints 24/7 with zero-downtime failover, remote-only management, non-technical staff
Role Lead Developer
Team 3 core
Duration Permanent installation
Client Gensler / TK Elevator

Experience Visual

Ambient
Default Generative
Elevator Driven
Generative
Sunset Dimming
Auto-scale brightness
Holiday Override
Special Event · CMS-scheduled
Ambient elevator data Elevator Driven
Elevator Driven feed drops Ambient
Ambient sunset trigger Sunset Dimming

Design Decisions

  • Built dual-server auto-failover so the wall never goes dark
  • Reverse-engineered the undocumented elevator API rather than waiting for vendor docs
  • Added graceful degradation — elevator feed drops → smooth ambient fallback
  • Designed CMS for non-technical building staff with playlist/schedule/brightness controls
  • Tied LED content to building lighting system for cohesive exterior appearance

Prototyping + Testing

Logged raw elevator data packets for weeks to map the undocumented protocol. Simulated network failures to validate auto-failover switching time. Tested the CMS with building staff during a training session to catch UX issues before handoff.

What Went Wrong + How We Solved It

Problem: The elevator API was undocumented and intermittent — it would drop connection unpredictably, causing the generative visuals to freeze mid-animation. The building's IT team had no technical documentation on the elevator data protocol.

Fix: Reverse-engineered the elevator data feed by logging raw packets over several weeks. Built a resilient connection handler with automatic reconnection, data smoothing (to fill gaps during dropouts), and a graceful fallback to ambient generative content when the elevator feed goes offline. The wall never shows a frozen frame — it degrades into a beautiful ambient state instead.

Responsibilities

  • Engineered auto-failover hardware architecture for continuous, unattended operation
  • Built custom CMS with playlist creation, smart scheduling, and holiday programming
  • Developed sunset brightness timer for automatic brightness scaling throughout the day
  • Integrated elevator position API to drive real-time generative visuals on the LED wall
  • Created lighting sync/control system tying LED content to building lighting
  • Built secure 24/7 remote support and monitoring dashboard
  • Calibrated LED wall for seamless panel alignment

Tech Stack

Custom CMS TouchDesigner GLSL LED Mapping Auto-Failover Elevator API Remote Monitoring Generative Content

System Architecture

Custom CMS (Web App)
Playlist
Schedule
Brightness
Special Event
Primary Media Server
Failover Media Server
Monitoring Dashboard
24/7
auto-switch on failure
LED Wall Controller
Largest in North America
Sunset Brightness Timer
Elevator API
→ Generative Visuals
Lighting Sync Control

Constraints

Uptime
24/7 with zero-downtime failover — the wall is a defining feature of the building's exterior
Remote
Fully remote management — no on-site technical staff, all updates and troubleshooting done remotely
Scale
Largest LED wall in North America (as of 2023) — massive pixel count requiring precise calibration
Usability
Non-technical building staff need to update content, adjust schedules, and manage playlists daily

Outcomes

24/7
Continuous uptime with failover
LEED
Gold Certification (2022)
AAA
American Architecture Award
AGC
Build Georgia Award — First Place
Architecture CMS LED Award-Winning Permanent
HBO Game of Thrones SXSW
2019

HBO: Bleed for the Throne

SXSW · Cannes Gold Lion · Cannes Silver Lion · Grand Clio · 2x Gold Clio · 2x D&AD Wood Pencil

Developed the wireless headphone audio system for HBO's multi-room "Bleed for the Throne" blood donation activation at SXSW 2019. The system synced personalized audio to video based on each guest's room location, creating a spatially-aware multi-channel experience as people moved through the space.

Experience Snapshot

Audience + Setting SXSW attendees, multi-room blood donation activation (Austin)
Intent Immersive audio-visual journey that syncs to each guest's physical movement
Core Mechanic Wireless headphones auto-switch audio channels based on room-detection beacons
Journey Enter → receive headphones → move through rooms → audio follows you → donate blood → exit
Constraints Medical environment, high-profile press, hundreds of concurrent wireless headsets
Role Senior Creative Technologist
Scope Multi-channel audio system
Duration SXSW 2019 (Austin, TX)
Client HBO

Experience Visual

Entry
Receive Headset
Room 1
Audio Ch. A
handoff zone
Room 2
Audio Ch. B
handoff zone
Room 3
Audio Ch. C
handoff zone
Beacon Proximity Layer
Debounce logic prevents flip-flop in overlap zones between rooms
Donate
Blood → Exit

Design Decisions

  • Used beacon proximity over GPS for indoor room detection accuracy
  • Added debounce logic so room transitions don't flip-flop in overlap zones
  • Adjusted beacon power levels to create clean handoff corridors
  • Synced audio to video via timecode — drift beyond 50ms was perceptible
  • Designed the system to handle headset disconnects gracefully (auto-reconnect on re-entry)

Prototyping + Testing

Tested beacon overlap zones during load-in by walking the full guest path with test headsets. Measured audio-video sync drift across multiple room transitions. Ran concurrent headset stress tests to find the wireless channel capacity ceiling.

What Went Wrong + How We Solved It

Problem: During early load-in testing, the wireless headphone system dropped audio when too many guests clustered in the transition zone between rooms. The room-detection beacons had overlapping ranges, causing the app to rapidly switch audio channels — creating a jarring "flip-flop" effect.

Fix: Added a debounce layer to the room-detection logic — the system required a sustained signal from a new room's beacon before switching audio channels. Also adjusted beacon power levels and placement to create cleaner handoff zones. The transition became seamless, and guests never noticed the switch.

Responsibilities

  • Developed wireless headphone app for room-based audio synchronization
  • Built real-time location tracking to detect which room each guest occupied
  • Designed audio-to-video sync engine that delivered room-specific audio and visual cues
  • Engineered multi-channel audio routing for personalized per-guest experiences
  • Coordinated integration with theatrical design and video playback systems

Tech Stack

Wireless Headphones Room-Based Audio Sync Location Tracking Multi-Channel Audio Video Sync Custom App

System Architecture

Show Control Server
Video Playback + Audio Routing Engine
Room 1
Beacon
Room 2
Beacon
Room N
Beacon
multi-room layout
Wireless Headset App
per guest
Room Detect← beacon proximity
Audio Switch← sync to room video
Playback Sync← timecode locked

Constraints

Latency
Audio had to sync with video within perceptible thresholds — any drift broke the theatrical immersion
Scale
Hundreds of concurrent guests, each with their own wireless headset, moving through rooms at their own pace
Reliability
SXSW activation — no second chances, high-profile press and influencer attendance
Safety
Blood donation activation — medical environment with strict crowd management and flow control

Outcomes

2x
Cannes Lions (Gold + Silver)
Grand
Clio Award — Experiential/Events
2x
Gold Clio Awards
2x
D&AD Wood Pencils
Brand Activation Spatial Audio Multi-Room Award-Winning
Fortnite World Cup
2019

Fortnite World Cup

Arthur Ashe Stadium · 360° Stage Mapping · Live Broadcast · $30M Prize Pool

Built the generative visual system for the inaugural Fortnite World Cup at Arthur Ashe Stadium. In-game player actions drove real-time visual compositions mapped across a 360-degree stage environment, integrated directly into the global live broadcast reaching millions of viewers.

Experience Snapshot

Audience + Setting 23,000 live audience + millions streaming, Arthur Ashe Stadium (NYC)
Intent Real-time visual spectacle on a 360° stage driven by live gameplay
Core Mechanic In-game events (kills, eliminations, wins) trigger generative visuals mapped to the stage
Journey Game starts → player actions happen → stage reacts in real-time → broadcast captures it all
Constraints Zero dropped frames on live broadcast, 3-day event, tight integration with broadcast truck
Role Senior Creative Technologist
Scope Stage visuals + broadcast
Duration 3-day event (NYC)
Client Epic Games / Fortnite

Experience Visual

Game Server
Live Match Data
kills, eliminations, wins
Event Mapping
Action → Motif · Throttle Layer (250ms smooth)
kill = pulse
elimination = burst
victory = cascade
Generative Engine
TouchDesigner
Media Servers
360° Stage Output
Redundant (A/B)
Broadcast Feed
Global Stream
Isolated Bus
LED + Projection
Stage Surround (Arthur Ashe Stadium)

Design Decisions

  • Built a throttle/smoothing layer to spread burst events across a 250ms window
  • Separated generative engine from broadcast feed so a render spike can't black-out the stream
  • Mapped specific game events to distinct visual motifs so the audience could "read" the gameplay
  • Used multiple media servers for 360° output redundancy
  • Designed all visuals to be camera-friendly — colors and contrast tuned for broadcast

Prototyping + Testing

Ran rehearsal matches with live data to tune the event-to-visual mapping. Stress-tested burst packets (mass elimination scenarios) to validate the throttle layer. Checked broadcast feed quality on-site with the broadcast engineering team before the event.

What Went Wrong + How We Solved It

Problem: During rehearsals, the game data API would occasionally send burst packets — hundreds of events in a single frame — when a major in-game moment happened (like a mass elimination). This caused the generative engine to spike and drop frames on the broadcast feed.

Fix: Built a throttle/smoothing layer between the data ingestion and the generative engine. Major events were queued and spread across a short animation window (250ms) instead of all hitting in a single frame. This preserved the dramatic visual impact while keeping the render pipeline stable. The broadcast feed ran clean for the entire 3-day event.

Responsibilities

  • Developed generative content system driven by real-time in-game player actions
  • Mapped generative visuals across 360-degree stage surround at Arthur Ashe Stadium
  • Integrated visual output into the live broadcast pipeline for global streaming
  • Built data ingestion layer to translate gameplay events into visual triggers
  • Coordinated with broadcast engineering team for seamless feed integration

Tech Stack

TouchDesigner GLSL Game Data API 360° Mapping Media Server Live Broadcast Generative Content

System Architecture

Fortnite Game Server
Live
player actions / events
Data Ingestion
Event → Visual Trigger Mapping
Generative Engine
TouchDesigner · Real-Time Render
Media Servers
360° Stage Output
LED + Projection
Stage Surround (Arthur Ashe)
Broadcast Feed
Global Live Stream

Constraints

Latency
Visuals had to react to in-game events within frames — any delay would desync from the live commentary and crowd energy
Broadcast
Output fed directly into a live global stream — zero tolerance for crashes, glitches, or black frames
Scale
23,000-seat stadium with 360-degree stage mapping — massive pixel surface area across multiple media servers
Coordination
Tight integration between game servers, generative engine, media servers, and broadcast truck — all live

Outcomes

360°
Stage visual surround
23K
Stadium seats — live audience
$30M
Prize pool — inaugural event
0
Dropped frames during broadcast
Esports Generative 360° Broadcast Live
HEB Immersive Dinner
2022

H-E-B Immersive Dinner

10,000 x 2,000 Resolution · Course-Synced Show Control · Shorty Award Finalist

Built the show control system and generative content pipeline for an immersive dining experience inside a 10,000 x 2,000 pixel wraparound display. Content transported diners through Texas farmland, orchards, and coastal scenes — timed to each course of the meal — while coordinating lighting, audio, and video into a seamless multi-sensory experience.

Experience Snapshot

Audience + Setting VIP dinner guests, wraparound display dining room (San Antonio)
Intent Course-synced immersive atmosphere that transports diners through Texas landscapes
Core Mechanic Kitchen cues trigger scene transitions on a 10,000 x 2,000 pixel display
Journey Seated → first course cue → farmland scene → course arrives → next cue → orchard → coastal finale
Constraints Kitchen timing varies nightly, dining environment (no visible tech/noise), multi-server sync
Role Lead Developer
Scope Show control + content
Duration Multi-evening event
Client H-E-B

Experience Visual

Cue 1
Kitchen Trigger
Farmland
Scene
ambient loop ↻
Cue 2
Kitchen Trigger
Orchard
Scene
ambient loop ↻
Cue 3
Kitchen Trigger
Coastal
Scene
ambient loop ↻
Cue 4
Kitchen Trigger
Finale
Scene
← ambient loops stretch/compress to absorb variable kitchen timing →

Design Decisions

  • Switched from timecode-based to cue-based show control to absorb kitchen timing variance
  • Built extendable ambient loops so scenes stretch/compress naturally between cues
  • Gave kitchen staff a one-button tablet interface — no training required
  • Separated video, lighting, and audio buses so any one can be adjusted without affecting others
  • Designed all transitions to be gradual (no hard cuts) so diners never feel jarred

Prototyping + Testing

Simulated variable kitchen timing by randomizing cue intervals during rehearsals. Tested multi-server sync by measuring pixel-level seam alignment on the wraparound display. Ran full dress rehearsal dinners with test audiences to validate pacing and atmosphere.

What Went Wrong + How We Solved It

Problem: Kitchen timing is inherently unpredictable — courses could arrive 5 minutes early or 15 minutes late depending on the evening. The original show control was timecode-based, which meant the visual scene would either finish too early (leaving dead air) or get cut short when the next course arrived.

Fix: Redesigned the show control to be cue-based instead of timecode-based. Each course had a start cue triggered by the kitchen team (via a simple tablet interface), and the content was structured with extendable ambient loops that could stretch or compress gracefully. Transitions always looked intentional, regardless of when the kitchen called the next course.

Responsibilities

  • Designed and built show control system coordinating video, lighting, and audio playback
  • Created generative content pipeline for 10,000 x 2,000 pixel wraparound display
  • Produced content sequences synced to each course of the dining menu
  • Built scene transitions timed to kitchen service — farmland, orchards, coastal scenes
  • Integrated lighting control for immersive atmosphere changes per course
  • Managed multi-server playback for seamless high-resolution output

Tech Stack

TouchDesigner Show Control GLSL Generative Content Multi-Server Playback Lighting Control Spatial Audio

System Architecture

Show Control Master
Course Cues
Timecode
Manual Override
cue triggers
Video Servers
Multi
Lighting Control
Audio System
Wraparound Display
10,000 × 2,000 px + Ambient Lighting + Spatial Audio

Constraints

Resolution
10,000 x 2,000 pixels — massive render pipeline requiring multi-server synchronized playback
Timing
Content transitions had to sync with live kitchen service — course timing varies per evening
Coordination
Video, lighting, and audio had to transition together — any desync broke the immersive atmosphere
Environment
Dining environment — can't have loud equipment noise, visible cables, or any tech that disrupts the meal

Outcomes

10K
Pixel-wide wraparound display
100%
Show sync accuracy per evening
Finalist
Shorty Award (2023)
Multi
Evening runs — repeatable and reliable
Dining Show Control Generative Award Finalist High-Res