Deep Dives
Case Studies
Otherworld Philadelphia
21 Puzzles · 18 Rooms · 150+ Projectors · #7 USA Today Best New Attractions
Designed and developed 21 interactive puzzles across 18 immersive rooms for a permanent, 24/7 art installation. Integrated 50+ computers, 150+ projectors, 30+ screens, and 20+ sensors into a unified system running a cohesive 4-character narrative — all built for long-term reliability with remote support.
Experience Snapshot
Experience Visual
Design Decisions
- Isolated each room on its own VLAN so one crash can't cascade
- Built self-healing watchdog processes instead of relying on staff restarts
- Chose LiDAR over camera tracking for privacy in a public venue
- Designed puzzles with progressive difficulty so casual visitors still feel rewarded
- Used 12-channel spatial audio per room to avoid bleed between adjacent spaces
- Created content timelines that loop seamlessly during low-traffic periods
Prototyping + Testing
Prototyped each puzzle mechanic in isolation before integrating into the full room system. Ran week-long soak tests with simulated guest loads to catch memory leaks and sensor drift. Calibration routines were tested overnight to validate LiDAR accuracy over time.
What Went Wrong + How We Solved It
Problem: With 50+ networked machines running simultaneously, early testing revealed cascading failures — one room crashing could ripple into adjacent rooms sharing network resources. Sensor drift on the LiDAR arrays caused false triggers in high-traffic areas.
Fix: Isolated each room onto its own VLAN with independent watchdog processes and auto-restart sequences. Implemented health-check heartbeats so the central dashboard could flag and reboot individual machines without affecting the rest of the venue. For LiDAR drift, built calibration routines that run during overnight downtime and added dead-zone filtering to ignore edge-case noise. The system now self-heals without staff intervention.
Responsibilities
- Led experience design for 21 interactive puzzles and 4 character narrative arcs
- Developed custom software applications for real-time interactivity and exhibit management
- Designed UX flows, audience flow diagrams, story arcs, and copy
- Integrated and stabilized 50+ computers, 20+ sensors, 30+ screens, and 150+ projection systems
- Built 12-channel spatial audio systems and room-specific audio environments
- Created content timelines and puzzle sequences for each exhibit
- Provided team onboarding and ongoing remote support infrastructure
Tech Stack
System Architecture
Constraints
Outcomes
GitHub Universe 2024
3 Interactive Exhibits · Flagship Developer Conference · San Francisco
Produced three interactive exhibits for GitHub's flagship developer conference: a survey-driven projection experience, a 270-degree data visualization of attendee GitHub profiles, and a conference-wide scavenger hunt — all designed to collect research data while reinforcing community engagement.
Experience Snapshot
Experience Visual
Design Decisions
- Cached GitHub API results locally to survive rate limiting during peak hours
- Pre-fetched trending handles during off-peak to speed up lookups
- Made survey flow under 60 seconds so lines stayed short
- Designed balloon visuals to be unique per response — every attendee got something shareable
- Routed all data to a real-time research dashboard so the team could adjust questions mid-event
Prototyping + Testing
Stress-tested the GitHub API integration with simulated concurrent requests to find the rate limit ceiling. Ran the full survey-to-balloon pipeline with test data to tune the generation speed. Validated the research dashboard with sample data before the event.
What Went Wrong + How We Solved It
Problem: The GitHub API rate limits nearly throttled The Core during peak hours. With hundreds of attendees searching their handles simultaneously, the system started queuing requests and the particle visualization stalled.
Fix: Implemented an aggressive local cache layer — once a handle was looked up, the data was stored locally for the rest of the conference. Added a pre-fetch queue that pulled popular/trending handles during off-peak. The result was sub-second lookups for returning queries and graceful degradation for new ones during spikes.
Responsibilities
- Technical production for all three exhibits from concept through strike
- Designed and built "Join The Fold" — survey responses generated custom hot air balloons launched into a projection-mapped bay scene
- Developed "The Core" — 270° projection with real-time particle field pulling GitHub account stats by handle
- Created "GitHunt" — conference-wide scavenger hunt with 10 hidden Mona Cats and gift card rewards
- Built real-time data pipeline so research team could review survey results live during the event
- Coordinated A/V integration, projection alignment, and interactive hardware
Tech Stack
System Architecture
Constraints
Outcomes
TKE Building / Gensler
Largest LED Wall in North America · LEED Gold · American Architecture Award · AGC First Place
Built the content management system and auto-failover infrastructure for the largest LED wall in North America. The system handles smart scheduling, brightness scaling, holiday programming, and real-time generative visuals driven by elevator position — all managed remotely with 24/7 uptime.
Experience Snapshot
Experience Visual
Design Decisions
- Built dual-server auto-failover so the wall never goes dark
- Reverse-engineered the undocumented elevator API rather than waiting for vendor docs
- Added graceful degradation — elevator feed drops → smooth ambient fallback
- Designed CMS for non-technical building staff with playlist/schedule/brightness controls
- Tied LED content to building lighting system for cohesive exterior appearance
Prototyping + Testing
Logged raw elevator data packets for weeks to map the undocumented protocol. Simulated network failures to validate auto-failover switching time. Tested the CMS with building staff during a training session to catch UX issues before handoff.
What Went Wrong + How We Solved It
Problem: The elevator API was undocumented and intermittent — it would drop connection unpredictably, causing the generative visuals to freeze mid-animation. The building's IT team had no technical documentation on the elevator data protocol.
Fix: Reverse-engineered the elevator data feed by logging raw packets over several weeks. Built a resilient connection handler with automatic reconnection, data smoothing (to fill gaps during dropouts), and a graceful fallback to ambient generative content when the elevator feed goes offline. The wall never shows a frozen frame — it degrades into a beautiful ambient state instead.
Responsibilities
- Engineered auto-failover hardware architecture for continuous, unattended operation
- Built custom CMS with playlist creation, smart scheduling, and holiday programming
- Developed sunset brightness timer for automatic brightness scaling throughout the day
- Integrated elevator position API to drive real-time generative visuals on the LED wall
- Created lighting sync/control system tying LED content to building lighting
- Built secure 24/7 remote support and monitoring dashboard
- Calibrated LED wall for seamless panel alignment
Tech Stack
System Architecture
Constraints
Outcomes
HBO: Bleed for the Throne
SXSW · Cannes Gold Lion · Cannes Silver Lion · Grand Clio · 2x Gold Clio · 2x D&AD Wood Pencil
Developed the wireless headphone audio system for HBO's multi-room "Bleed for the Throne" blood donation activation at SXSW 2019. The system synced personalized audio to video based on each guest's room location, creating a spatially-aware multi-channel experience as people moved through the space.
Experience Snapshot
Experience Visual
Design Decisions
- Used beacon proximity over GPS for indoor room detection accuracy
- Added debounce logic so room transitions don't flip-flop in overlap zones
- Adjusted beacon power levels to create clean handoff corridors
- Synced audio to video via timecode — drift beyond 50ms was perceptible
- Designed the system to handle headset disconnects gracefully (auto-reconnect on re-entry)
Prototyping + Testing
Tested beacon overlap zones during load-in by walking the full guest path with test headsets. Measured audio-video sync drift across multiple room transitions. Ran concurrent headset stress tests to find the wireless channel capacity ceiling.
What Went Wrong + How We Solved It
Problem: During early load-in testing, the wireless headphone system dropped audio when too many guests clustered in the transition zone between rooms. The room-detection beacons had overlapping ranges, causing the app to rapidly switch audio channels — creating a jarring "flip-flop" effect.
Fix: Added a debounce layer to the room-detection logic — the system required a sustained signal from a new room's beacon before switching audio channels. Also adjusted beacon power levels and placement to create cleaner handoff zones. The transition became seamless, and guests never noticed the switch.
Responsibilities
- Developed wireless headphone app for room-based audio synchronization
- Built real-time location tracking to detect which room each guest occupied
- Designed audio-to-video sync engine that delivered room-specific audio and visual cues
- Engineered multi-channel audio routing for personalized per-guest experiences
- Coordinated integration with theatrical design and video playback systems
Tech Stack
System Architecture
Constraints
Outcomes
Fortnite World Cup
Arthur Ashe Stadium · 360° Stage Mapping · Live Broadcast · $30M Prize Pool
Built the generative visual system for the inaugural Fortnite World Cup at Arthur Ashe Stadium. In-game player actions drove real-time visual compositions mapped across a 360-degree stage environment, integrated directly into the global live broadcast reaching millions of viewers.
Experience Snapshot
Experience Visual
Design Decisions
- Built a throttle/smoothing layer to spread burst events across a 250ms window
- Separated generative engine from broadcast feed so a render spike can't black-out the stream
- Mapped specific game events to distinct visual motifs so the audience could "read" the gameplay
- Used multiple media servers for 360° output redundancy
- Designed all visuals to be camera-friendly — colors and contrast tuned for broadcast
Prototyping + Testing
Ran rehearsal matches with live data to tune the event-to-visual mapping. Stress-tested burst packets (mass elimination scenarios) to validate the throttle layer. Checked broadcast feed quality on-site with the broadcast engineering team before the event.
What Went Wrong + How We Solved It
Problem: During rehearsals, the game data API would occasionally send burst packets — hundreds of events in a single frame — when a major in-game moment happened (like a mass elimination). This caused the generative engine to spike and drop frames on the broadcast feed.
Fix: Built a throttle/smoothing layer between the data ingestion and the generative engine. Major events were queued and spread across a short animation window (250ms) instead of all hitting in a single frame. This preserved the dramatic visual impact while keeping the render pipeline stable. The broadcast feed ran clean for the entire 3-day event.
Responsibilities
- Developed generative content system driven by real-time in-game player actions
- Mapped generative visuals across 360-degree stage surround at Arthur Ashe Stadium
- Integrated visual output into the live broadcast pipeline for global streaming
- Built data ingestion layer to translate gameplay events into visual triggers
- Coordinated with broadcast engineering team for seamless feed integration
Tech Stack
System Architecture
Constraints
Outcomes
H-E-B Immersive Dinner
10,000 x 2,000 Resolution · Course-Synced Show Control · Shorty Award Finalist
Built the show control system and generative content pipeline for an immersive dining experience inside a 10,000 x 2,000 pixel wraparound display. Content transported diners through Texas farmland, orchards, and coastal scenes — timed to each course of the meal — while coordinating lighting, audio, and video into a seamless multi-sensory experience.
Experience Snapshot
Experience Visual
Design Decisions
- Switched from timecode-based to cue-based show control to absorb kitchen timing variance
- Built extendable ambient loops so scenes stretch/compress naturally between cues
- Gave kitchen staff a one-button tablet interface — no training required
- Separated video, lighting, and audio buses so any one can be adjusted without affecting others
- Designed all transitions to be gradual (no hard cuts) so diners never feel jarred
Prototyping + Testing
Simulated variable kitchen timing by randomizing cue intervals during rehearsals. Tested multi-server sync by measuring pixel-level seam alignment on the wraparound display. Ran full dress rehearsal dinners with test audiences to validate pacing and atmosphere.
What Went Wrong + How We Solved It
Problem: Kitchen timing is inherently unpredictable — courses could arrive 5 minutes early or 15 minutes late depending on the evening. The original show control was timecode-based, which meant the visual scene would either finish too early (leaving dead air) or get cut short when the next course arrived.
Fix: Redesigned the show control to be cue-based instead of timecode-based. Each course had a start cue triggered by the kitchen team (via a simple tablet interface), and the content was structured with extendable ambient loops that could stretch or compress gracefully. Transitions always looked intentional, regardless of when the kitchen called the next course.
Responsibilities
- Designed and built show control system coordinating video, lighting, and audio playback
- Created generative content pipeline for 10,000 x 2,000 pixel wraparound display
- Produced content sequences synced to each course of the dining menu
- Built scene transitions timed to kitchen service — farmland, orchards, coastal scenes
- Integrated lighting control for immersive atmosphere changes per course
- Managed multi-server playback for seamless high-resolution output