
Gamified Stereoscopic 3D Experience Design
💡 Game Design | Hybrid Learning | Interaction Design | Rapid Prototyping | System Design | User Testing | Sound Design
🛠️ Figma | Google Slides | Illustrator | Photoshop | After Effects | Unity | Unreal Engine
👥 Designers | Developers | Artists | Experts | Game Testers
Project Overview
The Middle Space Project is an immersive, joystick-controlled digital gallery experience built for naked-eye stereoscopic 3D displays. Visitors board the SpaceTrain and explore MiddleSpace—a procedurally generated galaxy filled with 3D art, fantasy environments, and interactive Points of Interest spread across 46 sectors.

The experience uses large, parallax-shifting LED monitors
to create a holographic depth effect without VR headsets. MiddleSpace spans 1,000 light-years, with evolving weather systems, creatures, worlds, and artist-built installations. Once the virtual SpaceTrain is aligned to its physical counterpart, each window becomes a portal where the real and digital worlds meet.
The Need
Traditional galleries limit digital art to flat displays or VR headsets. The Middle Space Project needed a way to showcase 3D and new media art at scale—without isolating visitors or breaking immersion. The goal was to build an accessible, large-format, headgear-free 3D exploration experience that merges physical and virtual space.

The Space Train
The physical space is a warehouse venue designed like a small luxury cruise cabin in space, The Space Train. A bar and central lounge seating give everyone a perfect view through the “windows,” which are actually large naked-eye stereoscopic 3D monitors displaying responsive views of the MiddleSpace galaxy.

At the front, the bridge holds the captain’s chairs, haptic seats, and the main joystick. A simple digital navigation interface sits beside it, letting guests chart courses and explore as if they were steering a real starship.

The Game Mechanics
The SpaceTrain’s “windows” are actually naked-eye 3D LED monitors installed in the physical fuselage. Inside Unreal Engine, the SpaceTrain is built to the exact same dimensions as the real structure, including every window. Each virtual window has its own dedicated virtual camera, positioned and angled to match its real-world counterpart.
That means when the 3D monitors display the feed from their assigned virtual cameras, the world outside the train appears to line up perfectly with the physical windows—creating the illusion that you’re actually looking out into MiddleSpace. No headsets, no glasses, just pure parallax depth that moves when you move.

World Building
MiddleSpace’s core gameplay is built from real constraints—time, distance, and rate of travel—which become the primary mechanics of the experience. Because the galaxy’s scale and travel rates were mathematically recalibrated to ensure realism and meaningful stakes, navigation itself becomes the “game.” Players must make trade-offs: choosing which sectors to visit, which Points of Interest to skip, and how to plan efficient routes through a world that’s simply too large to see all at once. That scarcity creates opportunity cost, pacing, and reward.
Hybrid Interface
I conducted early user testing with experts at Carnegie Mellon’s Entertainment Technology Center (ETC) to evaluate feasibility, comfort, and interaction flow. Their feedback showed that difficulty and controls must scale to the physical environment—favoring simple, lightweight mechanics such as travel constraints, gentle motion paths, and randomized weather systems. These insights helped calibrate the experience so it feels intuitive and accessible inside a real-world installation.

Navigation App
The galaxy maps, sector views, and system-level screens share one clear visual language, so players always know their current location, selected route, and time-to-destination. Phenomena, POIs, and artifacts are layered in without cluttering the main navigation, giving guests reasons to explore. The result is a cohesive navigation system that feels intuitive in a dark, social, real-world venue and still supports deep, replayable exploration of MiddleSpace.
![]() | ![]() | ![]() | ![]() |
|---|---|---|---|
![]() | ![]() | ![]() | ![]() |
![]() | ![]() | ![]() |
Sound Design
MiddleSpace uses Sony 360 Reality Audio to spatialize every sound as a 3D object—engine hums, nebula storms, POI signals, cosmic winds. Instead of using fixed left-right channels, the audio is mapped as positional objects inside the galaxy, so the soundscape shifts naturally as the SpaceTrain moves or turns.
The system is paired with environmental haptics embedded in the physical space. Low-frequency vibrations, subtle rumbles, and directional feedback sync with the 360 audio, making movement, acceleration, weather changes, and cosmic events feel physically present. Together, the sound and haptics create a deep sensory illusion—you don’t just see MiddleSpace around you, you feel it.
.png)











