Phase Roadmap
All 16 phases are approved. Phases 1-11 are deployed and live. Phases 12-16 are scaffolded (types, stores, service stubs, panel UIs, Cloud Function stubs) but their ML/compute backends are not yet connected.
Phase Status
| Phase | Feature | Status | Key Files |
|---|---|---|---|
| 1 | Auth (email/password + Google OAuth) | Live | AuthPage, useAuthStore, auth.ts |
| 2 | Landing page with WebGL hero | Live | yugma-landing/ |
| 3 | Scene templates + loader | Live | sceneTemplates.ts, ProjectInfoPanel |
| 4 | Share links (URL-encoded) | Live | shareLinkService, urlEncoder |
| 5 | Library panel (primitives + GLTF + Sketchfab) | Live | LibraryPanel/ |
| 6 | Real-time collaboration (room codes, cursors) | Live | collabService, CollabOverlay |
| 7 | AI Scene Composer | Live | AIPanel, aiCompose, aiSerializer |
| 8 | Text-to-3D (Meshy) | Live | GenerateSection, generationService |
| 9 | Advanced collab (comments, feed) | Live | CommentPin, ChangeFeed |
| 10 | Export (GLB, screenshot, embed) | Live | exportUtils, EmbedPage |
| 11 | AI Materials (30 presets, AI gen) | Live | materialPresets, AITextureSection |
| 12 | Video-to-3D reconstruction | Scaffolded | video.types, useVideoStore, VideoPanel |
| 13 | Industrial digital twins | Scaffolded + Mounted | twin.types, useTwinStore, TwinPanel |
| 14 | Product vibe-coder | Scaffolded | product.types, useProductStore, ProductPanel |
| 15 | Factory simulation (physics) | Scaffolded + Wrapped | physics.types, usePhysicsStore, PhysicsWorld |
| 16 | AI cinematic director | Scaffolded | cinematic.types, useCinematicStore, CinematicPanel |
AI Enhancements (Cross-Phase)
These enhancements improve the AI layer across all phases:
| Enhancement | Status | Files |
|---|---|---|
| Spatial preprocessor (circle/grid/stack/spiral/line/scatter) | Complete | spatialPreprocessor.ts |
| Cross-session memory (Firestore) | Complete | aiService.ts (loadLastSession/saveAISession) |
| Style fingerprint + memory | Complete | styleFingerprint.ts |
| Planner→executor decomposition (3D-GPT) | Complete | aiCompose.ts (PLANNING_SYSTEM_PROMPT) |
| Real focus_camera (smooth tween) | Complete | CameraController.tsx, useSceneStore |
| YSL v1.5 schema (userData, semanticRole, relationships) | Complete | scene.types.ts, aiSerializer.ts |
What Each Scaffolded Phase Needs to Go Live
Phase 12 — Video-to-3D
Build a Cloud Run orchestrator that chains: SAM2 segmentation → monocular depth estimation → object classification → auto-rigging → scene composition. The Cloud Function startVideoReconstruction creates the job doc; the orchestrator processes it and writes back resultGlbUrl.
Phase 13 — Digital Twins
- Deploy RTDB rules for
/sensors/*(firebase deploy --only database) - Build an MQTT→RTDB bridge (Cloud Run or Cloud Function) that receives sensor readings and writes to
sensors/{id}/latest - The
useSensorSubscriptionshook +pushReading→updateObjectpipeline is already wired end-to-end
Phase 14 — Product Vibe-Coder
- Set
NEXAR_API_KEYsecret for the Octopart/Nexar GraphQL API - Replace the mock data in
octopartProxy.tswith real API calls - Wire enclosure geometry into SceneRenderer (convert
Enclosure→ box with mounting hole subtractions)
Phase 15 — Factory Simulation
pnpm --filter yugma-app add @react-three/rapier @dimforge/rapier3d-compat- Replace
PhysicsWorld.tsxstub body with<Physics paused={!running}>{children}</Physics> - Add
RigidBodywrappers per SceneObject based onusePhysicsStore.bodies
Phase 16 — AI Cinematic Director
- Build
renderStoryboardCloud Function: headless R3F render → ffmpeg → video upload - Wire
CINEMATIC_DIRECTOR_PROMPTinto a new Cloud Function that takes a scene + brief and returns aStoryboardJSON - Build a shot timeline player in CinematicPanel that drives
setCameraTargetper shot