Skip to content

Custom World Renderer Strategy

Goal

Build an in-house rendering and viewing pipeline that replaces the external BlueMap CLI while supporting:

  1. 3D visualization of each editor world snapshot.
  2. Live player markers on top of that 3D scene.
  3. Multi-layer “fusion” of stacked gameplay worlds.
  4. Transfer zones that push players into the corresponding Minestom instance when they cross configured areas.
  5. Per-map configuration stored in MongoDB and editable from the dashboard.
  6. On-demand rendering triggered together with the existing world-data snapshotter.

Phase Plan

Phase 1 – Configuration & Storage

  • Extend MapVersion with a BlueMapConfig block (layers, viewerSettings, transferZones).
  • REST (MapRoutes) + frontend store/UI to edit that config alongside map versions.
  • Decide on JSON schema (e.g., layer slug, offsets, visibility; zone shape + target instance).

Status: Implemented (backend schema + dashboard editor, Oct 2024).

Phase 2 – Renderer Core

  • Create a new WorldRenderService module invoked by WorldDataGenerationService after snapshots complete.
  • Input: snapshot directory (region/, node/, etc.) + BlueMapConfig.
  • Output: world-data/<slug>/render/manifest.json plus chunk meshes (start with simplified geometry, no textures).
  • Architecture:
    • Chunk reader → voxel mesh generator (surface only initially).
    • Optional layer fusion (merge multiple source snapshots based on config).
    • Material metadata stubbed for future texture work.

🚧 In progress: currently writes metadata manifest; mesh/tile export still TODO. ✅ Mesh + heightfield manifest available; web preview consumes mesh JSON (Nov 2024).

Phase 3 – Transfer Zone Runtime Support

  • Server-side service that watches player positions inside editor instances and consults transferZones.
  • When player remains in a zone beyond threshold, move them to the configured target instance (respect permissions/cooldowns).
  • Surface zone list to the renderer so the viewer can overlay the areas.

Status: TransferZoneService prototype online (teleports editors based on config). ✅ Cooldown guard ensures players are not bounced repeatedly after a transfer (configurable via TRANSFER_ZONE_COOLDOWN_SECONDS). ✅ Dashboard-Monitor zeigt jetzt live, welche Spieler gerade in Transfer-Zonen stehen (inkl. Restzeit & Zielinstanz).

Phase 4 – Web Viewer

  • SPA module (MapViewer) using Three.js (or similar) to load the generated manifest and display the 3D scene.
  • Controls: orbit camera, zoom, layer toggles, clipping plane, overlays for zones.
  • Hook into existing live-player websocket to draw markers above the geometry.
  • Integrate viewer panel into /maps detail screen (reuse current iframe slot).

✅ Basic mesh + zone viewer ships on /maps; live WebSocket player markers overlayed on the Three.js scene (Dec 2024). ✅ Layer fusion overlay pulls meshes from configured slugs and renders per-layer tint/offset toggles in the viewer (Dec 2024). ✅ Renderer erzeugt zusätzlich eine farbcodierte Top-Down-PNG-Vorschau je Snapshot (Jan 2025). ✅ Viewer zeigt nun auch WorldData-Locations als Marker an; Snapshots exportieren locations.json, und das Manifest liefert diese Daten an den Three.js-Layer. Jeder Location wird dabei eine persistente UUID zugewiesen, die sowohl in Mongo (MapVersion-Weltdata) als auch im world-data/<slug>-Verzeichnis landet, damit UI/REST-Aufrufe eindeutig referenzieren können. ✅ Renderer erzeugt jetzt mehrstufige Meshes (LOD 2×/4×/8×) samt Vertex-Farbverlauf; das Dashboard bietet einen Detail-Schalter, damit Viewer zwischen „leicht“ und „voll“ umschalten kann (Jan 2025). ⚙️ BlueMap-CLI wurde komplett entfernt – der hauseigene Renderer ist jetzt die einzige Quelle für Web-Assets. Keine Jar-Downloads oder zusätzliche Env-Flags mehr nötig.

  • Die Standard-Editorwelt lädt nun aus dem Snapshot-Verzeichnis, dessen Slug/Base-Pfad in der Mongo-Collection world_snapshot_preference gepflegt wird (kein ENV-Toggeln mehr nötig).
  • Dashboard (Maps-Modul) bietet jetzt einen Snapshot-Schalter inkl. Verzeichnisliste aus world-data/, sodass Admins den aktiven Snapshot direkt auswählen können.

Phase 5 – Enhancements

  • Texturing (sample from resource pack) + lighting improvements.
  • Level-of-detail tiles to keep payload small.
  • Incremental updates: only re-render changed chunks if needed.
  • Export hooks (e.g., glTF) if external tooling is required later.

Next Steps

  1. ✅ Move fusion work into the renderer so we can bake multi-layer meshes server-side instead of fetching each slug separately (Dec 2024). Textures still TODO.
  2. Generate textured tiles/LODs so the viewer can switch between lightweight meshes and detailed renders.
  3. Polish TransferZoneService (cooldowns, target-instance aliases, UI feedback) and surface runtime state in the dashboard.
  4. Add invalidation + diffing so fused layer meshes refresh only when their source snapshots change.

Data Flow Overview

  1. World snapshotWorldDataGenerationService flushes the active editor instance into world-data/<slug>/snapshot/... and writes a snapshot.json descriptor.
  2. Render jobWorldRenderService consumes the snapshot + MapVersion.BlueMapConfig, fuses all configured layers, and writes meshes (render/<layer>/mesh.json), heightfields, PNG previews, and render/manifest.json.
  3. Storage & sync – everything under world-data/<slug> is git-synced. Renderer guarantees deterministic filenames so CI runners can rsync without churn.
  4. Web delivery/api/world-snapshots/:slug/manifest streams the manifest + signed mesh URLs to the dashboard. Client caches manifests per slug + version to avoid redundant downloads.
  5. Live overlayLiveWebSocketRelay tags each PlayerPresence with the slug (via WorldIdentifierUtil). useLiveWorldStore keeps a per-slug registry so MapMeshViewer can render only relevant players.

Renderer Backlog

  • [x] Snapshot listener hook in WorldDataGenerationService that runs WorldRenderService after every editor snapshot and surfaces progress to the debug sidebar.
  • [x] Layer fusion with offset/color metadata and persisted manifest entries.
  • [x] Transfer-zone overlay export (geo + metadata) so the viewer can tint/label them.
  • [x] Heightfield + PNG quicklook assets to power the heatmap preview in /maps.
  • [ ] Textured mesh export that samples from the active resource pack.
  • [ ] Incremental chunk diffing: detect changed regions and only rebuild affected meshes/tiles.
  • [ ] Secondary format export (glTF) for external tooling + possible CDN pre-render.

Viewer & Dashboard Tasks

  • [x] Map detail view fetches manifest + mesh, exposes toggles for zones, layers, and LODs.
  • [x] MapMeshViewer renders fused mesh, transfer zones, manifest locations, and live players (via useLiveWorldStore).
  • [x] Viewer respects per-version settings (viewer.showPlayers, defaultLayerVisibility, etc.).
  • [x] /maps wires live WebSocket connect/disconnect lifecycle so inactive tabs stop streaming.
  • [ ] Add inspector overlays (hover/click) for players + locations (show UUID, layer, instance).
  • [ ] Provide “jump to player/zone” shortcuts + camera bookmark persistence per admin.
  • [ ] Integrate process bars/status cards for renderer + snapshot queue so admins see ETA inline.

Transfer Zones & Instance Routing

  • Transfer zone definitions live inside MapVersion.blueMapConfig.transferZones and mirror to Mongo so runtime + renderer share the same schema.
  • TransferZoneService runs inside each editor instance, watches player positions, logs pending transfers into Mongo for analytics, and calls WorldAccess.transferPlayer after the threshold.
  • Renderer exports the same zones into the manifest; viewer shows polygon overlays and highlights ones currently busy (via live socket data).
  • Upcoming work: zone groups (stacked corridors), richer cooldown config, and UI to preview the target layer combination before saving.

Open Questions & Risks

  • Rendering cost – Texturing + high LOD could spike snapshot time; need heuristics or admin controls (per-layer quality, chunk budget).
  • Storage footprintworld-data/<slug> may explode once we keep history. Plan: move stale snapshots to Git LFS or nightly tarballs.
  • Access control/maps assumes staff token. If we ever expose viewer to builders, we need scoped manifests + signed download URLs.
  • Multi-instance fusion – design still open for mixing runtime-only corridors with snapshot data (maybe stream runtime nodes as overlays rather than baking them).
  • Testing – no automated coverage yet. Add regression harness that snapshots a tiny template world and asserts manifest schema + mesh stats.