Large projects kill AI context. SRT-1 solves this by giving each module its own engine, running on its own port, with its own manifest. Then the Workspace Connector stitches them together โ without re-reading a single line.
Forget monolithic indexing. With SRT-1, you split your project into its natural modules โ auth, payments, frontend, api-gateway โ and drop an SRT-1 engine into each one. Each engine runs on its own local port with complete process isolation. Your AI never sees code it shouldn't.
Drop SRT-1 into any folder. It starts its own engine on a dedicated port, derived from the folder path. No config needed.
When you ask the AI to "fix the login flow," it only sees auth code. No bleed from payments. No confusion from frontend CSS. Pure, isolated intelligence.
Modules don't exist in a vacuum. Auth calls payments. The gateway routes to both. You need a bird's-eye view.
The Workspace Connector acts as a parent orchestrator. It doesn't re-read your code. Instead, it queries each running Sandbox engine on its port, collects their live manifests, and builds a unified cross-module dependency map.
Here is the exact terminal output when running the SRT-1 Workspace Connector on the SRT-1 CORE codebase itself, composed of 5 distinct modules โ each running on its own port.
The Sandbox is free. The Workspace Connector is $9/mo. Both run locally. Zero cloud dependencies.