Skip to content

Instantly share code, notes, and snippets.

@brannn
Created February 4, 2026 21:24
Show Gist options
  • Select an option

  • Save brannn/d2090fba5e5001f5a6566a4f7a8c871a to your computer and use it in GitHub Desktop.

Select an option

Save brannn/d2090fba5e5001f5a6566a4f7a8c871a to your computer and use it in GitHub Desktop.
Runcell: AI Coding Assistants in Isolated Workspaces

Runcell: AI Coding Assistants in Isolated Workspaces

The Problem

AI coding assistants are powerful, but using them today means dealing with manual configuration, exposing API keys, and hoping the AI doesn't accidentally touch something it shouldn't. Every new project requires the same tedious setup dance—install the tool, configure the provider, manage the keys, repeat.

Runcell changes this by letting Claude handle all of that for you. When you ask Claude to build something, it can spin up an isolated workspace with a pre-configured AI assistant inside, delegate the work, and report back the results. The complexity happens behind the scenes while you focus on describing what you want.

How It Works

The experience is conversational. You chat with Claude Code in VS Code, and Claude orchestrates everything:

You: Create a Go workspace and use Cline to build me a REST API with user CRUD

Claude: I'll create an isolated workspace with Cline pre-configured, then 
        delegate the task to the inner AI.
        
        [Creates workspace with Cerebras-powered Cline]
        [Runs: cline -y "Create a REST API with user CRUD endpoints"]
        [Reports results back to you]

You: Add authentication to it

Claude: [Runs: cline -y "Add JWT authentication to the API"]
        Done! I've added JWT auth. Here's what changed...

Behind the scenes, Claude is creating Kubernetes pods, injecting credentials securely, configuring AI tools, and running commands—but from your perspective, it's just a conversation about what you want to build.

A Complete Example

Here's what a typical session looks like in VS Code:

┌─────────────────────────────────────────────────────────────┐
│  VS Code                                                    │
│  ┌───────────────────────────────────────────────────────┐  │
│  │  Claude Code Chat                                     │  │
│  │                                                       │  │
│  │  You: Build me a web scraper in Go                    │  │
│  │                                                       │  │
│  │  Claude: I'll create a workspace and have the inner   │  │
│  │          AI build that for you.                       │  │
│  │                                                       │  │
│  │          ✓ Workspace created (ws-a1b2c3)              │  │
│  │          ✓ Cline configured with Cerebras             │  │
│  │          ⟳ Inner AI working...                        │  │
│  │                                                       │  │
│  │          Done! Created:                               │  │
│  │          • main.go - CLI entry point                  │  │
│  │          • scraper/scraper.go - Core scraping logic   │  │
│  │          • scraper/parser.go - HTML parsing           │  │
│  │                                                       │  │
│  │  You: Add rate limiting                               │  │
│  │                                                       │  │
│  │  Claude: ⟳ Asking inner AI to add rate limiting...    │  │
│  │                                                       │  │
│  └───────────────────────────────────────────────────────┘  │
└─────────────────────────────────────────────────────────────┘

The workspace itself is a Kubernetes pod running in isolation:

Pod: ws-a1b2c3
├── Go 1.22 runtime
├── Cline (pre-configured with Cerebras API)
└── /workspace (isolated filesystem)
     ├── main.go
     └── scraper/
         ├── scraper.go
         └── parser.go

Why This Matters

Isolation: Each workspace is a separate container. The inner AI can only see and modify files within that container's /workspace directory. Your local machine, credentials, and other projects remain untouched.

Automatic Configuration: When Claude creates a workspace, it can inject API keys and configure AI tools automatically. The inner AI is ready to work the moment the container starts.

Provider Flexibility: Different tasks benefit from different AI providers. Fast inference for quick iterations, high-quality models for complex reasoning. Claude can pick the right tool for the job, or you can specify what you want.

Reproducibility: Workspaces start from known base images with consistent tooling. Share a configuration with your team and everyone gets the same environment.

Common Workflows

Quick Prototyping

You: Create a workspace and build me a REST API in Go with 
     health check and user CRUD. Use fast inference.

Claude: [Creates workspace, configures Cline with Cerebras, runs task]
        
        Done in 45 seconds! Here's what was created...

Code Review

You: In the csv-tool workspace, have the inner AI review 
     converter.py for potential bugs and edge cases.

Claude: [Runs: cline -y "Review converter.py for bugs and edge cases"]
        
        The inner AI found 3 issues:
        1. No handling for empty CSV files
        2. Unicode characters may cause issues
        3. Memory usage could be high for large files
        
        Want me to have it fix these?

Iterative Development

You: Add support for YAML output too

Claude: [Runs: cline -y "Add YAML output support to the converter"]
        
        Added! New files:
        - formats/yaml.py
        Updated:
        - cli.py (new --format yaml flag)
        - tests/test_converter.py (YAML tests)

Debugging

You: The JSON output has extra whitespace. Fix it.

Claude: [Runs: cline -y "Fix: JSON output has extra whitespace"]
        
        Fixed! The issue was in converter.py line 42.

Available AI Tools

The workspace images include several pre-installed AI coding assistants:

Tool Description
Cline General-purpose assistant, works with any OpenAI-compatible provider
Claude Code Anthropic's official assistant
OpenCode Open-source alternative
Auggie Augment Code's assistant

Claude selects which one to use based on your request and the configured providers, though you can always specify a preference.

Provider Options

Cerebras offers extremely fast inference—useful for quick iterations and code generation where speed matters more than depth.

Anthropic provides direct Claude access for tasks requiring complex reasoning or careful analysis.

OpenRouter gives access to a wide variety of models with pay-per-use pricing.

Any OpenAI-compatible API works as well.

Questions

Do I need to understand Kubernetes? No. The orchestration is invisible. You describe what you want in natural language, and Claude handles the infrastructure.

How are API keys managed? Runcell has a credential store. You configure keys once, and Claude injects them securely when creating workspaces. Keys aren't exposed in chat history or stored in plain text.

Can I see the code that was generated? Yes. Claude shows you what files were created or modified, and you can ask it to display the contents or copy them to your local machine.

What happens if the inner AI makes a mistake? Ask Claude to fix it, or delete the workspace and start over. Since everything is isolated, there's no risk to your local environment.

What languages are supported? Go, Python, and Node.js have pre-built images. Custom images can be created for other languages.


Runcell — Claude orchestrates AI coding assistants in isolated workspaces

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment