Skip to content

Instantly share code, notes, and snippets.

@brannn
Last active February 4, 2026 21:15
Show Gist options
  • Select an option

  • Save brannn/f0c95157a9e24f8f43ddb5362f0bb926 to your computer and use it in GitHub Desktop.

Select an option

Save brannn/f0c95157a9e24f8f43ddb5362f0bb926 to your computer and use it in GitHub Desktop.
Runcell Inner AI Auto-Configuration Workflow

Inner AI Auto-Configuration Workflow

The Vision: Outer Claude Orchestrates Inner AI

The workflow allows outer Claude (running in Claude Code) to create isolated workspaces with pre-configured AI coding assistants, then delegate tasks to the inner AI running inside those workspaces.

How It Works

┌─────────────────────────────────────────────────────────────────────┐
│  Your Machine (Claude Code + MCP)                                   │
│  ┌───────────────────────────────────────────────────────────────┐  │
│  │  Outer Claude                                                 │  │
│  │  - Receives your request                                      │  │
│  │  - Creates workspace with INNER_AI_* env vars                 │  │
│  │  - Injects credentials (CEREBRAS_API_KEY, etc.)               │  │
│  │  - Runs `cline -y "prompt"` inside workspace                  │  │
│  │  - Reports results back to you                                │  │
│  └───────────────────────────────────────────────────────────────┘  │
│                              │                                      │
│                              ▼                                      │
│  ┌───────────────────────────────────────────────────────────────┐  │
│  │  cell MCP Server (K8s)                                        │  │
│  │  - create_workspace with environment + credentials            │  │
│  │  - execute_command to run inner AI                            │  │
│  └───────────────────────────────────────────────────────────────┘  │
└─────────────────────────────────────────────────────────────────────┘
                               │
                               ▼
┌─────────────────────────────────────────────────────────────────────┐
│  K8s Workspace Pod (runcell-workspaces namespace)                   │
│  ┌───────────────────────────────────────────────────────────────┐  │
│  │  Container Startup:                                           │  │
│  │  1. workspace-entrypoint runs                                 │  │
│  │  2. configure-inner-ai detects INNER_AI_* vars                │  │
│  │  3. Auto-configures cline/claude/opencode                     │  │
│  │  4. Container ready with pre-configured AI                    │  │
│  └───────────────────────────────────────────────────────────────┘  │
│                                                                     │
│  Environment Variables:                                             │
│  - INNER_AI_TOOL=cline                                              │
│  - INNER_AI_PROVIDER=openai                                         │
│  - INNER_AI_API_KEY=$CEREBRAS_API_KEY                               │
│  - INNER_AI_MODEL=qwen-3-32b                                        │
│  - INNER_AI_BASE_URL=https://api.cerebras.ai/v1                     │
│  - CEREBRAS_API_KEY=csk-xxxxx (injected credential)                 │
└─────────────────────────────────────────────────────────────────────┘

Complete End-to-End Example

User Request to Outer Claude

User: Create a Go workspace and use the inner AI to write a REST API 
      server with health check and user CRUD endpoints.

Outer Claude's Actions

Step 1: Create workspace with inner AI pre-configured

# MCP tool call: create_workspace
{
  "name": "rest-api-project",
  "language": "go",
  "inject_credentials": ["CEREBRAS_API_KEY"],
  "environment": {
    "INNER_AI_TOOL": "cline",
    "INNER_AI_PROVIDER": "openai",
    "INNER_AI_API_KEY": "$CEREBRAS_API_KEY",
    "INNER_AI_MODEL": "qwen-3-32b",
    "INNER_AI_BASE_URL": "https://api.cerebras.ai/v1"
  }
}

Step 2: Delegate task to inner AI

# MCP tool call: execute_command
{
  "workspace_id": "abc12345",
  "command": "cline -y \"Create a REST API server in Go with the following:
    1. Health check endpoint at GET /health
    2. User CRUD endpoints:
       - GET /users - list all users
       - GET /users/:id - get user by ID
       - POST /users - create user
       - PUT /users/:id - update user
       - DELETE /users/:id - delete user
    3. Use chi router
    4. Store users in memory (map)
    5. Include proper error handling
    Create all files needed and ensure it compiles.\""
}

Step 3: Verify the result

# MCP tool call: execute_command
{
  "workspace_id": "abc12345",
  "command": "go build -o server . && echo 'Build successful!'"
}

Step 4: Report back to user

Outer Claude: I've created the REST API server. The inner AI generated:
- main.go with the server setup
- handlers/user.go with CRUD handlers  
- models/user.go with the User struct

The project compiles successfully. Would you like me to run it or make any changes?

Container Startup Flow

When the workspace pod starts:

[workspace-entrypoint] Starting...
[configure-inner-ai] Detected INNER_AI_TOOL=cline
[configure-inner-ai] Configuring cline with provider=openai model=qwen-3-32b
[configure-inner-ai] Resolving API key from $CEREBRAS_API_KEY
[configure-inner-ai] Running: cline auth --provider openai --apikey *** --modelid qwen-3-32b --baseurl https://api.cerebras.ai/v1
[configure-inner-ai] cline configured successfully
[workspace-entrypoint] Executing: sleep infinity

Cline Configuration Result

After auto-configuration, cline's settings are saved to ~/.cline/data/:

// globalState.json
{
  "actModeApiProvider": "openai",
  "planModeApiProvider": "openai",
  "actModeOpenAiModelId": "qwen-3-32b",
  "planModeOpenAiModelId": "qwen-3-32b",
  "openAiBaseUrl": "https://api.cerebras.ai/v1",
  "welcomeViewCompleted": true
}

// secrets.json
{
  "openAiApiKey": "<injected-api-key>"
}

Supported Environment Variables

Variable Description Example
INNER_AI_TOOL Which tool to configure cline, claude, opencode, auggie
INNER_AI_PROVIDER API provider openai, anthropic, openrouter
INNER_AI_API_KEY API key (can reference other vars) $CEREBRAS_API_KEY, sk-xxx
INNER_AI_MODEL Model identifier qwen-3-32b, claude-sonnet-4-5-20250929
INNER_AI_BASE_URL Custom API endpoint https://api.cerebras.ai/v1

Provider Examples

Cerebras (fast inference)

INNER_AI_TOOL=cline
INNER_AI_PROVIDER=openai
INNER_AI_API_KEY=$CEREBRAS_API_KEY
INNER_AI_MODEL=qwen-3-32b
INNER_AI_BASE_URL=https://api.cerebras.ai/v1

OpenRouter

INNER_AI_TOOL=cline
INNER_AI_PROVIDER=openrouter
INNER_AI_API_KEY=$OPENROUTER_API_KEY
INNER_AI_MODEL=anthropic/claude-3.5-sonnet

Direct Anthropic

INNER_AI_TOOL=cline
INNER_AI_PROVIDER=anthropic
INNER_AI_API_KEY=$ANTHROPIC_API_KEY
INNER_AI_MODEL=claude-sonnet-4-5-20250929

Inner AI Command Patterns

Basic prompt execution

cline -y "Your prompt here"

The -y flag enables non-interactive mode (auto-accepts all actions).

Complex multi-step task

cline -y "Step 1: Create the project structure.
Step 2: Implement the core logic.
Step 3: Add tests.
Step 4: Ensure everything compiles and tests pass."

Code review

cline -y "Review the code in main.go and suggest improvements for:
- Error handling
- Performance
- Code organization"

Bug fixing

cline -y "There's a bug where users can't be deleted. 
Debug and fix the issue in handlers/user.go"

Implementation Details

configure-inner-ai script

The script runs at container startup and:

  1. Checks for INNER_AI_TOOL environment variable
  2. Resolves API key references (e.g., $CEREBRAS_API_KEY → actual value)
  3. Runs the appropriate auth command for the selected tool
  4. Exits silently if no configuration is requested

workspace-entrypoint script

#!/bin/bash
# Run inner AI configuration if available
if [[ -x /usr/local/bin/configure-inner-ai ]]; then
    /usr/local/bin/configure-inner-ai || true
fi
# Execute the main command
exec "$@"

Generated with Claude Code - Runcell Inner AI Auto-Configuration

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment