graph TB
subgraph "Client Layer"
User[π€ User/Client]
Browser[π Web Browser]
end
subgraph "Vercel Deployment Platform"
Vercel[βοΈ Vercel Serverless]
VercelConfig[vercel.json<br/>Routes: / β /api/index.py]
end
subgraph "FastAPI Application (api/index.py)"
FastAPIApp[π FastAPI App]
subgraph "Core Endpoints (Part 1)"
ChatEndpoint[POST /chat/angel<br/>Angel Chat Endpoint]
end
subgraph "Advanced Endpoints (Part 2)"
QuestionsForm[GET /questions<br/>Questions Form]
SubmitQuestions[POST /submit-questions<br/>Save Responses]
end
subgraph "Auto-Generated Endpoints"
DocsEndpoint[GET /docs<br/>Swagger UI]
RootEndpoint[GET /<br/>Root Endpoint]
end
subgraph "Internal Components"
PersonaHandler[Angel Persona Handler<br/>System Prompt Builder]
FileHandler[File Handler<br/>Read/Write student_responses.txt]
PromptBuilder[Prompt Builder<br/>Context + User Message]
end
end
subgraph "External Services"
OpenAI[π€ OpenAI API<br/>GPT Model]
end
subgraph "Storage"
EnvFile[.env<br/>OPENAI_API_KEY]
ResponseFile[student_responses.txt<br/>Student Answers]
end
%% User interactions
User --> Browser
Browser -->|HTTP Requests| Vercel
%% Vercel routing
Vercel --> VercelConfig
VercelConfig --> FastAPIApp
%% Core endpoint flow
FastAPIApp --> ChatEndpoint
ChatEndpoint --> PersonaHandler
PersonaHandler --> FileHandler
FileHandler -->|Read if exists| ResponseFile
PersonaHandler --> PromptBuilder
PromptBuilder -->|Build System Prompt| OpenAI
OpenAI -->|Angel Response| PromptBuilder
PromptBuilder --> ChatEndpoint
ChatEndpoint -->|JSON Response| Browser
%% Advanced endpoints flow
FastAPIApp --> QuestionsForm
QuestionsForm -->|HTML Form| Browser
Browser -->|POST Form Data| SubmitQuestions
SubmitQuestions --> FileHandler
FileHandler -->|Write| ResponseFile
SubmitQuestions -->|Success Response| Browser
%% Auto-generated endpoints
FastAPIApp --> DocsEndpoint
FastAPIApp --> RootEndpoint
%% Configuration
FastAPIApp -->|Read| EnvFile
OpenAI -->|Authenticate| EnvFile
%% Styling
classDef endpoint fill:#e1f5ff,stroke:#01579b,stroke-width:2px
classDef service fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef storage fill:#f3e5f5,stroke:#4a148c,stroke-width:2px
classDef component fill:#e8f5e9,stroke:#1b5e20,stroke-width:2px
class ChatEndpoint,QuestionsForm,SubmitQuestions,DocsEndpoint,RootEndpoint endpoint
class OpenAI service
class EnvFile,ResponseFile storage
class PersonaHandler,FileHandler,PromptBuilder component
sequenceDiagram
participant User
participant Browser
participant Vercel
participant FastAPI
participant FileHandler
participant OpenAI
Note over User,OpenAI: Part 1: Basic Angel Chat Flow
User->>Browser: Enter message
Browser->>Vercel: POST /chat/angel<br/>{message: "Hello"}
Vercel->>FastAPI: Route to api/index.py
FastAPI->>FastAPI: Build Angel persona prompt
FastAPI->>OpenAI: API Call with:<br/>- System: Angel persona<br/>- User: message
OpenAI-->>FastAPI: Angel response
FastAPI-->>Browser: JSON {response: "..."}
Browser-->>User: Display Angel response
Note over User,OpenAI: Part 2: Advanced Flow with Context
User->>Browser: Visit /questions
Browser->>Vercel: GET /questions
Vercel->>FastAPI: Route request
FastAPI-->>Browser: HTML Form (4 questions)
User->>Browser: Submit answers
Browser->>Vercel: POST /submit-questions<br/>{q1, q2, q3, q4}
Vercel->>FastAPI: Route request
FastAPI->>FileHandler: Save to student_responses.txt
FileHandler-->>FastAPI: Success
FastAPI-->>Browser: Success response
User->>Browser: Chat with Angel
Browser->>Vercel: POST /chat/angel<br/>{message: "Help me"}
Vercel->>FastAPI: Route request
FastAPI->>FileHandler: Read student_responses.txt
FileHandler-->>FastAPI: Student context
FastAPI->>FastAPI: Build enhanced prompt:<br/>Angel persona + Student context
FastAPI->>OpenAI: API Call with context
OpenAI-->>FastAPI: Personalized Angel response
FastAPI-->>Browser: JSON {response: "..."}
Browser-->>User: Display personalized response
graph LR
subgraph "FastAPI Endpoints"
A[POST /chat/angel<br/>Required<br/>Chat with Angel]
B[GET /questions<br/>Advanced<br/>Show form]
C[POST /submit-questions<br/>Advanced<br/>Save answers]
D[GET /docs<br/>Auto-generated<br/>Swagger UI]
E[GET /<br/>Auto-generated<br/>Root]
end
A -->|Uses| F[Angel Persona]
A -->|Reads| G[student_responses.txt]
A -->|Calls| H[OpenAI API]
C -->|Writes| G
style A fill:#4caf50,color:#fff
style B fill:#ff9800,color:#fff
style C fill:#ff9800,color:#fff
style D fill:#9e9e9e,color:#fff
style E fill:#9e9e9e,color:#fff
flowchart TD
Start([User Request]) --> Route{Request Type}
Route -->|GET /questions| ShowForm[Display 4-Question Form]
ShowForm --> End1([Return HTML])
Route -->|POST /submit-questions| SaveData[Receive Q1-Q4 Answers]
SaveData --> WriteFile[Write to student_responses.txt]
WriteFile --> End2([Return Success])
Route -->|POST /chat/angel| GetMessage[Receive User Message]
GetMessage --> CheckFile{student_responses.txt<br/>exists?}
CheckFile -->|Yes| ReadFile[Read Student Context]
CheckFile -->|No| BuildBasicPrompt[Build Basic Angel Prompt]
ReadFile --> BuildEnhancedPrompt[Build Enhanced Prompt:<br/>Angel Persona + Student Context]
BuildBasicPrompt --> CallOpenAI[Call OpenAI API]
BuildEnhancedPrompt --> CallOpenAI
CallOpenAI --> GetResponse[Receive Angel Response]
GetResponse --> End3([Return JSON Response])
style ShowForm fill:#fff3e0
style SaveData fill:#fff3e0
style WriteFile fill:#fff3e0
style GetMessage fill:#e1f5ff
style ReadFile fill:#e1f5ff
style BuildEnhancedPrompt fill:#e1f5ff
style CallOpenAI fill:#ffebee
style GetResponse fill:#e8f5e9