This is a slow walkthrough of OpenAI tool calling using only raw API calls and PowerShell.
The point:
- No agent framework is required.
- No harness is required.
- The model does not execute your code.
- The model asks for a tool call.
- Your code runs the tool.
- Your code sends the tool result back.
- The model uses that result to write the final answer.
You can use the same pattern with mocked data, a database, an internal service, or a real public API.
This walkthrough starts with the smallest possible idea, then builds up to a real weather API call using Open-Meteo.
A tool definition is just a contract.
You tell the model:
- the tool name
- what the tool does
- what arguments it accepts
- what JSON schema those arguments must follow
The model can then return something like:
{
"type": "function_call",
"name": "get_weather",
"arguments": "{\"location\":\"Cincinnati, Ohio\"}",
"call_id": "call_..."
}That is not the weather.
That is the model saying:
I need you, the application, to run
get_weatherwith this argument.
Your code is responsible for everything after that.
This is a PowerShell object that defines one tool named get_weather.
$tools = @(
@{
type = "function"
name = "get_weather"
description = "Get current weather for a location."
parameters = @{
type = "object"
properties = @{
location = @{
type = "string"
description = "City and state/country, such as Cincinnati, Ohio."
}
}
required = @("location")
additionalProperties = $false
}
strict = $true
}
)Nothing has run yet.
This is just a schema.
This request asks the model to use the tool.
The tool_choice field forces the model to call get_weather instead of answering directly.
$ErrorActionPreference = "Stop"
if ([string]::IsNullOrWhiteSpace($env:OPENAI_API_KEY)) {
throw "Missing required environment variable: OPENAI_API_KEY"
}
function Invoke-OpenAIJson {
param(
[Parameter(Mandatory = $true)]
[hashtable] $Body
)
$json = $Body | ConvertTo-Json -Depth 50
$bytes = [System.Text.Encoding]::UTF8.GetBytes($json)
Invoke-RestMethod `
-Method Post `
-Uri "https://api.openai.com/v1/responses" `
-Headers @{ Authorization = "Bearer $env:OPENAI_API_KEY" } `
-ContentType "application/json; charset=utf-8" `
-Body $bytes
}
$r1 = Invoke-OpenAIJson -Body @{
model = "gpt-5.5"
input = "Use the get_weather tool for Cincinnati, Ohio."
tool_choice = @{
type = "function"
name = "get_weather"
}
tools = $tools
}Why the Invoke-OpenAIJson helper?
Windows PowerShell can send string request bodies with the wrong encoding, especially once a JSON payload contains non-ASCII text. This helper always sends UTF-8 JSON bytes.
The first response should contain a function_call.
$fc = $r1.output | Where-Object type -eq "function_call" | Select-Object -First 1
if (-not $fc) {
throw "The model did not return a function_call. Full response: $($r1 | ConvertTo-Json -Depth 20)"
}
$fcYou should see the tool name and arguments.
$weatherArgs = $fc.arguments | ConvertFrom-Json
$weatherArgs.locationAt this point, the model has not fetched weather.
It only produced a structured request for your code to fetch weather.
Before calling a real API, prove the loop with fake data.
$weather = @{
location = $weatherArgs.location
temperature = "72F"
conditions = "Partly cloudy"
} | ConvertTo-Json -CompressThis is the whole "tool execution" step in mocked form.
No harness did it.
Your code did it.
Now send the result back to OpenAI as function_call_output.
The important fields are:
previous_response_id: links this request to the first responsecall_id: must match the model's tool calloutput: must be a string, usually compressed JSON
$r2 = Invoke-OpenAIJson -Body @{
model = "gpt-5.5"
previous_response_id = $r1.id
tools = $tools
input = @(
@{
type = "function_call_output"
call_id = $fc.call_id
output = $weather
}
)
}
$r2.output |
Where-Object type -eq "message" |
ForEach-Object { $_.content } |
Where-Object type -eq "output_text" |
ForEach-Object { $_.text }Expected shape:
Cincinnati, Ohio weather: Partly cloudy, 72F.
That is the full lifecycle:
- Model requests a tool call
- Your code executes the tool
- Your code returns the tool output
- Model writes the final answer
Now replace the fake $weather object with a real API call.
This example uses Open-Meteo:
- no weather API key required for non-commercial use
- geocoding endpoint resolves the city
- forecast endpoint returns current weather
There is still no harness.
The model asks for get_weather.
Your PowerShell code calls the real API.
Copy this whole block into the same PowerShell session.
Only OPENAI_API_KEY is required.
$ErrorActionPreference = "Stop"
if ([string]::IsNullOrWhiteSpace($env:OPENAI_API_KEY)) {
throw "Missing required environment variable: OPENAI_API_KEY"
}
function Invoke-OpenAIJson {
param(
[Parameter(Mandatory = $true)]
[hashtable] $Body
)
$json = $Body | ConvertTo-Json -Depth 50
$bytes = [System.Text.Encoding]::UTF8.GetBytes($json)
Invoke-RestMethod `
-Method Post `
-Uri "https://api.openai.com/v1/responses" `
-Headers @{ Authorization = "Bearer $env:OPENAI_API_KEY" } `
-ContentType "application/json; charset=utf-8" `
-Body $bytes
}
$tools = @(
@{
type = "function"
name = "get_weather"
description = "Get current weather for a location using Open-Meteo public REST APIs. No weather API key is required for non-commercial use."
parameters = @{
type = "object"
properties = @{
location = @{
type = "string"
description = "City and state/country, such as Cincinnati, Ohio."
}
}
required = @("location")
additionalProperties = $false
}
strict = $true
}
)
$r1 = Invoke-OpenAIJson -Body @{
model = "gpt-5.5"
input = "Use the get_weather tool for Cincinnati, Ohio. Summarize the current temperature, wind speed, and weather code."
tool_choice = @{
type = "function"
name = "get_weather"
}
tools = $tools
}
$fc = $r1.output | Where-Object type -eq "function_call" | Select-Object -First 1
if (-not $fc) {
throw "The model did not return a function_call. Full response: $($r1 | ConvertTo-Json -Depth 20)"
}
$weatherArgs = $fc.arguments | ConvertFrom-Json
# Open-Meteo geocoding searches place names, not full "City, State" strings.
# For "Cincinnati, Ohio", search "Cincinnati", then prefer an admin/state match.
$locationParts = @(
@($weatherArgs.location -split ",") |
ForEach-Object { $_.Trim() } |
Where-Object { $_ }
)
$placeName = $locationParts[0]
$regionHint = if ($locationParts.Count -gt 1) { $locationParts[1] } else { $null }
$encodedPlaceName = [System.Uri]::EscapeDataString($placeName)
$geoUri = "https://geocoding-api.open-meteo.com/v1/search?name=$encodedPlaceName&count=10&language=en&format=json"
$geoResult = Invoke-RestMethod -Method Get -Uri $geoUri
if (-not $geoResult.results) {
throw "Open-Meteo geocoding returned no result for location: $($weatherArgs.location)"
}
$places = @($geoResult.results)
$place = $places | Where-Object {
$regionHint -and (
$_.admin1 -eq $regionHint -or
$_.country -eq $regionHint -or
$_.country_code -eq $regionHint
)
} | Select-Object -First 1
if (-not $place) {
$place = $places | Select-Object -First 1
}
$lat = [string]::Format([System.Globalization.CultureInfo]::InvariantCulture, "{0}", $place.latitude)
$lon = [string]::Format([System.Globalization.CultureInfo]::InvariantCulture, "{0}", $place.longitude)
$timezone = [System.Uri]::EscapeDataString($place.timezone)
$forecastUri = "https://api.open-meteo.com/v1/forecast?latitude=$lat&longitude=$lon¤t=temperature_2m,weather_code,wind_speed_10m&temperature_unit=fahrenheit&wind_speed_unit=mph&timezone=$timezone"
$forecast = Invoke-RestMethod -Method Get -Uri $forecastUri
$toolOutput = [string](@{
source = "Open-Meteo"
resolvedLocation = @{
name = $place.name
admin1 = $place.admin1
country = $place.country
latitude = $place.latitude
longitude = $place.longitude
timezone = $place.timezone
}
forecast = $forecast
} | ConvertTo-Json -Depth 50 -Compress)
if ([string]::IsNullOrWhiteSpace($toolOutput)) {
throw "Open-Meteo returned no JSON output; refusing to send an empty function_call_output back to OpenAI."
}
$r2 = Invoke-OpenAIJson -Body @{
model = "gpt-5.5"
previous_response_id = $r1.id
tools = $tools
input = @(
@{
type = "function_call_output"
call_id = $fc.call_id
output = $toolOutput
}
)
}
$r2.output |
Where-Object type -eq "message" |
ForEach-Object { $_.content } |
Where-Object type -eq "output_text" |
ForEach-Object { $_.text }Example final answer shape:
Current weather in Cincinnati, Ohio:
- Temperature: 62.5F
- Wind speed: 3.8 mph
- Weather code: 3
The real version follows the same four phases as the mock version.
- The model returned a
function_callforget_weather - PowerShell geocoded the city with Open-Meteo
- PowerShell fetched current weather from Open-Meteo
- PowerShell sent that JSON back as
function_call_output - The model wrote the final natural-language answer
The model never called Open-Meteo.
The model never executed PowerShell.
The model only asked for a tool call.
Your code executed the real API call.
That is the core idea.
If geocoding or forecast lookup fails, stop.
Do not send an empty or null function_call_output back to OpenAI.
The tool output must be a string.
This can fail:
curl.exe ... -d '{"model":"gpt-5.5", ... }'PowerShell can split or mangle the JSON before curl receives it.
Prefer Invoke-RestMethod with ConvertTo-Json.
If you must use curl.exe, write JSON to a file and send --data-binary "@file.json".
The Open-Meteo response can include non-ASCII unit strings.
This helper avoids unicode decode errors:
$json = $Body | ConvertTo-Json -Depth 50
$bytes = [System.Text.Encoding]::UTF8.GetBytes($json)Then send $bytes with:
-ContentType "application/json; charset=utf-8"You can replace Open-Meteo with anything:
- a database query
- a billing system
- a CRM
- a local script
- a public REST API
- an internal service
The loop stays the same:
OpenAI request with tool schema
-> model returns function_call
-> your code executes the tool
-> your code sends function_call_output
-> model returns final answer
No harness required.
Frameworks can automate this.
But the underlying mechanism is just structured API calls plus your own code.