Skip to content

Commit d95eda4

Browse files
authored
docs(OpenAI): Adjust README with example for response function calling. (#691)
* Update README.md Fix tool call example * Revert "Update README.md" This reverts commit 3c11934. * Add function tool example * Add an example of how to use a model response with a function tool.
1 parent b42fe2f commit d95eda4

File tree

1 file changed

+44
-0
lines changed

1 file changed

+44
-0
lines changed

README.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -212,6 +212,50 @@ $response->usage->totalTokens; // 123
212212
$response->toArray(); // ['id' => 'resp_67ccd2bed1ec8190b14f964abc054267', ...]
213213
```
214214

215+
Create a model response with a function tool.
216+
217+
```php
218+
$response = $client->responses()->create([
219+
'model' => 'gpt-4o-mini',
220+
'tools' => [
221+
[
222+
'type' => 'function',
223+
'name' => 'get_temperature',
224+
'description' => 'Get the current temperature in a given location',
225+
'parameters' => [
226+
'type' => 'object',
227+
'properties' => [
228+
'location' => [
229+
'type' => 'string',
230+
'description' => 'The city and state, e.g. San Francisco, CA',
231+
],
232+
'unit' => [
233+
'type' => 'string',
234+
'enum' => ['celsius', 'fahrenheit'],
235+
],
236+
],
237+
'required' => ['location'],
238+
],
239+
]
240+
],
241+
'input' => "What is the temperature in Rio Grande do Norte, Brazil?",
242+
]);
243+
244+
foreach ($response->output as $item) {
245+
if ($item->type === 'function_call') {
246+
$name = $item->name ?? null;
247+
$args = json_decode($item->arguments ?? '{}', true) ?: [];
248+
249+
if ($name === 'get_temperature') {
250+
// ✅ Call your custom function here with the extracted arguments
251+
// Example:
252+
// $temperature = get_temperature($args['location'], $args['unit'] ?? 'celsius');
253+
// Then, send the result back to the model if needed.
254+
}
255+
}
256+
}
257+
```
258+
215259
#### `create streamed`
216260

217261
When you create a Response with stream set to true, the server will emit server-sent events to the client as the Response is generated. All events and their payloads can be found in [OpenAI docs](https://platform.openai.com/docs/api-reference/responses-streaming).

0 commit comments

Comments
 (0)