Skip to content

Commit f27a99c

Browse files
tkattkatmiguelg719
andauthored
add support for zod v4 (#1282)
Why Adds support for zod 4 while maintaining backwards compatibility with zod 3 What changed - Introduced a zodCompat layer so Stagehand speaks both Zod v3 and v4 natively. Every place that used to import zod/v3, zod-to-json-schema, or poke at schema internals now routes through the same helpers (StagehandZodSchema, InferStagehandSchema, toJsonSchema, etc.). That keeps existing apps on v3 working while letting us upgrade to the v4 ecosystem. - Swapped all examples, eval tasks, tests, and LLM clients over to standard zod imports, updated dependencies to accept 3.25.76 || 4.1.8, and dropped zod-to-json-schema. Tool plumbing (Anthropic/Google CUA) and custom clients now produce the right JSON Schema regardless of which Zod flavor the user passes in. - Tightened typings so structured LLM calls return { data, usage } directly no more as casts throughout inference or the samples. Manual testing To make sure nothing regressed, I ran: Custom OpenAI client example (schema conversion + structured output). Standard extract (including URL injection path). Google computer-use agent with a custom tool. Anthropic computer-use agent with a custom tool. All behaved as before --------- Co-authored-by: miguel <[email protected]>
1 parent 04d68f9 commit f27a99c

File tree

95 files changed

+1035
-419
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

95 files changed

+1035
-419
lines changed

.changeset/dark-pans-carry.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
"@browserbasehq/stagehand": patch
3+
---
4+
5+
Add support for zod 4, while maintaining backwards compatibility for zod 3

packages/core/examples/2048.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
import { Stagehand } from "../lib/v3";
2-
import { z } from "zod/v3";
2+
import { z } from "zod";
33

44
async function example() {
55
console.log("🎮 Starting 2048 bot...");

packages/core/examples/agent-custom-tools.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
/**
22
* This example shows how to pass custom tools to stagehand agent (both CUA and non-CUA)
33
*/
4-
import { z } from "zod/v3";
4+
import { z } from "zod";
55
import { tool } from "ai";
66
import { Stagehand } from "../lib/v3";
77
import chalk from "chalk";

packages/core/examples/custom_client_aisdk.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
*/
88
import { Stagehand } from "../lib/v3";
99
import { AISdkClient } from "./external_clients/aisdk";
10-
import { z } from "zod/v3";
10+
import { z } from "zod";
1111
import { openai } from "@ai-sdk/openai";
1212

1313
async function example() {

packages/core/examples/custom_client_langchain.ts

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,15 +3,17 @@
33
*
44
* You will need to reference the Langchain Client in /external_clients/langchain.ts
55
*/
6-
import { z } from "zod/v3";
6+
import { z } from "zod";
77
import { Stagehand } from "../lib/v3";
88
import { LangchainClient } from "./external_clients/langchain";
99
import { ChatOpenAI } from "@langchain/openai";
1010

1111
async function example() {
12+
// @ts-expect-error Type instantiation is excessively deep and possibly infinite
1213
const stagehand = new Stagehand({
1314
env: "BROWSERBASE",
1415
verbose: 1,
16+
// @ts-expect-error Type instantiation is excessively deep and possibly infinite
1517
llmClient: new LangchainClient(
1618
new ChatOpenAI({
1719
model: "gpt-4o",

packages/core/examples/custom_client_openai.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
* You will need to reference the Custom OpenAI Client in /external_clients/customOpenAI.ts
77
*/
88
import { Stagehand } from "../lib/v3";
9-
import { z } from "zod/v3";
9+
import { z } from "zod";
1010
import { CustomOpenAIClient } from "./external_clients/customOpenAI";
1111
import OpenAI from "openai";
1212

packages/core/examples/external_clients/customOpenAI.ts

Lines changed: 13 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,6 @@ import {
1111
LLMClient,
1212
} from "../../lib/v3";
1313
import OpenAI from "openai";
14-
import { zodResponseFormat } from "openai/helpers/zod";
1514
import type {
1615
ChatCompletion,
1716
ChatCompletionAssistantMessageParam,
@@ -22,10 +21,10 @@ import type {
2221
ChatCompletionSystemMessageParam,
2322
ChatCompletionUserMessageParam,
2423
} from "openai/resources/chat/completions";
25-
import { z } from "zod/v3";
2624
import { CreateChatCompletionResponseError } from "../../lib/v3";
25+
import { StagehandZodSchema, toJsonSchema } from "../../lib/v3/zodCompat";
2726

28-
function validateZodSchema(schema: z.ZodTypeAny, data: unknown) {
27+
function validateZodSchema(schema: StagehandZodSchema, data: unknown) {
2928
try {
3029
schema.parse(data);
3130
return true;
@@ -83,12 +82,18 @@ export class CustomOpenAIClient extends LLMClient {
8382
);
8483
}
8584

86-
let responseFormat = undefined;
85+
let responseFormat:
86+
| ChatCompletionCreateParamsNonStreaming["response_format"]
87+
| undefined;
8788
if (options.response_model) {
88-
responseFormat = zodResponseFormat(
89-
options.response_model.schema,
90-
options.response_model.name,
91-
);
89+
const responseModelSchema = options.response_model.schema;
90+
responseFormat = {
91+
type: "json_schema",
92+
json_schema: {
93+
name: options.response_model.name,
94+
schema: toJsonSchema(responseModelSchema),
95+
},
96+
};
9297
}
9398

9499
/* eslint-disable */

packages/core/examples/external_clients/langchain.ts

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,14 @@ import {
44
LLMClient,
55
AvailableModel,
66
} from "../../lib/v3";
7-
import { zodToJsonSchema } from "zod-to-json-schema";
87
import {
98
AIMessage,
109
BaseMessageLike,
1110
HumanMessage,
1211
SystemMessage,
1312
} from "@langchain/core/messages";
1413
import { ChatCompletion } from "openai/resources";
14+
import { toJsonSchema } from "../../lib/v3/zodCompat";
1515

1616
export class LangchainClient extends LLMClient {
1717
public type = "langchainClient" as const;
@@ -60,9 +60,8 @@ export class LangchainClient extends LLMClient {
6060
);
6161

6262
if (options.response_model) {
63-
const responseSchema = zodToJsonSchema(options.response_model.schema, {
64-
$refStrategy: "none",
65-
});
63+
//ref string no longer needed, this is now default behavior
64+
const responseSchema = toJsonSchema(options.response_model.schema);
6665
const structuredModel = this.model.withStructuredOutput(responseSchema);
6766
const response = await structuredModel.invoke(formattedMessages);
6867

packages/core/examples/operator-example.ts

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,17 +5,14 @@
55
*
66
* To learn more about Stagehand Agents, see: https://docs.stagehand.dev/concepts/agent
77
*/
8-
98
import { Stagehand } from "../lib/v3";
109
import dotenv from "dotenv";
1110
import chalk from "chalk";
1211

1312
// Load environment variables
1413
dotenv.config();
15-
1614
async function main() {
1715
console.log(`\n${chalk.bold("Stagehand 🤘 Operator Example")}\n`);
18-
1916
// Initialize Stagehand
2017
const stagehand = new Stagehand({
2118
env: "LOCAL",
@@ -48,5 +45,4 @@ async function main() {
4845
// await stagehand.close();
4946
}
5047
}
51-
5248
main();

packages/core/examples/parameterizeApiKey.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
import { Stagehand } from "../lib/v3";
2-
import { z } from "zod/v3";
2+
import { z } from "zod";
33

44
/**
55
* This example shows how to parameterize the API key for the LLM provider.

0 commit comments

Comments
 (0)