Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/clear-suns-sleep.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---

Check warning on line 1 in .changeset/clear-suns-sleep.md

View workflow job for this annotation

GitHub Actions / grammar-check

[vale] reported by reviewdog 🐶 [SAP.Readability] The text is very complex! It has a grade score of >14. Raw Output: {"message": "[SAP.Readability] The text is very complex! It has a grade score of \u003e14.", "location": {"path": ".changeset/clear-suns-sleep.md", "range": {"start": {"line": 1, "column": 1}}}, "severity": "WARNING"}
'@sap-ai-sdk/langchain': minor
---

[New Functionality] Add LangChain Orchestration client.
177 changes: 13 additions & 164 deletions packages/langchain/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,15 @@

SAP Cloud SDK for AI is the official Software Development Kit (SDK) for **SAP AI Core**, **SAP Generative AI Hub**, and **Orchestration Service**.

This package provides LangChain model clients built on top of the foundation model clients of the SAP Cloud SDK for AI.
This package provides LangChain clients built on top of the foundation model and orchestration clients of the SAP Cloud SDK for AI.

### Table of Contents

- [Installation](#installation)
- [Prerequisites](#prerequisites)
- [Relationship between Models and Deployment ID](#relationship-between-models-and-deployment-id)
- [Usage](#usage)
- [Client Initialization](#client-initialization)
- [Chat Client](#chat-client)
- [Embedding Client](#embedding-client)
- [Orchestration Client](#orchestration-client)
- [Azure OpenAI Client](#azure-openai-client)
- [Local Testing](#local-testing)
- [Support, Feedback, Contribution](#support-feedback-contribution)
- [License](#license)
Expand All @@ -28,10 +26,11 @@ $ npm install @sap-ai-sdk/langchain
- [Enable the AI Core service in SAP BTP](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/initial-setup).
- Use the same `@langchain/core` version as the `@sap-ai-sdk/langchain` package, to see which langchain version this package is currently using, check our [package.json](./package.json).
- Configure the project with **Node.js v20 or higher** and **native ESM** support.
- Ensure a deployed OpenAI model is available in the SAP Generative AI Hub.
- Use the [`DeploymentApi`](https://github.com/SAP/ai-sdk-js/blob/main/packages/ai-api/README.md#create-a-deployment) from `@sap-ai-sdk/ai-api` [to deploy a model](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/create-deployment-for-generative-ai-model-in-sap-ai-core).
Alternatively, you can also create deployments using the [SAP AI Launchpad](https://help.sap.com/docs/sap-ai-core/generative-ai-hub/activate-generative-ai-hub-for-sap-ai-launchpad?locale=en-US&q=launchpad).
- Once deployment is complete, access the model via the `deploymentUrl`.
- Ensure that a relevant deployment is available in the SAP Generative AI Hub:
- Use the [`DeploymentApi`](https://github.com/SAP/ai-sdk-js/blob/main/packages/ai-api/README.md#create-a-deployment) from `@sap-ai-sdk/ai-api` or the [SAP AI Launchpad](https://help.sap.com/docs/sap-ai-core/generative-ai-hub/activate-generative-ai-hub-for-sap-ai-launchpad?locale=en-US&q=launchpad) to create a deployment.
- For **OpenAI model**, follow [this guide](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/create-deployment-for-generative-ai-model-in-sap-ai-core).
- For **orchestration service**, follow [this guide](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/create-deployment-for-orchestration).
- Once deployed, access the service via the `deploymentUrl`.

> **Accessing the AI Core Service via the SDK**
>
Expand All @@ -40,168 +39,18 @@ $ npm install @sap-ai-sdk/langchain
> - In Cloud Foundry, it's accessed from the `VCAP_SERVICES` environment variable.
> - In Kubernetes / Kyma environments, you have to mount the service binding as a secret instead, for more information refer to [this documentation](https://www.npmjs.com/package/@sap/xsenv#usage-in-kubernetes).

## Relationship between Models and Deployment ID

SAP AI Core manages access to generative AI models through the global AI scenario `foundation-models`.
Creating a deployment for a model requires access to this scenario.

Each model, model version, and resource group allows for a one-time deployment.
After deployment completion, the response includes a `deploymentUrl` and an `id`, which is the deployment ID.
For more information, see [here](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/create-deployment-for-generative-ai-model-in-sap-ai-core).

[Resource groups](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/resource-groups?q=resource+group) represent a virtual collection of related resources within the scope of one SAP AI Core tenant.

Consequently, each deployment ID and resource group uniquely map to a combination of model and model version within the `foundation-models` scenario.

## Usage

This package offers both chat and embedding clients, currently supporting Azure OpenAI.
This package offers LangChain clients for Azure OpenAI and SAP Orchestration service.
All clients comply with [LangChain's interface](https://js.langchain.com/docs/introduction).

### Client Initialization

To initialize a client, provide the model name:

```ts
import {
AzureOpenAiChatClient,
AzureOpenAiEmbeddingClient
} from '@sap-ai-sdk/langchain';

// For a chat client
const chatClient = new AzureOpenAiChatClient({ modelName: 'gpt-4o' });
// For an embedding client
const embeddingClient = new AzureOpenAiEmbeddingClient({ modelName: 'gpt-4o' });
```

In addition to the default parameters of the model vendor (e.g., OpenAI) and LangChain, additional parameters can be used to help narrow down the search for the desired model:

```ts
const chatClient = new AzureOpenAiChatClient({
modelName: 'gpt-4o',
modelVersion: '24-07-2021',
resourceGroup: 'my-resource-group'
});
```

**Do not pass a `deployment ID` to initialize the client.**
For the LangChain model clients, initialization is done using the model name, model version and resource group.

An important note is that LangChain clients by default attempt 6 retries with exponential backoff in case of a failure.
Especially in testing environments you might want to reduce this number to speed up the process:

```ts
const embeddingClient = new AzureOpenAiEmbeddingClient({
modelName: 'gpt-4o',
maxRetries: 0
});
```

#### Custom Destination

When initializing the `AzureOpenAiChatClient` and `AzureOpenAiEmbeddingClient` clients, it is possible to provide a custom destination.
For example, when targeting a destination with the name `my-destination`, the following code can be used:

```ts
const chatClient = new AzureOpenAiChatClient(
{
modelName: 'gpt-4o',
modelVersion: '24-07-2021',
resourceGroup: 'my-resource-group'
},
{
destinationName: 'my-destination'
}
);
```

By default, the fetched destination is cached.
To disable caching, set the `useCache` parameter to `false` together with the `destinationName` parameter.
### Orchestration Client

### Chat Client
For more information about the Orchestration client, refer to the [documentation](https://github.com/SAP/ai-sdk-js/tree/main/packages/langchain/src/orchestration/README.md).

The chat client allows you to interact with Azure OpenAI chat models, accessible via the generative AI hub of SAP AI Core.
To invoke the client, pass a prompt:
### Azure OpenAI Client

```ts
const response = await chatClient.invoke("What's the capital of France?");
```

#### Advanced Example with Templating and Output Parsing

```ts
import { AzureOpenAiChatClient } from '@sap-ai-sdk/langchain';
import { StringOutputParser } from '@langchain/core/output_parsers';
import { ChatPromptTemplate } from '@langchain/core/prompts';

// initialize the client
const client = new AzureOpenAiChatClient({ modelName: 'gpt-35-turbo' });

// create a prompt template
const promptTemplate = ChatPromptTemplate.fromMessages([
['system', 'Answer the following in {language}:'],
['user', '{text}']
]);
// create an output parser
const parser = new StringOutputParser();

// chain together template, client, and parser
const llmChain = promptTemplate.pipe(client).pipe(parser);

// invoke the chain
return llmChain.invoke({
language: 'german',
text: 'What is the capital of France?'
});
```

### Embedding Client

Embedding clients allow embedding either text or document chunks (represented as arrays of strings).
While you can use them standalone, they are usually used in combination with other LangChain utilities, like a text splitter for preprocessing and a vector store for storage and retrieval of the relevant embeddings.
For a complete example how to implement RAG with our LangChain client, take a look at our [sample code](https://github.com/SAP/ai-sdk-js/blob/main/sample-code/src/langchain-azure-openai.ts).

#### Embed Text

```ts
const embeddedText = await embeddingClient.embedQuery(
'Paris is the capital of France.'
);
```

#### Embed Document Chunks

```ts
const embeddedDocuments = await embeddingClient.embedDocuments([
'Page 1: Paris is the capital of France.',
'Page 2: It is a beautiful city.'
]);
```

#### Preprocess, embed, and store documents

```ts
// Create a text splitter and split the document
const textSplitter = new RecursiveCharacterTextSplitter({
chunkSize: 2000,
chunkOverlap: 200
});
const splits = await textSplitter.splitDocuments(docs);

// Initialize the embedding client
const embeddingClient = new AzureOpenAiEmbeddingClient({
modelName: 'text-embedding-ada-002'
});

// Create a vector store from the document
const vectorStore = await MemoryVectorStore.fromDocuments(
splits,
embeddingClient
);

// Create a retriever for the vector store
const retriever = vectorStore.asRetriever();
```
For more information about Azure OpenAI client, refer to the [documentation](https://github.com/SAP/ai-sdk-js/tree/main/packages/langchain/src/openai/README.md).

## Local Testing

Expand Down
1 change: 1 addition & 0 deletions packages/langchain/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@
"@sap-ai-sdk/ai-api": "workspace:^",
"@sap-ai-sdk/core": "workspace:^",
"@sap-ai-sdk/foundation-models": "workspace:^",
"@sap-ai-sdk/orchestration": "workspace:^",
"@sap-cloud-sdk/connectivity": "^3.26.1",
"uuid": "^11.1.0",
"@langchain/core": "0.3.40",
Expand Down
2 changes: 2 additions & 0 deletions packages/langchain/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,5 @@ export type {
AzureOpenAiEmbeddingModelParams,
AzureOpenAiChatCallOptions
} from './openai/index.js';
export { OrchestrationClient } from './orchestration/index.js';
export type { OrchestrationCallOptions } from './orchestration/index.js';
Loading
Loading