You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/intelligentapps/copilot-tools.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -58,7 +58,7 @@ The Agent Code Gen tool has several important features:
58
58
Example requirement:
59
59
60
60
```text
61
-
Create an AI app to manage travel queries, use Azure AI Foundry models.
61
+
Create an AI app to manage travel queries, use Microsoft Foundry models.
62
62
```
63
63
64
64
- **Various agent framework functionality support:** The tool supports many features like function calling, MCP, and streaming responses.
@@ -82,7 +82,7 @@ The Agent Code Gen tool has several important features:
82
82
83
83
## AI Model Guide tool
84
84
85
-
The AI Model Guide tool helps developers pick the best AI models for their apps. It recommends Azure AI Foundry and GitHub models, including the latest and most popular ones. The tool provides details like input types, context length, cost, and metrics (quality, speed, safety). It also explains how to connect to models, such as GitHub endpoints and tokens.
85
+
The AI Model Guide tool helps developers pick the best AI models for their apps. It recommends Microsoft Foundry and GitHub models, including the latest and most popular ones. The tool provides details like input types, context length, cost, and metrics (quality, speed, safety). It also explains how to connect to models, such as GitHub endpoints and tokens.
86
86
87
87
This tool supports:
88
88
@@ -114,7 +114,7 @@ This tool supports:
114
114
Create an AI app to manage travel queries using a cheap and fast azure model.
115
115
```
116
116
117
-
For this example, Copilot selects a model like Azure AI Foundry GPT-4.1-mini model.
117
+
For this example, Copilot selects a model like MicrosoftFoundry GPT-4.1-mini model.
Copy file name to clipboardExpand all lines: docs/intelligentapps/evaluation.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -87,7 +87,7 @@ For textual similarity:
87
87
-**GLEU**: Google-BLEU variant for sentence-level assessment; measures overlaps in n-grams between response and ground truth.
88
88
-**METEOR**: Metric for Evaluation of Translation with Explicit Ordering; measures overlaps in n-grams between response and ground truth.
89
89
90
-
The evaluators in AI Toolkit are based on the Azure Evaluation SDK. To learn more about observability for generative AI models, see the [Azure AI Foundry documentation](https://learn.microsoft.com/azure/ai-foundry/concepts/observability?tabs=warning).
90
+
The evaluators in AI Toolkit are based on the Azure Evaluation SDK. To learn more about observability for generative AI models, see the [Microsoft Foundry documentation](https://learn.microsoft.com/azure/ai-foundry/concepts/observability?tabs=warning).
Copy file name to clipboardExpand all lines: docs/intelligentapps/models.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,14 +11,14 @@ Within the model catalog, you can explore and utilize models from multiple hosti
11
11
12
12
- Models hosted on GitHub, such as Llama3, Phi-3, and Mistral, including pay-as-you-go options.
13
13
- Models provided directly by publishers, including OpenAI's ChatGPT, Anthropic's Claude, and Google's Gemini.
14
-
- Models hosted on Azure AI Foundry.
14
+
- Models hosted on Microsoft Foundry.
15
15
- Models downloaded locally from repositories like Ollama and ONNX.
16
16
- Custom self-hosted or externally deployed models accessible via Bring-Your-Own-Model (BYOM) integration.
17
17
18
-
Deploy models directly to Azure AI Foundry from within the model catalog, streamlining your workflow.
18
+
Deploy models directly to Foundry from within the model catalog, streamlining your workflow.
19
19
20
20
> [!NOTE]
21
-
> Use Azure AI Foundry, Foundry Local, and GitHub models added to AI Toolkit with GitHub Copilot. For more information, check out [Changing the model for chat conversations](/docs/copilot/customization/language-models#change-the-model-for-chat-conversations.md).
21
+
> Use Microsoft Foundry, Foundry Local, and GitHub models added to AI Toolkit with GitHub Copilot. For more information, check out [Changing the model for chat conversations](/docs/copilot/customization/language-models#change-the-model-for-chat-conversations.md).
22
22
23
23

24
24
@@ -125,20 +125,20 @@ To add a self-hosted or locally running Ollama model:
125
125
126
126
To add a custom ONNX model, first convert it to the AI Toolkit model format using the [model conversion tool](/docs/intelligentapps/modelconversion.md). After conversion, add the model to AI Toolkit.
127
127
128
-
## Deploy a model to Azure AI Foundry
128
+
## Deploy a model to Microsoft Foundry
129
129
130
-
You can deploy a model to Azure AI Foundry directly from the AI Toolkit. This allows you to run the model in the cloud and access it via an endpoint.
130
+
You can deploy a model to Microsoft Foundry directly from the AI Toolkit. This allows you to run the model in the cloud and access it via an endpoint.
131
131
132
132
1. From the model catalog, select the model you want to deploy.
133
-
1. Select **Deploy to Azure AI Foundry**, either from the dropdown menu or directly from the **Deploy to Azure AI Foundry** button, as in the following screenshot:
133
+
1. Select **Deploy to Microsoft Foundry**, either from the dropdown menu or directly from the **Deploy to Microsoft Foundry** button, as in the following screenshot:
134
134
135
-

135
+

136
136
137
137
1. In the **model deployment** tab, enter the required information, such as the model name, description, and any additional settings, as in the following screenshot:
138
138
139
139

140
140
141
-
1. Select **Deploy to Azure AI Foundry** to start the deployment process.
141
+
1. Select **Deploy to Microsoft Foundry** to start the deployment process.
142
142
1. A dialog will appear to confirm the deployment. Review the details and select **Deploy** to proceed.
143
143
1. Once the deployment is complete, the model will be available in the **MY MODELS** section of AI Toolkit, and you can use it in the playground or agent builder.
144
144
@@ -177,7 +177,7 @@ In this article, you learned how to:
177
177
178
178
- Explore and manage generative AI models in AI Toolkit.
179
179
- Find models from various sources, including GitHub, ONNX, OpenAI, Anthropic, Google, Ollama, and custom endpoints.
180
-
- Add models to your toolkit and deploy them to Azure AI Foundry.
180
+
- Add models to your toolkit and deploy them to Microsoft Foundry.
181
181
- Add custom models, including Ollama and OpenAI compatible models, and test them in the playground or agent builder.
182
182
- Use the model catalog to view available models and select the best fit for your AI application needs.
Copy file name to clipboardExpand all lines: docs/intelligentapps/overview.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -99,12 +99,12 @@ AI Toolkit opens in its own view, with the AI Toolkit icon now displayed on the
99
99
-**Add MCP Server**: The link for adding and working with an existing MCP server.
100
100
-**Create new MCP Server**: The link for creating and deploying new MCP servers in AI Toolkit.
101
101
102
-
-**Help and Feedback**: This section contains links to the Azure AI Foundry documentation, feedback, support, and the Microsoft Privacy Statement. It contains the following subsections:
103
-
-**Documentation**: The link to the Azure AI Foundry Extension documentation.
102
+
-**Help and Feedback**: This section contains links to the Microsoft Foundry documentation, feedback, support, and the Microsoft Privacy Statement. It contains the following subsections:
103
+
-**Documentation**: The link to the Microsoft Foundry Extension documentation.
104
104
-**Resources**: The link to the AI Toolkit Tutorials Gallery, a collection of tutorials to help you get started with AI Toolkit.
105
105
-**Get Started**: The link to the getting started walkthrough to help you learn the basics of AI Toolkit.
106
106
-**What's New**: The link to the AI Toolkit release notes.
107
-
-**Report Issues on GitHub**: The link to the Azure AI Foundry extension GitHub repository issues page.
107
+
-**Report Issues on GitHub**: The link to the Microsoft Foundry extension GitHub repository issues page.
Copy file name to clipboardExpand all lines: docs/intelligentapps/reference/ManualModelConversion.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,13 +7,13 @@ MetaDescription: Model Conversion reference about manual model conversion.
7
7
8
8
The AI Toolkit supports the [Open Neural Network Exchange](https://onnx.ai) (ONNX) format for running models locally. ONNX is an open standard for representing machine learning models, defining a common set of operators and a file format that enables models to run across various hardware platforms.
9
9
10
-
To use models from other catalogs, such as Azure AI Foundry or Hugging Face, in the AI Toolkit, you must first convert them to ONNX format.
10
+
To use models from other catalogs, such as Microsoft Foundry or Hugging Face, in the AI Toolkit, you must first convert them to ONNX format.
11
11
12
12
This tutorial guides you through converting Hugging Face models to ONNX format and loading them into the AI Toolkit.
13
13
14
14
## Set up the environment
15
15
16
-
To convert models from Hugging Face or Azure AI Foundry, you need the [Model Builder](https://onnxruntime.ai/docs/genai/howto/build-model.html) tool.
16
+
To convert models from Hugging Face or Microsoft Foundry, you need the [Model Builder](https://onnxruntime.ai/docs/genai/howto/build-model.html) tool.
0 commit comments