Skip to content

Commit 33bdaa5

Browse files
authored
update home page and pictures (#3630)
1 parent 73d340d commit 33bdaa5

File tree

5 files changed

+4
-1
lines changed

5 files changed

+4
-1
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ Start using OpenVINO Model Server with a fast-forward serving example from the [
2323
Read [release notes](https://github.com/openvinotoolkit/model_server/releases) to find out what’s new.
2424

2525
### Key features:
26+
- **[NEW]** [Support for AI agents](https://docs.openvino.ai/2025/model-server/ovms_demos_continuous_batching_agent.html)
2627
- **[NEW]** [Image generation compatible with OpenAI API](https://docs.openvino.ai/2025/model-server/ovms_demos_image_generation.html)
2728
- **[NEW]** Native Windows support. Check updated [deployment guide](https://docs.openvino.ai/2025/model-server/ovms_docs_deploying_server_baremetal.html)
2829
- **[NEW]** [Text Embeddings compatible with OpenAI API](https://docs.openvino.ai/2025/model-server/ovms_demos_embeddings.html)

docs/deploying_server_docker.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ This is a step-by-step guide on how to deploy OpenVINO™ Model Server on Li
55
**Before you start, make sure you have:**
66

77
- [Docker Engine](https://docs.docker.com/engine/) installed
8-
- Intel® Core™ processor (6-13th gen.) or Intel® Xeon® processor (1st to 4th gen.)
8+
- Intel® Core™ processor or Intel® Xeon® processor
99
- Linux, macOS or Windows via [WSL](https://docs.microsoft.com/en-us/windows/wsl/)
1010
- (optional) AI accelerators [supported by OpenVINO](https://docs.openvino.ai/2025/openvino-workflow/running-inference/inference-devices-and-modes.html). Accelerators are tested only on bare-metal Linux hosts.
1111

docs/home.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,8 @@ The models used by the server need to be stored locally or hosted remotely by ob
4040
Start using OpenVINO Model Server with a fast-forward serving example from the [QuickStart guide](ovms_quickstart.md) or [LLM QuickStart guide](./llm/quickstart.md).
4141

4242
### Key features:
43+
- **[NEW]** [Support for AI agents](../demos/continuous_batching/agentic_ai/README.md)
44+
- **[NEW]** [Image generation and editing](../demos/image_generation/README.md)
4345
- **[NEW]** Native Windows support. Check updated [deployment guide](./deploying_server.md)
4446
- **[NEW]** [Embeddings endpoint compatible with OpenAI API](../demos/embeddings/README.md)
4547
- **[NEW]** [Reranking compatible with Cohere API](../demos/rerank/README.md)

docs/ovms_diagram.png

-62.1 KB
Loading

docs/ovms_high_level.png

-104 KB
Loading

0 commit comments

Comments
 (0)