You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: COMMUNITY_PROVIDERS.md
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,6 +13,7 @@ Community-developed provider plugins that extend LangExtract with additional mod
13
13
| AWS Bedrock |`langextract-bedrock`|[@andyxhadji](https://github.com/andyxhadji)|[andyxhadji/langextract-bedrock](https://github.com/andyxhadji/langextract-bedrock)| AWS Bedrock provider for LangExtract, supports all models & inference profiles |[#148](https://github.com/google/langextract/issues/148)|
14
14
| LiteLLM |`langextract-litellm`|[@JustStas](https://github.com/JustStas)|[JustStas/langextract-litellm](https://github.com/JustStas/langextract-litellm)| LiteLLM provider for LangExtract, supports all models covered in LiteLLM, including OpenAI, Azure, Anthropic, etc., See [LiteLLM's supported models](https://docs.litellm.ai/docs/providers)|[#187](https://github.com/google/langextract/issues/187)|
15
15
| Llama.cpp |`langextract-llamacpp`|[@fgarnadi](https://github.com/fgarnadi)|[fgarnadi/langextract-llamacpp](https://github.com/fgarnadi/langextract-llamacpp)| Llama.cpp provider for LangExtract, supports GGUF models from HuggingFace and local files |[#199](https://github.com/google/langextract/issues/199)|
16
+
| Outlines |`langextract-outlines`|[@RobinPicard](https://github.com/RobinPicard)|[dottxt-ai/langextract-outlines](https://github.com/dottxt-ai/langextract-outlines)| Outlines provider for LangExtract, supports structured generation for various local and API-based models |[#101](https://github.com/google/langextract/issues/101)|
16
17
| vLLM |`langextract-vllm`|[@wuli666](https://github.com/wuli666)|[wuli666/langextract-vllm](https://github.com/wuli666/langextract-vllm)| vLLM provider for LangExtract, supports local and distributed model serving |[#236](https://github.com/google/langextract/issues/236)|
0 commit comments