-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Description
Feature Request: VS Code Models Integration for Graphiti
Summary
Add native VS Code language models and embeddings integration to Graphiti, allowing users to leverage VS Code's built-in AI capabilities without requiring external API keys.
Problem Statement
Currently, Graphiti users must configure external LLM providers (OpenAI, Anthropic, Google, etc.) with API keys and internet connectivity. This creates barriers for:
- Local development and experimentation
- Users without API access or credits
- Offline or air-gapped environments
- Developers who want to use AI models already available in VS Code
Proposed Solution
Implement VS Code models integration as an optional dependency package graphiti-core[vscodemodels], providing:
Core Features
- VSCodeClient: Native LLM client using VS Code's language model API
- VSCodeEmbedder: Embedding client with 1024-dimensional vectors and fallback support
- Zero Dependencies: No external packages required for VS Code integration
- Intelligent Fallbacks: Graceful degradation when VS Code models unavailable
- Automatic Detection: Auto-detects available VS Code models
Example Usage
from graphiti_core import Graphiti
from graphiti_core.llm_client.vscode_client import VSCodeClient
from graphiti_core.embedder.vscode_embedder import VSCodeEmbedder, VSCodeEmbedderConfig
from graphiti_core.llm_client.config import LLMConfig
# Initialize VS Code clients
llm_client = VSCodeClient(
config=LLMConfig(model="gpt-4o-mini", small_model="gpt-4o-mini")
)
embedder = VSCodeEmbedder(
config=VSCodeEmbedderConfig(
embedding_model="embedding-001",
embedding_dim=1024,
use_fallback=True
)
)
# Create Graphiti instance - no API keys needed!
graphiti = Graphiti(
uri="bolt://localhost:7687",
user="neo4j",
password="password",
llm_client=llm_client,
embedder=embedder
)Technical Requirements
VS Code Language Model API Integration
- Implement
VSCodeClientusing VS Code's native language model API - Handle LLM requests with intelligent fallback responses
- Support streaming and batch processing
- Follow existing LLM client patterns and interfaces
VS Code Embedding Integration
- Implement
VSCodeEmbedderwith consistent 1024-dimensional vectors - Provide semantic clustering fallback when embeddings unavailable
- Support batch embedding generation for efficiency
- Maintain similarity preservation across different environments
Package Structure
- Follow
graphiti-core[provider]pattern like other integrations - Add to
pyproject.tomloptional dependencies asvscodemodels = [] - Export classes in appropriate
__init__.pyfiles - Maintain compatibility with existing Graphiti interfaces
Benefits
- No API Keys Required: Works entirely within VS Code using native AI capabilities
- Zero External Dependencies: No additional packages needed for basic functionality
- Local Development: Perfect for offline or air-gapped environments
- Consistent Experience: Same interface patterns as other Graphiti LLM providers
- Lower Barrier to Entry: Easier onboarding for developers already using VS Code
Implementation Considerations
Architecture
VS Code Models Integration
├── VSCodeClient (LLM operations)
├── VSCodeEmbedder (embedding generation)
├── Fallback Systems (when VS Code unavailable)
├── Configuration Management
└── MCP Server Integration
Dependencies
- No external dependencies required (hence
vscodemodels = []) - Uses VS Code's native language model APIs when available
- Fallback to semantic chunking for embeddings
- Compatible with existing Graphiti infrastructure
Installation & Configuration
# Installation
pip install "graphiti-core[vscodemodels]"
# Optional environment variables
VSCODE_LLM_MODEL=gpt-4o-mini
VSCODE_EMBEDDING_MODEL=embedding-001
VSCODE_EMBEDDING_DIM=1024
USE_VSCODE_MODELS=trueAcceptance Criteria
-
VSCodeClientclass implements LLM operations using VS Code API -
VSCodeEmbedderclass generates consistent 1024-dimensional embeddings - Package available as
pip install "graphiti-core[vscodemodels]" - Automatic detection of available VS Code models
- Intelligent fallbacks when VS Code models unavailable
- Integration with MCP server for Model Context Protocol support
- Comprehensive documentation and usage examples
- Validation tests to ensure integration works correctly
- Following existing patterns from other LLM provider integrations
Related Work
- Existing LLM provider integrations (anthropic, google-genai, groq, etc.)
- VS Code Language Model API documentation
- Model Context Protocol (MCP) specification
- Other VS Code extension integrations with AI models
Priority
High - This removes a significant barrier to entry for Graphiti usage and enables local development without external API dependencies.
Additional Context
This integration makes Graphiti accessible to developers who want to experiment with knowledge graphs without setting up external LLM services. It leverages AI capabilities already available in VS Code, providing a seamless development experience for users working within the VS Code ecosystem.
Environment:
- VS Code version: Latest stable with language model extensions
- Python version: 3.10+
- Operating System: Cross-platform (Windows, macOS, Linux)