Plugin for LLM adding support for Together
Install this plugin in the same environment as LLM.
llm install llm-togetherYou will need an API key from Together. You can obtain one by creating an account and going to 'API Keys'.
You can set that as an environment variable called TOGETHER_API_KEY, or add it to the llm set of saved keys using:
llm keys set togetherEnter key: <paste key here>This plugin adds together models that support inference without VM start to increase speed.
llm models listllm -m <one-together-model> "Three names for my new ai project"To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-together
python3 -m venv venv
source venv/bin/activateNow install the dependencies and test dependencies:
pip install -e '.[test]'Execute unit test with:
pytest