Add a section to the installation read me to highlight this as an option
#27 (comment)
Running locally with different models may provide inconsistent results for users... but this open the door to taining a chatdev specific llm model 😀
Wich imo should highly be considered 🤔