This project is inspired by (https://github.com/techwithtim/PythonAIAgentFromScratch.git). Their work laid the foundation for this research bot and I have modified it for offline use and simplicity.
A simple research assistant chatbot powered by phi3 running locally using Ollama.
It fetches info from Wikipedia and DuckDuckGo, summarizes it with an LLM, and saves your research.
- 
Clone this repo
 - 
Create and Activate a Virtual Environment
 - 
Install Dependencies
pip install -r requirements.txtThis will install: langchain langchain-community langchain-ollama python-dotenv wikipedia pydantic duckduckgo-search (installed as a sub-dependency) - 
Install and Run Ollama
ollama run phi3 - 
Configure Environment Variables (Optional) If you later decide to use APIs like OpenAI or Anthropic, you can use a .env file.
For now, since Ollama is local, you don’t need any API keys.
 - 
Run the Application
python main.pyYou’ll be prompted with: "What can I help you research?" Type your query and get a clean summary. The response will also be saved to research_output.txt. 
Gets input from user
Searches Wikipedia / DuckDuckGo
Passes data to phi3 LLM
Summarizes the result
Saves the research output to a .txt file