A graphical user interface for interacting with Ollama's large language models primarily the coding models such as Deepseeker-v2:latest, built with Python and Tkinter. Save code blocks to files with the click of a button!
Save code blocks to files in a convenient manner. The use case for the "Ollama LLM Python Interface" is someone who wants to generate code and save the results locally. This saves the user from having to manually copy and paste each resulting code block.
- Connect to local Ollama instance
- Select from available language models (LLM's)
- Interactive chat interface
- Real-time streaming of AI responses
- Code block detection and saving
- Application will try to auto-detect the filename for the code blocks to the best of its ability
- Once the AI response is completed, then below the response our app will output all the code blocks from which you can choose which or all to save to files somwhere
- Night mode for comfortable viewing (alpha)
- Response information display (start time, end time, length)
- Ability to stop ongoing requests
- Built in debugging
- Deepseeker-v2:latest
- yabi/codestral_22b_v0_1:q2_k
- shuchi215/python_code_13b:latest
- Python 3.12+
- Ollama installed and running locally
- Clone this repository:
- git clone https://github.com/non-npc/ollama-llm-interface.git
- cd ollama-llm-interface
- Install the required packages:
- pip install -r requirements.txt
- Ensure Ollama is installed and running on your local machine and that you have models downloaded and accessible to Ollama.
Run the application: python main.py
- Select a model from the dropdown menu.
- Type your message in the input box.
- Click "Send" or press Enter to send your message.
- The AI's response will appear in the chat display.
- Code blocks in the response can be saved individually.
- Use the "Stop" button to interrupt long responses.
- Toggle night mode with the moon/sun button in the top right.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under CC0 1.0 Universal license - see the LICENSE file for details.
- Ollama for providing the local LLM backend.
- The Python community for the excellent libraries used in this project.