Skip to content
This repository has been archived by the owner on Mar 5, 2025. It is now read-only.

with Ollama running locally (ollama serve) and with deepseek-coder-v2:latest model or other similar, you can now generate code on your desktop and it will allow you to select all the code you want to save locally.

License

Notifications You must be signed in to change notification settings

non-npc/OllamaLLMPythonInterface

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama LLM Python Interface

A graphical user interface for interacting with Ollama's large language models primarily the coding models such as Deepseeker-v2:latest, built with Python and Tkinter. Save code blocks to files with the click of a button!

Ollama LLM Python Interface Screenshot

Project Goals

Save code blocks to files in a convenient manner. The use case for the "Ollama LLM Python Interface" is someone who wants to generate code and save the results locally. This saves the user from having to manually copy and paste each resulting code block.

Features

  • Connect to local Ollama instance
  • Select from available language models (LLM's)
  • Interactive chat interface
  • Real-time streaming of AI responses
  • Code block detection and saving
  • Application will try to auto-detect the filename for the code blocks to the best of its ability
  • Once the AI response is completed, then below the response our app will output all the code blocks from which you can choose which or all to save to files somwhere
  • Night mode for comfortable viewing (alpha)
  • Response information display (start time, end time, length)
  • Ability to stop ongoing requests
  • Built in debugging

Tested LLM's include

  • Deepseeker-v2:latest
  • yabi/codestral_22b_v0_1:q2_k
  • shuchi215/python_code_13b:latest

Requirements

  • Python 3.12+
  • Ollama installed and running locally

Installation

  1. Clone this repository:
  2. Install the required packages:
    • pip install -r requirements.txt
  3. Ensure Ollama is installed and running on your local machine and that you have models downloaded and accessible to Ollama.

Usage

Run the application: python main.py

  1. Select a model from the dropdown menu.
  2. Type your message in the input box.
  3. Click "Send" or press Enter to send your message.
  4. The AI's response will appear in the chat display.
  5. Code blocks in the response can be saved individually.
  6. Use the "Stop" button to interrupt long responses.
  7. Toggle night mode with the moon/sun button in the top right.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under CC0 1.0 Universal license - see the LICENSE file for details.

Acknowledgments

  • Ollama for providing the local LLM backend.
  • The Python community for the excellent libraries used in this project.

About

with Ollama running locally (ollama serve) and with deepseek-coder-v2:latest model or other similar, you can now generate code on your desktop and it will allow you to select all the code you want to save locally.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages