You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Contexts I've built the image pushed the image to my repo, so I could run the container from Dockge.
Ran the container in bind 0.0.0.0 on host. it worked What I mean is the container is running and is accessible from my host well really from my network Ollama is configured in a similar manner as well.
how ever when I go to the selection window to "prompt Setting" selection window it is not listed now I did copy the .env file left as default as a localhost connection should be possible
I then ran the compose file, with it using my hosts network so a local connection to ollama should be possible or am I incorrect with this?
Not relevant to the issue but worth looking in to?
another option would be to rebuild the dockerfile make an edit to it and hard code my ip into the .env or to add a RUN cmd to install ollama and jupyter labs into the container a preference of mine and run ollama serve from the terminal now I am aware that I could
"docker exec -it" in to the container but I like UI of the jupyter lab. other option use the sh terminal from Dockge but the UI leaves a bit to be desired
The text was updated successfully, but these errors were encountered:
@Mikewhodat if you run it as a docker container, then 127.0.0.1 is resolved inside the docker container and refers to the container itself. I'm not sure if using network_mode: host will help you, I'm never using this option.
The way I'm running it with Ollama installed directly on my Mac is either by using the IP address of my machine or host.docker.internal
Contexts I've built the image pushed the image to my repo, so I could run the container from Dockge.
Ran the container in bind 0.0.0.0 on host. it worked What I mean is the container is running and is accessible from my host well really from my network Ollama is configured in a similar manner as well.
how ever when I go to the selection window to "prompt Setting" selection window it is not listed now I did copy the .env file left as default as a localhost connection should be possible
I then ran the compose file, with it using my hosts network so a local connection to ollama should be possible or am I incorrect with this?
The
YAML file configuration
version: "3.8"
services:
ollama-deep-researcher:
image: mikewho90/ollama-deep-researcher:v1
network_mode: host
environment:
SEARCH_API: tavily
TAVILY_API_KEY: tvly- "obfuscated"
OLLAMA_BASE_URL: http://127.0.0.1:11434/
OLLAMA_MODEL: llama3.2
Not relevant to the issue but worth looking in to?
another option would be to rebuild the dockerfile make an edit to it and hard code my ip into the .env or to add a RUN cmd to install ollama and jupyter labs into the container a preference of mine and run ollama serve from the terminal now I am aware that I could
"docker exec -it" in to the container but I like UI of the jupyter lab. other option use the sh terminal from Dockge but the UI leaves a bit to be desired
The text was updated successfully, but these errors were encountered: