Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker container not working #33

Open
arsaboo opened this issue Feb 20, 2025 · 15 comments
Open

Docker container not working #33

arsaboo opened this issue Feb 20, 2025 · 15 comments

Comments

@arsaboo
Copy link

arsaboo commented Feb 20, 2025

I followed the instructions and build the image and then used the following docker-compose:

services:
  deep-researcher:
    image: ollama-deep-researcher
    restart: unless-stopped
    environment:
      - TAVILY_API_KEY=REDACTED
      - OLLAMA_BASE_URL=http://host.docker.internal:11434/
    ports:
      - "2024:2024"
    extra_hosts:
      - "host.docker.internal:host-gateway"

However, I see Not Found on 192.168.2.162:2024. Here are the logs:

╦  ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬

║  ├─┤││││ ┬║ ╦├┬┘├─┤├─┘├─┤

╩═╝┴ ┴┘└┘└─┘╚═╝┴└─┴ ┴┴  ┴ ┴

- 🚀 API: http://0.0.0.0:2024

- 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024

- 📚 API Docs: http://0.0.0.0:2024/docs

This in-memory server is designed for development and testing.

For production use, please use LangGraph Cloud.

2025-02-20T22:38:12.521300Z [info     ] Will watch for changes in these directories: ['/app'] [uvicorn.error] api_variant=local_dev

2025-02-20T22:38:12.521430Z [info     ] Uvicorn running on http://0.0.0.0:2024 (Press CTRL+C to quit) [uvicorn.error] api_variant=local_dev color_message=Uvicorn running on %s://%s:%d (Press CTRL+C to quit)

2025-02-20T22:38:12.521547Z [info     ] Started reloader process [100] using WatchFiles [uvicorn.error] api_variant=local_dev color_message=Started reloader process [100] using WatchFiles

2025-02-20T22:38:12.909466Z [info     ] Using auth of type=noop        [langgraph_api.auth.middleware] api_variant=local_dev

2025-02-20T22:38:12.911176Z [info     ] Started server process [107]   [uvicorn.error] api_variant=local_dev color_message=Started server process [%d]

2025-02-20T22:38:12.911283Z [info     ] Waiting for application startup. [uvicorn.error] api_variant=local_dev

2025-02-20T22:38:12.955000Z [info     ] 1 change detected              [watchfiles.main] api_variant=local_dev

2025-02-20T22:38:13.165370Z [info     ] Registering graph with id 'ollama_deep_researcher' [langgraph_api.graph] api_variant=local_dev graph_id=ollama_deep_researcher

2025-02-20T22:38:13.165690Z [info     ] Application startup complete.  [uvicorn.error] api_variant=local_dev

2025-02-20T22:38:13.165850Z [info     ] Starting 1 background workers  [langgraph_api.queue] api_variant=local_dev

2025-02-20T22:38:13.165998Z [info     ] Worker stats                   [langgraph_api.queue] active=0 api_variant=local_dev available=1 max=1

Server started in 1.09s

🎨 Opening Studio in your browser...

2025-02-20T22:38:13.275422Z [info     ] Server started in 1.09s        [browser_opener] api_variant=local_dev message=Server started in 1.09s

2025-02-20T22:38:13.275501Z [info     ] 🎨 Opening Studio in your browser... [browser_opener] api_variant=local_dev message=🎨 Opening Studio in your browser...

2025-02-20T22:38:13.275569Z [info     ] URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024 [browser_opener] api_variant=local_dev message=URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024

URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024

2025-02-20T22:38:13.275758Z [info     ] GET /ok 200 0ms                [langgraph_api.server] api_variant=local_dev latency_ms=0 method=GET path=/ok path_params= proto=1.1 query_string= req_header={'accept-encoding': 'identity', 'host': '0.0.0.0:2024', 'user-agent': 'Python-urllib/3.11', 'connection': 'close'} res_header={'content-length': '11', 'content-type': 'application/json'} route=None status=200

2025-02-20T22:38:13.330921Z [info     ] 18 changes detected            [watchfiles.main] api_variant=local_dev

2025-02-20T22:38:13.670704Z [info     ] Queue stats                    [langgraph_api.queue] api_variant=local_dev max_age_secs=None med_age_secs=None n_pending=0

2025-02-20T22:38:13.671652Z [info     ] Sweeped runs                   [langgraph_api.queue] api_variant=local_dev run_ids=[]

�

5-02-20T22:38:21.527219Z [info     ] GET / 404 2ms                  [langgraph_api.server] api_variant=local_dev latency_ms=2 method=GET path=/ path_params= proto=1.1 query_string= req_header={'host': '192.168.2.162:2024', 'connection': 'keep-alive', 'dnt': '1', 'upgrade-insecure-requests': '1', 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36', 'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7', 'accept-encoding': 'gzip, deflate', 'accept-language': 'en-US,en;q=0.9,zh-CN;q=0.8,zh;q=0.7', 'cookie': 'categories=general; language=auto; locale=en; autocomplete=google; favicon_resolver=; image_proxy=0; method=POST; safesearch=0; theme=simple; results_on_new_tab=0; doi_resolver=oadoi.org; simple_style=auto; center_alignment=0; advanced_search=0; query_in_title=0; infinite_scroll=0; search_on_category_select=1; hotkeys=default; url_formatting=pretty; disabled_plugins=; enabled_plugins=; tokens=; LOBE_LOCALE=en-US; LOBE_THEME_PRIMARY_COLOR=undefined; LOBE_THEME_NEUTRAL_COLOR=undefined; enabled_engines="bing__general\\054deezer__music\\054presearch__general\\054presearch videos__general\\054yahoo__general"; token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZCI6IjljN2Y4MjJhLTk4YmEtNGZiNi05ZDg0LTBjOGQ4NzU4NmJhNyJ9.gezOAyil6M5mLNFy-h9MAhtwAUvbRA--DeCXcbjw9V0; disabled_engines=duckduckgo__general; selected-model=ollama:qwen2.5:latest; search-mode=true; NEXT_LOCALE=en; NEXT_DEVICE_SIZE=pc; fastgpt_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VySWQiOiI2N2I2MjYzMjRmMjNhZTc0YjRkOGJjNjEiLCJ0ZWFtSWQiOiI2N2I2MjYzMjRmMjNhZTc0YjRkOGJjNjkiLCJ0bWJJZCI6IjY3YjYyNjMyNGYyM2FlNzRiNGQ4YmM2ZCIsImlzUm9vdCI6dHJ1ZSwiZXhwIjoxNzQwNTk1ODc4LCJpYXQiOjE3Mzk5OTEwNzh9.aYfepy8IPhvH5quyEQJxCX3KyB8Hy4vRM6fBXQmInUc; ph_TEST_posthog=%7B%22distinct_id%22%3A%2201951617-d28f-765d-955b-d3ce71695292%22%2C%22%24sesid%22%3A%5B1740017079186%2C%2201952119-9392-7227-8d6c-6770877530de%22%2C1740017079186%5D%2C%22%24epp%22%3Atrue%2C%22%24initial_person_info%22%3A%7B%22r%22%3A%22%24direct%22%2C%22u%22%3A%22http%3A%2F%2F192.168.2.162%3A3004%2F%22%7D%7D'} res_header={'content-length': '9', 'content-type': 'text/plain; charset=utf-8'} route=None status=404
@rlancemartin
Copy link
Collaborator

@cgmartin can you please have a look?

@arsaboo
Copy link
Author

arsaboo commented Feb 20, 2025

Just a quick note...http://192.168.2.162:2024/docs works. So, looks like the API does not have a route defined at /

@arsaboo
Copy link
Author

arsaboo commented Feb 20, 2025

I also tried the url: https://smith.langchain.com/studio/?baseUrl=http://192.168.2.162:2024

but that gave

Image

@rlancemartin
Copy link
Collaborator

@dqbd may also have some ideas.

@arsaboo
Copy link
Author

arsaboo commented Feb 20, 2025

I logged in to Langchain and I am still seeing:

Image

I see the same even at https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024

The container logs don't have the 404 anymore.

   Building ollama-deep-researcher @ file:///app

Downloading numpy (13.6MiB)

Downloading jsonschema-rs (2.0MiB)

Downloading brotli (2.8MiB)

Downloading langchain-community (2.4MiB)

Downloading lxml (4.6MiB)

Downloading aiohttp (1.6MiB)

Downloading sqlalchemy (3.1MiB)

Downloading zstandard (4.7MiB)

Downloading pydantic-core (1.7MiB)

Downloading tiktoken (1.1MiB)

Downloading cryptography (3.6MiB)

 Downloaded tiktoken

 Downloaded aiohttp

 Downloaded pydantic-core

 Downloaded jsonschema-rs

 Downloaded brotli

 Downloaded sqlalchemy

 Downloaded langchain-community

 Downloaded cryptography

 Downloaded lxml

 Downloaded zstandard

 Downloaded numpy

      Built ollama-deep-researcher @ file:///app

Installed 77 packages in 46ms

INFO:langgraph_api.cli:

        Welcome to

╦  ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬

║  ├─┤││││ ┬║ ╦├┬┘├─┤├─┘├─┤

╩═╝┴ ┴┘└┘└─┘╚═╝┴└─┴ ┴┴  ┴ ┴

- 🚀 API: http://0.0.0.0:2024

- 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024

- 📚 API Docs: http://0.0.0.0:2024/docs

This in-memory server is designed for development and testing.

For production use, please use LangGraph Cloud.

2025-02-20T23:26:27.258226Z [info     ] Will watch for changes in these directories: ['/app'] [uvicorn.error] api_variant=local_dev

2025-02-20T23:26:27.258348Z [info     ] Uvicorn running on http://0.0.0.0:2024 (Press CTRL+C to quit) [uvicorn.error] api_variant=local_dev color_message=Uvicorn running on %s://%s:%d (Press CTRL+C to quit)

2025-02-20T23:26:27.258466Z [info     ] Started reloader process [104] using WatchFiles [uvicorn.error] api_variant=local_dev color_message=Started reloader process [104] using WatchFiles

2025-02-20T23:26:27.648089Z [info     ] Using auth of type=noop        [langgraph_api.auth.middleware] api_variant=local_dev

2025-02-20T23:26:27.649789Z [info     ] Started server process [111]   [uvicorn.error] api_variant=local_dev color_message=Started server process [%d]

2025-02-20T23:26:27.649847Z [info     ] Waiting for application startup. [uvicorn.error] api_variant=local_dev

2025-02-20T23:26:27.683168Z [info     ] 1 change detected              [watchfiles.main] api_variant=local_dev

2025-02-20T23:26:27.902265Z [info     ] Registering graph with id 'ollama_deep_researcher' [langgraph_api.graph] api_variant=local_dev graph_id=ollama_deep_researcher

2025-02-20T23:26:27.902570Z [info     ] Application startup complete.  [uvicorn.error] api_variant=local_dev

2025-02-20T23:26:27.902646Z [info     ] Starting 1 background workers  [langgraph_api.queue] api_variant=local_dev

2025-02-20T23:26:27.902846Z [info     ] Worker stats                   [langgraph_api.queue] active=0 api_variant=local_dev available=1 max=1

Server started in 1.09s

2025-02-20T23:26:28.012980Z [info     ] Server started in 1.09s        [browser_opener] api_variant=local_dev message=Server started in 1.09s

🎨 Opening Studio in your browser...

2025-02-20T23:26:28.013223Z [info     ] 🎨 Opening Studio in your browser... [browser_opener] api_variant=local_dev message=🎨 Opening Studio in your browser...

2025-02-20T23:26:28.013158Z [info     ] GET /ok 200 0ms                [langgraph_api.server] api_variant=local_dev latency_ms=0 method=GET path=/ok path_params= proto=1.1 query_string= req_header={'accept-encoding': 'identity', 'host': '0.0.0.0:2024', 'user-agent': 'Python-urllib/3.11', 'connection': 'close'} res_header={'content-length': '11', 'content-type': 'application/json'} route=None status=200

URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024

2025-02-20T23:26:28.013305Z [info     ] URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024 [browser_opener] api_variant=local_dev message=URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024

2025-02-20T23:26:28.037795Z [info     ] 18 changes detected            [watchfiles.main] api_variant=local_dev

2025-02-20T23:26:28.408255Z [info     ] Queue stats                    [langgraph_api.queue] api_variant=local_dev max_age_secs=None med_age_secs=None n_pending=0

2025-02-20T23:26:28.409342Z [info     ] Sweeped runs                   [langgraph_api.queue] api_variant=local_dev run_ids=[]

2025-02-20T23:27:28.065627Z [info     ] Worker stats                   [langgraph_api.queue] active=0 api_variant=local_dev available=1 max=1

2025-02-20T23:27:28.578989Z [info     ] Queue stats                    [langgraph_api.queue] api_variant=local_dev max_age_secs=None med_age_secs=None n_pending=0

@cgmartin
Copy link
Contributor

cgmartin commented Feb 21, 2025

Hi. I've pulled main branch and it is still working for me using the docker run example in the README. Your posted logs seem to look fine (it is running inside the container). If the /docs endpoint works, that seems good - the server is running. You can also try the /info endpoint. But you're correct, / will result in 404 - there is no route.

For the smith.langchain.com baseUrl param, You need to supply the IP and port of the ollama-deep-researcher service from the perspective of where your browser is running. From your examples, it seems as if https://smith.langchain.com/studio/?baseUrl=http://192.168.2.162:2024 would be the correct URL to use, especially if the same browser can access http://192.168.2.162:2024/docs. But if you are getting the "Failed to load assistants" error, then something is wrong with the browser LangSmith Studio context reaching your ollama-deep-researcher service.

If you open the browser dev tools and check the network requests, you should see some fetch requests to the baseUrl you provide, which may help with triaging. See attached Chrome dev tools - In my case, it is using baseUrl=http://127.0.01:2024 and it accesses the /info, /search, /graph?xray=true (and others) on domain 127.0.0.1 successfully.

Image

Other notes:

Hope this info helps you narrow down the issue

@arsaboo
Copy link
Author

arsaboo commented Feb 21, 2025

I updated the OLLAMA_BASE_URL=http://192.168.2.162:11434/ in the docker-compose. Here are some of the fetch requests:

Image

Just to give you some idea about the setup, I have Ollama running on my Mac, and this container is also on the same machine. Interestingly, the URLs that I am seeing are different from yours. I don't see any requests to the OLLAMA_BASE_URL

@arsaboo
Copy link
Author

arsaboo commented Feb 21, 2025

Notice that even http://192.168.2.162:2024/info is blocked even though it works when I navigate to the URL.

@cgmartin
Copy link
Contributor

This sounds like a CORS policy issue to me, if you are able to access the URL directly, but is blocked with the fetch API from the LangSmith Studio context.

@arsaboo
Copy link
Author

arsaboo commented Feb 21, 2025

@cgmartin could be...here's what I see:

Image

Not sure what to do about this.

@cgmartin
Copy link
Contributor

cgmartin commented Feb 21, 2025

Just want to make clear to the project authors that this issue doesn't seem specific to Docker, but in general with special networking setups (running elsewhere than localhost). They may have some better ideas about how to configure CORS for this service. cc @rlancemartin

@arsaboo I should also mention as a workaround, that if there is a way you can configure your docker-compose to run the container so that it can be accessed via localhost / 127.0.0.1, it will avoid CORS policy issues like these..

@arsaboo
Copy link
Author

arsaboo commented Feb 21, 2025

I tried on the Mac with 127.0.0.1 and localhost, but same error.

@rlancemartin
Copy link
Collaborator

Just want to make clear to the project authors that this issue doesn't seem specific to Docker, but in general with special networking setups (running elsewhere than localhost). They may have some better ideas about how to configure CORS for this service. cc @rlancemartin

yes, I am going to bring this up with @dqbd and @lc-arjun.

@photonjunkie
Copy link

Got the same, running a local Ubuntu server and deployed the docker image there. Spins up fine, can get /docs and /info but nothing on base IP and DNS, or using it via the langsmith link

@billcsm
Copy link

billcsm commented Feb 28, 2025

I got the same problem. The server can reach /docs. Inside the docker image, when I ran "curl http://host.docker.internal:11434" or "curl http://hostIP:11434", it could get "Ollama is running".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants