-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docker container not working #33
Comments
@cgmartin can you please have a look? |
Just a quick note...http://192.168.2.162:2024/docs works. So, looks like the API does not have a route defined at |
I also tried the url: https://smith.langchain.com/studio/?baseUrl=http://192.168.2.162:2024 but that gave |
@dqbd may also have some ideas. |
I logged in to Langchain and I am still seeing: I see the same even at https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024 The container logs don't have the 404 anymore. Building ollama-deep-researcher @ file:///app
Downloading numpy (13.6MiB)
Downloading jsonschema-rs (2.0MiB)
Downloading brotli (2.8MiB)
Downloading langchain-community (2.4MiB)
Downloading lxml (4.6MiB)
Downloading aiohttp (1.6MiB)
Downloading sqlalchemy (3.1MiB)
Downloading zstandard (4.7MiB)
Downloading pydantic-core (1.7MiB)
Downloading tiktoken (1.1MiB)
Downloading cryptography (3.6MiB)
Downloaded tiktoken
Downloaded aiohttp
Downloaded pydantic-core
Downloaded jsonschema-rs
Downloaded brotli
Downloaded sqlalchemy
Downloaded langchain-community
Downloaded cryptography
Downloaded lxml
Downloaded zstandard
Downloaded numpy
Built ollama-deep-researcher @ file:///app
Installed 77 packages in 46ms
INFO:langgraph_api.cli:
Welcome to
╦ ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬
║ ├─┤││││ ┬║ ╦├┬┘├─┤├─┘├─┤
╩═╝┴ ┴┘└┘└─┘╚═╝┴└─┴ ┴┴ ┴ ┴
- 🚀 API: http://0.0.0.0:2024
- 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024
- 📚 API Docs: http://0.0.0.0:2024/docs
This in-memory server is designed for development and testing.
For production use, please use LangGraph Cloud.
2025-02-20T23:26:27.258226Z [info ] Will watch for changes in these directories: ['/app'] [uvicorn.error] api_variant=local_dev
2025-02-20T23:26:27.258348Z [info ] Uvicorn running on http://0.0.0.0:2024 (Press CTRL+C to quit) [uvicorn.error] api_variant=local_dev color_message=Uvicorn running on %s://%s:%d (Press CTRL+C to quit)
2025-02-20T23:26:27.258466Z [info ] Started reloader process [104] using WatchFiles [uvicorn.error] api_variant=local_dev color_message=Started reloader process [104] using WatchFiles
2025-02-20T23:26:27.648089Z [info ] Using auth of type=noop [langgraph_api.auth.middleware] api_variant=local_dev
2025-02-20T23:26:27.649789Z [info ] Started server process [111] [uvicorn.error] api_variant=local_dev color_message=Started server process [%d]
2025-02-20T23:26:27.649847Z [info ] Waiting for application startup. [uvicorn.error] api_variant=local_dev
2025-02-20T23:26:27.683168Z [info ] 1 change detected [watchfiles.main] api_variant=local_dev
2025-02-20T23:26:27.902265Z [info ] Registering graph with id 'ollama_deep_researcher' [langgraph_api.graph] api_variant=local_dev graph_id=ollama_deep_researcher
2025-02-20T23:26:27.902570Z [info ] Application startup complete. [uvicorn.error] api_variant=local_dev
2025-02-20T23:26:27.902646Z [info ] Starting 1 background workers [langgraph_api.queue] api_variant=local_dev
2025-02-20T23:26:27.902846Z [info ] Worker stats [langgraph_api.queue] active=0 api_variant=local_dev available=1 max=1
Server started in 1.09s
2025-02-20T23:26:28.012980Z [info ] Server started in 1.09s [browser_opener] api_variant=local_dev message=Server started in 1.09s
🎨 Opening Studio in your browser...
2025-02-20T23:26:28.013223Z [info ] 🎨 Opening Studio in your browser... [browser_opener] api_variant=local_dev message=🎨 Opening Studio in your browser...
2025-02-20T23:26:28.013158Z [info ] GET /ok 200 0ms [langgraph_api.server] api_variant=local_dev latency_ms=0 method=GET path=/ok path_params= proto=1.1 query_string= req_header={'accept-encoding': 'identity', 'host': '0.0.0.0:2024', 'user-agent': 'Python-urllib/3.11', 'connection': 'close'} res_header={'content-length': '11', 'content-type': 'application/json'} route=None status=200
URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024
2025-02-20T23:26:28.013305Z [info ] URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024 [browser_opener] api_variant=local_dev message=URL: https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024
2025-02-20T23:26:28.037795Z [info ] 18 changes detected [watchfiles.main] api_variant=local_dev
2025-02-20T23:26:28.408255Z [info ] Queue stats [langgraph_api.queue] api_variant=local_dev max_age_secs=None med_age_secs=None n_pending=0
2025-02-20T23:26:28.409342Z [info ] Sweeped runs [langgraph_api.queue] api_variant=local_dev run_ids=[]
2025-02-20T23:27:28.065627Z [info ] Worker stats [langgraph_api.queue] active=0 api_variant=local_dev available=1 max=1
2025-02-20T23:27:28.578989Z [info ] Queue stats [langgraph_api.queue] api_variant=local_dev max_age_secs=None med_age_secs=None n_pending=0 |
Hi. I've pulled main branch and it is still working for me using the For the smith.langchain.com If you open the browser dev tools and check the network requests, you should see some fetch requests to the baseUrl you provide, which may help with triaging. See attached Chrome dev tools - In my case, it is using baseUrl=http://127.0.01:2024 and it accesses the ![]() Other notes:
Hope this info helps you narrow down the issue |
I updated the Just to give you some idea about the setup, I have Ollama running on my Mac, and this container is also on the same machine. Interestingly, the URLs that I am seeing are different from yours. I don't see any requests to the |
Notice that even http://192.168.2.162:2024/info is blocked even though it works when I navigate to the URL. |
This sounds like a CORS policy issue to me, if you are able to access the URL directly, but is blocked with the fetch API from the LangSmith Studio context. |
@cgmartin could be...here's what I see: Not sure what to do about this. |
Just want to make clear to the project authors that this issue doesn't seem specific to Docker, but in general with special networking setups (running elsewhere than localhost). They may have some better ideas about how to configure CORS for this service. cc @rlancemartin @arsaboo I should also mention as a workaround, that if there is a way you can configure your docker-compose to run the container so that it can be accessed via localhost / 127.0.0.1, it will avoid CORS policy issues like these.. |
I tried on the Mac with 127.0.0.1 and localhost, but same error. |
|
Got the same, running a local Ubuntu server and deployed the docker image there. Spins up fine, can get /docs and /info but nothing on base IP and DNS, or using it via the langsmith link |
I got the same problem. The server can reach /docs. Inside the docker image, when I ran "curl http://host.docker.internal:11434" or "curl http://hostIP:11434", it could get "Ollama is running". |
I followed the instructions and build the image and then used the following docker-compose:
However, I see
Not Found
on 192.168.2.162:2024. Here are the logs:The text was updated successfully, but these errors were encountered: