fix: add OLLAMA_BASE_URL support to backend config#605
fix: add OLLAMA_BASE_URL support to backend config#605shamil2 wants to merge 1 commit intoMODSetter:mainfrom
Conversation
|
Someone is attempting to deploy a commit to the Rohan Verma's projects Team on Vercel. A member of the Team first needs to authorize it. |
There was a problem hiding this comment.
Review by RecurseML
🔍 Review performed on 48ea41a..3772901
✨ No bugs found, your code is sparkling clean
✅ Files analyzed, no issues (2)
• surfsense_backend/.env.example
• surfsense_backend/app/config/__init__.py
|
@AnishSarkar22 Can you test if it fixes the original issue. IMO I think we should be storing this in db and passing this var on runtime. Can you try that. @shamil2 Can you raise this PR for 'dev' branch. |
|
Sure I will test it and confirm |
|
@MODSetter Tested this locally but the solution by @shamil2 does not fix the original issue. Chonkie doesn't have native Ollama support, so it crashes with Yes we need a DB-based solution similar to how LLM configs work. Should I implement an Embedding Configuration UI/table that stores the |
Fixes #587. Adds support for OLLAMA_BASE_URL environment variable to allow connection to external Ollama instances for embeddings, which fixes connection refused errors during document and YouTube video uploads in Docker.
High-level PR Summary
This PR adds support for the
OLLAMA_BASE_URLenvironment variable to enable connections to external Ollama instances for embeddings. This resolves connection refused errors that occur during document and YouTube video uploads when running in Docker environments by allowing users to specify a custom Ollama endpoint (e.g.,http://host.docker.internal:11434).⏱️ Estimated Review Time: 5-15 minutes
💡 Review Order Suggestion
surfsense_backend/.env.examplesurfsense_backend/app/config/__init__.py