I'm trying to make this https://haystack.deepset.ai/docs/latest/tutorial5md into a Dockerized Django App, the problem is when I implement the code locally it works but when I make a dockerized version of it it gives me a connection refused, my guess is that the two docker images can't find their ways to each other.
This is my docker-compose.yaml file
version: '3.7'
services:
es:
image: elasticsearch:7.8.1
environment:
- xpack.security.enabled=false
- "ES_JAVA_OPTS=-Xms512m -Xmx512m -Dlog4j2.disable.jmx=true"
- discovery.type=single-node
- VIRTUAL_HOST=localhost
ports:
- "9200:9200"
networks:
- test-network
container_name: es
healthcheck:
test: ["CMD", "curl", "-s", "-f", "http://localhost:9200"]
retries: 6
web:
build: .
command: bash -c "sleep 1m && python manage.py migrate && python manage.py makemigrations && python manage.py runserver 0.0.0.0:8000"
volumes:
- .:/app
networks:
- test-network
ports:
- "8000:8000"
depends_on:
- es
healthcheck:
test: ["CMD", "curl", "-s", "-f", "http://localhost:9200"]
retries: 6
networks:
test-network:
driver: bridge
and this is my apps.py
from django.apps import AppConfig
import logging
# from haystack.reader.transformers import TransformersReader
from haystack.reader.farm import FARMReader
from haystack.preprocessor.utils import convert_files_to_dicts, fetch_archive_from_http
from haystack.preprocessor.cleaning import clean_wiki_text
from django.core.cache import cache
import pickle
from haystack.document_store.elasticsearch import ElasticsearchDocumentStore
from haystack.retriever.sparse import ElasticsearchRetriever
from haystack.document_store.elasticsearch import ElasticsearchDocumentStore
class SquadmodelConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'squadModel'
def ready(self):
document_store = ElasticsearchDocumentStore(host="elasticsearch", username="", password="", index="document")
doc_dir = "data/article_txt_got"
s3_url = "https://s3.eu-central-1.amazonaws.com/deepset.ai-farm-qa/datasets/documents/wiki_gameofthrones_txt.zip"
fetch_archive_from_http(url=s3_url, output_dir=doc_dir)
dicts = convert_files_to_dicts(dir_path=doc_dir, clean_func=clean_wiki_text, split_paragraphs=True)
document_store.write_documents(dicts)
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2", use_gpu=True)
document_store = ElasticsearchDocumentStore(host="localhost", username="", password="", index="document")
retriever = ElasticsearchRetriever(document_store=document_store)
self.reader = reader
self.retriever = retriever
my views.py
from django.apps import apps as allApps
from rest_framework.decorators import api_view
from rest_framework.response import Response
from haystack.pipeline import ExtractiveQAPipeline
theApp = allApps.get_app_config('squadModel')
reader = theApp.reader
retreiver = theApp.retriever
@api_view(['POST'])
def respondQuestion(request):
question = request.data["question"]
pipe = ExtractiveQAPipeline(reader, retreiver)
prediction = pipe.run(query=question, top_k_retriever=10, top_k_reader=5)
content = {"prediction": prediction}
return Response(content)
again this Django API works perfectly locally with an elastic search docker image but in this config i can't manage to make it work. Any help ?
As suggested by @leandrojmp, just needed to replace
"localhost"
with"es"
on theapps.py
.