I'm interested in running 2 servers within a single Python script running in a Docker container. The initial dev setup that I have is with pynetdicom (see mpps-scp as an example. The main thing of interest there is the line that starts up the server running, in my file:
ae.start_server(('python_mpps', 11112), block=True, evt_handlers=handlers)
There is more detailed documentation about that here: start_server
pynetdicom implements a DICOM server. I don't think it has the ability, nor was it designed to respond to api requests using HTTP protocol.
I do have the requests module in my container to send requests to other servers, but I'd like to also add an HTTP listener, something like Flask, to add the ability to receive and process api requests, all within the same python script that is executed upon building and running my Docker container.
I guess the basic stub for Flask is something like that below, although I've never used that before. I was able to test that this basic setup works when it is launched without launching pynetdicom by just using curl: http://ip:5000/companies, where the 5000 port is mapped to 5000 on my host.
companies = [{"id": 1, "name": "Company One"}, {"id": 2, "name": "Company Two"}]
api = Flask()
@api.route('/companies', methods=['GET'])
def get_companies():
return json.dumps(companies)
return api
api.run(host='0.0.0.0',port=5000,debug=True)
I'd really like to run both servers within the same script because they would share a number of methods that are generic to processing requests for both DICOM and API requests. I probably could create separate scripts for each server, but I'd have to maintain the methods in each, or somehow load a common library of methods into each server script.
Just wondering what the best setup and approach could be. I don't know anything about threading in Python for this kind of application.
Thank you.