Current situation: I have a rails app with N delayed job workers. Whenever I want to send an SSH request to some machine, I create a task for the worker. Worker performs something like:
Net:SSH.start(hostname, username, :password => pass) do |ssh|
ssh.exec!(command)
end
Sometimes I create e.g. 50 such tasks to be performed one by one, or within 5-10 minutes from each other. Each task in this case opens separate connection, which is not effective and sometimes is blocked by target servers due to too many connections.
What I want: Have opened connections stored somewhere and reused by each worker. So that each worker obtains connection somehow and then just run
ssh.exec!(command)
What I have tried:
- storing connections in file/database/cache looks to be not possible as they are not serializable
- have tried using singleton class with global variables instantiated under initializers. However class object is different for each worker (later have found that global variables can not be an option).
Are there way to solve that? Any other ideas? Thanks!!
First some basics; one SSH connection is at its core a low-level socket connection to a remote machine. Sockets cannot be (easily) shared between processes. Therefore things that run on multiple processes cannot share the same SSH connection.
Next you need to know which parts in your current setup are run in separate processes. What we get is:
Rails is in most cases process-based: separate web requests run on separate processes. So storing SSH connections in a Rails app is not a reliable solution.
Delayed Job is, as far as I know, also process-based. A master process launches slave processes to handle each job. Therefore DJ doesn't work for this purpose either.
What you need is a master process that stores the SSH session(s), and then waits for incoming messages that are commands which need to be executed on one of your remote SSH machines.
Personally I would just code a simple threaded Ruby daemon process myself that handles this task. You could use something like EventMachine to handle the communication and processing if you don't want to deal with socket programming directly.
If you're not comfortable with EventMachine or socket programming, then you could look at some messaging systems like RabbitMQ, or ZeroMQ to create your client and server with.
I also found something for Rails called ActiveMessaging, although I'm not sure how current and working that project is.
But like I said, I think the simplest implementation is just a socket daemon process that runs in the background and keeps track of the open SSH connections, and then listens to commands from your Rails app.
Remember to take into account security considerations also if you implement something like this. Otherwise you can easily give anyone SSH access to all your remote machines through your daemon process.
EDIT
An even simpler idea:
Just have a daemon process periodically read commands from a database table from your Rails app. Then this daemon can execute these commands based on what it finds in this "job queue" table. This way you don't have to deal with socket communications at all, with the disadvantage that this solution is a polling solution.