Considering a company with more than 1000 points of sales in the country needs to send sales recorded in your store to your sales center a couple of times during the day.
What is the best integration strategy to ensure delivery? Use a JMS queue at each point of sale to be consumed by a Midleware. Send xml messages to Midleware and wait for a call back to ensure receipt, or send sales through files?
For sending a couple of times per day (we're speaking about high volume but low velocity), you could use files. References and confirmations of processing can be sent by JMS or HTTP, for instance. In terms of guaranteed delivery, you'll need to code this with the confirmations on the application level. Alternatively, for transfering files, you can use some frameworks like StreamSets or Nifi. These are designed for different kinds of input protocols and are very flexible.
A couple of times during the day is not really a lot. In the future you'll most definitely want to have it working in real-time on per-transaction basis, so you'll need to redesign your system. It all depends what this data that you're sending means to you. Most probably JMS will do just fine in this case. 1000 selling points means that you'll have something like 1000 transactions per minute, I suppose. This can be handled without problem with today's comodity hardware and well known JMS queue managers, for instance MQ or ActiveMQ etc.
I see you attached some tags like "esb". The ESB, if you have one, could be used for the control messages (in case you have a JMS message saying that some new data is available at a specific location or a confirmation of processing). I would not use an ESB for transfering the big files you'll have. If you really want to transfer all the data in an "inteligent" middleware like an ESB (not only a queue manager), then you can look at compositions of some big-data oriented middleware, for instance check how Hortonworks HDF does it, or see the above mentioned SteamSets.