I am writing an image server for a web project that will handle all image requests. The idea is that one image of a max resolution (for examples sake lets say 3000X2000) will be stored in the file system. When an image request comes from the client, the appropriate file will be found, then resized to match the clients dimensions and returned to the browser. To break it down:
- File requested from client and caught by catch all servlet
- Screen dimensions have already been stored in client cookie.
- Max file is found on the file system.
- Image is resized to match users dimensions.
- Buffered image is returned to the browser. Never saved.
This of course is just an idea. It saves me from having to save 5 different image sizes for every image that is used on the project, but is this a realistic approach? Is this being used in production by any major sites? At scale if thousands of requests are coming in, will this cause memory problems? The memory should be freed as soon as the file is returned to the browser correct?
The file will be returned to the browser:
BufferedImage image = ImageIO.read(f);
OutputStream out = response.getOutputStream();
ImageIO.write(image, "jpg", out);
out.close();
Some testing using just the basic Java resize tools is showing speeds in the milliseconds, so in theory its possible, but it will of course take longer than having the resized images pre-stored. Maybe being able to serve images that are exactly the right dimensions for the requested screen size will negate the slightly slower image request?
private static BufferedImage resize(BufferedImage original, int width, int height, int type) {
BufferedImage resized = new BufferedImage(width, height, type);
Graphics2D g = resized.createGraphics();
g.drawImage(original, 0, 0, width, height, null);
g.dispose();
return resized;
}