How to check if any IO stream is open and forcefully close it in Java

58 views Asked by At

So I am making an archiver application using Spring boot, whenever application should start, fetch data from database , uploads to S3 and it should goes down automatically.

So, I used below property in application.yml

spring:
   main:
    web-application-type: none

and It was working as excepted like, Whenever my application starts : It makes the S3 client, establish the Database connection , fetch data from Database using Hibernate , uploads into S3 using PutObject and goes down.

Logs:

2023-09-10 01:04:51.578  INFO 14992 --- [           main] c.j.m.s.ArchiverBusinessServiceImpl  : Archiver process completed with status SUCCESS
2023-09-10 01:04:51.592  INFO 14992 --- [ionShutdownHook] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFactory for persistence unit 'default'
2023-09-10 01:04:51.595  INFO 14992 --- [ionShutdownHook] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown initiated...
2023-09-10 01:04:55.606  INFO 14992 --- [ionShutdownHook] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown completed.

Process finished with exit code 0

but recently, to improve an S3 performance (Since I am uploading around 1-2 millions files at one time), I started using transfer manager. So, Now application fetch data from database , crates a files in directory, use that directory for transfer manager for bulk upload in s3.

Problem is , everything works as expected only this time application is not going down. Its just stops at

2023-09-10 01:04:51.578  INFO 14992 --- [           main] c.j.m.s.ArchiverBusinessServiceImpl  : Archiver process completed with status SUCCESS

so, when I did some analysis , looks like PrintWriter (used for creating files ) causing this issue i.e Its not getting closed when operation ends.

However, I am closing printwriter using try-with-resource, tried with close() method as well.

Need an assist to resolve this.

Here is the code snippet:

for creating a files

public void storeFile(KeyEntity keyEntity, List<JsonObject> jsonObjects, String archiveDate) {

        String fileName= getFileName(archiveDate,keyEntity);
        String path = ArchiveProperties.getFileUploadPath();
        Path dir= Paths.get(path);

        if(!Files.exists(dir)){
            try {
                Files.createDirectories(dir);
            } catch (IOException e) {
                log.error("Exception occurred while creating directory {}",ArchiveProperties.getFileUploadPath());
                throw new RuntimeException(e);
            }
        }
        Path file = Paths.get(path,fileName);
        try(PrintWriter printWriter = new PrintWriter(new FileWriter(file.toFile()))){
            printWriter.print(jsonObjects);
        } catch (IOException e) {
            log.error("Exception occurred while writing into directory {} for filename {}", ArchiveProperties.getFileUploadPath(), fileName);
            throw new RuntimeException(e);
        }
    }

Directory upload using transfer Manager:

public Boolean bulkUpload(String path, S3Properties s3Properties,String filePrefix){

        log.info("S3ObjectStore:bulkUpload :: Initialing bulk upload for path {}",path);

        AmazonS3Client s3 = S3Configuration.getS3Client(provider,s3Properties);
        TransferManager transferManager = TransferManagerBuilder.standard().withS3Client(s3).build();
        try {

            MultipleFileUpload multipleFileUpload = transferManager.uploadDirectory(s3Properties.getBucket(),filePrefix,new File(path),false);
            showTransferProgress(multipleFileUpload);
            multipleFileUpload.waitForCompletion();

        }
        catch ( Exception e) {
            log.error("S3ObjectStore:bulkUpload :: Exception occurred while uploading directory to s3",e);
            return Boolean.FALSE;
        }finally {
            transferManager.shutdownNow();
          
        }
        return Boolean.TRUE;
    }

public static AmazonS3Client getS3Client(final IdaS3ClientKeysProvider provider, final S3Properties s3Properties){

        log.info("Creating AmazonS3Client for bulk upload ..............");
        IdaS3ClientKeysProvider idaS3ClientKeysProvider = provider;
        //Set up the connection configuration
        ClientConfiguration config = new ClientConfiguration().withRetryPolicy(ClientConfiguration.DEFAULT_RETRY_POLICY);
        //DEFAULT_RETRY_POLICY=3 times retry for failure files
        config.setProtocol(Protocol.HTTPS);
        config.setSignerOverride("S3SignerType");
        //Set up the AWS credentials
        BasicAWSCredentials credentials = new BasicAWSCredentials(idaS3ClientKeysProvider.getS3Keys().getS3AccessKey(),idaS3ClientKeysProvider.getS3Keys().getS3SecretKey());
        //Set up AWS Client
        AmazonS3Client client = new AmazonS3Client(credentials,config);
        client.setS3ClientOptions(S3ClientOptions.builder().setPathStyleAccess(true).build());
        client.setEndpoint(s3Properties.getDataplaneEndpoint());
        log.info("Inside AmazonS3Client :: access key :{}, secret key :{}",provider.getS3Keys().getS3AccessKey(),provider.getS3Keys().getS3SecretKey());
        return client;
    }

Expecting: Application should go down/ shutdown automatically

0

There are 0 answers