I am using below code to write log message to text file, the program is getting hanged when different source calls this method in parallel. Is there a way to grant /control parallel writing without breaking the program.
sub sLog {
my $self = 'currentServerDirectory';
my $logMsg = "";
my $fileName = join '_', $self->{LogFilePrefix}, &sTimeStamp("Log");
my $absFileName = "$self->{LogFolder}/$fileName.txt";
open APPHDLER, ">>$absFileName" or &exitErr("Cannot append message to file, $absFileName");
print APPHDLER scalar(localtime(time))." - $logMsg\n";
close APPHDLER;
}
When you open a file for writing, a lock on that file is granted to the process that opened it. This is done to prevent corruption of data, by processes overwriting each other. Possible solutions would be to feed the output data to a single process that handles writing to the log file, making sure that processes close the file and release their locks when they are finished writing, or using a library or file format that is designed for parallel access of files. The first two of those methods would be the easiest and preferred for writing log files like this. There are also probably perl modules (check CPAN) that handle log files.