NLog - Write log entry to different file

2.2k views Asked by At

I haven't found a solution for the problem yet. In some cases I have errors in my application that I have to log to a seperated log file. The problem with some of these errors is that they consists of a lot of data and I'd like to have the log entries in seperated files to analyze them later. Currently I just log the message to the global log file and copy and paste the start and end tags (XML, JSON) of the log entries manually to a different file, save it and open it in a JSON/XML-viewer. I think best would be to have a directory with unique file names to write for every log entry and have a back reference to the global log file with this file name as a log entry. But that's just my humble opinion, maybe there is a better solution for it. You have one? :)

Currently I am using NLog but I could also image to change to Serilog if this isn't possible with NLog. I also would say that there shouldn't be a different way to log the message in code than this:

    public class TestClass
    {
        private static readonly Logger Log = LogManager.GetCurrentClassLogger();

        private void Work()
        {
            var json/xml = ...
            Log.Error(json/xml);
        }

    }

Because it's concern of the logging configuration how this is written to files, database etc.

Thanks.

3

There are 3 answers

0
Rolf Kristensen On

The easy solution is just doing this (Ensures the Blob-output suddenly doesn't propagate some where unwanted):

public class TestClass
{
    private static readonly Logger Log = LogManager.GetCurrentClassLogger();
    private static readonly Logger BlobLog = LogManager.GetLogger("BlobLog");

    private void Work()
    {
        var correlationId = Guid.NewGuid().ToString();
        Log.WithProperty("BlobGuid", correlationId).Error("Hello");
        var json/xml = ...
        BlobLog.WithProperty("BlobGuid", correlationId).Error(json/xml);
    }

}

Then do this in you NLog.config, where ${event-properties:BlobGuid} ensures new file:

<targets>
    <target name="logfile" xsi:type="File" fileName="file.txt" layout="${longdate}|${level}|${logger}|${message} BlobGuid=${event-properties:BlobGuid}" />
    <target name="blobfile" xsi:type="File" fileName="blob.${shordate}.${event-properties:BlobGuid}.txt" layout="${message}" />
</targets>

<rules>
    <logger name="BlobLog" minlevel="Trace" writeTo="blobfile" final="true" />
    <logger name="*" minlevel="Debug" writeTo="logfile" />
</rules>

There are a lot of ways to add context to a LogEvent. See also https://github.com/NLog/NLog/wiki/Context

The LogEvent context can be used in the logging rules filtering. See also https://github.com/nlog/NLog/wiki/Filtering-log-messages

You can also use the LogEvent-context in NLog Layouts, and FileTarget Filename is a NLog Layout. So the same file-target can write to different filenames based on LogEvent-context.

1
Rolf Kristensen On

If it is really important to only generate a single LogEvent, then you can do this:

public class TestClass
{
    private static readonly Logger Log = LogManager.GetCurrentClassLogger();

    private void Work()
    {
        var theBlob = ...
        Log.WithProperty("Blob", theBlob).Error("Hello World");
    }

}

Then do this in you NLog.config:

<targets>
    <target name="logfile" xsi:type="File" fileName="file.txt" layout="${message}${when:when=length('${event-properties:Blob}') > 0:inner= BlobGuid-${guid:GeneratedFromLogEvent=true}}"/>
    <target name="blobfile" xsi:type="File" fileName="Blob.${shortdate}.${guid:GeneratedFromLogEvent=true}.txt" layout="${event-properties:Blob}" />
</targets>

<rules>
    <logger name="*" minlevel="Debug" writeTo="blobfile">
       <filters defaultAction='Ignore'>
          <when condition="'${event-properties:Blob}' != ''" action="Log" />
       </filters>
    </logger>
    <logger name="*" minlevel="Debug" writeTo="logfile" />
</rules>

This will reuse the same LogEvent for writing to both files.

0
C. Augusto Proiete On

Here's a possible approach using Serilog, leveraging the sinks File and Map:

  • Write regular log messages to a file called Application.log (i.e. anything that does not have a large data in the log message)
  • Write large data messages to individual files named Data_{uniqueId}.log
  • Use a property called LargeDataId to store the unique ID for the file where the large data will be stored, and also use this property to determine if a message is a regular log message or not (i.e. if a LargeDataId property exists, then it goes on the individual file, otherwise it's a regular message and goes to Application.log:

public class TestClass
{
    private readonly ILogger _logger = Log.ForContext<TestClass>();

    public void Work()
    {
        var jobId = Guid.NewGuid();

        // Writes to Application.log
        _logger.Error("Error executing job {JobId}", jobId);

        var jsonOrXml = "...";

        // Writes to Data_{uniqueId}.log
        _logger.Error("{LargeDataId}{LargeData}", jobId, jsonOrXml);
    }
}

Your Serilog logging pipeline configuration would look something like:

Log.Logger = new LoggerConfiguration()
    .WriteTo.Logger(c =>
        c.Filter.ByExcluding(e => e.Properties.ContainsKey("LargeData"))
            .WriteTo.File("Application.log"))
    .WriteTo.Map("LargeDataId", (id, wt) =>
            wt.File($"Data_{id}.txt", outputTemplate: "{LargeData}"),
                sinkMapCountLimit: 0)
    .CreateLogger();