I'm looking for a solution to the problem of having the DefaultIfEmpty()
extension method not picking up null values when used in a LINQ outer join.
Code as follows:
var SummaryLossesWithNets = (from g in SummaryLosses
join n in nets
on g.Year equals n.Year into grouping
from x in grouping.DefaultIfEmpty()
select new
{
Year = g.Year,
OEPGR = g.OccuranceLoss,
AEPGR = g.AggregateLoss,
OEPNET = ((x.OEPRecovery == null) ? 0 : x.OEPRecovery),
AEPNET = ((x.AEPRecovery == null) ? 0 : x.AEPRecovery),
});
In the List SummaryLosses there are many years worth of data I wish to join to the table 'nets' which contains a sub-portion of the years. The exception is thrown on x being a null value, I am assuming because the years in SummaryLosses not matched by years in nets creates null values in the grouping list.
How should one go about checking for the null value here? As you can see I've attempted to check for null on the properties of x, but since x is null this doesn't work.
Kind regards Richard
Simply check whether
x
is null instead:Or if
x.OEPRecovery
andx.AEPRecovery
are nullable properties too, use: