For lab data, measurements are normally provided with detection/reporting limits and confidence intervals. For example, I might have a measurement of Magnesium concentration in water where the minimum reporting value is 5 and I have received two measurements, the first is 10 and the second is "<5" (ie. below reporting value). As an end user, there are times that you want "<5" to be treated as "5", sometimes treated as "0", sometimes treated as "2.5".
How I am approaching this problem is by constructing an S3 class with an attribute LRL (lower reporting limit). What I would like to be able to have the user do is the following:
a <- set_measurement("<5", LRL = 5)
b <- set_measurement(8, LRL = 5)
set_conservatism(1) # sets a global variable called "conservatism_coefficient" to 1
a
# 5 [LRL: 5]
c <- b + a
# 13 [LRL: 5]
set_conservatism(0.5)
a
# 2.5 [LRL: 5]
b + a
# 10.5 [LRL: 5]
c
# 13 [LRL: 5]
What I'm imagining is that the value of "a' is somehow set to "LRL*conservatism_co-efficient" rather than a number. Then when some other function tries to access the value, the value is dynamically computed based on the current conservatism_co-efficient.
Is this possible, and/or am I just going about this completely the wrong way?
Do not be scared to try overloading the generic functions you need. You could achieve what you want by just modifying the
printfunction and the group of arithmetic operationsOps:(Side note from a Python programmer: Things like this are very easily achieved with properties in Python and by overriding magic methods)