Post by LenPI'm looking for a magic word that represent a base measure - upon which other measures are derived - so that if the base measure is wrong the others are wrong, and the error rate is magnified the farther away you get from the base measure.
I think it's something associated with Game Theory but I can't recall.
That's awfully fuzzy.
"error rate is magnified" suggests to me something about
"propagation of error". You can compute a new variance
for a transformed version of a score, and it will be more
and more inaccurate the further you extend the range
of scores from the mean. It is not in getting "away from
the base measure" where error increases, so much in getting
away from the anchor-point (mean, often) for the base measure.
The variance in that sort of case is taken (say) as the first
term from series-expansion, and is only good while you can
ignore the higher-order terms.
The perfect base measure is sometimes called a Gold
Standard, in various areas including diagnosis. It does
complicate stuff when your Gold Standard has Error.
Worrying about "a base measure is wrong" somehow
makes me think of econometrics, where they often do have
a plethora of ways to measure (say) income, but have to
choose something available to plug into equations.
--
Rich Ulrich