Open macterra opened 6 years ago
I was thinking a little about the definition of Value... I think our current definition is slightly inaccurate. An agent is not evaluating between states of the world, but between states of its world model.
Slight distinction, but important for relativism since an agent can't have absolute information about its environment.
(this is unrelated to the new definitions you posted, but has to do with elaborating value)
That's a good point that becomes more obvious when you consider that agents often evaluate counterfactual world state models, i.e. all preferences that do not fall under the status quo.
The Value definition should link to a new page that elaborates the concept.