And let's not forget that there are plenty of people out there who do not (or choose not to) value rationality. I mean, I strive to decrease the entropy of the human universe, yet I consider myself an absurdist and somewhat gnostic. If you have agents in a model that have desires *about* that model (i.e., "what I want is: to be not rational"), then you have Godel's Incompleteness Theorem smack in the middle of things; a set theory problem. Humans are meta. I mean, geez, what about Buddhism?
I'm not sure if it comes across, but I'm agreeing with your post. ;)
Lots of good discussion in these threads; thanks for that. ("Make the ganglia twitch!")