If someone blithely continues to disagree with their (apparent) epistemic peers, how much should we downgrade the rationality and/or intelligence and/or integrity of that person. My answer was:
We can take a dimmer view of them, and should, but also have to take a dimmer view of ourselves, I think. I don’t think the “they” get downgraded relative to “us.”
…let’s say we agree with it [Aumann’s construction] completely. Then it would be true and non-operationalizable, keeping in mind that the smartest people I know — by whatever metric — do not themselves internalize the argument. There is some kind of semi-rational back and forth estimation process, where in part you judge [peer] expertise by the actual views people hold, and iterate accordingly. There is probably no fixed point theorem here, and no single obviously best way to proceed. Maybe we should downgrade those who do not know that. But I don’t know by how much. Maybe not by a lot, since knowing all those complications doesn’t improve one’s own rationality by a whole lot, as far as I can tell.
With a bit more thought, I have come up with a further and more specific answer.
Let’s say you are staying at a hotel, and everyone agrees that the hotel offers room service. There is also a very good restaurant one hour away, but people strongly disagree on how to find the place. Half of the people think the restaurant is to the West, and you strongly agree with this group; the other half strongly believe the restaurant is to the East. If you choose the wrong direction, you will have wasted two hours driving and will have to settle for the room service in any case.
If you buy into Aumann, you should be more likely to start with the room service, even though you strongly believe the restaurant is to the West.
You will note that is a purely self-regarding choice only. For choices in that category, accepting Aumann means you should be more willing to focus on what everyone agrees is good, possible, beneficial, etc. — you might call this common sense morality.
Alternatively, let’s say it is a choice for all of society, and many other people are pitching in their efforts to some kind of common enterprise — let’s call it politics.
You then have to ask what kind of stupidity you are most likely to expect from the contributing others. If the relevant bias is excess conformism, I see no special case to take greater care to converge upon what others think is best. In fact, there might be external benefits from doubling down on your own stubbornness. You might be wrong a lot of the time, but still it will be truly rare when lots of people are really quite right, and it is important that your voice shine through and be heard in those cases.
So in a nutshell, the implications of Aumann are “common sense morality for yourself, but political orneriness remains on the table.”