Intuitions are our least introspective belief components. We know the least about their origins, or how they would change if our other beliefs changed. Of course this does not make them wrong; since we are only consciously aware of a tiny fraction of what goes on in our minds, in a sense most belief is intuitive.
Alex reminds Tyler that initial moral intuitions are often contradictory, and therefore in error. We should thus “curve fit” around our initial intuitions to create a better estimate of moral truth. And the higher our error rate, the less influential each specific intuition should be. In this post, let me highlight a huge error source: cultural and genetic heritage.
Put yourself into the frame of mind of a reasonable creature of some indeterminate species and culture, before your culture or species arose. Did this creature have a reason to expect the moral intuitions arising in your culture or species to be closer to moral truth than intuitions in other random cultures or species? If not, then any such correspondence would be random luck.
We do not want to just hope that we happen to believe truth; we want to see that the process that produces our beliefs produces a correlation between our beliefs and the truth. So random influences on our beliefs are bad, inducing more error. Unless you can see a reason to have expected to be born into a culture or species with more accurate than average intuitions, you must expect your cultural or species specific intuitions to be random, and so not worth endorsing.
A similar argument suggests you reject ways that your intuitions differ the average in your culture or species. If a neutral observer would have no good reason to think you special, then neither do you.