- The universe doesn’t optimize for morality. The moral arc of the universe doesn’t bend toward justice unless we bend it.
- Even ‘good’ people don’t necessarily optimize for morality, unless they are primed by their environment to look at things through a moral lens. We tend to optimize for whatever we’re focused on. That’s not usually morality — often it’s survival, or something we have been told is equivalent to survival (such as money, miliary dominance, or the relative power of our ethnic or political group).
- Nevertheless, improving the world (and encouraging others to do so) is worthwhile. The world can’t be fixed but it can be improved. Some improvements are even low-hanging fruit — never performed, because distractions from moral imperatives are so effective.
- Optimizing for morality is just like optimizing for anything else: if you don’t keep your model updated with new information, you will end up maximizing something else entirely — something that isn’t quite your goal, and that (at the extremes) conflicts with it.
- Morality is hard to quantify, but ethical systems are not. Each ethical system is an attempt at codifying what constitutes moral behavior.
- Ethical systems conflict on the margins and in pathological or corner cases. Our familiar moral thought experiments tend to highlight these conflicts, because they are designed to differentiate between systems, as a test of which system is more effective.
- Nevertheless, ethical systems tend…