In general, I say that an agent is causally responsible for some event E when the probability of E occurring given A, the set of all decisions that that agent made, is greater than , the probability of E occurring given the alternative set of decisions B that minimizes the probability of E occurring.
Suppose I go out, and I won’t bring an umbrella. If I get wet as a result, then I am causally responsible for it, because I could’ve avoided it by bringing an umbrella or staying at home. Arguably, even if I stay at home, there is the possibility (a very slight one) that a powerful storm will cause the roof to leak and I will get wet anyway. However, even in that case I would be causally responsible, because I could have hidden out in a different building that would’ve reduced my risk of getting wet. Perhaps I could have taken refuge in a mine shaft, where the probability of getting wet is vanishingly small.
As a quantitative person, I would like to somehow assign a real number to causal responsibility so that my responsibility is lower in the case that I stay in my house than when I go out. For example, perhaps I could define . In this case, causal responsibility is a real number between 0 and 1. Hard determinism would imply that is always identically zero. Supposing that free will exists, on the other hand, I am responsible for each of my own decisions with . The weather forecast says the POP is 10% right now, so for getting wet if I go out. for the decay of an unstable nucleus, and so on. (It may be that this is not the most mathematically convenient way to define causal responsibility)
Whatever the case, causal responsibility is absolute and objective, though it is impossible to calculate it precisely. It is also morally irrelevant.
In contrast to causal responsibility, moral responsibility is subjective and often relative, just as morality itself is. It is clear-cut most of the time—if I grab a knife and step out now and stab the next person I see, then that is a moral transgression and I am morally responsible—but in many cases it is not so clear-cut. For example, some people think that it’s moral to execute certain criminals, and others disagree. And if one thinks that it is immoral to execute a particular person, then on whom does the moral responsibility lie? The executioner? The judge that handed down the death sentence? The entire society, for failing to democratically abolish capital punishment? All of these entities, to varying extents? (For that matter, what on Earth would it mean to blame society, composed of a large number of moral agents?)
Common sense dictates that moral responsibility should entail causal responsibility. If for an event given my actions, then I could not possibly have prevented it, so it would be absurd to blame me for it. In general, people should not be held morally responsible for failing to achieve the impossible. In law, we say ultrā posse nēmō obligātur (“no one is obligated beyond possibility”). But this post is about the converse.
Some people believe that causal responsibility also entails moral responsibility, so that the two are in fact equivalent. This is the point of view that I would expect from anyone who subscribes to the total responsibility doctrine, who would, for example, probably believe that the victims of the Holocaust were responsible for their own deaths in that they failed to flee before Hitler’s Final Solution was implemented. (I would call this point of view extreme, in contrast to the point of view that moral responsibility exists in all cases, including ones without causal responsibility—which I would call absurd.)
The rest of us, the vast majority, believe that causal responsibility does not entail moral responsibility. But the real point here is that as soon as we can find even one example in which we can agree that causal responsibility exists but moral responsibility doesn’t, we have to accept that, in general, we cannot deduce moral responsibility from causal responsibility; to do so would be illogical. I don’t think that anyone I know would claim that the victims of the Holocaust were, in fact, even slightly morally responsible for their deaths. It follows that, if the only reason why they feel that X is morally responsible for Y is that X is causally responsible for Y, then it is illogical to continue feeling that way. (And, by the way, I don’t judge people for having illogical feelings; after all, that’s what feelings are for, right?)
I think we should particularly keep this in mind when we try to decide who is at fault when one person (Alice) says something and another person (Bob) becomes offended. Causal responsibility exists with both parties; Alice is causally responsible for obvious reasons, whereas Bob is causally responsible because his life could’ve unfolded differently so that he would not be offended by the same statement; I can’t think of any statement that would be offensive to everyone. Typically, when we try to apportion moral responsibility, we base our decision upon where the causal responsibility is greater (note that this test might not work with the “naive” mathematical definition of causal responsibility suggested earlier in this article), by using reasonableness arguments about how likely it is that a reasonable person would become offended, and so on. On the other hand, I advocate an intent-based approach: if Alice intended to hurt Bob’s feelings, then she is morally responsible (inasmuch as hurting people’s feelings can ever be considered “wrong” at all); otherwise, she is not. This cleanly avoids any causality-based argument in this kind of issue.
I think most people that in principle consider themselves hard determinists in practice believe in causal responsibility. I guess you think this makes any sense. *shrug* For my part, I think it is a matter of semantics to apply the definition as a determinist and still get worakble values (much as you can be a bayesian or a frequentist but still agree on what the probability of an event is).
Also, care to explain the bit about negative values? I can’t figure out quite what you mean there. =P
You’re right, the negative values don’t make sense. I’ll remove that.