Good in Theory: A Political Philosophy Podcast

31 - Thought Lab 3: Utilitarianism & the Great Spreadsheet

Clif Mark

Today, Paul Sagar and I get into utilitarianism. We talk about thought experiments that involve: drowning kids, ruined loafers, death squads and bioweapons.

The drowning children are from Peter Singer. He's a utilitarian that thinks that we rich first-world types should be giving away all our money to save the global poor from starving and malaria.

Paul disagrees. He brings in another philosopher (Bernard Williams) to argue that worrying about starving children all the time would violate his integrity. As usual, he tries hard not to offend anyone (until he gets to Hiroshima).

References:

Peter Singer, “Famine, affluence and morality”

Bernard Williams, “Against Utilitarianism”

Support the show

Clif Mark:

Today is the third thought lab with Paul Sager, and we talk utilitarianism, drowning children, and when it's okay to develop biological weapons, I'm Clif Mark. And this is good in theory. Welcome back, Paul, we're glad to have you here for a third thought lab.

Paul Sagar:

Thanks very much for having me back cliff.

Clif Mark:

We're going to be talking about utilitarianism today. So I'm just going to start by giving a basic idea of what utilitarianism is. So, utilitarianism is a moral philosophy that says, The best thing to do at all times is, whatever produces the best consequences. And the best consequences are understood as the ones that produce the most happiness, or well being or benefit to the greatest number of people or in aggregate. So instead of sitting around worrying about is it really wrong to steal? Is it wrong to do this or that, you just look at what the consequences of your actions are going to be. And you choose the action that's going to have the best outcomes aggregated from everyone in the world. So, Paul, could you give us an example of how that works in practice?

Paul Sagar:

So a classic example might be something that a government would have to consider. So we know that every year many 1000s of people in say, the US and Canada are going to die on the freeway, it's just we know from the historical incidents of road traffic accidents, and we have good reason to expect those to continue in some similar way into the future. And that means 1000s of people will die in horrific ways right? Now, what we could say, is that so terrible, that we're going to stop having people drive on the freeway, or we're going to make them drive at five miles an hour, because that will get the deaths on the freeways down to zero. And that would, you know, stop people dying on the freeway. But the flip side is, but in order to have a functioning, healthy economy, in order to have haulage moving across the country, in order for people to be able to live their lives, go to their work, have holidays, see their family, we need a freeway infrastructure, which allows people to travel very, very fast. As what governments do is they weigh up on balance, what is going to promote the greatest well being of the greatest number. And in this case, it's having freeways where people can drive at 70, or 80, or whatever mile per hour is, even though that means some people are going to die. So what we're going to choose to do here is tolerate a certain number of deaths, in exchange for benefits that accrue to the majority. Even though the benefits to the majority are relatively slight compared to the cost to the minority, the minority that dies, loses everything. But the majority that doesn't die, but gets to carry on using the freeway go into small increments. But the utilitarian thought here is that if you aggregate all that increment, it outweighs even the very, very bad thing in the hands to the minority.

Clif Mark:

Good. So utilitarianism is a kind of economic logic. It's a cost benefit analysis, where you add up all the good things and all the bad things, and you just try to go for the best outcome right. Now that we've got that basic shape of the theory in place, I'd like to start with the thought experiments. And on the show, we've had sex with dead chickens, we've had a little bit of incest, we've served dinner with murder knives. And today, I'd like to start by drowning a child. And I think you know which one I'm talking about.

Paul Sagar:

So cliff, the problem with utilitarianism is there's so many examples that involve doing harm to children. totally sure which one you're referring to. But the one that I teach my students is seen as very famous example of the drowning child. The thought experiment goes like this, you're you're walking around the local pond, perhaps you've gone there to feed the ducks, who knows. And you see a small child drowning in the pond.

Unknown:

What should you do?

Clif Mark:

Well, you jump in and save the child. Okay,

Paul Sagar:

but what if you were wearing an expensive pair of Italian handcrafted suede loafers that you'd only just bought the day before and you really didn't want to get them wet.

Clif Mark:

You slip off your loafers and you save the child.

Paul Sagar:

But what if it's muddy and they they're going to get you know, they don't they can't take them off late. It's just the vagabonds around they might steal them, you know, you really can't really take them off and you can't get them wet. You know, it's okay to walk away. No, obviously, no matter what you're wearing, no matter how fresh, you got to jump in and get that baby. Good. So that's a conclusion that obviously singer wants everyone to draw. Right? And we all agree that you can't let a child drown just because you don't want to get your shoes wet, right? That's not acceptable. And singers point here is we all agree with this. But then he tries to build over here. Yeah, but here's the rub. So what he wants to say next is okay, think about children who are at risk of drowning in Bangor, because the government lacks the infrastructure to protect them from floodwaters. And yet, for whom it would be very easy for us to help them, if we simply sent them a small portion of our income, far less than the price of a lovely pair of Italian shoes. And if we simply sent that income to them, we could save not just one child, but many, many 1000s of child children from drowning. So do we not have a duty to do exactly that?

Clif Mark:

So the question is whether we have a duty to not spend our money on expensive shoes, because we could just as easily use the money to save the lives of starving children on the other side of the world. Now, we both know that if you start telling people, they have a duty to hand over their money, they're gonna start coming up with objections. And the first one that you usually hear with this example is, why is it my business? These kids are on the other side of the world? Don't they have parents? Don't they have their own government that should be taking care of them? Why is it up to me to save these drowning children in another hemisphere?

Paul Sagar:

Well, this is where singer embodies the kind of ultimately hard headed utilitarian attitude here, which is that it doesn't matter that they're 1000s of miles away, right, saying what to say, doesn't matter, that their government has failed them, which is true, or their parents have abandoned them, which may also be true, what matters is that you have the capacity through using your money to save them, right. And the mere fact that you can't see them, or that they're very, very far away is morally irrelevant singer wants to say. So in one case, you'd have to ruin your new shoes, and that will cost you money. If you wanted to replace the shoes. In the other, you'd have to forego shoes in order to save the children. And what's sitting on wants to say is that given that you are in a position to save them, it doesn't matter,

Clif Mark:

right. But what I'm saying is, aren't those children someone else's problem? Why is it up to me to save them?

Paul Sagar:

Well, because you can write and insofar as a singer thinks that utilitarianism is the correct moral code a, we should try and save as many people as possible increase the preferences and the satisfactions of as many living creatures as possible. If you buy that premise, which of course, maybe you don't, but he says, if you do buy that premise, then you should do everything you can to try and reduce the amount of suffering increase the amount of satisfaction of preferences as he put it, in the world that you can. And the fact that these people are unconnected to you, personally, is just a distraction, right? That's a prejudice, he'd want to say that's a hang up from a pre scientific bunch of prejudices you have about morality. But if you accept that you should save a drowning child. Even if your shoes get wet, then by the same logic, you should try to save children who are further away, just because you can't see them doesn't matter.

Clif Mark:

Right? Even if every child matters, then they all matter

Paul Sagar:

equally, even

Clif Mark:

even Bengali children are worth a pair of shoes is right. That's basically the is singers like very low bar that he's trying to establish.

Paul Sagar:

Exactly.

Clif Mark:

So suppose I'm convinced by the Cosmopolitan impartiality of the utilitarian position. And I say, Fine. I'm not going to buy shoes. This time, I'm just going to send the money I would have spent on shoes to famine relief. Are we done? Can I just go on my life and start thinking about my next pair of shoes yet?

Paul Sagar:

So Peter Singer wants to say no, because, okay, you've forgotten a couple of$100 on nice shoes. But look at your life compared to the life of these Bengali children. Look at how much more you have, look at how much better your life is going than theirs, even when you've sent the $200 to their children. So surely, you can afford to give a little bit more, right. I mean, your life is so much more comfortable than theirs

Clif Mark:

400 500 1006 that what is this going to cost me. So Peter Singer stops asking for my money,

Paul Sagar:

while more or less until you've given away everything above what you need just to survive a fairly basic level. And singa himself does seem to live fairly close to this code. But his conclusion is that everybody should be giving away pretty much all of the surplus that they have to those less need, sorry, those more needy than they are, because that is a way that we can directly try to improve the amount of some pleasure or happiness or the amount of satisfaction that there is in the world. So it's not enough just to not buy those shoes and send the money to charity. You've got to give pretty much everything that's above the necessities that you need to survive.

Clif Mark:

Okay, so just to recap, why is this utilitarian, it's because the pleasure that I'm going to get the happiness that I'm going to get out of that pair of shoes or out of whatever amount of money Have being a rich Western person is going to be much smaller than the happiness that these children will have in virtue of being able to eat and not starve to death.

Paul Sagar:

Exactly, the sort of pleasure or utility that you'll get from a pair of shoes is just vastly outweighed singer thinks by, you know, not dying, or not drowning or not starving, or, you know, they're having a life that you were in which you can grow up through some kind of education, some kind of opportunity with other things, the money could be used. And it's basically this idea that utilitarianism compares these different outcomes and NGOs have the one where the outcome is more beneficial.

Clif Mark:

So this is alienating and repellent and very compelling in a very specific kind of philosophical way. This makes sense to me, it is hard to say, No, I'd rather have shoes than to save a child's life. However, people look for reasons at this point in the example, like I said, we both taught it, students are hands up saying Why should I give anything, everything? I have a way I worked for this, I deserve it. This is mine, who are these strangers on the other side of the earth? Who have a right to take everything that I've shaped my life around? So what do you say to them as a Peter Singer, utilitarian?

Paul Sagar:

So again, the Peter Singer utilitarianism response here is just the hard headed one, it says, Well, you know, that's just prejudice. And that's just selfishness. What you are basically doing is saying that you are more important than these other people just because you were lucky enough to be born into a culture and an economy, where you were given far more at a younger age, and had better life chances. But that's all just a matter of luck. You don't, there's nothing about you, that's special, there's nothing about you, that means you deserve these advantages. And the least you could do now is try and level other people up to a bit closer to where you are. And what you're really expressing here is a kind of prejudice, a kind of selfishness and a set of sort of prejudicial favoring of yourself. That's the kind of hard headed utilitarian answer here that I think is most consistent, because utilitarianism says

Clif Mark:

the resources go wherever they can cause the most benefit. And the most benefit is not me updating my shoes every year.

Unknown:

Exactly. It's

Paul Sagar:

children not starving, or, you know, mosquito nets in Africa,

Clif Mark:

right. So here's the question, this way of thinking it makes sense if it's just one pair of shoes for one drowning child. But once you tell me that, I have to give away everything till I'm living on a subsistence level just to buy mosquito nets, and that every decision I make is governed by that. It just sounds like a monstrous way to think and no one should ever think that way. And so how does utilitarianism make sense as a way of moral reasoning at all?

Paul Sagar:

Good. So one way to think about this that I find helpful is there's times when utilitarianism makes perfect sense. And those are usually in political situations. And there are times when we'd probably have to be utilitarians. But there's a big question about whether utilitarianism can really make sense at an individual level as a personal ethic. And whether in fact, in both cases, given the level of personal morality, but also at political morality, there are lines which we should be deeply, deeply suspicious of ever crossing, which utilitarianism not only wants you to cross, but want you to say there's nothing wrong with crossing them.

Clif Mark:

Give me an example of utilitarian line crossing.

Paul Sagar:

So there's a few that tend to be used in this area. The first one is about stealing people's organs. The second one is about a guy who has to go into a job he doesn't like. And the third one has some very problematic language that we may need to modify for a contemporary audience. But maybe we'll start with the famous organ theft example, which kind of crosses the line between political and ethical, individual ethical questions sort of straddles the two. So one thought here is, imagine you had five people who were in a hospital and they're all dying of some kind of organ failure, one's got liver failure, we've got kidney failure, one needs new hearts, and they're all going to die. These are all immensely popular, hardworking members of society, all of whom have loving families. And indeed, some of them are scientists who are on the brink of making major discoveries, which are likely to improve human wellbeing on aggregate. But they're gonna die these these five guys, or maybe a women as well that those be non binary people, whatever gender they are these these five people, they're at death's door. But what we know is that there's, this is this is one person, right? He's a recluse. He lives very, very healthy life. He has no friends. He has no family. He's a hermit, and you contribute nothing to society, and he never will. But he has very, very healthy heart, a very healthy lungs, very healthy kidneys. And the challenge to utilitarianism here is well why not kidnap this person? Cut out his organs and transplant them into the more popular, loved socially productive people save five lives five lives which are likely to go on to benefit humanity immensely. And sacrifice the one hermit that no one cares about. No one knows him, and he's never gonna contribute anything anyway. The challenge has utilitarian is what's wrong with doing that? What's wrong with ambushing people and stealing their organs and giving them to more quote unquote, deserving individuals?

Clif Mark:

Well, utilitarianism says nothing, it says you should do it.

Paul Sagar:

Well, the consistent utilitarianism says nothing. The consistent, hardheaded, utilitarian says, look, you know, you find that that squeamish, and you find that unpleasant. But that's because you're saddled with an outdated morality that can't make these kinds of hard choices. At that point, I think the philosopher Bernard Williams, who you mentioned earlier, really does have the right answer, which is the only thing you need to resist utilitarianism at that point, is a sufficient willingness not to be bullied, right? That if the utilitarian says the right thing to do is to kidnap people and steal their organs? Well, then we're at a point where there's not much left to be said in terms of philosophical analysis. The thing to say here is no, you don't understand why people matter and why we'd be in the business of making ethical judgments in the first place. So the conversations over and I tend to think that that's where if somebody really wants to say the right thing to do is to steal organs from the innocent, there's probably nothing I can say to convince them. But there's a very, very good self refutation there about why we should probably not want to be utilitarians.

Clif Mark:

It's a self refutation, because utilitarianism tells you to steal people's organs. And that's just obviously wrong.

Paul Sagar:

Exactly. And if the utilitarian responses were your idea of obviously morally wrong, your intuition, it just needs to be rejected in favor of utilitarianism. Well, then we need to know why. Why should it be rejected in favor of your 13 ism?

Clif Mark:

Well, how about this, I just since we're on surgery, I do have to bring up the basic utilitarian line of defense. And we don't have to go too deeply into it. But they'll just say, Well, look, we can't make a rule of ambushing people for their organs. Because we don't want to live in a world like that no one would even come into the hospital, because they'd be afraid that as soon as you put them out, you're gonna start looting their bodies. Overall, living in a world where we did steal people's organs, would be a worse world. So utilitarianism says, despite the appearances, that it would improve utility, it's better just to have a rule where we don't steal people's organs.

Paul Sagar:

Good. So that's the standard utilitarian response here, right, which is said, Look, you know, you've got to think about all the consequences, not just the consequences of saving these five people, but all the consequences. And if we had a system where people's organs could be stolen, then actually the overall consequences would be unbalanced worse. So we would do that. So don't worry, utilitarianism doesn't license that. But hey, you've got his were doing some good technical work in philosophy by interrogating the thought experiment can be really helpful, because I don't think that's a particularly good response from the utilitarian. Because what the utilitarian is doing, there's gambling, that the calculation will come out in a way that fits with ordinary morality, intuitions, right? saying, Oh, don't worry, we won't actually make you harvest people's organs. Because you know, it would never be the right thing to do on our calculation. And I think the correct line of response here is Hang on a minute, it's not about the calculation. It's not about how that summing up comes out. It's about the fact that you were even prepared to consider it. Because the utilitarian has to be prepared to consider the possibility that it might create the greatest utility to steal people's organs. Because here we can impose, you know, here, we can change the thought experiment, we're only going to do it once. And it's going to be completely in secret, no precedents will be set. In fact, we're going to pass a law afterwards, banning this kind of activity to make sure it never happens again, and you say to the utilitarian, I'm going to really push you on this, you don't get to just declare that it will never come out that the calculation demands it. Number one, it might demand it. And secondly, why are you even prepared to think about it in those terms, it shouldn't even be up for debate, ambushing people and stealing their organs is something we just don't do. We don't not do it. Just because a calculation said on balance it will lead to worse consequences. That's again, the vosper and Williams is very critical of utilitarianism said that's the kind of thing he called one thought to many. If the reason you think you shouldn't ambush people and steal their organs is because in my unbalance in future lead to unintended bad consequences. You're saying, You don't understand why you shouldn't kidnap people and steal their organs. Right? The reason you shouldn't do that is because people have established certain intrinsic moral worth for whom we have respect. They have rights over their own bodies that we again should respect. That's why you don't attack them. Right. Consequences be damned. The calculation be damned here. And the utilitarian can't say that. Because the utilitarian can't say that I think something's gone wrong in the way they're thinking about ethics.

Clif Mark:

Good. So, with the surgery's thought experiment shows is that sometimes utilitarianism gives you these really gross conclusions. And that's one big objection to it. I want to now talk about some examples where utilitarianism doesn't really necessarily give the wrong answer. But it just doesn't capture how we do moral reasoning. So the example I want to start off with is from Bernard Williams article against utilitarianism. And it's about a guy named George. George has just got his PhD in chemistry, he's on the job market, he's having a tough time finding a job. And it's getting really stressful because he has a young family, him and his wife have a young child that they're having trouble taking care of. And one of his prof says, I know you're having trouble on the job market. And I know, this really isn't your thing. But I know some people in a biological and chemical weapons factory, I could get you in with them, you could go over design some bio weapons, and that'll solve your feeding your family problem. So what should George do in that situation?

Paul Sagar:

So it's actually worse than that. It's not just that if George doesn't take the job, his own family will likely be destitute is that this other person, Jackie, will not only do the job, but do it really well. And make really, really effective chemical weapons, George could do the job, you know, just well enough to not get fired, but kind of sort of Bo kind of grit inside the machine whilst also feeding his family. If he doesn't take the job, Jackie's gonna take it. So not only will his own family be destitute, but someone who's going to do worse things in the position will get the job. So that's the challenge.

Clif Mark:

Right? So on the utilitarian view, he should just take the job because it's gonna, he's probably going to do less good for the biochemical weapons industry, assuming you think that's bad, and he's going to benefit his family. So overall, it's just win win. Whereas if he lets Jackie, take the job soon, she's gonna be hiding viruses and bats shutting down world economies.

Paul Sagar:

Exactly. And, and George's family will stuff. And so the interesting point about this thought experiment is the Williams wants to make this really clear. And people often miss this, which is that?

Clif Mark:

Well, first of all, I should say that I don't want to take the job as George I imagined, okay, so

Paul Sagar:

I get that. But you can also see how someone like George might take it, right? How there's a genuine conflict here. Like there's something really, you know, that he may decide on balance, he has to take it, he doesn't want to, but in order to feed his family and to stop Jackie, he's going to do this thing. I wouldn't blame him exactly. You wouldn't blame him if he did. But you could also understand if he refused, right? And what will influence to say here is, there's genuine ambiguity in this scenario, right? There's a genuine sense of which there's a real tragedy for George's decision here. On the one hand, if he does take the job, then he's going to be himself directly implicated in producing chemical weapons, and doing something which he thinks is deeply morally wrong, right, and there's a cost there to George, that's something that should matter. On the other hand, if he doesn't do it, then there are objectively going to be bad consequences, you know, more people will likely die, his own family will suffer. And one thing that Williams wants to say here is, there's a weight sense in which utilitarianism can't even make sense of there being a cost here, or some kind of genuine tragedy at stake. utilitarianism wants to say, look, it's obvious what he should do, he should just take the job. But it's not obvious. The utilitarian wants to say, Well, look, you know, the calculation is obvious, Giorgia, take the job, what's the problem? Move on. But that reveals a certain tone deafness to the texture of ethical experience, that, again, I think, is important to us.

Clif Mark:

That seems true to me. And this happens to a lot of people when they're thinking about their career and trying to choose a job, right? Because biochemical weapons is just an extravagantly evil one. But there are lots of jobs that people might think are ethically compromised. I mean, I know a lot of scientists who went to research for big pharmaceutical companies, and a lot of people think they're unethical, or I know a lot of people who into consulting or public relations, or at least considered it or wound up working for Facebook, or Google or all sorts of big companies that a lot of people think are not necessarily doing good in the world. And my point is that when people are contemplating this question, it doesn't really always seem like just utilitarian calculation. That's not all they're thinking of. They're also thinking About what they personally want to do and what kind of person they want to become through whatever career they

Unknown:

choose.

Paul Sagar:

Yeah, so this is why I am quite suspicious of the so called effective altruism movement, as are a number of others. Just to recap, the effective altruists. idea is if you take your salary and give a large proportion of it away, you can do more good than say, if you work for a charity, we're earning less money, but keep it for yourself. So instead of working for Save the Children, the charity and trying to help children directly, you what you should do is go and work for JPMorgan get a huge salary and give 90% of it away to good causes, which are more effective at promoting social change or improving the lives of others. I think a lot of people share this suspicion that there's something morally distasteful about the attitude being reared here, which is it's totally fine to go and work for the kinds of corporations which you have lots of reason to be very suspicious of many of whom have done, and that's damage to our society and look at what Facebook and Google have done to the democratic process in the last 15 years, it seems rather problematic to say, Oh, well, you know, just go work for Google. But make sure you give a lot of your salary away to buy mosquito nets for people in Africa, because that was promoted the most good overall, you know, nothing to worry about here, move on.

Clif Mark:

Good, good. I just want to take a second to explain effective altruism, and why it seems so weird to people and made so many headlines. So effective altruism. This is an organization I believe it started in Oxford. And it's kind of directed at people who are upper middle class who are high performing young people who want to do good in the world. And they're inspired by Peter Singer. So they take a utilitarian approach. And they say, Okay, if you want to do good in the world, if you really want to save lives, we know that there's places at the top of society jobs where you can get paid a lot of money, and you can get into those. So instead of wasting your life, volunteering or working for a charity, go work for Goldman, get that big salary, get that fat bonus. And then when you get it, instead of spending it on nice cars, and vacations and apartments in London, just fork it over to the Gates Foundation, because Bill Gates has got a really good deal on mosquito nets. And that will actually be the most efficient and effective way to save lives. It's better than working for a charity. So that is the most moral thing to do. And the reason why I think this is so alienating and weird and interesting is because you got these people, the premise is someone who just wants to go and do good in the world to help as much as they can. And you're advising them to go into this completely opposite culture, this kind of corporate Shark Tank where money is the most important thing. And they say, go live that life, live the life of a hyper competitive overachieving investment banker, but do it in the service of charity. And that just feels kind of weird and odd advice to give someone. And so that weirdness might reveal that there's something fishy about utilitarianism, right.

Paul Sagar:

So I have a thought about that. So I think what it is, is that utilitarianism wants to reduce everybody's life to nothing but a kind of cipher of utility promotion. So your life isn't something which is internally complex and meaningful, specifically to you and your family. And it was lived in a particular context and a particular history, which is ultimately extremely complex, when your life really boils down to is a little node and a giant calculation that you can influence by doing certain things to withstand pleasure in one direction rather than another. And I think what a lot of people find intuitively repulsive, about utilitarian at this level, if they can't always articulate it, is that your life becomes nothing but a cipher for utility. And that's all it's worth. That's all anybody's life is worth.

Clif Mark:

Your slave to those starving. Yeah, even

Paul Sagar:

they themselves don't matter as a starving Bengali kids. They matter simply as negatives in the great spreadsheet. And something's getting deeply lost there about what human beings are. So on the one hand, I love that the great spreadsheet, yeah, that's a good one, which of course, nobody's ever actually been able to fill in or do any real calculations on that's important thing, of course, to always remember about utilitarianism, it promises to do these great calculations of promoting pleasure and avoiding pain or whatever, but no one's ever actually come up with a reliable, you know, algorithm to do it and how could they How could it be possible to reduce the sum total of human lived experience to preferred satisfactions. And again, that's where it's important to just kind of say, hang on a minute, this intuitive idea that, you know, it's good to help the many at the expense of the few when possible, if there's a kind of democratic idea there. There's a social reforming idea here. There's a history behind utilitarianism, which is emerged in the 19th century amongst radicals, who looked at a deeply unfair, deeply unequal world and wanted to make it better. And there's still times when a utilitarian argument is welcome in certain political contexts. The problem is, is when it's taken too far, and presumes to encompass everything it presumes to be the only game in town. And that's when it's appropriate to step back and say, hang on a minute, that can't be the right way to think about my life or your life.

Clif Mark:

How do you tell when it's been taken too far.

Paul Sagar:

So when it's failing to understand things that ordinary human beings can get a grip on, and understand as being things that matter? utilitarianism has lost its way. And again, the important point here, and here, I think we can come back to Williams is another famous example, is it's not just about utilitarianism coming up with the wrong outcomes. It's not just about it, declaring things that we don't like, because we don't like the consequences. So in the example we had before of cutting up people's organs, we just think you thought terrorism has the wrong answer there, right? It's just come up with the answer says, Go kidnap people and steal their organs. And we like that. No, that's the wrong answer. That's not acceptable. But utilitarianism can also go deeply wrong when it gets the right answer, but in the wrong kinds of ways, or in ways that reduce important aspects of ethical life out of the picture in a way that we should resist. And this is what Williams is so called Jim and the Indians example. And I'm using the original terminology because it's a very famous philosophical thought experiment. Of course, we, we probably wouldn't use that language now. But it's thing for clarity for those who maybe getting some recall from the university days or whatever. It is the so called gym in the Indians thought experiment. So Clif Mark, what do you remember of this? This?

Clif Mark:

One gym is a vacation somewhere in South America, he walks into the town square there, he finds a sweaty, fat, Pedro have a military officer, maybe some kind of Death Squad guy. And he's about to execute 20 indigenous people Indians. And he says, Yeah, I'm about to kill these guys. But hey, Jim, you're a great honored foreigner that makes it a special day, I'll give you the chance to shoot one of these guys. And then we'll let the rest of them go. Exactly. And so and so Jim is faced with this choice. He's got to kill one of these people. Some of them are defiant, but most of them are terrified and begging him to shoot one of them. So at least most of them can live. And so that's the thought experiment. And it's what should Jim be thinking about when he makes this decision? What should exactly? And what Williams says is Look, the utilitarian answer here is ultimately the correct one,

Paul Sagar:

the utilitarian answer here is he's got to shoot one, he's got to shoot one Indian person, indigenous person, because if it doesn't, 20 you're gonna die. And we can all agree that one person dying is less bad than 20 people dying. But what Williams wants to draw attention to is how much else is going on in this scenario? So you described it really well there, Cliff, you know, some of these people are terrified. They are crying, some of them may be dedicating themselves because they already know, Pedro is there sweating away laughing because he does this every day, and he gets a kick out of it. And poor Jim, poor Jim, who's just out in the in the jungle collecting his botany samples, he didn't want any part of this. He's never held a gun before. He doesn't want to kill anybody. He's terrified because he's terrified that Pedro is going to shoot him if he doesn't engage. So there's a real grittiness and visceral, emotional, psychological horror to this situation, which utilitarianism encourages us to forget about it right, or simply to turn into another part of the calculation. But the really important point Williams wants to make here is, let's say Jim, disgusted and horrified, terrified as yours decides, look, you know, there's 19 people here, we're going to live if I just kill this one person, I have to do it. And let's say Jim does it he does what the utilitarian says and shoots one person to save 19 and Pedro man of his word, at least for today. That's the other 19 go. And then the question here is what does the utilitarian Miss from the scenario?

Clif Mark:

Right? Because the utilitarian says, Well done, Jim, you should be proud of yourself. You shot the guy you save 19 people go home, treat yourself on the plane has

Paul Sagar:

actually talked to and Williams and said that can't be right. If that really was Jim's attitude, he'd be a some kind of psychopath. If Jim's attitude was really Oh, well, you know, place with this choice. Did my calculation save 19 killed one, pretty good day think I'll have a cocktail that would repulse us and it should repulse us. Again, the utilitarian here is missing the point, which is that what Jim has been forced to do is violate his own integrity. He's been forced to commit horrific act of violence under duress by another person, which is going to have profound repercussions for his sense of who he is as an ethical agent. He's going to live with that for the rest of his life.

Clif Mark:

Okay, so you mentioned integrity as an important thing that utilitarianism lives out. And that's an important concept and Williams's argument. So could you just say a little more about it.

Paul Sagar:

So Jim wanted to Jim wanted to be the kind of guy who went through life just collecting botany samples, minding his own business, he just wanted to be nice to other people, he's quite reclusive. He didn't want to hurt other people. But he was now forced into the situation to become somebody he never wanted to be, and to do a thing he never wanted to do. And he's changed as a result of that he's not the same person he was before he never will be again. And that's an assault on Jim and assault on No, of course, there's a very serious moral wrong here about what happens to the, to the to the quote, unquote, Indians. And we might want to say something about how interesting it is that Williams reached for this example, in this language and, and, you know, it focuses on the white man being faced with this awful thing had to do. But putting even those relative relevant concerns aside, there is a there is an issue here about this person being morally changed and abused, in a way that cannot be captured simply by saying, Well, we've got to balance his sense of himself and his sense of who he is and what he's going to live his life according to. And just weigh that against, you know, how many pizzas somebody had, or, or whether you know, or whether lots of people down the road needed new organs, and we stole some organs from from poor people, you know, that to collapse all of the human ethical experience into these kinds of interchangeable categories is to lose touch with what makes human ethical experience complicated and rich, and makes tragedy a real thing. utilitarian, his view worldview kind of wants to eliminate moral tragedy, wants to say that there'll be no remainders if we get our calculations, right. But that should set off alarm bells. Because if we get rid of moral tragedy, we've probably got rid of an awful lot of what makes our ethical experiences worth living in the first place.

Clif Mark:

Okay, I get that. But to take the other side. In this case, it sounds like Bernard Williams is saying, and you're saying, utilitarianism tells you what to do, you should save the 19 people, but you should also feel bad about it. So the right ethical thing is to do utilitarianism, but then to feel like moody about it.

Paul Sagar:

Good. So that's the response that has been made by some utilitarians the Hanged Man, what is it that what Williams wants here? So RM hare, who was Williams's teacher at Oxford, and a staunch utilitarian sort of ridicule this as a kind of self indulgence, right? saying, Oh, you just want to go around feeling bad about this kind of thing? And, and, you know, well, okay, well, then feel bad about it. Right. You're allowed to do that. Go on. And but I think that misses the point. I think it misses the point that the utilitarian doesn't understand what's at stake here. It's not simply about Jim or feeling bad. It's about Jim being forced to do something horrific having to live with that. And also the the 20 Indians, or at least the 19 survivors, will have to witness something horrific. And that matters in and of itself, not simply because people feel bad, not simply because it's part of the calculation. And if the utilitarian genuinely doesn't understand what the problem is here, well, then I don't think there's anything that can be said, this is the kind of point at which philosophy gives out.

Clif Mark:

Well, I think I think that the way you're describing is it just if if he's feels bad, and everyone feels terrified, then you can. If it's about bad feelings and terror, you can add it to a calculation, I think

Paul Sagar:

you can. But that misses the point

Clif Mark:

about I think that, I think, well, right. So what is the point, if it's not that it's just, you just keep describing it as this incredibly traumatic bad experience. And therefore cannot be added into calculation. But those are exactly the things that can be, can

Paul Sagar:

be added into a calculation. But that's to miss the point of what's wrong with what's going on here, which is that somebody is being forced to do something on balance they're going to have, they're going to do something which has good consequences, but the reasons that they're having to do it and what it does to them as an agent, identifying with their own life as a continuous experience of developing a certain kind of character, that's now all being completely fucked with by external power in a way that the utilitarian can't make sense of in and of itself as something that matters. In and of itself. It can only matters the utilitarian insofar as it can be shoved into a calculation, but that's the Miss what's wrong here.

Clif Mark:

Okay, so I'm going to have a go at explaining this to see if I've understood it, and I'm going to relate it to the example of George with the weapons.

Paul Sagar:

Yeah, so they're closely connected. Of course, there's the same kind of thought going on here.

Clif Mark:

So, William says its integrity that everyone's losing under utilitarianism. And you mentioned something about becoming a kind of entry in the grand spreadsheet. And I think this is easier to understand if we fill in a little bit on the integrity side and what we're missing when we become an entry on the big spreadsheet. And that's this idea that all individuals have values and projects and commitments that are important to us, we have loved ones, all these things that make up who we are, and that we're committed to. And utilitarianism completely takes that away, right? It's just all about impartial, universal happiness calculated and aggregated over the entire globe. And that's what makes it so horrific. Because all you're allowed to think about is saving starving children and mosquito nets. It doesn't matter if you want to work for a charity, or if you want to be a dancer, or if you want to do anything. If the calculation shows that the way you can get the most mosquito nets is to work for Goldman Sachs. That's what you do. Your identity and values be damned. And that's, I think, the sense of integrity that William says that utilitarianism to us loses. We don't have any integrity in the sense that we don't have any commitments or values or anything that are consistent, and that we can call our own.

Paul Sagar:

I think that's completely correct. And that's, that's really, really good way to put it. And I think that, again, it links back, there's a time and a place where it's right. To put aside people's individual projects and commitments, right global pandemic, we've got to make some hard decisions, we got to make them fast. That's the time in which it's appropriate to be utilitarian, when we need our governments to think like that. But the singer example about giving away all your money is exactly the time when it's not appropriate to think like utilitarian.

Clif Mark:

Well, we're getting close to time. But I want to I want to try one more. One more idea, you, which is you've mentioned that maybe the times that we need it, and when utilitarianism is most appropriate, is in political decisions. And I guess that's because for political decisions, the stakes are so high, you don't want to have someone's own personal scruples about getting their hands dirty. stop them from doing the most good for the huge number of people that are under their influence, right? So political decisions, utilitarian is making more sense in personal life. That's maybe not such a good way to think about it because you lose a sense of integrity. That's right. Is this why politicians have such a hard time having any integrity? And so is it right for them not

Paul Sagar:

to have any integrity. So that's where I think utilitarianism can again become dangerous. Where if politics is reduced to simply utilitarian calculations, then it becomes very easy for certain stripe of politician to say, Hey, I'm just, you know, do whatever is required, I'm just being a hard headed and, you know, making those tough decisions, and in order to, you know, promote the greatest good. But often what that really means in practice is I'm going to sacrifice people who I don't need, who don't vote for me, or who support I don't need under the guise of it being for the public good. Now, I don't think that's anywhere near enough of an explanation as to why politicians often lack integrity. That's a big question.

Clif Mark:

Okay. So let me ask it this way. We're saying that utilitarianism means that you can't have integrity, it doesn't leave any place for integrity. And we're also saying that very often, we want politicians to be utilitarians because we can't have them being too fussy about getting their hands dirty. And when it's aggregated over a bunch of people, we need them to make that calculation. So I'm not saying politicians are bad because I have no integrity. I'm saying, isn't it the case, if utilitarianism is appropriate on the policy level, that we want our politicians to have no integrity, so they can be good utilitarians?

Paul Sagar:

Right? So to that there are times when that is true, right? There are times when we want them to say, Look, just drop the bomb, right? So for example, I have a controversial view, which my students get very angry at before, which is I think that on balance, dropping the nuclear weapons on Japan to end the Second World War was probably the correct decision. Because Truman, his first responsibility was to American troops. The estimate of a invasion of the Japanese home islands was a million US casualties. And many, many millions of civilian Japanese casualties because the Japanese would not have surrendered until every man woman and child with dead a given that alternative using nuclear weapons, which also by the way, sort of very strong message to Joseph Stalin to stop where he was, is, in my view, the right thing to have done. I think Truman made the right call. That's not a popular view. But I think in that case, he did. But if he said, Oh, I'm not going to drop the bombs on Japan, I'm not going to drop any bombs anywhere because of my integrity, because I don't want that to be on my conscience. I don't want to be like Jim, who had to kill the Indian, well, then that's something's gone wrong, right? There are times when we need our politicians not to have integrity. The problem is, the ones who were prepared to give up to step down from any integrity are also the ones who are likely to be very dangerous. And there's no mechanism for regulating when they're making the right call. And so, the great German socio sociologist and political theorists, Max vabre, described this as the great dilemma of politics, that leaders have to be able to know both when to stand up for their integrity and to refuse to do certain things, but also when to let go of their integrity and do them anyway, because the situation demands it. And making that judgment call is incredibly difficult. But one thing I don't think utilitarianism as a theory can do is make that judgment call that a better because utilitarianism just says stick it in a calculation. And often the realities of politics can't be reduced to a calculation. So utilitarianism in practice is likely to be no help at all. We have a problem about judgment about when to hold our politicians to standard of integrity and when to ask them to give that up for the greater good. But utilitarianism can't tell us when to make those calls.

Clif Mark:

Excellent. Paul Sager. Thank you very much for coming on. But thanks again for having me. And I think that is a good place to leave it. Yeah. Let's do it again sometime.

Paul Sagar:

Dude. That was awesome. Thank you.