Skip to main content Skip to secondary navigation
Page Content

Susan Liautaud: We Must Democratize Ethics and Rethink Greedy Business Models

The ethics expert discusses problematic incentive structures, rising employee activism, and the need for board accountability.

Image
A photo of Susan Liautaud and John Etchemendy

YouTube’s algorithm learns what we like and can deliver more of it, but can also lead us down paths to more extreme views. Social media platforms have perfected ways to keep us online longer, our eyes glued to their sites. But some of these business models, while profitable, are ethically problematic, says corporate ethics expert Susan Liautaud.

“There’s a business model that is maximizing numbers of eyeballs on screens for a maximum amount of time,” she says. “We’ve all seen the Social Dilemma, and there have been many experts who have talked about addiction and various other aspects of social media that can be harmful not only to informed adults, but to teenagers, to children. And at that point it’s pure greed.”

Liautaud, a Stanford HAI advisory council member who teaches an ethics course at Stanford Law School, notes that some companies and government have begun to counteract these problematic models. China, for example, set a curfew and time restrictions on online gaming for anyone under 18.

“It is possible to, if not have concrete fixes like that, at least dramatically increase the transparency about the risk,” she notes. “We have cigarette packages that say smoking kills. We could have social media terms of reference that say in big red letters, this is addictive.”

In this latest Stanford HAI Directors’ Conversation, Liautaud, author of the new book The Power of Ethics: How to Make Good Choices in a Complicated World, and HAI co-director John Etchemendy explore the changing world of corporate ethics. She explains why ethics should never be confused with corporate social responsibility, how boards should be accountable for a company’s unethical business practices, and why the growth of corporate ethics boards offers great potential. She also shares her thoughts on the rise in employee activism and the need for the democratization of ethics.

Watch the full conversation, or read the transcript below:

Full transcript:

John Etchemendy: Hello, and welcome to our Directors Conversations series, where we talk to multi-disciplinary leaders in AI. With me today is Susan Liautaud. Susan is a corporate ethics expert. She guides global corporations, nonprofits, governments, and academic institutions on issues of ethics and governance. She also teaches an ethics course here at the Stanford Law School. We’re lucky enough to have Susan as a valued member of the HAI Advisory Council, and we’re delighted about that. Susan recently wrote a book titled The Power of Ethics: How to Make Good Choices in a Complicated World. It’s just been released by Simon & Schuster. So I’d like to use this opportunity to talk to Susan about some of the ideas that come up in the book. Susan, welcome.

Susan Liautaud: Thank you so much for having me here, John. It’s really a pleasure.

John Etchemendy: So let me start with what seems like a common misconception or I gathered is a misconception. When we think of companies and ethics sometimes the first thing that comes to mind is corporate responsibility. Now you distinguish corporate responsibility from the corporate ethics. Could you explain the difference?

Susan Liautaud: Sure. So what we’ve come to know is corporate social responsibility can mean things like in part corporate philanthropy or programs for the community or volunteer programs for employees or similar things like that. They may very well be run ethically, but even those kinds of good initiatives can have ethical problems. But even if they don’t, that doesn’t substitute for ethical decision making throughout the running of the organization, and CSR initiatives don’t clean up, don’t act as erasers for misconduct elsewhere.

John Etchemendy: Oh, they don’t. So corporate ethics, then you’re referring then to the day-to-day everything that the company does, the decisions that the company makes are those ethical decisions and you want to talk mainly about that, is that right?

Susan Liautaud: Yes. It’s how do we make decisions and how do we integrate ethics into day-to-day decisions and sell them at the level of the board or senior management and will be very significant and very long-term. Others might be more day-to-day and nobody has time to belabor every decision, but the question is how to make it a habit rather than an event or an occasional offsite.

John Etchemendy: So maybe related, one of the things that you bring up is you talk about the idea of democratizing ethics, and I’m curious what you mean by that. Isn’t ethics by its nature something that we all have to take into consideration, or is that just my old-fashioned approach to things?

Susan Liautaud: Well, I think it should be. But I think the world has gotten so complicated and the ethical issues have gotten so complicated that a lot of people might think that they just don’t have access or just aren’t capable of having an opinion on the ethics of certain things like AI or like gene editing or even like certain business models, like shared economy business models or things that are very complicated. But what I mean by democratizing ethics is making ethical decision-making accessible to people from all walks of life. And at the same time spreading what you just said, John, which is that it’s all of our responsibility. Everything is not the fault of the tech company leaders. Everything is not the fault of a rogue scientist. And everything is not the responsibility of government and company leaders. And the flip side of that is that leaders do have to think about new ways to be listening, to be giving everybody a voice.

John Etchemendy: Susan, if I may, let me just read from the very beginning of the flap on your book, and I know you don’t write the text in the flap, but I think it’s really quite interesting. It starts by saying, “It’s not your imagination. We’re living in a time of moral decline. Publicly we’re bombarded with reports of government leaders acting against the welfare of their constituents, companies prioritizing profit over health safety in our best interests and technology posing risks to society with few or no repercussions for those responsible.” So those are pretty strong statements. I’m curious whether you think, are we going through an ethical crisis? I know it seems to me that maybe we are, but I’d like to know what your perspective is on that.

Susan Liautaud: I should preface my response with the fact that I’m very pro-technology and I’m very pro-business. So this book is not in any way, shape or form a rant against Silicon Valley or rant against capitalism. And in fact, I think very often we lose sight of the fact that failure to seize the opportunities that technology brings can be some of the biggest ethics risks. And that it’s very easy coming from a privileged position like mine and like yours. It’s very easy for us to say, Oh, we’re not going to go on Facebook, or we’re not going to use this, or we don’t need the AI diagnostics, but here, we’re sitting with one of the best hospitals in the country or in the world right down the street from us. So we have to be careful about that.

I’m not sure that we’re in a period of more moral decline. I think we’re in a period of more moral complexity. And I think that the interaction of technology with some of the classic moral dilemmas has turbocharged the threat. So, it’s one thing to be speaking out against an election. It’s another thing to be doing it on social media when hundreds of millions of people are going to be rallied all at the same time. And so everything in scope and in scale and speed is intensified more and more because of the complexity of technologies, as I was saying earlier, people from all walks of life feel like, well, I can’t have a voice in this. I just don’t understand it. And the leaders, the one thing I would expect more of from leaders and experts generally, is to help the public form their opinions, to tell them in clear, understandable terms, what matters to them and help them feel that they can have the confidence to have a point of view on the ethics.

And that’s part of a bigger question we can get into if you wish about the demise of expertise or respect for expertise. But I’m not sure I’m as focused on the decline as just marking what this moment in time is about and how technology intersects with it.

John Etchemendy: I’m interested in what you just mentioned, the decline of expertise. You talk about democratization of ethics. Does this tie into how you think that we can encourage that? What are we to do? What should the ethicists do, the professional ethicist?

Susan Liautaud: I think one of the most fundamental threats of our time that I do think is different from the past, in part, because of technology is the degree to which we are willing to compromise on truth. And I fundamentally believe that without truth, there cannot be ethical decision-making. If we think about corporate principles or values, most of them are hitched to truth. Things like honesty or integrity, or even growth in profit — you have to know truthfully how much growth in profit you have. You need information to make decisions. So if you don’t have your facts, you don’t have quality information, your decision is going to be skewed. And if your information isn’t good quality, you’re not going to know who the stakeholders are, and you’re not going to be able to think about the consequences over time of your decision. So I don’t think there’s any such thing as ethical decision-making without truth. And what has happened and then what has been accelerated, at least in the past several years in particular, since 2016, we started hearing things like alternative facts. The Oxford English Dictionary chose post-truth as the word of the year in 2000, I believe it was ’17. We seem to think that truth is negotiable, and we don’t seem to value as much expertise and experts might be academic experts. They might be the experts within companies. And I think it’s a two-fold responsibility. It’s one the public’s willingness to disregard facts and that’s on all of us. And then there’s the experts needing to understand that so much more expertise is sequestered in their brains, in the brains of a very few people.

And they have this responsibility that I referred to earlier to make the public understand what’s at stake, because most of us can’t go out there and understand what’s really at stake with AI or with some of these other more complicated technologies. So, I think there’s a two-sided responsibility here, but I think it’s really important because without a common understanding of where the facts are and the facts may move, we may learn more about COVID-19, we may learn more about how to combat climate change, et cetera, but without a common understanding of the facts and even further, without a common sense of the importance of facts, we are really in trouble ethically.

John Etchemendy: Let me throw you a curve ball and ask a question that just came to mind. So you talked about the decline of the respect for truth, belief in truth even, belief in an objective world. One of the things that distresses me is that we see a... it’s true that people do not have agreed upon experts. For example, there’s not a newspaper that everybody will agree is objective and check there if you wonder what really happened. Now that’s a problem, but it’s also something I believe that is to a certain extent brought on by the experts themselves. So in the case of the news media, at every level, every outlet has taken sides. I think arguably that academia itself has taken sides and, or at least as viewed as having taken a side. What do we do to combat that? What do we do to regain the respect of the public, the trust of the public?

Susan Liautaud: So, first of all, I agree with you completely, that there’s a shift in a commitment to neutrality that the news media used to have, and it’s very... people pick on Fox News, but quite honestly, there are many other news outlets that are just as guilty, if not of the same politics, at least have the same lack of commitment to neutrality. So I couldn’t agree with you more. And I also couldn’t agree with you more on the difficulty in academic institutions of in particular conservative voices and people feeling uncomfortable if they’re bringing conservative voices. And this is something that I see all over Europe, not just in the U.S.

It’s a very tricky situation, but again, I think it comes down to the responsibility of the experts. It is a huge privilege to be an academic in a significant academic institution. And I believe that part of that privilege is to allow space for diversity of views and to proactively work, to seek out those views and to hire those views. Obviously, within reason. I’m not talking about hiring people who would incite violence or anything that would be inappropriate. But I think everybody loses without that diversity of view.

As you say so well, everybody loses respect. The institution loses respect, but frankly also some of the academics that come across as presenting a point of view or intolerant. On a slightly smaller level in my classes, in particular, a large seminar I teach called ethics of truth in a post-truth world, everything is very sensitive and I try very hard to welcome diversity of views and to make students comfortable expressing views that go against those seven or eight classmates that came before them. But it’s very difficult at the level of faculty. And it’s not helped by the fact that occasionally there are extreme cases where you really do question whether it’s some equivalent of inciting violence or just truly inappropriate.

John Etchemendy: Let me change topics a little bit, get back to the corporate ethics topic. I think that some of the things that we lately have been learning seem to be ethically problematic about some companies, or some corporate behavior is actually inextricably tied to a company’s business model. And one of the examples that I often use is the YouTube recommendation engine. So on its surface, I appreciate, I think we all appreciate that YouTube has an algorithm and that algorithm finds more of the kinds of videos that I like to watch. And that feature keeps me engaged. I enjoy it. They’re very good at finding things that I want to watch. But we’ve also learned that those algorithms can lead people to incredibly problematic content, or if not problematic content they lead us down into deeper and deeper, more and more extreme versions of the views that we initially were inclined to like, and pretty soon we get more and more extreme.

I think that’s problematic to the society. And yet it’s not that YouTube is doing anything horrible. They’re giving us what we like. So what do we do in a case of that sort where the business model seems to be at odds with something like ethics?

Susan Liautaud: That’s a great question. I think first of all, companies have a responsibility, even if we say we like something, to be thinking about the ethics of it. So it’s not because we like it, that they aren’t responsible for telling us that there’s an extraordinary amount of sugar in it, or that it could be otherwise harmful to us. The example you raised is particularly interesting. And let me just say, I am not a technologist. So I come to the technology side with a tremendous amount of humility. But the idea of getting more and more extreme or siloed is an example of one of the biggest drivers of the spreading of unethical behavior. Silos by definition does on an individual level, in this YouTube example, what we just discussed in academic institutions: it exposes us to less and less, and it makes us more and more confident that we’re right no matter what because we keep getting reinforcement and we keep getting told we’re right.

And another fundamental ethics challenge is being able to step back and question ourselves, in particular when we’re in these muddy waters. So I think therefore that these algorithms are extremely problematic and I wonder at what point both at a governance level and a management level, but also at a technology level, we’re not going to be able to find a way to solve these problems because we know that these companies can eliminate, for example, child pornography. So why can’t they adopt the algorithms in different ways and still give us what we like, but doing less of what you just described? But there’s another kind of business model that is in the same league. And that is maximizing numbers of eyeballs on screens for a maximum amount of time. And that is simply greed.

We’ve all seen the Social Dilemma, and there’ve been many experts who’ve talked about addiction and various other aspects of social media that can be harmful not only to informed adults, but to teenagers, to children. And at that point it’s pure greed. There’s a business model that says, the longer more people are staring at screens, well, they can do something about that. And I think I mentioned to you an example that I thought was very impressive in China, where there was a video game and the Chinese government finally realized that there was such widespread addiction, that they required the company to limit the amount of time that anybody could spend on this video game per day. And not only did they do that, but they did it using eye recognition technology, so somebody couldn’t collect the passwords of their five friends who didn’t care about it and spend five times as much time on the site. So it is possible to if not have concrete fixes like that at the very least to dramatically increase the transparency about the risk. We have cigarette packages that say smoking kills, we could have social media terms of reference that say in big red letters, this is addictive or some variation of warning about the ethics risk that don’t require people to read the 25 micro print pages.

John Etchemendy: So Susan, particularly the Chinese example, raises the natural objection that this is the company overstepping or the government overstepping and being paternalistic. Now we know many in China’s government, it tends to be paternalistic. So in the U.S. it’s not clear that at least that sort of government regulation would be tolerant.

Susan Liautaud: I agree with you on that. I think it’s a question of how demonstrable the harm is. We certainly regulate things for teenagers and children like vaping for example. They finally came around to regulate vaping because there’s demonstrable harm. If there is demonstrable harm to the mental health of young people, and I really am focused here on children and teenagers, then the government might get more acceptance, but at the very least these... you raised an excellent point. These companies don’t need to wait for the government to act to improve the ethics and their impact on society. And it’s my view, I say to corporate leaders all the time that ethics is your greatest corporate strategy and your greatest strategy as a leader, but if you fail to heed it, it will become your greatest risk. And I think a lot of these companies have plenty of space for proactive, strategic, profit-enhancing ethics. And if they just use that opportunity, then they would fend off a lot of unwanted regulation or at the very least a lot of unwanted attention.

John Etchemendy: This actually another question I had is, what are corporate boards responsibilities here? I’ve been on corporate boards before and I think it’s often very difficult for the corporate board member to really assert him or herself and suggest that the company should do something differently from the way the company is currently operating. So what do you feel the board’s responsibility is?

Susan Liautaud: I think the boards have a lot of responsibility, but I’m very sympathetic to what you said. It’s extremely difficult, particularly in some of these very large companies where founders have controlling shares or there are various governance structures that really take power away from the board. It’s very difficult for board members. And honestly, I don’t know a lot of board members who are personally irresponsible or who lack an ethical approach to business. But as you say, there’s the dynamic of the board room, there’s the dynamic of the company and the governance structure that can make it very difficult. I do think there are examples where boards could have reacted a lot earlier. I think in the case, for example, of Uber, where sexual misconduct was known, and that behavior is clearly illegal.

That’s quite different from pushing the edges of technology or not quite catching up with the ethical consequences of technology, but it is very difficult. And I think if the boards look at the strategy, for example, and say, “Okay, we are seeing attention from Washington, for example, that we don’t want. We don’t want antiquated antitrust laws. So what can we do to show Washington that we don’t need those, or that we have a plan B that will take away the greatest risks and protect the opportunity that we’re providing for society?”

John Etchemendy: One of the things you’ve mentioned in that last answer reminded me a concept that you introduced called “ethics on the edge,” that our laws lag behind the technology advances. Can you say something about what you mean by that term, ethics on the edge, and is there any hope that we can catch up with that edge?

Susan Liautaud: What I mean by the edge is that the law lags very far behind technology. The law has always lagged behind reality. There were no red lights and stop signs before Henry Ford developed the Model T. But what’s happening is that the pace of technology and the complexity of technology and the ubiquity of technology is such that the legal system regulators are having terrible time catching up. Now there’s more they could do. And some of them respectfully are a bit slow on the uptake. We have known some of the risks, for example, of social media for a long time. So, they could be moving a little bit more quickly, but the reality is the technology is moving so fast that we’re not going to catch up, and I’m not actually arguing that the law should completely catch up.

We don’t want to over-regulate, as you alluded to earlier. We don’t want the government in every decision for all kinds of reasons. We do need to be able to tolerate a certain amount of risk versus society and we don’t want to impede beneficial innovation. I don’t think it will ever catch up, and I don’t think it necessarily should, but it could do better. But the other part of this, and this is something that is a focus in some of my classes is that there are issues that are on the edge, so to speak, that really shouldn’t be. So, racial discrimination, for example, and there are many issues like that. It’s mind-boggling that we haven’t been able to sort that out even where there is law, we have a Civil Rights Act. We have decades of social movements and yet look at the news recently. I think it’s really important to keep in mind that the problem isn’t just technology.

John Etchemendy: Right. As you know, one of the things that HAI tries to do is educate, including legislators, judges, lawyers, journalists, because we believe that, to use your terminology, in order to at least catch up or be nearer to the edge and not falling further and further behind, we need to make sure that the people who are creating laws actually have some understanding of the technology so that they don’t create stupid laws, which is often a big danger.

Susan Liautaud: Exactly. And the other thing I would say that HAI does so well is all of the scholars and other leaders in HAI really model keeping humanity front and center, because that’s where we’ve gone off the rails. We’ve had a number of occasions where the technology has been unleashed on society without thinking about the humanity. But when we hear all of the incredible projects happening at HAI, you really are keeping humanity front and center as the science is being developed and tested and explored and shared. I think that’s an incredibly important model also that corporates should look at because they could follow the model that you’re establishing.

John Etchemendy: Well, thank you. I actually want to change from your book to an article that you recently wrote for the Harvard Business Review, which I thought was really timely and fascinating. It was about corporate and ethical advisory boards, and you offer some suggestions about setting these up and how to have a successful one. And we’ve seen some prominent ethical advisory boards blow up or end up being ineffective. Are you seeing any companies or industries that are having success with this model?

Susan Liautaud: I think success right now is giving it a good try and giving it an earnest try with respect to some of the things I set out in that article, like really giving the members of these ethics advisory bodies the space to express their views and an ability to report back to the boards without individuals being identified so everybody will speak freely. And then there is a report to the board and to the senior management team, this body unanimously feels this, or there was one outlier who thought that, but nobody’s going to be personally connected to anything. Nobody’s going to be exposed. The second thing that can be done well is very much along the lines of what you just said about HAI’s educating journalists and legislators and the like, which is to make sure that there are technologists and the management team in the room for at least part of the discussion so that everybody knows truly what’s at stake here from a technology standpoint, from a business model standpoint.

But I think one of the things that ethics advisory boards can do really well when management is willing to put these in place is in a risk we weigh because they’re confidential forums. If they’re done right, they can be the kind of diverse set of voices that we were talking about earlier that lacks in certain academic institutions and with media. They can bring diversity of thoughts. And I take this model all the way back from, well, let me say, instead of take, I unabashedly borrow from Doris Kearns Goodwin and Team of Rivals. And in fact, even with my students, one of the things I say to them is over the course of the term, I want you to think about who could be your personal team of rivals.

Who in your life, would it be a friend, a professor, a family member, maybe somebody you’re doing an internship with, and you get three or four people who will be your personal ethics team of rivals and have them get into the habit of looking for diversity and looking for a respectful challenge. And I think that’s what these advisory boards do. They become the team of rivals for the board and for the management team. And they can be hugely effective in terms of assuring, supporting risk when it’s the right kind of risk, but also to your earlier point, giving board members a little bit of a foundation for standing up to management in the cases when that needs to happen.

John Etchemendy: It’s interesting. This is neither here nor there, but one of the things I’ve always told search committees, particularly university presidential search committees or whatever, that one of the risks of hiring somebody from outside is that you bring them into the organization and they don’t have established relationships of people who are willing to tell them, you know, you’re doing something wrong. Or I don’t think that’s the right thing to do. Whereas if you hire from inside, there’s a greater chance that there will be those not necessarily rivals, but people willing to speak the truth and the truth to power because those people knew the person before he or she rose to power. So anyway, it is interesting that I think that is very important to have that kind of input when you make leadership decisions.

Susan Liautaud: I think that’s right. I had the privilege of chairing the search for the president of the London School of Economics. And we had a wide variety of candidates, but we did a global search, that’s the tradition in the UK and we use the search firm and that’s also the tradition in the UK. And it was quite a long process. And there was a lot of discussion about exactly your point, John, which is how would you as a leader assure that people are going to speak truth to you when you come into this organization? And so it’s really interesting. I’m learning a lot from this moment that you think it’s easier when people are internal. And I think that makes a lot of sense.

John Etchemendy: I want to move to a slightly different topic, and that is a phenomenon that we’re seeing lately that is the rise of employee activism, particularly in some of the technology companies where employees seem to be rising up and questioning their employers ethics or their decisions or their business choices. I’m curious how you think companies should address these kinds of mismatches between the leadership’s judgments about what should be done and the mass of employees’ judgment about what should be done.

Susan Liautaud: There’s no question that today employees have a lot more knowledge and a lot more power if only because of technology and cell phones and social media and the like, and there’s no question that across the board, whether it’s Amazon employees responding to Jeff Bezos’s decisions to continue to sell cloud space to fossil fuel companies, or whether it’s Google and drones in the military or a sexual misconduct case. There’s no question that pretty much every corporation has some variation of employees speaking up at different times. And to a certain extent, I think that’s good. Now on the employee side, what I would say is pick your battles. You are an employee, and you can’t be speaking up about everything. You have to really think through and be sure that you’re convinced of the ethics of your argument and not just have a knee-jerk reaction or just join the crowd.

And so I really do encourage employees to pick their battles and leaders have the decision authority in the end. But one of the things that happens is that even when it’s pretty clear that an issue is going to be controversial or potentially controversial, what happens is one of two things: either understandably the leaders just wait and see what happens, or there’s a lot of town hall meetings. That seems to be the thing to do. Town hall meetings are by definition open spaces where anything can come up in any way. So a conversation that takes something out of the binary can be much more constructive and specific.

We think we need to be doing this for the following reasons: How can we mitigate some of the risks you see, not yes or no. And if we can’t as a company mitigate some of the risk you see with our decision, maybe how can we give employees who really just can’t quite reconcile their conscience with the decision an opportunity to move into a different project or a different part of the company? I know that Brad Smith at Microsoft is trying to do that and says, “Look, we’re a big company. And we can find space for you somewhere else.” That doesn’t mean you change every day. It means that if there’s one really big thing, they might help you not work on something that you find really difficult, but this comes up everywhere.

And another place that it really came up is in particularly and I mean this to be a politically neutral comment, but it particularly came up in the Trump administration with members of the military and the state department in a conscience-versus-country question. I interviewed a number of people who were trying to understand, how do you reconcile the two? And there’s great guidance in general from wonderful people like Jim Mattis who as you know is at Stanford, but this conscience versus institution is coming up a lot. And the final thing I would say is that I get this question from students a lot. I get students asking me, well, how do I hold my employer accountable? And the first thing I ask is, “Well, how are you holding yourself accountable?”

And the second thing I ask is, have you looked at everybody else holding them to account? Have you looked at the annual reports that are produced for regulators and the industry experts and the academic institutions that are interacting with them? What can you learn from that? And then we go down from there and it’s not to say that individuals and young people shouldn’t be holding their employer to account, but it is to say that I try to encourage a questioning mindset and a self-reflective mindset as opposed to an attacking mindset.

John Etchemendy: That’s terrific. I’m going to thank you for joining us, this has been a wonderful conversation. I wonder if you have any final pieces of advice and what kind of advice do you give to your students on the final day?

Susan Liautaud: There’s one particular point that I think is important for this generation, which is that perfectionism has no role to play in ethics. Perfection is neither a laudable goal nor an achievable one. And when people set impossible goals, whether it’s, I absolutely have to get into this one medical school or a particular sales target or whatever it is, there’s only two things that can happen: One is that people will cheat to try to achieve the impossible. So that’s clearly not an ethical path. The other is banging one’s head against the wall. And that’s where we see these epidemics of mental health, in particular among young people. So one of several messages along with things like team of rivals is really student perfectionism. That’s really something that I think this generation needs to learn.

What are your lessons for students? I’d be really curious to learn from your lessons for students.

John Etchemendy: I wasn’t expecting to be interviewed. What lessons do I give? I love the one that you just mentioned, which is something that I never would’ve thought about. I think as far as ethical behavior goes, one of the things that I try to impress on students is that it’s going to be hard. You’re going to find yourself in a context where you’re surrounded by people and you think there’s something wrong with this. There’s something wrong with this direction we’re moving in and you’ll look around and you’ll see nobody else seems to feel that way. And then that’s when it’s hard. And that’s when you have to decide what exactly to do. And here, Susan, your point about choose your battles is right that sometimes you’re just wrong, maybe they’re more experienced. They may be right.

On the other hand sometimes it really requires you to step up and say, “Have we thought about this?” Or, “Have we thought about the ethical consequences of that?” And when you do often, somebody else will join in and say, “Yes, I agree.” But those are, I think, the hard situations that our students face and that we are not as good about teaching them about when we teach ethics in the abstract and in the abstract it’s always what’s the ethical decision when I’m making my own decision about how I’m going to behave. Am I going to lie or am I going to tell the truth? And then it can be easy or maybe not easy, but when you’re in a context where it’s a joint decision and you seem to be the odd man out as they put it. So, I talked to them about that and asked them to think through how they would react to situations of that sort.

Susan Liautaud: Yeah, I think you’re right. I think it is really difficult and it’s incredibly valuable to convey that it is difficult and that they should expect difficulty. The other one that I would just highlight is I am always asking people to stay anchored in reality. And I tell students, you can do ethics all you want outside of reality, but reality will come back to bite. So you need to stay focused on reality and consider the implications of your decision for the person who will be the most disadvantaged by it, and then consider that that person is you and then see how you feel about it.

John Etchemendy: That’s terrific, Susan. Well, thank you so much for talking to us today. I’m delighted and I know that our listeners are going to enjoy this. And I wanted to mention to our listeners that you can find Susan’s book now at your favorite bookstore, if your favorite bookstore still exists, or get it online from Simon and Schuster, the publisher, or from one of the online booksellers also make sure that you have a chance to visit our website or our YouTube channel to find more conversations like this. Susan, I look forward to talking to you again soon at HAI.

Susan Liautaud: Thank you so much, John. It’s such a privilege.

More News Topics

Related Content