Workday Recognized as One of the 2026 World’s Most Ethical Companies®

To dig into what this honor means and how Workday approaches ethics and integrity in practice, we sat down with Rich Sauer, Workday’s chief legal officer and head of corporate affairs.

In this article we discuss:


Workday has once again been named one of the World’s Most Ethical Companies® by Ethisphere—our sixth consecutive year on the list. The recognition highlights Workday’s ongoing commitment to business integrity, from our ethics and compliance program and strong corporate governance, to our positive culture of ethics, community impact, and rigorous third-party management.

To dig into what this honor means and how Workday approaches ethics and integrity in practice, we sat down with Rich Sauer, Workday’s chief legal officer and head of corporate affairs.

Q: The environment in which we operate is becoming increasingly complex. Does that complexity make it even more important now to be an ethical company?

Rich Sauer: Absolutely. Substantively AI surfaces a number of new and novel questions that we’ve not previously faced, and from a training and enablement perspective, it’s harder to get people’s attention. But Workday has always operated with some degree of complexity. We were founded 20-plus years ago, in the early days of cloud. At that point in the technology revolution, the big issue was privacy. Everybody was saying, “Wait a minute, you want me to pick up my data that I have here in my own data center, or my own servers, or in my basement, and move it over to you, a third-party company? That sounds kind of crazy.”

Trust was key. You had to trust the vendor before you were going to give them your data, and you had to trust that they were going to be responsible stewards of it. That’s how Workday made its name. We made ironclad promises to our customers about how we would treat their data and protect their privacy. It’s their data, not ours.

Flash forward to last year, when everybody was racing to adopt AI. AI was so hyped and exciting that everyone wanted to use it, leverage it, and experiment with it. This year, they’re waking up and saying, “What’s our governance process for managing all this AI that we now have in the workplace?”

If you’re going to adopt a company’s AI and their agents, it’s really important to be able to trust that vendor.

We’re almost back to the same playbook from two decades ago. If you’re going to adopt a company’s AI and their agents, it’s really important to be able to trust that vendor, respect that vendor, and know they have high integrity and operate on a set of principles. Given our 20-year track record as a reliable and trusted partner to our amazing customers, we have a real and meaningful advantage in this race.

Q: Earlier this year, Workday announced it had achieved two top AI certifications for its commitment to responsible AI. Can you share practical examples of how Workday applies its ethics principles to responsible AI and AI agents?

Sauer: We have a great set of principles that guide our AI development and deployment, but principles are the easiest part. Everybody has them and they all kind of look the same. The hard part is how do you implement and put those principles into practice?

When I got here 6 ½ years ago, we started a Responsible AI team. We now have a team staffed by data scientists and social scientists, and that’s been critical. We created an intake process where we risk-rate the features that Workday is developing. We make sure we understand what they are, and anything that’s making consequential decisions that will impact human lives is something we pay very close attention to. We curate the development process, we test, and we do everything we can to make sure the results are high integrity and fit for purpose.

Q: Some people see integrity and ethics as “brakes” on innovation. How can a strong ethical foundation actually help companies move faster, especially with AI?

Sauer: It’s true that innovation and engineering groups often see the legal department as slowing them down. That’s not how our engineering department thinks. They understand the importance of what we’re doing, and they themselves want to make sure that what they’re developing is ethical and meets high standards.

Our customers expect high standards in the innovation we’re releasing, so in some ways, it’s self-policing. We do this well because we demand it of ourselves, and we’ve got to do this well because it’s what the market demands.

Q: How do you think about human judgment as you automate more processes with AI?

Sauer: We consider risk as low, medium, medium-high, or high. When something is medium to high risk and we need to make anything considered a consequential decision, we have a principle that says humans must be in the loop.

You want to make sure there’s human judgment applied to more sensitive areas.

There are features at the low end that can be fully automated, but you need to be really thoughtful about what those are, what the impacts are, and what the use case is. And you want to make sure there’s human judgment applied to the more sensitive areas.

Q: Ethisphere looks at a range of criteria, including ethics and compliance. Is there anything in that mix you’re especially proud of, or that shapes your priorities?

Sauer: One great thing about the Ethisphere report is that they actually score and benchmark you against others. They show you where you are relative to all the other companies they ranked. It’s a great tool to help you hone in on, “We did really well here,” and also, “Here are areas for improvement.”

As we’re now a World’s Most Ethical Companies honoree for six years running we have the ability to benchmark against ourselves. We’re always focused on getting better.

It’s hard to point to one thing I’m most proud of, but I do appreciate that Workday continues to focus on and invest in the environment and social impact. Our commitment in those areas is durable and long term, not just a trendy commitment we made once and moved on.

Q: What does it take to build and sustain a culture of integrity?

Sauer: Our core value of integrity has been there from the beginning, but the things that come to mind are persistence and conviction. This is not a “set it and forget it” area. You can’t just write a policy, walk away, and assume everything will be fine. You have to constantly talk about it and remind people why it’s important, particularly when they have so many things competing for their time.

You can’t just write a policy, walk away, and assume everything will be fine.

Sometimes it can feel like either you have high integrity or you don’t, as if it’s ingrained in you. I don’t think that’s true. You always will have something to learn, and you can get better at it, particularly in the world of AI and increasing complexity.

So you have to be persistent with training and awareness, and you have to have the conviction to always do the right thing even when the most ethical decision might be the toughest one, and especially when no one is watching.

More Reading