Skip to content

Our Core Values: Privacy, Transparency and Algorithmic Responsibility

arrows with logo

Last month, This One celebrated its first birthday. Instead of throwing a party, Johnny and I marked the occasion by reflecting on our journey so far. We discussed many topics, including the wonderful people who’ve joined us, our product-development process and our company strategy. But most of all, we discussed the connective tissue that holds everything together: Our core values.

Ultimately, our company is about 2 things: making discoveries and making decisions. Looking back over the past year, Johnny and I both agree that the discovery of our core values and the decision to put them front-and-centre have both been crucial aspects of our journey so far. Now, as we celebrate our 1st birthday, we’re excited to take the next step: Sharing our core values in public.

I’ll begin this post by discussing why we regard core values to be so important. I’ll then describe our values framework, list our core values, and provide a short discussion of each. I’ll also describe the mechanisms that we currently use to hold ourselves accountable, along with our commitments for how and when we’ll evolve these mechanisms in the future. Finally, I’ll discuss some of the benefits and drawbacks that we’ve experienced as a consequence of positioning our core values so centrally.

Why so Serious?

Back when we founded This One, Johnny and I immediately recognized that we were operating in a high-stakes arena. To recall my previous post: If successful, we believe that our product could reach millions (or even billions) of people, each of whom could use it to help make multiple discoveries and multiple decisions each day. Therefore, the things we do could have a material impact on many people’s lives. At scale, our product may even create emergent outcomes that impact whole societies. We take this situation very seriously.

Although we are still very early in our product-discovery process, it’s clear that AI will play a central role in our journey. When it comes to AI, we believe that it’s reckless to adopt a mentality of “build things and see what happens”, because doing so can create deeply negative outcomes. The media is awash with stories about how recommendation algorithms (which, as I highlighted in my previous post, are one of the main existing approaches to solving discovery problems) have contributed to the polarization of public opinion, the rising popularity of conspiracy theories, and even the spread of anti-vaccine sentiment.

These examples - and many others - motivated us to nurture a set of core values that keep us moving in the right direction, irrespective of what strategy we pursue or what product we build.

“[Core] values are not about markets or products. They are a set of fundamental beliefs about what the company stands for that can endure the test of time: the ethical, moral, and emotional rocks on which the company is built.” Steve Blank, The Four Steps to the Epiphany.

At This One, alignment with our core values is a must-have, not a nice-to-have. If we’re considering a product feature that doesn’t align with our core values, we won’t build it. If we’re interviewing a candidate who doesn’t align with our core values, we won’t offer that person a job. If we’re considering a partnership with a company that doesn’t align with our core values, we won’t proceed with a contract.

What are “Core Values”?

During my time at Spotify, I was fortunate enough to participate in the company’s executive-development program. One of my favourite parts of the program was a monthly reading group. Many of the books that we studied still inspire me today, but one stands out in particular: The Advantage by Patrick Lencioni.

The book's central message is simple: Organizational health trumps everything else in business. Lencioni notes that organizational health requires many ingredients, including an authentic set of company values. Diving slightly deeper, Lencioni argues that a company’s values actually consists of 4 sub-categories:

  • Core values: “These are the few - just two or three - behavioural traits that are inherent in an organization … They should be used to guide every aspect of an organization, from hiring and firing to strategy and performance management.”
  • Permission-to-play values: “These values are the minimum behavioural standards that are required in an organization. Although they are extremely important, permission-to-play values don’t serve to clearly define or differentiate an organization. Values that commonly fit into this category include honesty, integrity, and respect for others … Permission-to-play values must be delineated from the core to avoid genericism.”
  • Aspirational values: “Aspirational values are the qualities that an organization is aspiring to adopt and will do its best to manage intentionally into the organization. However, they are neither natural nor inherent, which is why they must be purposefully inserted into the culture.”
  • Accidental values: “These values are the traits that are evident in an organization but have come about unintentionally … sometimes they even sabotage its success by shutting out new perspectives and even potential customers.”

Lencioni’s framework helps us to clarify what we mean by “core values”: They aren’t our minimum behavioural standards (those are our permission-to-play values); they aren’t things that we aspire to be true (those are our aspirational values); and they aren’t things that we want to mitigate (those are our accidental values). Instead, our core values are the key traits that define and differentiate us as a company.

Our Core Values


To understand our users’ personal preferences, we need to collect, process and store data about them. We believe that our users should be in control of who can access this data, and on what terms. Therefore, our first core value is privacy.

Despite being a central topic in technology, it’s surprisingly difficult to define what “privacy” actually means. To ensure our discussions are grounded in solid foundations, we adopt the definition by a recognized policy analyst:

“Privacy is a state of affairs or condition having to do with the amount of personal information about individuals that is known to others. People maintain privacy by controlling who receives information about them and on what terms. Privacy is the subjective condition that people experience when they have power to control information about themselves and when they exercise that power consistent with their interests and values.” - Jim Harper.

We regard it especially important to take an active stance on privacy due to a widely observed phenomenon called the Privacy Paradox: “Consumers consistently say they want more privacy, but they don't do much about it”. Several industry experts have suggested that the Privacy Paradox is underpinned by a simple factor: Existing approaches to user-privacy are too complex. To quote David Temkin, former CPO of Brave: “Privacy’s not gonna win if it’s a specialist tool that requires an expert to wield”. By making privacy a core value, we are committing to helping our users overcome the Privacy Paradox by building our product with this consideration firmly in mind.


As a company, one of the most important challenges we’ll ever face is to build trust. We believe that the best way to build trust is to be open and honest. Therefore, our second core value is transparency.

We consider transparency via two specific questions: What information do we share internally (i.e., with our teammates/colleagues)? And what information do we share externally (i.e., with our customers/stakeholders)? The answers to these two questions sometimes differ, so we adopt two definitions of transparency, from Parris et al. (2016):

  • Internally transparent organizations are open to sharing information within and across departments and teams, and from both top down and bottom up.”
  • Externally transparent organizations are open to sharing information with stakeholders, such as their current and prospective customers, supply chain members, investors, and partners.”

Importantly, these definitions emphasize that we are open to sharing information with specific groups of people. They don’t imply that we always share all information with everyone, because doing so could create negative consequences. For example, extreme internal transparency could cause our employees to feel like they are under surveillance, and extreme external transparency could eliminate our defensible competitive advantages. To quote Forbes : “Here’s an example of safe [external] transparency: You can disclose the ingredients in your restaurant’s signature meal without giving away the exact recipe.”

Ultimately, transparency is a balancing act. By making it a core value, we aren’t committing to a future where the default is to always share all information with all parties. Instead, we are committing to a future where the default is to share, unless there is a specific reason not to.

Algorithmic Responsibility

If we attract a large user-base, the algorithms within our product could create emergent outcomes that impact whole societies. For example, if we help a single person choose which podcast to listen to today, then we might impact that person’s perspective on a particular topic; but if we help millions of people choose which podcast to listen to every day, then we might create large-scale emergent outcomes that impact global culture and politics. We believe that we are responsible for all emergent outcomes created by our product. Therefore, our third core value is algorithmic responsibility.

We certainly don’t deny that using algorithms to make discoveries and decisions could create negative emergent outcomes. However, we do not regard this as a good reason for not using algorithms to help solve these problems, for one simple reason: The same challenges still exist when using other approaches to help solve these problems. That’s not to downplay the potential severity of algorithmic bias (which we regard to be a fundamental challenge of the internet era); rather, it’s to say that algorithmic bias is often easier to measure and mitigate than other forms of bias, such as cognitive bias. Put another way: Any form of decision-making is subject to bias, and we don’t regard algorithmic bias to be inherently worse than other potential forms of bias.

What’s more, we believe that if we adopt a robust approach to algorithmic responsibility, then the use of our technology could actually lead to less problematic outcomes in the long-run.

“Technology is creating new opportunities – subtler and more flexible than total transparency – to design decision-making algorithms so that they better align with legal and policy objectives. Doing so will improve not only the current governance of automated decisions, but also – in certain cases – the governance of decision-making in general. The implicit (or explicit) biases of human decision-makers can be difficult to find and root out, but we can peer into the “brain” of an algorithm: computational processes and purpose specifications can be declared prior to use and verified afterward.” - Accountable Algorithms

By making algorithmic responsibility a core value, we aren’t claiming that we will ever eliminate all types of algorithmic or cognitive bias, because those are impossible goals. Nor are we saying that we are exclusively responsible for all emergent outcomes of our technology, because discovery and decision-making always exist within complex, human ecosystems. Instead, we are committing to investing time, energy and resources into reducing bias in our product, and verifying that any emergent outcomes we create are aligned with our intentions.

Holding Ourselves Accountable

For the first few years of our company’s life, our main focus will be on product-discovery work, including developing a deep understanding of our users’ pain-points, building lightweight prototypes to test our core hypotheses, and iterating on early versions of our product. During this phase, holding ourselves accountable to our core values will be especially important, because the decisions we make will set the future direction of the company. However, it will also be especially difficult, because things are in a constant state of flux. We take this challenge seriously, and have therefore implemented four concrete mechanisms to hold ourselves accountable.

First, we have invested considerable time into discovering, defining, communicating and discussing our core values throughout the company. Now, with this post, we also share them publicly. In doing so, we hope to build a framework that allows not only our employees, but also our users (and the wider public) to decide whether our behaviour is consistent with our core values.

Second, for each of our core values, we hold ourselves accountable to a set of standards published by an external industry body:

By benchmarking ourselves against these external frameworks, we ensure that we don’t give ourselves “wiggle room” to shy away from the most challenging aspects of accountability.

Third, we dedicate specific time to collecting qualitative and quantitative feedback on these topics from our employees. In our (anonymous) monthly employee surveys, we ask everyone to rate how strongly we are embodying each of our company values. In our weekly retro, we provide a forum for open and honest discussion about how we could be doing better.

Fourth, we ask all of our employees to speak up if they see or experience anything they feel is at odds with our core values. We strive to build a team that centres on divergent opinion and life experience, so we expect such conversations to take place frequently - and we see that as a good thing. Except in the most extreme cases (such as an employee deliberately breaching privacy by knowingly leaking sensitive data - which would result in the employee’s termination), we commit to approaching these conversations with a growth-mindset. We certainly do not regard them as an outlet for “calling-out” our colleagues, because we do not believe that this is an effective environment for growth. In short, we hold each other accountable to upholding our values and learning from our mistakes, but we do that from a place of empathy and respect.

Looking to the future, we commit to evolving our accountability mechanisms by publishing our own policies on privacy, transparency and algorithmic responsibility. In order for us to understand what those policies should include, we first need to spend more time on product-discovery work, but we hereby commit to publishing these documents before our company’s 3rd birthday, which is 18th Nov 2024.

Benefits and Challenges

To conclude, I’ll share some of the benefits and drawbacks that we’ve experienced from positioning our core values so centrally.


Attracting the right employees: At This One, we strongly welcome divergent opinions on most topics – but not on core values. Therefore, we integrate our core values directly into our hiring process. We’ve declined several otherwise excellent candidates due to their lack of alignment on core values. While it’s always disappointing to lose a great candidate, we’re proud to stand up for what we believe in, so we stand by this approach wholeheartedly.

Alignment and decision-making: In any company, reaching alignment can be hard. But it’s much harder – and often impossible – if different people approach a situation from fundamentally different positions. Within our company, our alignment on core values makes it less likely that we encounter fundamental differences of opinion on the big-picture, so we can start our debates much closer to the specific details.

Strategic benefits: One way to interpret our core values is as a set of principles that help us decide what we do and don’t do. Therefore, our core values are intimately related to our company strategy, and they could even hold the keys to our long-term success. For example, we strive to build a product whose privacy considerations far exceed all mandated minimum standards; therefore, it is unlikely that we will need to halt our development in response to an increase in the standards set out by law.


Opportunity cost: The journey of discovering and deciding upon an authentic set of core values has taken time. Investments like this come with an opportunity cost, because every hour that we spend thinking about our core values is an hour that we spend not doing something else.

If you stand for something, you don’t stand for everything: Our strong commitment to our core values has caused us to lose several otherwise strong candidates during our selection process (some because they decided to withdraw; others because they took the core values interview but weren’t aligned in practice). As a result, our hiring took longer, which again created a concrete opportunity cost.

Public accountability isn't easy: We are proud to hold ourselves accountable to our core values, but we are fully aware that doing so won’t be an easy ride. We will almost certainly face public criticism, either because people don’t share our core values or because they don’t believe that our actions align with our stated intentions. In both cases, we will need to think carefully about how to interpret such criticism. If we think a criticism is valid, we will need to think about how to get better; if we don’t, we will need to have the determination to keep doing what we think is right. We don’t pretend that this will be an easy journey, but we think that’s a good thing, because it allows us to learn. If we don’t receive criticism, we will have fewer opportunities to grow.


At This One, our company mission is to make life better for everyone by helping people discover things they truly love. We’re keenly aware that this pursuit comes with great responsibility, and that many things could go wrong along the way. We believe that the best way to stay on the right track is to position our core values front-and-centre. We already hold ourselves accountable to these core values internally. Now that we’ve shared them publicly, we hope that you’ll hold us accountable to them too.

We don’t pretend that this will be an easy ride – but if we were looking for an easy ride, we wouldn’t be pursuing this mission to begin with. Sometimes the hard way is the right way.

We hope you’ll join us for the journey.


Thanks to Sarah Drinkwater and Sumaiya Balbale for your comments and suggestions on this post.

Blog comments