I'm talking to you today from my home in Brazil, where I live with my wife and two kids. Let me start by asking a question to the other parents out there. Would you consider asking a total stranger, someone you've never met before, never even seen before, to meet your kids after school, put your kids in their car -- which, by the way, you haven't seen either -- and drive them halfway across town? Even just asking that hypothetical question freaks me out. Let me ask you another question. Would you invest in a business that does that -- have strangers driving kids around town? It seems like an absurdly untrustworthy value proposition, an impossible business plan, doomed to fail, doesn't it?
Well, this may come as a surprise to you, but back in 2014, three moms started a company called HopSkipDrive, with this exact model. It's served one million customers, and, in February 2020, raised 22 million dollars and expanded to several cities in the US. Is the business foolproof? Well, no business is, but it's good enough to keep growing. How did they do that? How did they create trust in what many of us believe is one of the most inherently untrustworthy situations possible? The short answer -- they built trust in the overall system. Customers don't necessarily trust HopSkip drivers -- that would be relational trust. But they do trust the HopSkipDrive system -- what we call "systemic trust." And that's what makes it work. I am fascinated by this.
Here in Brazil, people these days tend to say that trust is a rare commodity. I don't think we are alone. Trust appears to have broken down all around us. And yet, the concept of trust has never been so fashionable.
But what is trust, really? Is it a feeling, an invisible part of our human DNA or culture, or this quasi-spiritual thing, like the Force in Star Wars? Or is it really something more concrete? I am an engineer and a consultant -- worse still, with a PhD -- sorry about that. I study the structures and systems of businesses and organizations. So, a couple of years ago, I started wondering whether we could decode and manage this seemingly intangible concept of trust. I'm pleased to report we are doing it, which I believe is really important, because from my perspective, if we can decode how trust impacts businesses, we can make them more successful, which might mean that their partners and employees are more engaged and can be more cooperative. And we, as customers, can be happier, more satisfied and safer when we interact with them.
So, today, I want to present to you the results of our study, and also offer to you a toolbox to build systemic trust. We basically started with a sizable graveyard of over 100 failed business ecosystems. And by "business ecosystem," we mean a business that can only function if all participants cooperate. Care.com, a childcare ecosystem, is a great example. Independent babysitters, independent parents, all have to work together in order to make the system work. Amazon and Apple iOS are also business ecosystems. It is that necessary cooperation that makes those business ecosystems a perfect laboratory to study trust. And in this study, we defined trust as the confidence that someone or something will deliver on a promise or behave as expected. We went into this wanting to understand whether trust was playing any role whatsoever in these failed ecosystems' inability to scale and grow in comparison to their successful peers.
For instance, we studied Orkut versus Facebook. What is that? You don't know what Orkut is? Why doesn't that surprise me? RIM / Blackberry versus Apple iOS or Android, HouseTrip versus Airbnb. I bet you haven't booked your last vacation on HouseTrip, have you? You get the idea.
What we found is that trust does play a meaningful role between success and failure of business ecosystems. It wasn't always the final nail in the coffin, but it was relevant [enough] to send more than half to the graveyard. Why was that? Many of the failed ecosystems made the mistake of naively assuming that cooperation anchored on trust would spontaneously emerge between complete strangers. And yet, we found more than 70 percent of uncooperative behaviors in the failed ecosystems. In contrast, nearly nine in 10 of the successful ecosystems actively embedded trust right into the workings of the platform. They built systemic trust. In essence, ecosystems were competing on trust. Trust had become a source of competitive advantage. The question, then, is "How did they do it?" How did they design for trust? When we examined the successful ecosystems, we found seven trust tools embedded in them.
Let me start with the first one, access. Many of the successful ecosystems define very well who is allowed in and who can be kicked out of the platform for bad behavior. HopSkipDrive does access well. It takes the drivers through a strict background check before they are hired into the platform. They also have a zero tolerance policy, which is superclear to everyone, so drivers know they can be terminated if they are caught illegally using their mobile phones while driving.
Next is contracts. Trustworthy ecosystems formalize a relationship with all participants through contracts. If you've ever clicked the box "I agree to the terms and conditions," you signed an ecosystem contract.
Then, there is incentives, and this is a big one. Successful ecosystems encourage cooperation through rewards, or by motivating participants to interact with each other in a positive manner. eBay and Amazon use reputation as an incentive. If you're a seller, and you have good reputation, you can charge higher prices for your products.
Then, there is control, and I know it's a bit off to talk about control in trust, but we are not talking about forceful control -- it’s more like a gentle guidance, like an invisible hand nudging you in the right direction. Successful ecosystems shape the behavior of participants so the kind of cooperation required will emerge in the platform. Uber does control well, and it dictates to the driver the best route to take, so the passenger trusts the driver will not take the longer route just to make some more money.
Then, there is transparency, which is superclear, isn't it? Sort of, ecosystems who are trustworthy make past and present behavior visible to everyone participating in the platform. And that's the reason why you feel a pit in your stomach if you've ever booked an Airbnb with a host who is new to the platform and doesn't have any reviews yet. And of course, Airbnb has managed to make transparency work both ways. If you are a guest, and you trash a house, the other hosts will know about it thanks to the Airbnb review system. Then, there is intermediation. How does the platform act as a middleman in the moments of truth of cooperation? Taobao, Alibaba's online shopping platform, does intermediation when it acts as an escrow agent between sellers and buyers. It basically holds the seller's money, until the buyer says she is satisfied with the product.
Last but not least, mitigation. How does the platform handle mishaps or prevent them from happening in the first place? Did you know that LiveAuctioneers, an auctions platform for art, collectibles and antiques, has a broad protection program that guarantees payments on the platform? That's an example of mitigation.
So those are the seven trust tools, the toolbox. Even more interesting is how they appear to combine in the successful ecosystems we studied. On the one hand, there is no silver bullet, no single tool that can solve for trust. On the other hand, you don't need the seven tools to be successful. You need 3.6, on average. So how do you pick? It depends on the kind of ecosystem you design. If interactions among the participants are key, like in most social-media ecosystems, you will require a combination of access, transparency and control in order to be successful. These are the very tools Facebook uses, and these are the tools, interestingly enough, causing Facebook so much grief right now. When there is a main last mile for the delivery of the promise, like in most gig economy ecosystems, then, you will require mitigation in order to cater for failed delivery. When there is a large asymmetry of information, say between sellers and buyers in used goods marketplaces, then, you'll require a combination of intermediation and mitigation. And of course, when there are many dimensions to the platform, you will require a larger combination of tools.
Let me say one more thing, because I've been a consultant long enough to know that many of the business leaders watching this may be saying, "Hey, this is great. Let's digitize all these tools and we'll have the best and most successful ecosystem ever." Well, before you move to action, let me tell you something. Yes, digital plays a meaningful role in enabling trust, and in some cases, the very existence of the ecosystem. You could say that digital could be the backbone of systemic trust. However, there is no such thing as trustless trust. No matter how fabulous the code, how advanced the blockchain, digital cannot solve for trust alone. And that's why we found nine in 10 of the most successful ecosystems to be bionic trust systems, meaning they use a combination of digital and nondigital, human tools -- such as contracts, policies, governance -- in order to build trust on the platform.
At the risk of showing my Star Wars fan card again to you, think about it this way: if you want to build a successful ecosystem, and a trustworthy one, you need to think of the Jedi, the Skywalkers. Every time they go into an adventure, they take their favorite droids with them, R2-D2 and C-3PO. They actually make up a bionic team.
I know today, we talked a lot about trust in business systems, however, that's not where the conversation should end. Systems are all around us -- schools, governments, health care. Could those systems become more trustworthy through the use of the tools in the toolbox? I don't see why not. So if you are designing any system, but especially an ecosystem, give those tools a try. If you do that, I can almost guarantee the Force will be with you. Trust me.
Thank you.