This site is a Work In Progress
Home NeoBooks In Progress Creating a NeoBook (start here) Admin and Help
In the My Story of Trust video, this is what I say:
This is the story of how I see trust, how we got to the current state of trust, and where we might go with it.
I happen to believe that long ago, all around the world, we used to understand how to live in community on the Commons.
On the upper left there is a bunch of megafauna. First, we managed to kill off all the megafauna because humans were pretty efficient at killing the large slow animals. So we killed off megafauna and then slowly over time we figured out how to live in community on the Commons.
This was hard-won knowledge because a lot of communities wiped themselves out. The ones that destroyed their Commons ended up overfishing or cutting all the timber off or Easter Island or name your calamity in society, but basically really often it was because we fought each other to death or we managed to eliminate that which sustained us.
If you go to the Dagara tribe in West Africa, if you go to the Quechua and Aymara tribes in South America, if you go to lots of communities around the world and you probe and you ask about their indigenous ways of knowing, their cosmologies, the stories they tell, the words in their language, you'll find they have words like reciprocity and stewardship and so forth — really words about how to build and create community and how to take care of your Commons.
Somewhere along the road we shattered this. We systematically went around the world and destroyed it, in fact. It's a painful topic. It causes lots of controversy. A bunch of people don't particularly want to have this conversation; its causes are multiple.
My own take on it is that somewhere around World War I we sort of finally lost faith in humans because before World War I we have, you know, the automobile, the airplane, electrification, a whole bunch of technologies seemed to be making life better.
We're figuring out how a lot of things work, and then World War I is a stupid war. We feed people into machine guns; millions of people die. The frontier across the middle of Europe never moves more than 14 miles one way or the other for four years. We just exhaust each other in this war.
Then at the end of the war we proceed to do a lot of stupid things. The end of World War I basically primes the world for World War II by making a whole series of stupid decisions, but on the whole we kind of lost faith in humans in here and we started handing institutions over to mostly engineers, managers, white men to design things for efficiency, scale and often profit.
Why efficiency and scale? Well there's so many humans, population is growing, we've got to make sure that our systems are efficient so we've got to build, you know, these big industrial sorts of systems and we told these designers don't worry about the woo-woo stuff.
We kind of demonized relationships, meaning, purpose, society, trust: important terms between humans. They were soft stuff, they were intangibles, they were very hard to invoke on command. So we kind of told these designers, no, no, build for efficiency and scale, automate and use, you know, coercion when you need to because these things are not very reliable.
So these designers went and industrialized every sector of human activity, not just industry. That also means they standardized, automated, commoditized and a word I really like, consumerized every sector of human activity. And not just industry.
What do I mean? So "consumerized" comes from "consumer," so you think consumer goods, toothpaste, floor wax, you know, consumables, all that kind of thing. And I don't mean just laundry, just you know, Tide and Dove. I also mean culture.
Long ago we used to memorize poems and plays, we used to put on musicals ourselves, we would play music for each other and a few people still do that but generally it's like, is it Nintendo or Netflix and chill? We've automated culture, we've turned it into consumer mass marketing stuff. It's more stuff to acquire and do and pay for.
Our identities:
Well, chances are you don't think either thing, because there's a bunch of people in the fashion industry and other places that would like you to buy more stuff to look cool and hip and trendy for the next season.
So we've sort of consumerized both clothing and identity and all the artifacts we put on our bodies. That's how we create our identities through the purchasing of brands. These secondary attributes. That's the consumerization of identity.
Same thing for health, our food system, shelter, spirituality, governance.
We no longer come together in town halls and make decisions together about how to govern.
Generally, what happens is every four years we have a major election, where we get asked for money over and over and over again, because that money is going to be put into the advertising machine to basically prompt us to vote for A or B. That's not actually governance, that is some kind of consumer mass marketing politics.
We consumerized everything and I can go way deep.
Anywhere you see a little green marker here (in the Prezi), that means I could go off on a tangent here and if we had a magical hypertext that allowed you to go and come back I would use that. But here we are on YouTube.
So we designed most of our institutions from mistrust of the average person. We assumed that, you know, anybody could do something wrong so we might as well design for those people. We designed for them up front.
That means we normalized a whole bunch of really odd behaviors.
For example, the the language of advertising is the language of war. Advertising is like going to war with those consumers to try to get them to buy more stuff: You launch campaigns against target demographics, you pay by the impression you're trying to achieve market penetration. We're at war.
Meanwhile, we're busy vacuuming up all of your exhaust data and stalking you, as you can see when Zappos follows you from website to website with that same shoe you lingered on for a little too long on the Zappos website. Never mind Zappos, there are plenty of others that are that are mixing and sharing all of our data to stalk us to get us to buy more stuff.
In other areas, we created a compulsory education system, not trusting that children might be curious and that we might be able to actually teach them well. Our system before the industrialization of schooling — around the Civil War through World War I — basically had thousands upon thousands of one-room schoolhouses that were doing a pretty good job. America was quite literate back in the day.
We're over protecting intellectual property. Jack Valenti, former head of the Motion Picture Association of America, is famous for having said that "I would love copyright to last forever minus a day." Craziness! And basically stealing from the Commons of culture and information that we all rely on together.
And then we've also militarized our police. Something goes wrong we send in a SWAT team. In schools, a situation that would previously have gotten a child sent to the principal now has police or SWAT come in and all kinds of really bizarre things are happening.
I'm calling these things weird names, the stalker economy, but I think you recognize that these are all things we take for granted. That stalking is what companies are doing right now. The compulsory education system is where we put our kids.
The company that did a great job of protecting its intellectual property is like bang on great entrepreneurs and I would suggest that these are actually odd behaviors that don't go well with living in community on the Commons.
Sadly, as we did all of this we turned ourselves from citizens into consumers or we were turned into them. I'm not sure what the causality is, but I know that if you're in government and you're trying to defend citizens you're trying to defend my right to have fair housing and you know equity, fairness, access, rights to free speech, things of that nature.
If you're just trying to protect consumers what you want is everyday low prices. If a big-box store happens to wipe out all the little mom-and-pop stores in town that's too bad. It led to everyday low prices so that's that's clearly a good outcome from that measuring stick. But it wasn't really good for the local town, in particular when that big-box store then closes and goes away. I look at a movement called Reburbia, which attempts to to do something with those empty big-box stores.
So as consumers we've ended up alienated, separated from each other, pretty lonely even though we're densely packed; we don't we don't make a lot of contact with other people.
We're kind of stupider than we otherwise would be because we're mere consumers and not engaged citizens.
Alas, keeping trust broken serves a whole bunch of interests. Keeping us not trusting ourselves and not loving who we are, as we are, means we will buy more stuff and need more stuff because we have to keep up with the Joneses. Keeping us thinking we're not attractive means we'll buy more clothes. These are sort of the more more simple kinds of forces.
Keeping us fearful of other kinds of people really serves other people. A few parties out in the political sphere have managed to weaponize trust. They've got us fearful of science, of journalism, of foreigners, fearful of the opposition because those people are crazy, fearful of facts because facts are not facts anymore right?
We're in the post-truth era, the post-fact era.
These things are craziness and this is a rapid spiral downward and you're seeing electoral success around the world as authoritarian populists get elected. You know: Orban, Bolsonaro in Brazil — all over the place! These far-right actors have come into power because they've weaponized trust.
All of this is inflaming recent crises. When you get everybody heated up and start breaking trust and turning people against each other, in particular now when word travels faster and when ordinary people have power tools and power weapons like drones, let's say, you better figure out where the bill is going to get paid at the end.
All of this would be really depressing and I would like send a counselor to you right now as you're watching this video but I want to tell you that we're busy rediscovering trust.
I have found movements around the Earth that are doing something that I call Design from Trust. Not design for trust but rather from trust, from an assumption, a working assumption of trust. When I'll describe these things to you and it sounds a little bit like leaving the key in the front door, and it kind of is that because vulnerability is one of the examples of how design from trust works.
So let me say a little bit more about it.
It starts from a premise that people are more trustworthy than we think they are. That our general assumption of skepticism and fear for everyone, which is if you ever watch the local news in your city you'll leave fearful. The general assumption here is that you can assume good intent on the part of the next person coming through the door.
If you've done any work in open-source software or on the internet, "assume good faith" or "assume good intent" is a simple meme, a simple phrase in that culture.
But what this leads you to, what it led me to, is the idea that most of our institutions need a redesign from trust because they've been designed from mistrust. We need to figure out how to flip them around.
What on earth does that mean?
Just so you have a more tangible example of what Design from Trust is, the Wikipedia is my favorite example of Design from Trust. Remember the day when you noticed how Wikipedia works, when you noticed this little thing (pointing) that says edit this page. How did you feel? What emotions were running through your mind as you realized that any idiot on earth can come in and edit the world's encyclopedia?
Like, kind of weird, right?
That's an example of design from trust. Hold that thought.
Another example is in governing the Commons. Unfortunately, in 1968 a soil biologist named Garrett Hardin wrote an essay, "The Tragedy of the Commons," which stuck in everybody's head. It wasn't a great essay, it was kind of misinformed because he was making all sorts of assumptions that basically everybody will eat up a Commons, that greedy behavior will dominate, so Commons are effectively impossible.
We've got this as a thing in our heads, like "scarcity equals value" or "time is money." There are these little sayings and "tragedy of the commons" lets people dismiss commons right away because no, no, no, they always result in tragedy.
Well, fortunately it turns out that Lin Ostrom won the Nobel Prize for figuring out the the principles for governing Commons. She originally submitted eight. This is a slightly longer list of ten, but following these principles actually helps people learn how to build trust in one another in local communities.
You'll notice in three and nine, for example, that being very local, that devolving authority right to the fingertips to the people closest to a Commons is one of the governing principles. She did some really effective work in figuring out how to take care of Commons together.
Muhammad Yunus won a different Nobel Prize because he gave a very small loan, something on the order of a little more than twenty dollars, to a group of women in Bangladesh, and those women repaid him.
In fact, as this became a mature industry, lots of companies entered it, lending money to people with no collateral, meaning they had no assets to put up that the bank might take away if the loan wasn't repaid. Their repayment rates were better than those for normal loans to people who passed a credit check and did have collateral.
What's up with that? This is an act of trust as well. Microfinance is designed from trust.
I just gave you three examples of Design from Trust, and you might notice that there are two predictable responses. These are a way of noticing when you hit a new system designed from trust.
The first one is, "oh shit, this is impossible, it's crazy, it's never going to work. Who had this stupid idea?" Right?
At this point some people bounce, they don't try the Wikipedia anymore, they dismiss it: "this is impossible so I won't use it."
A bunch of other skeptics just wander on in and then they go to some area in a topic they know a lot about, whether it's Buffy the Vampire Slayer or neurobiology or whatever, and they discover that it's actually really pretty good. And they discover that the Britannica would never have had a page on every episode of Buffy the Vampire Slayer, never mind a page about Buffy Studies, which is an academic field of endeavor.
There you go.
But then everybody goes, "This is naïve! Design from Trust is so stupid because everybody knows that there's bad actors in the world." Yes, Wikipedians are clearly aware of bad actors. Vandals come in. If I were 14 years old, what more amusing thing than to go into Wikipedia, change a page and leave something awful on it.
Except it's easier to revert changes than to vandalize things that are designed from trust. Often, not always.
So we deal with bad actors in there, and after a while people hit a second predictable response to systems designed from trust. This is again an "oh shit" reaction, except this time it's "oh shit, no way, this really works. This seems to be really working and it feels kind of interesting and good. It feels kind of weirdly warm and connective."
I just want to say that this second insight is at the same time thrilling and really depressing.
It's thrilling because, yes, I want people to have that sense and to understand it and to go look for more. It's really depressing because this is a signal to me of how deeply we've internalized the left-brain, paternalistic, hierarchical, linear, industrial mindset, mostly a male mindset, of control and coercion, which is how everything seems to work. And we accept that that's the way it all needs to work.
We have a long way to go to kind of hit "undo" on all those things and unravel them so that we can get back to some place where we're living inside of systems designed from trust, which have a lot of benefits.
Remember the little petri dish image here where I said we're rediscovering trust? Well, one of these little fuzzy batches of bacteria is actually a collection of movements I've discovered around the world that have already been instituting Design from Trust. They don't, any of them, call it that because I'm trying to bring that to the party here to help people understand how these things are similar to one another.
I just talked about governing the Commons and wikis, so here's open-source software, crowdsourcing. The Internet was Designed from Trust, and open everything is kind of a placeholder for a whole bunch of things. Open science, amateur astronomy, open government, open scientific journals, garage biology, the public domain, freedom to tinker (which means the freedom to open up things you buy and change them and fix them), open patents.
Some of these different clusters are pretty rich just by themselves. Here we have policing by consent and restorative justice. Down on the left, the sharing economy, microfinance I talked about, the basic income guarantee, peer-to-peer lending, cooperatives. On the right, peer-to-peer everything, open space and unconferences, which are basically self-organizing conference methods. The decriminalization of drugs, for example, time banking, these are all Designed from Trust. And I could, as you might imagine, go on and on and on about any one of these at some length.
Some of my personal heroes are the people who developed these movements.
If I had to summarize, I would say systems designed from mistrust are trying to control crowds. They're trying to make their lives better, but in order to control crowds, they design to trap and stop the bad actors first to achieve efficiency and scale. But by doing that, they often separate us from each other and limit what we can do within the system. To promote more usage, they focus on individualism but often have to rely on coercion to actually make the system work.
In systems Designed from Trust, we're really looking for cooperation, not control. We want to deal with the bad actors last, wait to mess up the system to limit bad actors as long as possible. In fact, if you can turn a bad actor into a good actor, that is a huge win.
Really what we want is the genius and thriving of the whole, not just efficiency and scale.
To get there, we're going to emphasize our interdependence, not our individualism, so that we can click people into community and get them to understand that when they learn how to work together, when they learn the particular rules of this Double Dutch dance in one of these spheres or another, that's when they actually move into some really good place.
Finally, what are the benefits? Why would you want to design from trust? I've picked my favorite three here.
One is that trust is cheaper than control. If you think about the costs of control, let's say you want more security in some place, so you put cameras up and then you have to record the videotapes, and then you have to store and archive the videotapes, and then you have to prosecute and catch people who violate by going through the tape. There's a whole architecture of security and control that goes behind it. Maybe if people who are participating in the space could catch wrongdoers or bad actors and deal with them, that would be a lot cheaper.
Open-source software doesn't require (those kinds of) protections. I don't know if you're old enough to remember the days of early software where there were all sorts of software protection schemes that everybody griped about because when you ran things you had to insert a special disk or even plug a dongle into one of the ports in your computer so that it would unlock the software. Open-source software and our present software models are much cheaper to do.
Trust also unlocks abundance.
I went to business school, and one of the phrases they taught me was that scarcity equals value. I think scarcity equals value is kind of a screwed-up way of seeing what's going on. I'll propose to you instead that scarcity is abundance minus trust.
You can have a setting in the world where there is a lot of abundance. If you remove trust from it, the trust that allows us to preserve that abundance and grow that Commons, what you end up with is scarcity. Scarcity equals value is basically a license to any capitalist to go out and create scarcity where there is none on purpose because then there's value; then there's a business model.
I think this actually sucks for the world.
Just go look at the story of water in Bolivia where they passed the law that privatized all water, including water that fell on your roof and got drained into a bucket you put out by the gutter. Bechtel was supposed to administer that and shit hits fan. It really ended badly but it ended in the undoing of those regulations.
Finally, trust unlocks genius.
When I say "unlocks," I mean there's already genius in the room when you bring people together. It's just that our assumptions about people — that there's so many bad actors, that people are so dumb, that they're not trustworthy — limit the genius we get from any collection of humans.
But in fact there is a whole bunch of genius in there, and if you Design from Trust, you design in ways that let that genius march to where it is best used. From self-awareness, from feedback from other people; however that might work. If we have systems that rely on people figuring out where they fit and how best to apply their their work in collaboration with other people we get better results overall.
This brings me to my favorite phrase of the last decade which is also the name of my TEDx talk: "What if we trusted you?"
It implies that we don't trust you right now, because from my history of trust I'm saying that we lost faith in humans, we lost trust in humans, and we've designed everything from the assumption that the next person coming through the door is not trustworthy.
But what if we flip that around?
What if we said, "As a starting point we'll trust you. You might lose our trust, in which case we've got to figure that out, but we're going to start from this assumption that most people have good faith or good intentions and go from there."
That is my story of trust. I hope it's been useful to you. I would really love any comments and look forward to making this better.
Thanks.