Privacy is the New Celebrity

Dr. David Bray on Ransomware, the Uncertainty of Change, and Working for the Government as a Teenager - Ep 10

November 02, 2021 MobileCoin
Privacy is the New Celebrity
Dr. David Bray on Ransomware, the Uncertainty of Change, and Working for the Government as a Teenager - Ep 10
Show Notes Transcript

In Episode 10, Dr. David Bray brings us on a journey into the future of privacy, from ransomware to sith lords. David is a distinguished fellow at the Atlantic Council as well as the Stimson Center, and principal at LeadDoAdapt Ventures. David began working for the government at age 15 and has an impressive range of expertise including cybersecurity, disinformation, military and humanitarian issues, civil rights, energy, biotechnology, and more. David and Josh discuss ransomware as one of the most significant threats of the 21st century and David explains why it's never worth paying the ransom. David ponders the last 150 years of technological innovation and explains how we're currently witnessing four technological revolutions simultaneously. He also shares his fears that our desire for certainty in an ever-changing world could lead us towards autocracy. Josh reminds us that only a sith deals in absolutes.

[00:02] - Speaker 2
Welcome back. I'm Joshua Goldbard, the founder of MobileCoin, and this is Privacy is the New Celebrity. Privacy, technology, and everything in between. This is our 10th episode. Four hosts, ten guests, and we're still going strong. Thank you for staying with us. If you've enjoyed the show, don't forget to subscribe on your favorite podcast app. Now down to business. Last week we spoke with Alex Feerst, who sits on the MobileCoin Foundation board of directors, and we're continuing that theme with another board member. Today we have Dr. David Bray on the line. David is a distinguished fellow at the Atlantic Council, as well as the Stimson Center and principal at Lead to Adapt Ventures. He's also a current board member with the MobileCoin Foundation. David has an enormous range of expertise, including cybersecurity, disinformation, military and humanitarian issues, civil rights, energy, biotechnology. You name it. David has worked on it in our government. To give you a taste of just how brilliant our guest is, today, David began working for the US government at age 15 on computer simulations at a high energy physics facility investigating quarks and neutrinos.

[01:24] - Speaker 2
He is the recipient of so many awards that I lost count. And today we can add to that amazing list of accolades. Esteemed guest on Privacy is the New Celebrity. David, it's so great to have you with us today.

[01:38] - Speaker 1
It's great to be here, Joshua, and thank you for that very humbling introduction. I'm really glad to be here.

[01:42] - Speaker 2
David, you've held so many important roles over the years in such a wide variety of fields. When people ask you what you do, how do you answer that question?

[01:53] - Speaker 1
Usually I say, if I'm fortunate, I make a difference in your life two to three years from now. And if I'm successful, you probably don't know about it because it's getting ahead of the curve on how technology and data might disrupt in a negative way our society and what we can do to make choices, to steer it in a positive direction, to try and heal our planet, heal our society and heal the world.

[02:17] - Speaker 2
Healing the world is a noble ambition. Tell us about your work at the Geotech Center.

[02:23] - Speaker 1
So, yes, back in 2019, late 2019, I got approached with the opportunity to create a new center at the Atlantic Council called the Geotech Center, basically focusing on how data and technologies changing geopolitics and how geopolitics changing data and technology. And it would involve standing up a new center. The Atlantic Council itself has been around since 1961, and it really was created 1961 to remind the United States in the time of polarization that even then we had polarization between political parties. And we seem to be distracted on the reality that we're stronger together with allies, stronger together with allies across borders, stronger together with allies across sectors.

[03:03] - Speaker 1
And so that's what the Atlantic Council's original mission was. And so in some respects, it was perfectly fitting for in 2019 to be looking at what's happening in the world and say, well, data and technology increasingly, if we're not careful, could be this source of polarization, whether it's in disinformation that's going on, whether it goes not understanding how it's changing the world or concerns about surveillance, loss of Privacy, that if we don't get ahead of this, we may find that our civil societies will pull themselves apart, particularly in representative forms of government, like the United States, but also allies in Europe, in Asia, in Africa, South America, thinking about how can representative forms of government leverage technology and data to uplift their people and empower people?

[03:45] - Speaker 1
And then on top of that, also, we wanted to create a Commission that was bipartisan to include both Democrats and Republicans, including active senators and representatives, to actually map out when we say tech for good or data for good. What actually are those actionable recommendations that we can do? So the good news is, even though we launched the midst of pandemic, we actually launched on March 11 when Kobet was declared a pandemic, which is always a suspicious when you launch when a pandemic occurs 14 months later, we actually had bipartisan recommendations that actually got praised in Europe and praise in Australia and New Zealand, also from Singapore, that actually map a path.

[04:21] - Speaker 1
That when we say tech for good. What does that mean for secure data? What does that mean for trust in the digital economy, resilient supply chains, global health technologies and Biotechnologies and commercial space as well?

[04:33] - Speaker 2
I noticed that you said that there's an element of Privacy in your work. Can you talk about the intersection of Privacy at the Geotech Center?

[04:41] - Speaker 1
So I go even further than just the Geotech Center and say, I've always been a passionate believer that part of any society that embraces individual freedoms and individual choice includes the choice as to when you want certain things to be private or when you want certain things to not enter into the public domain. And I think that's just fundamental, whether you're in the United States, whether in Europe again, our North American allies, South America, Australia, New Zealand. That is fundamental to it. And so as we look at what's happening with technologies, it's worth noting that concepts like Privacy always came after the technologies.

[05:21] - Speaker 1
It's worth noting that you didn't really have the concern about indecent, like there weren't laws against indecency until people started moving into the cities in the late 19th century, early 20th century. And now that it was conceivable that you could be in a building, you could look out a window and see something that you didn't really want to see. And so that necessitated indecency laws. We didn't really pass the Privacy Act in the United States until the 70s. And that's because we then have these things called mainframe computers and advanced data processing, in which correlations now could be reached about someone that may not have been somebody they gave permission to, but the computers could actually bring it together.

[05:57] - Speaker 1
And so we're now in this area, I think where it's actually even surpassing what the 70s were, where I'd even say it's about even going further to almost digital dignity and integrity of oneself and finding out ways that civil societies aren't zero or one that we either know everything we know because that would be bad. But coming up with a way that individuals can make choices almost like a ResiStat that you can turn as to when you want to share information and when you don't. But that is part of your integral self, whether it is your digital persona, your physical persona, or even your biological persona, because I think we're going to soon have concepts of questions about when do you actually have the right to say I want to withhold my biological information or my biometrics from being shared?

[06:39] - Speaker 1
Because I'm not comfortable with what you're going to do with it.

[06:42] - Speaker 2
Can you give some examples of how that would work in real life? Like, how do you apply these ideas?

[06:46] - Speaker 1
Yeah. So one of the things that I'm really looking forward to seeing done, and I'm actually trying to help right now at Lead to Adaptive Ventures is champion the idea of whether it's called data co Ops or data trust. This is actually an idea that dates back to 2017, the UK government they were looking at trying to see how AI was going to impact their government. Lord Tim Clement Jones, who is a good friend now, was actually the lead for an effort there that actually produced a report saying, we need to do this idea of what's called data trust in which people bring their data together for specific purposes.

[07:19] - Speaker 1
They may bring it together to inform better public health activities or helping out with new drug discoveries or informing what's being done with crops. You can imagine small farmers and large farmers sharing the data, but they preserve sort of oversight and choice with their data. How that would work going in the future is you could pull your data and you could choose how much you want to share, maybe just want to share. You don't want to share that your specific age in your birth date, but you want to share that you are in a certain age range between 25 and 30.

[07:50] - Speaker 1
So it's almost like differential Privacy of sorts. And you share that. And then that helps to inform either public policy. It may even inform private sector actions, but it's something that it's the community coming together, and so it's that you still have a locus of choice and control. And I imagine eventually what we're going to have to do, because this is just going to be where these devices and these sensors are going to be surrounding us, and everywhere we go is going to be monitored. You're going to need to have your own sort of agent, your own digital agent or avatar that is your broker to all these different services.

[08:21] - Speaker 1
And so as I walk into a cafe or I walk into a restaurant or I walk into a shopping mall and the sensors pick up that it's. David. They're querying my digital agent as to what have I given permission to reveal certain things about me or not? If I do reveal certain things, maybe I'm getting an offer that says here's a 30% off coupon, or here's a flash sale for this cup of coffee or whatever it might be. And I also may say, look, I want to be in incognito mode, so it's a choice architecture.

[08:44] - Speaker 1
And then when I get ready to go to the airport as I go to the airport, maybe I've given up a little bit more information because I want to go through the TSA pre Lane or the clear Lane, and so I get on the plane faster. Or maybe I chose I didn't want to. And so maybe I do the more strenuous check because I want to go in incognito mode. And so they have to basically treat me as if I'm a stranger and verify that I'm safe to get on the plane.

[09:03] - Speaker 1
But it's a choice, and it allows for the wide variety of human choices, as opposed to what I see some nation States doing right now, which is basically you have no choice whatsoever, and basically they're becoming a surveillance state and saying all your data must be provided to the government.

[09:19] - Speaker 2
One of the things we really believe in mobile coin is the possibility of an alternative future where people are able to maintain some financial Privacy in the 21st century. One of the ways that we accomplish this is with the technology that we describe as oblivious computing. This is the ability to store information on a server you don't control where the operator of that server cannot abuse that information, cannot even understand what that information is, but know that it is correct. And I actually think that this is a potential way that you could satisfy some of the things you talked about, like being able to pay at a coffee shop without that coffee shop, knowing who you are, what the rest of your transactions are.

[09:57] - Speaker 2
And I think that's kind of an essential thing in the world that you're imagining where we have both Privacy choice and the ability to interact with society. And getting all those things together is really, really difficult.

[10:08] - Speaker 1
Exactly. And I think on top of it, it's recognizing that in addition to making sure that no excessive information leakage or revealing of information is provided to that transaction at the coffee shop, if the coffee shop is willing to offer you some exchange that you find valuable, you still have that choice. There may be some people that say, look, I'm willing to give my street address because I want to be part of the Frequent Coffee Club creditor club or whatever and in exchange, I'm going to get a percentage of my coffee.

[10:37] - Speaker 1
But that's a choice you've made. And hopefully it's an informed choice. Or you may say, Look, I don't want to do that. I'm willing to pay more than a regular cup of coffee, and I want to make sure that the transactions that just occurred was financially sound. But at the same time, didn't do any information leakage or mind my data exhaust to my detriment.

[10:54] - Speaker 2
I think we're really as a society recognizing the importance and value of data. And this is something that has taken a very long time. You look at the advent of the Internet, late 80s, early 90s, when we started to have mass culture, really at the AOL kind of bubble. It's taken us nearly 30 years, 40 years for us to recognize that the data that we create as individuals has value, and that value is really amazing in aggregate, but it can also be abused. So I think dealing with that is just really a critical thing for the 21st century.

[11:27] - Speaker 1
Yes, in some respects. So I've looked at the last 150 years of human history and how technology has changed civil societies, how it's changed governments and how nations operate. And you're absolutely spot on. It generally takes between 15 to 25 years for any new technology to be wrestled with by a society to figure out what's the good, the bad and the ugly of the morals and the ethics associated with new technology. And you look at flashback to the early 2000s. And even like the late 2000s, early 2010s, there were certain things being done by certain technology companies that we can choose to remain anonymous, that at the time we were like, oh, that sounds great.

[12:06] - Speaker 1
That sounds good. And investors were flocking to them that now almost about twelve to 13 years later. We're like, Wait, this is bad. We don't want this. It's not like the companies have changed what they were doing 13 years ago. They're still doing the same thing. It's just that society's consciousness has changed, and this is not the first time it's worth noting. When submarines first came out, the British actually thought they were unethical because they were underneath the water, and it wasn't appropriate to have armed conflict in which you did not present yourself.

[12:34] - Speaker 1
And of course, later we realized, well, you kind of need submarines, and if you don't, then other people are going to use it against you. But it shows that as a society, it does take us about an entire generation, 15 to 25 years to sort of wrestle with what is the good of these technologies and what are the bads?

[12:50] - Speaker 2
Well, the British also famously had difficulty with guerrilla warfare because of the fact that they thought that people should present themselves on the battlefield and sometimes in war, you kind of just have to deal with the situation as it is not as though you want it to be. Ray Dahlia has this beautiful quote where he says, Embrace reality and deal with it something that red coats do not really present themselves too well, yes. Since your work exists at the intersection of technology in so many other fields, like cybersecurity, international relations, medicine, what are some of the biggest technological challenges you encounter in your work?

[13:26] - Speaker 1
It's interesting because I think what makes I often ask what makes this coming decade different than any decade before? You can imagine if you had been born at the beginning of the 1009 hundreds you saw the automobile, you saw the airplane, you saw a massive amount of technology will change in a very short period of time. And I think what makes it different now is one. The speed of global adoption is just massive. It's worth noting. For example, you mentioned the Internet, and I've had the honor and privilege to have worked with Vincer for several years now and consider him a friend with the People Center Internet Coalition and what he and others did back in the late 60s to about December of 2018.

[14:11] - Speaker 1
It took about that period of time. So more than 45 plus years to get half the planet connected to the Internet. But if current projections are right, it took 45 years to get about 3.5 billion people online. And now in the next five to seven years, we're going to bring out another 3.5 billion.

[14:27] - Speaker 2
It's so fast now it's crazy.

[14:30] - Speaker 1
And so I think that is massive. And then here's the other thing that makes it even more complicated. It's not just that revolution going on. We don't just have one technological revolution like one industrial revolution. We have what's going on with data and algorithms and AI. We have what's going on with synthetic biology. We have commercial space. It's at least four or five in parallel revolutions, equivalent to the Internet compacted into one decade. And that's why I think you're seeing societies, you're seeing companies, you're seeing communities going I'm not sure we can keep up with this.

[14:59] - Speaker 1
Yeah.

[14:59] - Speaker 2
If you look at social networks, you look at the pace at which Friendster grew, and then you look at the pace at which MySpace grew. And then you look at the pace at which Facebook grew. And then you start to look at the messaging applications and the picture applications and the rate at which Instagram grew and the rate at which WhatsApp grew. And then you look at TikTok and Tik Tok just crushes all of them. And it's just the cycle where the Vanguard, the new hot thing grows so fast, the speed at which we are interconnected in our society has never been faster.

[15:32] - Speaker 2
And there are lots of consequences to that. I think that when you think about Privacy in the context of having these incredibly large megaphones that can reach the entire planet in seconds, it's a very complicated thing to wrestle with.

[15:45] - Speaker 1
Absolutely. And on top of it, the whole premise of the Liberty of government that is the cornerstone of the United States and other representative forms of government in Europe and abroad was that we would take the time to deliberate on something we would figure out the pros and cons, and maybe we didn't get it right initially, but eventually we figured out it's attributed to Winston Churchill, though he apparently history said he didn't really say it, but it's still a good quote, even if he didn't say it, which is, you can always count on the United States to do the right thing after trying every other possible permeation thereof.

[16:17] - Speaker 1
But the trouble is, we're now in a world in which we don't have that much time. And I worry that you're seeing increasingly people want a world that they want a world presented to them that is black and white, that one is certain that things are not shades of Gray and that's kind of autocratic, at least autocratic in thought, if not autocratic in reality. I worry that as the world becomes more turbulent in its changes and things are accelerating, that we will end up creating autocratic societies out of a desire for a sense of certainty in a world that is increasingly changing at a speed where things aren't certain and then find out we've given up certain things that we don't have given up.

[16:58] - Speaker 2
So I think one of the things that's really interesting in society now, and you just helped host a symposium with the Department of justice on this around Ransomware. And I'm very curious about a world in which we are able to fight ransomware, but also defend Privacy. And what does that look like? What do you imagine that world looks like?

[17:18] - Speaker 1
Yes. And so what I applaud is you're absolutely right. We've done three round tables. We have a fourth one coming up. This is a fellow friend and former retired FBI Supervisory Special Agent Trenteama. And when we're with the torrent justice and I give Kudos to them for asking the question, which is, how can we get ahead of ransomware in a way that recognizes it is bigger than government. It is bigger than just the United States. It is truly global in nature. And I think it is recognizing a couple of things, which is the time between a company experiencing a ransomware attack and then them choosing to notify the government that this has happened.

[18:00] - Speaker 1
And they don't always notify at least in the United States, because they choose for various reasons not to can be upwards of three months, whereas the time that an attacker who has done an attacker that has done a successful ransomware attack and then pivots to do another attack on another company is 30 minutes. So we're operating on a notification time frame of three months or more, or maybe never. And then an attacker that's operating on premise of 30 minutes before they successfully move on to another target and then on top of it, it's global in nature.

[18:32] - Speaker 1
It's not within the United States. It's really thinking about how can we make it so that there are friction points for the attacker, how we can make it so that they aren't able to pivot as quickly, how there's uncertainty as to whether or not they're going to get a payoff because they're running it like a business. They're not just doing this for hacktivism or things like that. I mean, they're doing it as a business. And so if you can find ways you look at when banks started to be robbed, when the car came around, because now you could do interstate crime and you could do crime in a city that you weren't born in, it's kind of like what's happening here.

[19:04] - Speaker 1
And the industry started coming up with solutions that included safes that couldn't be opened simply by the single back paint teller. They came up with ways where it wasn't quick and easy for the seller to give the money to the robber or the robber to take it. And so they introduced delays or explosive dipacks and things like that that made it more that the robber was less attracted to doing that specific type of crime and moving on to something else. And so I think if we're looking for something that is not a autocratic state and is a surveillance state, but is still in keeping with the premises of what we have in the United States in terms of representative forms of government and free market, it will ideally be the government looking to industry saying, So what can we do together?

[19:46] - Speaker 1
And what can you do to help us? That is the equivalent of making safe that are not easy to Rob or making it harder for the Robert to make off with the money before the police can get there. And something analogous to that because we've got a time frame problem of three months versus 30 minutes, and we've got to figure out ways to solve this. Otherwise it's going to be and continue to get more and more lucrative for the ransomware attackers.

[20:07] - Speaker 2
So I want to roll into this just a little bit more so in thinking about the way that cryptocurrencies have changed the world. They really have made money that moves at the velocity of the Internet. And that's a very powerful thing. It's an amplifier for commerce. It's an amplifier for innovation. It's creating a tidal wave of new financial realities that just have never existed before. At the same time, there are new opportunities for bad actors. And I think that that's something that the industry takes very seriously right now.

[20:38] - Speaker 2
I think what the industry is saying is that the choke points that exist in cryptocurrency are the exchanges and merchants, people who have denied Ms or identified the users who are interacting with the system. Do you think that that is enough to handle ransomware in the 21st century?

[20:55] - Speaker 1
I think it's a start. I think it's thinking about how when exchange is made, if reasonable suspicion or reasonable cause can be provided to the cryptocurrency platforms, say something that's happening. You can actually say that we have concern either about this accurate about this transaction, and then you feel comfortable as a platform doing it. In some respects. It's just like how we know that with real world currency. If certain banks are robbed and you know the denomination of the currencies that were robbed and you know the serial number on them, then you can say if you see the serial number, let us know because it may have been involved in a criminal action or may have been involved in human trafficking or something like that.

[21:36] - Speaker 1
So I think that's an interesting thing to explore. I think it's also worth exploring because more and more companies are looking to cyber insurance, which can actually be a natural market forcing mechanism to actually have them demonstrate that what they have put in place, they have adequate back up, so they don't have to pay off the attacker, that they have adequate defenses, and they have adequate updates to their security as part of them maintaining their insurance. In some respects, that's just a free market solution. It doesn't even require government intervention.

[22:05] - Speaker 1
And so I think it's going to require no one silver bullet is going to solve. It the one thing that I will say. And again, I'm just saying this as one who's looked at the data, it is probably well taken that if you ever get hit with a ransom or attack, even if it's really painful one, you're probably better off never paying a ransom, because once you pay that ransom, then you get on that list that's exchanged on the dark web that says they paid off once they'll probably pay off again.

[22:31] - Speaker 1
Whereas if you don't pay off, then it goes and says you like, you spin all that and you didn't get a pay off. And yes, maybe it was painful, and maybe you lost something that was extremely valuable. But if you take the money that you do to recover, you may actually be able to come back at least more aware and stronger than you would if that had never happened.

[22:48] - Speaker 2
I think it's really fascinating to think about the forcing function of the free market of the insurance contract. What's interesting about this is that companies that aren't able to deal with the attack with the sort of consequences of losing access to the information on their servers. They'll have higher premiums. They might not even be able to get insurance. And it's a bit like I hate to use this analogy, but living in a fire zone versus not living in a fire zone. If you choose to have your business in a place that Burns down yearly, your insurance premiums are dramatically higher.

[23:21] - Speaker 1
100%. My wife and I had plans to build a house and it was near Parkland, and the house itself was not in flood zone, but the park was. And unfortunately, at the time, the county was using outdated FEMA maps that were accurate the 30 meters resolution, which is not great. And so in the end, we said it wasn't worth it, and we just sold that property. But you're absolutely right. It's allowed people to still have choice. And I think that's so key to the future is that the moment you remove choice from people, then we've lost.

[23:51] - Speaker 1
We've become autocracies. But if instead you give choice architectures and say, look, you can pick to be there, but there's consequences, or you can pick this. And here's these options or you pick this. I think giving choice architectures is something that governments increasingly are going to need to do. Absolutely.

[24:05] - Speaker 2
I think ransomware is definitely one of the seminal issues that's apparent in our world right now. I wanted to segue and ask you a lot of your experience has involved problem solving in difficult environments. For instance, you served in Afghanistan in 2009 in the midst of some pretty intense conflict. Why do you think you gravitate towards working in these environments?

[24:28] - Speaker 1
Yeah, that's a great question. I often say I feel in my head an early age, and it made all the difference. But I guess the longer answer is.

[24:35] - Speaker 2
Did you actually fall in your head?

[24:36] - Speaker 1
I did. Apparently, I was in a high chair, and the doorbell rang. I was one, and my parents came back to discover I had tipped over the high chair and had to go to the emergency room to get some stitches on my chin. So yes, apparently I did. It happened. I apparently reached for something beyond my reach. And so there's maybe a metaphor there. Fair enough, Afghanistan. So it's interesting, because back in 2000, I joined a little known program, as you mentioned, called the Bioterrorism Prepares Response Program.

[25:06] - Speaker 1
And I did that because I knew computer science. I was good with computers. But I also had done biology, and they approached me and said, you have security clearance as well. Would you like to be part of a 30 person team that's there for bias. And this was 2000. And I said, sure, I'll do it for a few years. And every six months, Congress said we didn't need to exist. We were a Cold War relic. I mean, we were only 30 people, but even then, our budget was always in danger.

[25:32] - Speaker 1
And then it was scheduled weeks in advance for me to give a briefing to the CIA and the FBI as to what we would do technology wise if a buyer's, an event happened. And that briefing just happened to be scheduled for September 11, 2001, at 09:00 in the morning. And so, of course, 834 world changes. We piled computers in the cars. We set up an underground bunker. That was the state of information sharing at the time. Fly people to New York and DC. Don't sleep for three weeks.

[25:55] - Speaker 1
Stand down from high alert on October 1. I do fly up the breeds of sea on October 3, food back to Atlanta on October 4. And sure enough, I got back just to have the first case of anthrax show up in Florida. And I said, you guys are kidding right now. It's crazy times. And at the time, there was only one case in Florida, and there were no threat letters. But we were a 30 person team in a 15,000 person organization, which is just one agency and a larger 1.3 million government.

[26:21] - Speaker 1
And so we said, if you later find that it's coming from the postal system, you'll need to start doing prophylaxis. But again, we were only 30 people. And so when it actually started having threat letters in Capitol Hill and everything like that, of course, there was a surge, and we were there. But we were now having to pull an additional people. And you saw what happened is when things hit the fan, the head of the CDC, the head of the FBI, they called in their friends, and that's good.

[26:49] - Speaker 1
But they weren't used to working together. They weren't used to the patterns. And so I came to the conclusion that in turbulent environments, people try to operate top down when in fact, they really should empower the edge and be bottoms up. And I had this premise. I later saw it again. When we had West Nile in 2002, we had severe two respiratory syndrome and monkeypox. In 2003. We had other events like that. And I came to the conclusion that one, we've got to figure out ways of empowering the edge when things hit the fan in turbulent environments, whether they be natural or human caused.

[27:25] - Speaker 1
But then the other thing is one of things we also did when times were not crazy is we did a thing called syndromeic surveillance, which is we would go prior to the President speaking at some big event that was public or whatever. We would actually go and plug in five days in advance and plug into the sell of the counter drugs. And it's not like we're tracking individual transactions. We're just trying to chart what's the request for gastrointestinal medications, what's the request for respiratory medications? We also plugged into 911 call centers again, not to do individual calls or tracing.

[27:54] - Speaker 1
So Privacy was still preserved. But what's the normal rate of 911 calls throughout the day? Same thing for school absenteeism. What's a regular pattern for people that are attending school. And then when the big event happened, if all of a you saw a spike, then that would tell you that something's going on, either something that's rushing to 911 calls, or suddenly people are not attending school or something's going on with over the counter drug sales. And sure enough, there were some events, including a football event where we saw a dramatic spike in the sale of gastrointestinal medication, which was not a biotechnology event.

[28:23] - Speaker 1
That was just probably what people ate at the sporting event caused some interesting issues later. But it was a way to do it with Privacy. And so I went to go get a PhD, focusing on how to improve organizational response to disruption in a business school. Busy did postdocs at MIT and Harvard, and then the opportunity came up to go to Afghanistan to think differently. And the promise was anything I wrote. They could either bear it if they didn't like it or claim they wrote it if they did.

[28:48] - Speaker 1
And while I was on the ground there, I usually operate with a listening learning model. For the first 30 days. I came back in 30 days, and I said, I'm not getting a good reason why we're still here. I talk to people and they're like they say we're here to provide legitimacy to the Afghan government. I'm like that's an outcome. That's not a reason. And even if that was the case, why aren't we in Somalia or why aren't we in Yemen? And I came to the conclusion that we needed to find a way to exit.

[29:14] - Speaker 1
This was 2009. And so I proposed that. But at the same time, you don't just exit. There are two things we could have done, which is one we could have gone special operations, could have gone to the 13 different tribes because reality is, Afghanistan has 13 different tribes and offered them aid in exchange for not abiding a tribe. That means harm to our tribe, the United States. And they would have understand that because that's the positive code. The posting code is once you make a promise, you're beholden to that promise forever and to break it, then eternal against you.

[29:44] - Speaker 1
So you would have understood that, or if we didn't want to do that, invite India and or possibly China. China does share a border with Afghanistan. India is fairly close by and actually is much more like Afghan culture, Palestinian culture to play a peacekeeping role. So it's not just us.

[30:00] - Speaker 2
One of the questions that I have around Afghanistan is that this has come up in many of our episodes. What I notice about what's happening in Afghanistan, and this is something that's just really crazy. Is that part of what the Taliban promised the rebels who allied with them is that they would topple Koubl and they would seize the banks, and they would use that to pay out the people who helped them in the conflict. But unfortunately, all that money was overseas. And so one thing that I've often wondered is how different would Afghanistan be right now if the government was a crypto bank and they could prove their reserves to prove that they were not feasible?

[30:40] - Speaker 1
Well, let me go even further further that even before the falling of Afghanistan and the return of the Taliban, one of the things that we did do in 2009 and 2010 is we started paying those who are assisting us with either language translation or with other activities directly through digital payments. They actually came back to us and they said, we appreciate the raise. Or why are you paying me more? And we're like, we haven't given a raise. And that's when you realize just how much graph is present, that even people we thought we could trust where we gave them actual hard currency, that then they turned to someone else and gave hard currency, too.

[31:20] - Speaker 1
There was always skimming that was occurring. And so just by going to digital currency to start with, you can make it to the actual it's going to the actual person who's doing the work that's getting renewal in it for it. But you're absolutely right as well. I think we are very fortunate that they thought they were going to encounter large amounts of hard currency that would have been present in Afghanistan. And sure enough, no, that's not the case.

[31:43] - Speaker 2
But what if the Afghan government could have shown that the money definitively was not in Afghanistan? And there's no way that the Taliban attacking Kabul would have resulted in them being able to get access to the currency because they're like, look, it's on a blockchain like, this is gone. You can't touch this.

[32:01] - Speaker 1
Yeah. I think it would have been an interesting detriment. Maybe it would have slowed them down a little bit. I will say the other thing that was interesting in 2009 that I actually was successful with was I got together with some interesting army people and we did some analysis. So at the time before I arrived, we were burning the poppies. And the trouble was in some respects, the Taliban wanted us to burn the poppies, say more. Well, it made them more pricey because we made them scarce.

[32:26] - Speaker 1
And so I actually said, crash the market, let them overgrow focus on prefix or chemicals. But they had been using because, remember, literacy in Afghanistan is 20%. What would happen is the Taliban would go to different mullahs that they would have influence over and get them to interpret the Quran differently. They come out and say it's against the Quran to grow poppies because they're mintering substances. But they would say that only after the Taliban had bought all the poppies up, converted them to morphine base, which is stable for 15 years and buried it underground.

[32:54] - Speaker 1
And so they created artificial scarcity after buying it. And then when they started to run out, they came back to the mullahs and the mullahs came back and interpreted for the Afghan people and said, okay, it's okay to grow poppies because you're not really hurting us. You're hurting the infidels overseas, which is fine. And then they started to grow them again. And they were going to have an oversupply problem until we showed up. And guess what we started doing. We started burning them, which is exactly what they wanted because it made them more profitable.

[33:20] - Speaker 1
And it also allowed them to take bribes, which says you either pay us a bribe or we're going to turn you over to the US forces that will then burn your crops. Now, that said, this gives another thing, which is you can look at interesting economic data. And this is why it's so important to make sure people have choice about their economic data. But, for example, in South America, this was done, let's say, a decade ago. So I don't know what the current figures are, but this was done about a decade ago.

[33:46] - Speaker 1
You could see that the price of heroin is not obviously not produced in South America. So it's somewhat of a premium. But it was pretty low in a certain set of countries, say, Venezuela everywhere else. And you're wondering, why is it low in Venezuela and then similarly, there are other places in the Middle East where, if you look at the economic price and the street price of, say, cocaine, which, by the way, in Middle East, you're found with drugs, that's the immediate death penalty. So it's pretty pricey in the Middle East.

[34:16] - Speaker 1
But cocaine, for some reason in certain parts of the Middle East is lower in certain places where you wouldn't expect it to be either. And so one has to wonder, are there shipments of heroin going in one direction to South America and cocaine going in another direction to the Middle East? And you can observe that in the economic data.

[34:31] - Speaker 2
That's fascinating. Let me shift gears for a second. When did you first realize that Privacy was important to you?

[34:41] - Speaker 1
So you mentioned how I started working for government when I was 15. There's another part of that story, which was I was 17. My father had moved up to DC at the time, so I was now in a different school than I was in 15. But I got called down to the principal's office, and there were four people in suits. And they said we would like to offer you a job. It would involve small satellites that were part of part of an interesting program, which later I learned, was if folks remember, there was the Ballistic Missile Defense Organization, which was the, quote, unquote Star Wars program.

[35:12] - Speaker 1
It involves the satellites. It will involve classification. You won't be able to tell your parents everything you do. Are you interested? And of course I said yes. I don't think I even asked how much they were going to pay me along the way. What we did was we use some of these interests just real quick.

[35:26] - Speaker 2
Why do you think they offer this to you?

[35:30] - Speaker 1
Well, I was fortunate enough to be really good with computers and come in the mid 90s, when there was still government investment in science fairs.

[35:40] - Speaker 2
How did you demonstrate you were really good with computers?

[35:43] - Speaker 1
Yeah, through science fairs. So because my father was a Minister, my Methodist Minister, I wanted to figure out who he was really working for. And so I built computer simulations of natural events. I did a computer model in 1000 hundred 92. If you follow current trends with greenhouse gases where things might be in the next 50 years later, did one on ozone layer deterioration, which fortunately, we did turn around. So that's at least a small success story. And then the oil spills. And that really got me on the map.

[36:13] - Speaker 1
I got invited to the International Science Fair in 1994 with a science fair project on oil spills in the Gulf of Mexico. That later got me to go to South America for the South American Science Fair as well, which was great, made a lot of friends there, too. And the government uses science fairs as a way to recruit for talent. And they did that during the Cold War Two. And I wish we still had our governments that do. I wish we still had governments that had funds to do that because I was in Virginia Beach.

[36:44] - Speaker 1
And you hear the same similar story of other people that were either in the Midwest or things like that. We found talent and at a nurturing at a young age, we exposed them to things they would not have access to do otherwise. And so that's why the government approached me to build computer simulations. So I got to be part of a team that monitored crop growth from space to try and predict famines in advance and try and see if we could actually get humanitarian effort in advance to those regions because we could see that they weren't growing enough crops.

[37:13] - Speaker 1
And this was before Geospatial Information systems. And then the project I got to lead. It was about a year and a half long endeavor was to do a prototype demonstration in 1995 of picking up forest fires from space, scanning the wind conditions, scanning the topography, scanning the foliage, and trying to guess where the forest fire might go to warn people on the ground if it was going to be an issue and to make sure forest firefighters would have their life saved. Now the trouble was, I couldn't fully declassify the project at the time.

[37:40] - Speaker 1
So my Westinghouse Science Fair project you never heard about. But that led to me realizing that this technology is going to become democratized in time. And like all democratizations of technology, eventually it will be in the hands of companies and eventually in the hands of individuals. We don't want a world in which everything we do is observable from space is quantified from space. You can use hyperspectral to see heat signatures inside of buildings so you can know how many people are there. You can use essentially radar from space to figure out change detection.

[38:15] - Speaker 1
And sure enough, fast forward from 1995 to where we now are in 2021, there are proposed launches for constellations of commercial satellites, and I sometimes have to pause that. I can talk about this openly because I feel like it shouldn't be something we can talk about openly, but they're going to be launching in this decade satellites that can be accurate to 25 centimeter resolution hyperspectral, again, being able to see acidity of soil heat signatures and buildings. But with space radar be accurate to 1 mm resolution and can in some cases do persistent stare in certain locations, which means you could actually watch this way of a building or watch the sway of a bridge and say it's swaying too much.

[38:53] - Speaker 1
It's outside of its stress tolerances. You need to make sure people don't go on that bridge until you can reinforce it.

[38:58] - Speaker 2
That is wild on the subject of climate change and the ozone layer. I often think about Thomas Mitchley, who is the person who invented both leaded gasoline and chlorofluorocarbons, which are the thing that punched a dramatic hole in the atmosphere. So it's the single person to do the most harm to the planet as a single organism, which is a hell of an honor or dishonor, as one might say.

[39:26] - Speaker 1
And I think that's the nature of I don't know what the discipline is because it's not philosophy by itself, because unfortunately, philosophy has 3000 years of debate, and I'm not sure we came up with anything other than doing others, as you would have them do unto you. And even then, not everyone agrees with that. Science by itself doesn't ask about the normative ethics about where we want to go and how this is going to impact society, and I'm not sure it should, because that's not the right level of analysis, but at the same time, given the sheer amount of I tell people the good news is technology is being democratized.

[40:01] - Speaker 1
The bad news is technology is being democratized at a speed unprecedented, and the choices we make in the next three to five years will have ripple effects for the next 30 to 50 years on whether or not we have more freedom of choice or if we're all going to be quantified monetized without our permission in a more dystopian future. And on top of that, you add to the fact that we have a representative form of government that is designed to be reactionary that is designed to be deliberative, which is great, but it's not designed to deal with these shocks.

[40:32] - Speaker 1
How do we do demonstrations in the commercial sector that show a better way forward and then bring enough people along with us to say this is a better way. We don't have to be a surveillance state. We can actually have a future in which people are uplifted and we all have digital dignity to our identities.

[40:49] - Speaker 2
So something that's making headlines this week, as we tape, are new reports coming out that outline the financial or economic risks of climate change. The organization you work for the Atlantic Council has put out its own report on this subject. Are you involved in this work of assessing economic risk of climate change? And if so, what can you tell us about it.

[41:09] - Speaker 1
I know the colleagues at the Alan Council doing it. I can't say I'm part of their work, but I can say Interestingly enough, I got a phone call when I was doing my PhD in 2007, followed by an email from someone by the name of Carol Dumaine. And she's now a senior fellow at the LAN Council because we stayed in touch. And at the time, she was publicly at the Department of Energy. But she'll say she was a 30 year CIA veteran, and it shows you that even in 2007, 2008, the CIA and the Department of Defense, there were people there that were seeing what was coming, and then they knew it was going to create ripple effects and it was going to have impacts on energy markets.

[41:51] - Speaker 1
It was going to have impacts on populations, mass migrations, it was going to have impacts on national security. And so she was pulling together a coalition of international folks who wanted to make a difference in this space. And she was asking if I wanted to help, partly because my PhD was looking at this idea of knowledge ecosystem, so more decentralized approaches to collaboration as opposed to top down. And so we stayed in touch ever since. But I find it interesting because despite the reputation, people subscribe to certain parts of defense or intelligence, I think people would be surprised if you went behind the curtain.

[42:26] - Speaker 1
You discovered there are a lot of people that are aware of these issues, and they care about these issues, and they want to make sure we get ahead of them.

[42:33] - Speaker 2
Who do you most look up to in the field of Privacy and technology.

[42:38] - Speaker 1
So in the field of technology and Privacy, yeah, well, I definitely ascribe and again, I'm biased because I've gotten to know them. But I think Vint Surf has had a very distinguished reputation, not just for helping with TCP, IP and what that meant, but recognizing that this was something that needs to be set free to the public. He shares with me this idea that different norms have come along at different times, as the technology has shaped us. What I'm tracking now, and I don't subscribe to any one person is this idea, and it builds on the Privacy by design movement that came out of Canada.

[43:14] - Speaker 1
So maybe I should give credit to that as well. Is there are parts of Europe now that talk about personal integrity or holistic integrity? It's not an English word. So I'm probably mistranslating the concept, but that we should have a right to digital dignity, to digital wholeness that includes digital identities, but also bio that includes things in terms of our physical person, our biometrics. And I'm also tracking. Interestingly enough, there's an effort, a foot by a coalition of Indigenous people who share the idea that Privacy is not just an individual, right, but one of the community as well.

[43:54] - Speaker 1
And that when you make your decisions, you're not just making it for yourself. You're making it for your community, and you can actually find online a set of values, whether it's the Maori, whether it's folks in New Zealand, Australia or Canada, even in our tribes here in the United States, they've pulled together a very interesting data value set that I think those of us who are ascribed to more the Western school of thought should take a look at seriously, because sometimes we can be so much focused on the individual side that we need to recognize in some respects.

[44:25] - Speaker 1
It's also not just about us. It's about our communities, our families and future generations well after us.

[44:31] - Speaker 2
So the title of our show is Privacy is The New Celebrity, and we'd like to ask each of our guests, do you agree? Do you disagree? What do you think of this concept? And just to contextualize this a little bit further, this is a quote from Aaron Sidg, who's one of the earliest engineers at Facebook.

[44:51] - Speaker 1
So I recognize celebrity represents that it's now in the zeitgeist moment of the public, which in some respects was good. But oftentimes things that get in the zeitgeist moment in the public don't get sufficient traction to have action. And I think that's where I want to make sure that Privacy is also at some other level in which we are doing demonstration projects to show a way forward. We're doing things that don't just virtue signal, but actually show a way forward that is beyond the world that we know.

[45:24] - Speaker 1
And it's like Einstein's quote, which is we cannot address new problems with the same thinking that got us there. We have to think about new thinking and having been one that's worked in very change adverse environments within government and interacted with the Congress and seen what's happened over the last six to seven years. I feel like we're in a world in which we're increasingly becoming more about virtue, signaling about misinformation, disinformation. Truthiness is, whoever shouts, allow this or whoever claims it's the true or whoever has the best sound bite, as opposed to the longer, more nuanced thing.

[46:02] - Speaker 1
And at the same time, I sympathize with that because people are busy. People don't have time to dive deep into the issues. And Congress, for the most part, want things that are very concrete. Abstract concepts are difficult. They want concrete things that they can act on for their electorate. So it could very well be that Privacy is a new celebrity, but I hope it's something more than that too, because if we want to make actionable change, and I think part of what attracts me to the work that is being done by Mobile Coin and Mobile Coin Foundation is it is about demonstration projects that show a better way forward, and then that becomes something that can be community centric, that can be something that is uplifting of people, and it can actually be a better way for it, because right now I don't see demonstration projects of a better way.

[46:50] - Speaker 1
You hear what Tim Bernerley is doing a solid. Okay, maybe there's something there, but we need to make it so it's usable by everybody. We need to show better ways for us. We are going to be surprised that ten years from now, quietly, Privacy disappeared from us.

[47:03] - Speaker 2
Yeah, I think that's a really beautiful point at Mobile .1 of the things we think very deeply about is how do we set the Cultural revolution? How do we create this tone where people can opt in to an alternative future that allows them to retain Privacy in a way they've never had before. So my final question for you this evening is, what do you think are the biggest threats to Privacy that we need to be looking out for right now?

[47:32] - Speaker 1
Well, we talked about ransomware. I think ransomware. We don't want to throw out Privacy in the name of somehow getting more secure for ransomware. I think there are increasing tensions at the geopolitical level between more authoritarian autocratic regimes and alternative regimes. And I'm not saying the US is perfect, but we are not authoritarian. And so I watch that because as those tensions get worse and worse, does that become an excuse for giving up Privacy in the name of security, which is not a good thing or worse?

[48:07] - Speaker 1
Do those more autocratic regimes demonstrate themselves as being more successful at navigating the turbulence and addressing polarization in service? The United States is not doing a gloomy example right now of here's all the goodness that an open society gives you when we're pulling ourselves apart and a lot of other things. Does that mean that autocracies become more attractive? And then lastly, I know we have fellow board member Renee Durstro. It's not like misinformation. Disinformation is not a new concept in human species. We've always had it.

[48:41] - Speaker 1
I mean, you look at what John Adams and Thomas Jefferson. They did it to each other. We had 89, we had yellow journalism. But again, on the trend that the good news is technology has been democratized. The bad news is technology has been democratized. We're increasingly seeing more and more misinformation and disinformation that anybody can produce, and it's hard to defend against, especially when it gets thrown your way and it's already gone halfway around the world. But what I worry about is that some people will claim we have a solution to it that requires knowing who everybody is and knowing what exactly what they said.

[49:13] - Speaker 1
And in the process of doing so, we've now tossed out Privacy because I think you should still have the right to sometimes have an anonymous comment whether you're concerned about your freedom or your Privacy, or you just want to have anonymous comment. But at the same time, then the public can make a choice and say, if that said anonymously versus that said with attribution, we'll give different weights to it.

[49:33] - Speaker 2
I think that a society with perfect. Surveillance is sterile and one where innovation is impossible and a society with no surveillance is ungovernable. The conversation is really about where we meet in the middle of those two extremes.

[49:48] - Speaker 1
300% well, and that's where I really appreciate your views, Joshua, because you always provide a more nuanced view. And I think that is so needed nowadays. Anyone who ever says never X or always Y, I'm like there's probably something more there. And what I really appreciate about what you just shared is it's about where do we want to be in the middle? How do we want to balance the social contract of the individual with that of the society?

[50:15] - Speaker 2
On that note, I will say that a Jedi once said only assist deals in absolutes. David Bray this has been a fantastic episode. Thank you so much for your time. I just want to say that we've been speaking with David Bray, a distinguished fellow at the Atlantic Council and board member of the Mobile Phone Foundation. David, thanks for coming on the show.

[50:34] - Speaker 1
Thank you, Joshua. And may the force be with you?

[50:43] - Speaker 2
Thanks for listening. Don't forget to subscribe to Privacy as the new celebrity. And if you like what you hear, please leave us a review for more amazing content. Check out mobilecornradio. Com where you can find our radio show every Wednesday at 06:00 p.m.. Pacific Time. I'm Joshua Goldbard. Our producer is Sam Anderson, and our theme music was composed by David West.

[51:04] - Speaker 1
Paul.

[51:05] - Speaker 2 
And don't forget, Privacy is a choice we deserve.