Privacy is the New Celebrity

Enter the Metaverse with Kavya Pearlman: A Deep Dive into Immersive Reality - Ep 17

February 10, 2022 MobileCoin
Privacy is the New Celebrity
Enter the Metaverse with Kavya Pearlman: A Deep Dive into Immersive Reality - Ep 17
Show Notes Transcript

In episode 17, Henry Holtzman interviews Kavya Pearlman, award-winning cyber security professional and founder of the XR Safety Initiative, a non-profit organization that promotes privacy, security, and ethics in virtual and augmented reality. Henry asks Kavya to break down why there is so much buzz around the metaverse right now, and why we should be concerned for our privacy. Kavya explains how body-worn sensors like VR headsets vastly increase the amount of data companies can collect from users, and the difficulty of developing regulations for technology that doesn't quite exist yet.  Kavya shares her experience serving as the head of security for the oldest existing virtual world, “Second Life” by Linden Lab, and what this taught her about the need for adequate user protections. To learn more about the risks of virtual reality, Kavya suggests this recent report by her organization. To get involved with XRSI, visit

Speaker 2 (00:18)
Hello, and welcome back to Privacy is the new Celebrity. I'm Henry Holtzman. And today on the show, we're speaking with Kavya Pearlman, award winning cybersecurity professional and founder of the XRSI Safety Initiative, a nonprofit organization that promotes privacy, security, and ethics in immersive technologies. And the reason we invited Kavya onto the show today is because we're doing a deep dive into the Metaverse. As you've probably heard, the word Metaverse is being tossed around a lot lately. Facebook is rebranding their company as Meta and other technology companies like Microsoft are signaling big investments into virtual reality. So we want to know, what is the Metaverse? Why is there so much hype about it right now? And most importantly, what does this mean for our privacy? So let's get started. Kavya, thank you so much for joining us on Privacy is the New Celebrity.

Speaker 1 (01:10)
Thank you, Henry, for having me. And yeah, what I start off the program. It's a very timely topic because as you said, there is a lot of hype that's going around. So I'm happy to dive in.

Speaker 2 (01:25)
Can you tell us about your organization XRSI? What is your main focus?

Speaker 1 (01:29)
Yeah. So XR Safety initiative, XRSI. Our mission is to help build safe and inclusive XR ecosystems. And now we kind of have a word for it. People call it metabolism. We mainly try to address this from XR perspective, which is an umbrella term for virtual, augmented and mixed reality. But then when you talk about these immersive technologies, collectively, there are these other intersections, such as artificial intelligence, 5G, improved graphics, and all these other various different technologies start to converge. And that's kind of where we focus ourselves to try to understand what's really going on and help stakeholders and our stakeholders being individuals who need to be aware of what's coming all the way to organizations such as Meta, Microsoft, the big tech corporations that are potentially building some of these building blocks off the metabolic as well as regulators that really need to understand the potential consequences. So it's a very large mission, but we kind of break it down into different programs. And overall, what we really want to see as our vision is a safe and inclusive next iteration of the Internet where people are not left behind.

Speaker 2 (03:06)
So in the soup of acronyms that you just went over, I noticed that one wasn't there that I've been seeing tied to the Metaverse quite strongly lately. And that's Web Three. And I just thought of that because you talked about a new safe Internet. Do you find Web Three to be part of that future story?

Speaker 1 (03:26)
Oh, absolutely. In fact, it is one of the major component. And yes, it totally skips my mind sometimes because there's just so much that has to go into it. Content development network, like going from Web to, which is the protocols. The protocols are all about two dimensional protocols or the traditional TCP IP and other acronym for networking. Basically, Web Three is more about decentralized ledger technology based protocol, which we are still developing. And the distinction here would be it's really about distribution of power. In the Web Two era, the power kind of stayed with these centralized ecosystems, and Web Three with developing protocols and more of more grassroots sort of a push to adopt these protocols, we may be able to get to a place where the power is not consolidated to these centralized environments. It's more distributed because it's more decentralized in a way. So Web Three is absolutely a major, you can say, sort of a parallel to what we consider the immersive technologies. All these immersive technologies are part of this Web Three aspect as well.

Speaker 2 (04:51)
Yeah. That decentralization is very exciting. Generally, it's being achieved through blockchain technologies. As you said, ledgers. And one of the attributes of a blockchain is that it's immutable. It's a permanent record. So in your focus on Privacy, do you worry about people not understanding that personal information may end up in this permanent ledger?

Speaker 1 (05:14)
Oh, absolutely. In fact, yes, I send out tweets oftentimes, and it's just like last week I was saying there's a couple of other enthusiasts influencers that agree with me. And we were just collectively talking about on Twitter Spaces. It's just because it's on the blockchain doesn't mean that it's inherently secure. So when I say it means the data and depending on the nature of the data, it could really put people at risk. It could put organizations at risk because sensitive data is still sensitive data and should not be visible to all. So that's definitely concerning for when it comes to Privacy. And it's more of a concern because, as you said, the nature of the blockchain or these DLT decentralized ledger technologies, it's immutable.

Speaker 2 (06:06)
Yeah. And as you said, when we bring the Metaverse back into it, which includes augmented reality and cross reality, you could end up like your location being logged onto the blockchain because all of those sensors and all of those devices are probably recording your location.

Speaker 1 (06:24)
Yeah. And that is something that I constantly think about. It's not even perfect. But I think the best way to describe what we are facing here when it comes to these immersive technologies. And as you said, if I may sum it up, we are like kind of capturing everything around us and a whole lot more. So we're moving to this era of constant reality capture and into the Metaverse. So what we're really presented with here is a term called callingrich dilemma, which creates this double bind problem because we can't really predict the issues of this technology until it's extensively developed, extensively, widely used. But then by the time we do understand this, the change would be quite difficult. So we can't undo it because all these Privacy and safety issues, with the nature of blockchain being immutable itself, it's already fully baked in. So this is kind of where we sit in the middle to try to collect as much evidence, but also anticipate oftentimes as to what are these risky scenarios. Because especially when it comes to artificial intelligence, for example, if there are some biases that are baked into algorithms, we can't necessarily go back and undo the impact that it may have already caused.

Speaker 1 (07:53)
So this is the Collin Ridge dilemma. Oftentimes we see whenever there is these emerging technology is being developed, it could be said about several different things. Like, I'm sure scientists felt this dilemma when they were developing nuclear bombs. Even another technology could be used for nuclear technology itself just could be used for good and for bad itself. And that's the dilemma. It's like we don't know until we develop it fully, but here we are. These dilemmas occur, and that's kind of the role of organizations like myself, XRSI to try to help with that.

Speaker 2 (08:33)
Let me touch on something that I know I get confused about. And so I'm assuming a lot of people get confused about and we talk about the Metaverse. Right. And now we have Facebook renaming itself Meta for its focus on the Metaverse. So it's like the Metaverse now, is that Facebook? Is there actually a Metaverse? Are there many metaverses? What do we mean by the Metaverse in that regard?

Speaker 1 (09:04)
So I would like to offer sort of an analogy, let's say from the Internet era. So if the Metaverse is the next iteration of the Internet. But first of all, we need to understand, just like mobile computing, when it came into existence, when it was fully materialized, it didn't replace desktop. Right. So this next iteration is not going to entirely replace what we currently have going on. However, it will offer this new way for doing commerce, connecting with people, creating new things. So at XRSI, we even did this research where we kind of segmented in these different blocks and we only did this to address it from Privacy and safety perspective. So let me take a step back again. Imagine being in the 60s or 70s and saying, oh, I'm building the Internet and being like, let's say a University and saying, I'm building the Internet. You wouldn't really be building the Internet. The Internet came into emergence by connecting several different nodes and networks. A person who is building a router or let's say a VR device. In this today's era, is it claiming that they are building a Metaverse? It's kind of funny to me because how could you possibly say that they are the ones that will own the Metaverse or building the Metaverse, just like the internet, came into full fruition in the 90s and 2000s.

Speaker 1 (10:50)
That's how this Metaverse, which is the next iteration of the Internet, there is only one that will emerge. It's not quite here yet. We're not quite living in the metawares. But at some point this interoperability, just like we had between nodes, servers, networks, all kinds of routers and stuff for the Internet. It will happen where we connect all these spatial system where there is an element of immersion, but we use this flow of users and data and commerce and all these cultures across 3D spaces. So threedimensional spaces. And one thing that is to be noted, this next iteration of the Internet, called the Metaverse, will not be owned by one actor. It would not be necessarily closed walled gardens. It may have some elements of closed wall gardens, but largely. And I really hope to God that it is more of a decentralized and built by nonprofits doing governance and advocacy and putting standards and guidelines together, just like exercise doing. It should have lots of cloud computing blockchain representatives, just like we have various blockchain entities currently working on commerce, specific elements like cryptocurrency, for example. And then there will be an element of virtual economy, which we have.

Speaker 1 (12:27)
But it's not all interconnected, it's not all interoperable. So we have this little foundation, small nodes, if I may call like virtual words. So oftentimes people are like, oh, I wonder if GTA is a Metaverse. Well, it is a virtual word. Roblox is a virtual world. And perhaps once we are able to connect all these virtual words, sort of seamlessly interoperate between them, just like Facebook's Horizon virtual reality platform, connecting with, let's say, Microsoft's all Space virtual reality platform. So two people could go seamless amongst between them. Or maybe I'm experiencing something as a holographic display and now I want to go play in another environment, but I can take that hologram and Port it over to a virtual world. It seems very complex and this is why it's not here yet. When we have this 3D specialized, fully immersive, in and out seamlessly environment, that's when we can say, now we are in the Meta. Worse before that, we are still just putting few blocks and building blocks here. And why people say, oh, we're building the Meta verse or I'm the Meta verse. And this is the Meta. Worse is because it gives you clicks, it gives you this SEO optimization.

Speaker 1 (13:48)
And it's kind of unfair to common people who may have even bought the idea that apparently Facebook is the one building the Metaverse, which is so false. Facebook, it cannot and is not the only entity. There are so many actors that will put in resources, efforts, build protocols, and then finally, someday, maybe five years from now, ten years from now, we'll finally have the next iteration of the Internet.

Speaker 2 (14:22)
I like your analogy to the creation of the Internet. And I want to just dig into a tiny bit deeper and then we can move on because I got to live through that. And so I remember quite clearly when we got the term the Internet. And the point where we got the term the Internet was actually the point where there was a concerted effort to glue together all kinds of disparate networks. People didn't say I have an internet and other people say I have an internet and they didn't actually talk to each other. It was specifically the rallying cry that we shouldn't all these networks together, right? And then it just grew from there. So the meta versus is happening in a different way. It feels like the term is out there. We have a 900 pound gorilla that's trying to grab that term. Is there actually an effort happening yet to try to nip this all together? Like is that something that is happening with intentionality by say a very credible standardization organization, governmental or maybe we don't need those things today. But is there actually a core set of people that we should be following because they are actually trying to make the Metaverse?

Speaker 1 (15:39)
One thing that is a really important question and the answer to that is the efforts are taking place but they are not really centralized. In a sense. It's very decentralized somebody is doing something. For example, XRSI is focusing on Privacy, safety and ethics of all these various components of the matters commerce pieces or when you connect to people social VR with social virtual reality where harassment and online all kinds of moderation issues. So we're trying to address that. There are other groups. For example, there is a group that I want to call out is Open Metaverse Interoperability Protocol group. I think they call it only Open Metaverse Interoperable group. And they are trying to create interoperability within the Metaverse or within the next iteration. So these are just like independent at the moment. And we do talk to each other. Some of these groups, for example, there are other groups spinning up to specifically focus on web, three aspects from a security perspective. In fact, this one particular group that just came out together, they're like a consortium of several different cryptocurrencies like Salona Polygon Coinbase is a part of it. There's like twelve different and Open Sea which is the largest like NFD platform.

Speaker 1 (17:14)
So they formed together a consortium to exchange security information with each other to make sure that these platforms are secure and less of a scammy activities are going on. So there are these independent sort of connected but not necessarily in a centralized umbrella or connected together talking to each other. And the question that you raise is so important because I think that's exactly what needs to happen is we definitely need to bring everyone who is working in a space together to potentially form this common understanding these consensus. At our end at XRSI we are trying to put together taxonomy for the Metaverse and then we're going to open it up. And that's the other challenge, Henry, is emerging technologies are very difficult to create consensus around because they are still emerging. That's the nature they're evolving. So I remember when XR, the umbrella term for extended reality or umbrella term for all these immersive technologies in 2019 when we started our mission, people were like, oh, what is AR? What is VR? They were fighting this war of words, just like we are fighting with this whole Metaverse thing. And the first thing we did was just put out a taxonomy document with the idea that if you contest this definition, then you can Ping back to us and we can modify it.

Speaker 1 (18:50)
And then over time, those definitions have been adopted by meta, by several other big tech, and now they're even being adopted in the law. And we'll see one of the very first law that includes virtual reality augmented reality that XRSI advised on is called Camera Act. It's for children. Another acronym, because I don't want to present wrong information, I'm going to give you the exact meaning of this. The Children and Advanced Media. So Camera Act is called Children and Media Research Advancement Act. So within this act in the United States, virtual reality, augmented reality. And this will be used first time. So this is the notion. And as soon as this whole term metaphorse was introduced and I was speaking with 60 Minutes and a bunch of other journalists back in April 2021, and this whole meta hype happened in October 2021. We had been anticipating this kind of a thing to happen and we have been sort of researching around what could go wrong, what are the challenges? But we look at it from a more global perspective. And so now our job really back when we were talking about XR specific extended reality, we put out a taxonomy and now we're going to do the same thing.

Speaker 1 (20:16)
We're going to put out certain taxonomy, create consensus around web, three men of our specific taxonomy terms and terminology, and gather community feedback. So I would say this is sort of how I see Xrsi's role has become, but we always sort of center it around safety and sort of global issues like equity, diversity, accessibility. Those are the things. And since we're not for profit for us, it's more about advocacy and creating consensus. So hopefully this might help. However, you're right, we need more and more entities to sort of band together to create this common grassroots effort, almost like a revolution to keep the Metaverse open, interoperable, decentralized. So there is not just one big giant entity controlling these powers.

Speaker 2 (21:15)
Turning back specifically to what we see coming with the Metaverse in terms of new technology. You've been quoted in a few articles speaking about the risk of increased data collection with virtual reality. Can you say a little bit more about what those risks are?

Speaker 1 (21:34)
Yeah. And I mean, it all starts with the very same notion of calling rich dilemma and the era of constant reality capture. Why is this worse now? Is because the devices that enable some of these experiences will potentially and the consumer devices will potentially. And the enterprise devices already have eye tracking technology and gaze pose gate, which is a form of how you move could actually infer a lot of information about a human being. And so this is where things get tricky is up until now, we've only dealt with personal identifiable information, which is very quantified definition, like your Social Security number patterns that are fully recognizable and quantified as far as we go. General data protection regulation came about. And then we started to think about, oh, well, somebody's political opinion is also some kind of a personal data or we talked about personal data in the sense of like gender orientation or sexual orientation. But we have not dealt with. And that's the bigger and more really incredible risk that's coming to us is these technologies are intersecting with many new advancements such as brain computer interfaces. So we're not just talking about being able to track your eyes, but your brain patterns.

Speaker 1 (23:13)
And when that intersection is fully materialized and is ubiquitous, the ability to potentially gauge somebody's mood infer what kind of health problems that they may have based on the way they move or other inferences combining a few other data points. At XRSI, we kind of attributed this new term. And there was a report that recently come out on Safer Internet Day on 8 February, which is talking about this very term in a very deeply granular aspect. So how we can use this to then build laws, et cetera. And the term is called biometrically inferred data. It's very important to pay attention to this because what's at risk now is our autonomy. We got to be allowed to be able to have free will. And if somebody has access to your mental thoughts, your brain patterns, then your mental Privacy can be at risk. And that's something this algorithmic biases we are beginning to sort of understand. But this is way beyond that. This is inferences of a whole other magnitude, which currently we're talking about certain demographics like African American or other demographics like Muslims or somebody being at risk. But this kind of data could put a person at risk for just having thoughts of a certain nature.

Speaker 1 (24:52)
And that's why this is a whole other level of landscape of risk. I remember I think it was Wall Street Journal, and I was telling them that previously the attack surface because I go back to my cyber security experience, the attack surface was nodes, networks, and servers. But now the attack surface is expanded into our living room and our brain. And that's the challenge.

Speaker 2 (25:21)
We're kind of there already, aren't we even at a mass scale? Because we can infer mood from your facial expressions by computer software today, we can do vocal stress analysis. We can detect your pulse in video as well. We can do all of the gate analysis with your phone. I'm wearing a fitness tracker, a watch, and a phone that all have accelerometers in them and can do all the measurements you're talking about. They're constantly recording my position, both through GPS and what cell towers I'm near. We're there. We're in the future that you're describing, it's just going to get worse.

Speaker 1 (26:04)
I agree with you. Yes, we are inching towards that sort of a dystopic nightmare, if I may say, these are just what all the things that you described. It's a writing on the wall, because all this would converge into a massive supercomputer having access to this information. So then I go to say it's not even about predicting what a person is going to do. It's more about what they will let you do. Because based on those predictions, you may not even get the interview that you're going for. You may not even get the promotion because they have access to or HR has access to your cognitive load, which determines when you become sleepy. You may be pregnant. All these kinds of things that currently it's there, but it's not in the hands of all kinds of creators and developers quite yet. And when all this data is in the hands of HR manager, in the hands of your bosses, for example. And that's when we've created like a perfect challenge, like a really big challenge to address, like who makes these decisions. And then we introduce these amazing artificial intelligence algorithms that will inevitably make a lot of this determination to scale all of this.

Speaker 2 (27:25)
So it's all very scary. And I want to add to that this notion of vertical integration. Okay, so today we're already on edge about Facebook, about the data gathering they do, about how responsible they are, about how their algorithms work. This is a story we've been talking about now for months, if not years. Well, enter Meta. Enter Oculus. Now Facebook is vertically integrated right down to the sensors that are on your body. Whereas before, they were going through, they're going through Apple, they're going through Google, who want to keep their customers safe. And so there's, like, tension between Facebook and between the hardware and OS companies. What happens when it's this one vertically integrated stack with one company that to date is all about using data to make you what's being sold?

Speaker 1 (28:25)
Oh, I think everybody needs to watch Reddy Blair one. I am surprised that people have not made a direct this is a direct straight line between what could be Meta versus IOI is the entity where all the Corporation IOI in that movie Ready Player One is really just all about consolidating all of the power on all the human beings that are all the human beings at least portrayed there. But, yeah, most of their consumers. And I think that's what might become if we are not careful about this consolidation of power and leaving our data in the hands of a few.

Speaker 2 (29:10)
So the New York Times has reported that Meta has pledged billions of dollars to the development of the Metaverse and its products for the Metaverse. And then the Wall Street Journal reported that they've pledged 50 million to give to outside researchers focusing on Privacy and security for the Metaverse. I'm curious what you think about first of all, is that enough money? And second of all, is it the right balance? Is that a high enough percentage to be going towards this very important issue?

Speaker 1 (29:47)
I think it all comes down to intent. And that number that you're quoting, it is to be noted that XRSI as well known, we are also advocating for responsible innovation, Privacy, safety. We have not seen a cent of that allocated $50 million come to us. In fact, as we speak, I have made specific asks to meta to allocate some of that funds to XRSI and we are yet to see what happens with that ask. But 50 million is not enough because I'll take an example. So we did a XR safety week and they funded that week by giving some money, which is exact amount of $50,000, just as a primary sponsor. But when it came to putting speakers or sending some concrete message doing campaigns, I quote, there has been only two specific tweets targeted towards audience. And that was not during the safety week campaign. That was like after and kind of in a funny way, oh, just disable your safety boundaries. Otherwise you'll blah, blah, blah. And so that's the problem is the amount of funds that are being put into marketing is millions and billions. Whereas we have not even introduced the idea of hey, safeguarding the boundaries and clearing out the surroundings in a very careful and thoughtful way to the consumers.

Speaker 1 (31:36)
All we have I mean, we've seen several children in the platform using that quest Oculus headset, whereas the only thing we have is this tiny little print on a disposable box that says 13 plus. So as encouraging as it sounds, 50 million from the sound of it, it's not a comparable percentage to what they are doing in order to get people to get addicted almost to these environments. There are so much campaign going on that you can do wellness and health and fitness. But what about the risks of the data of the wellness and health apps are collecting on you? And that's a disparity that I would absolutely address in the upcoming frameworks because just like how we did in general data protection regulation from Europe, they said that hey, you need to do adequate security and Privacy and define it. Otherwise you'll be at risk of losing your 4% of your revenue. That's what it should compare to. It shouldn't be just some arbitrary number. 40 million. Like what is the percentage of the amount of your revenue that you're dedicating to safety awareness, Privacy awareness, to potentially creating safeguards or allocating it to the organizations like XRSI?

Speaker 1 (33:00)
They kind of handpicked a bunch of few organizations that's not good. And most of these organizations are just doing their bidding and they're kind of like a lip service and PR. I would go as far as saying that because we have not seen the real outcome, which is globally, we are impacting lots of these populations. And I say that globally because decisions are being made that impact a person like my sister in India or somebody sitting in Saudi Arabia. And they see these the impact is different on a woman in Saudi Arabia when her data is lost and where they are mandating sort of login ID. And it's still a mandate. Even though they said that we won't do it until now, the login ID is a mandatory requirement till this date. It's still a mandatory requirement. So they do these kind of trade offs in terms of Privacy and safety and quote, oh, we're going to put X million dollars to the companies of our choosing. That to me is just not good enough. That to me is just trying to sort of shrug responsibility almost by putting a number to it and saying, oh, well, look at this, we did 40 million.

Speaker 1 (34:16)
But that's like, let's compare that to your quarterly. Let's not talk about this quarter because that was really bad. But generally quarterly revenue that you were collecting from the entire globe. And let's also put a person in India and person in other countries into scope when these decisions are made, when these trade offs are made. So those are like some of the things that I'm highly concerned about when very similar to this. Any organization, not just Meta, but like Apple. I know we hammer on Facebook all the time, but where the hell is Apple when it comes to disclosing what kind of decisions are being made behind the scenes for Privacy, safety with respect to their augmented reality ecosystem, they're even a worse kind of violator because they're not even involving organizations like XRSI. They're just doing everything in the closed like a black box. And one day, just like they did with child safety and client side scanning stuff, they will just come out and be like, oh, now we are going to scan all of your messages while calling people like myself and other organizations, the Ni air, quote, screeching voices of minorities.

Speaker 1 (35:32)
That's not acceptable. So there are more than Meta. There are other real massive stakeholders that are just doing all these decision making in private, not involving us. And that's not fair or just putting a tiny little number to it and justifying their efforts by saying, oh, 40 million and we're good, you should trust us. That's not good enough. There needs to be way more transparency. In fact, I directly asked Boss, their CTO what all are the third parties that have access to the data that Oculus is taking? Well, I'm yet to receive that answer from them. So there are these conversations that are just like unanswered. And the accountability is absolutely 40 million is nothing compared to what all is at risk.

Speaker 2 (36:25)
So earlier when you were talking about how the Metaverse is the future of the Internet. I heard genuine excitement in your voice. What do you think of the Metaverse as it is today? Have you put on the goggles? Do you spend time there?

Speaker 1 (36:43)
Oh, that's funny, because I don't know if I mentioned my background, but I was the head of security for the oldest existing virtual world. And if we consider Metaverse is sort of this interconnected virtual world seamlessly allowing us to go back and forth between each other. I was a part of this sort of virtual world, and my job practically was to do security, safety, and compliance. And one was, of course, the oldest cryptocurrency. Not crypto, but like a virtual currency, Linda dollar, making sure that that's all compliant at all. But then they were also creating this virtual reality platform called Sansar, which is now being sold to another organization. But during my time, this was exactly kind of what I did. I spent about sounds kind of crazy, but I spent about eight and a half hours on average in that platform and also in various other platforms to try to understand what could go wrong and how do we need to proactively do policy and do safety and ensuring that if we are passing on the data via API to a third party, what kind of a Privacy consideration? And this was 2018. So during my time, general data protection regulation, GDPR was getting adopted.

Speaker 1 (38:07)
So that was another part of my job to make sure that we can comply with these laws. So, yes, I have very much been inside virtual world, and I have a sort of flavor of what the Metaverse could look like or what it could mean to exist inside these kind of ecosystems and be at risk of a company just overnight changing their terms and conditions. And now a bunch of your money, which would be virtual currency now, is at risk or you're a disadvantage because they changed the game or they changed the rules. One thing that was really that struck to me, and you can only experience it when you're inside these ecosystems is the level of intimacy with the interface is so real and more and more hyper real. These environments become when you interact with another human being, the same type of harassment, the bullying and all of this. This has a very visceral effect. So back when I was spending time within this virtual world, I had this experience when somebody yelled at me and called me names, and they were just going off about F this and F that. And, oh, my gosh, it was like as if I was in my childhood and I was being abused all over again.

Speaker 1 (39:34)
So I had a bit of a tough childhood, and I have had a bit of an abuse and stuff. So it really triggered all these negative emotions. To the extent that someone who spends about eight and a half hours on an average daily, I never went back in for like three or four days. And then right away I got together like my security team and we were talking about what are the potential? Like yesterday, Facebook's Horizon VR announced that they have a safety bubble. This was a very Proactive thing that we put in place for Sandsar. And there are several different other considerations, like kick, block, mute, all these safety considerations. So it's almost like being in real world, but you are sort of using these glasses to experience this because the brain feels it that way. You feel like you're being harassed, you feel like you're being sexually harassed if somebody makes these advances, even if it's just they're sort of attempting to touch you or something in a very bad way. So those are the aspects that have been very real for me. And that's why it's also very personal to me, because I understand it from not just being an outsider who's trying to secure some data or systems, but I've lived in these sort of segmented ecosystems that when in full effect, will have a very detrimental impact on our society.

Speaker 2 (41:15)
So we have covered a lot of ground. This has been amazing. I would like to try a little experiment. I want to try speed round. I've got three last questions and do them speed round style, which is first thought that comes to your mind, your game.

Speaker 1 (41:34)
Let's give it a shot. I generally mess up on all these things, but if no answer is the wrong answer, then I'm game for it.

Speaker 2 (41:43)
That's fine. I just want to sort of try to summarize. So, number one, what is the number one risk of what we've talked about or maybe haven't gotten around to when the Metaverse is fully realized?

Speaker 1 (41:56)
The number one risk is really the loss of our agency, our autonomy. We may think we're making decision, loss of our free will. And that is number one, something that must be addressed, but is currently not even barely even being talked about.

Speaker 2 (42:16)
Okay, what is the number one solution?

Speaker 1 (42:24)
That's our mission. You know, actually, honestly, there is no silver bullet. However, there are a couple of things on a technical level, because it's all based on code and technology. We need to do a lot of deep thinking within our cultures because we have not resolved the issues of our in real life IRL society and how we are biased towards very different segments of society. But I think technically I could say there are a couple of technologies, such as homomorphic encryption. It's basically sort of a property of an encryption scheme where you can actually do computation on an encrypted data without decrypting it. So it not just provides Privacy, but it also provides security because if the data is lost to a bad element, then the sensitivity and all of that is still maintained. So that's one, then there is the differential Privacy. But this is why sort of Xorsi is working towards a Privacy and safety framework because the technology alone is not going to be able to tackle all the challenges that we are up against. So Unfortunately, I wish I had one single word answer. It's going to be a combination of a whole lot of things that we got exactly a lot of hard work.

Speaker 2 (43:54)
So we do the hard work. We're successful. What's the thing you're most excited about that this technology brings us in the future medical.

Speaker 1 (44:03)
And this is why medical, health care, mental health, human health, just being able to be the best versions of ourselves, realizing our full potential where these mental health challenges can be tackled seamlessly, easily. Currently, our health care infrastructure is so burdened and covet has been kind of fueling that burden, too. So that's what excites me, is it's going to allow us to heal ourselves. And if we can trust these technologies and that's what I want to enable the trust in that we even have a medical XR Council that pays attention to these issues. We can really be better versions of societies, better versions of humans, better versions of ourselves.

Speaker 2 (44:51)
Great. Yeah, I like that. Future vision. So I want to switch gears for a second again and ask you a couple of questions that we like to ask all of our guests. And the first one is, what is a Privacy technology that does not exist right now but should exist?

Speaker 1 (45:07)
I think I kind of hinted at that, which is like the homomorphic encryption and differential Privacy. I believe what exists right now is these things are still in sort of research mode, and they are still being used in a very isolated use case basis. These technologies need to become ubiquitous. Instead of breaking encryption, we need to utilize homomorphic encryption scheme that would allow us to compute on encrypted form of data. So new kind of distributed computing would emerge as soon as we adopt these things. So I think that it's a flavor of these two that I would love to see and maybe even more Privacy preserving technologies that I'm not sure if I could count a few, but those two are top of my mind that I think would really love to see them become ubiquitous.

Speaker 2 (46:05)
When did you first realize that Privacy was important to you?

Speaker 1 (46:09)
Oh, it's a very funny story. Too funny because it's not even funny, but the context and how it happened, it's really hilarious. So 2016, I was getting hired or I was hired at Facebook as a third party security adviser. And as I was walking through, first day, second day, I was walking through and getting my first day introduction and all, I must create a Facebook account. And I didn't have it back then. So I was kind of hesitant. And I went up to my manager at that time, my lead, and I was like, hey, do I really need this account? And she said, yeah, you know what? Just put a few pictures. I'm walking through the hallway, it says, Be social, be connected, move fast, break things. And it was like, okay, I don't want to be shaved by people that I don't have this. And in order to work there, you really had to have a Facebook account. So I created the account. The one thing that I forgot and didn't pay attention to between the time that I joined Facebook and back when I had left India in 2007, I had changed my religion.

Speaker 1 (47:24)
I had become from Hindu to a hijabi woman who covers her hair. And because I was like, okay, I'll just put a few pictures and what not. And I was a very private person back then. These pictures were actually by default. Privacy policy of Facebook were public and I didn't pay attention to it. I didn't know. And within a matter of hours, Facebook's AI started to suggest my relatives back in India to who I am. Or there was this person who just logged in and apparently a couple of my relatives, it's really sad, but they were just like anti Muslims. And I started to get death threats where one of my uncles, my mum's real brother, who had been in this anti Muslim, Hindu Muslim riots and stuff were just like sending how dare you and dare you come to India and will kill you. And so at that time and somebody actually called me frantically, hey, do you know you're like hijabi pictures online? I'm like, what? How do you know? Because look, they suggested me that I should be your friend. And I'm like, oh my God, what am I supposed to do?

Speaker 1 (48:40)
Disable your Privacy. Default Privacy. Like you need to modify your default Privacy. And that's when I realized for the first time how bloody important Privacy can be. Because just based on your choices of your lifestyle. And I can only imagine how difficult it may be for somebody with, let's say, sexual orientation of a different kind that doesn't fit into a different culture or gender specific preferences. It could be really hurtful. It could literally kill you and it has killed people. And that's why it's kind of that day, it sort of became so personal and intimate to me and I realized that, okay, this is very important issue and it must be addressed. And the conclusion of that story, my uncle that sent me death threats that died. And Thankfully, I'm very careful now that I go to India. But for a couple of years I didn't go to India and now I'm able to. My parents are more accepting and stuff. So thankfully it's worked out. But for some people, that may never be an option. They may just be seeing literal debts because of what they became.

Speaker 2 (49:49)
Thank you so much for sharing. Who do you look up to in the field of Privacy?

Speaker 1 (49:55)
This may not be everyone's favorite, but I look up to Kamala Harris many reasons, if I may add to it, is right before Facebook, I was doing security work for a corporate immigration law firm. And that's when Privacy shield had been rendered, Privacy shield became like safe harbor had been rendered useless. And then Privacy shield came into effect. And that's when I got introduced to Kamala Harris's work, who basically was the reason why Privacy laws in California became the toughest in the nation. We're talking about she started in 2012, and what I'm talking about is 2015. In 2012, she created this Privacy enforcement and protection unit and federal and state Privacy laws regulating all these collection and retention and all of this data. So specifically, I look up to her because she has had this ability to see it through, how you can regulate these things, what all needs to be regulated and has been a fierce advocate for all kinds of Privacy challenges. And she was like this boots on the ground person, and now she sits in the White House. And that is the kind of fierce advocacy and understanding of technology that we need today to be able to confront all this and not just at a technical level, but like a regulatory level and taking it all the way to White House and a global stage.

Speaker 1 (51:32)
And I think that I really look up to her work for being able to do all of that.

Speaker 2 (51:39)
It's very impressive. Yeah. All right. So the title of our show, It's Privacy, is the New Celebrity, and we like to ask each guest, do you agree? Do you disagree? What do you think of that concept?

Speaker 1 (51:53)
I have to say I agree that Privacy is the new celebrity, but there is a caveat to it. I think Privacy is new celebrity. However, the celebrity needs new clothes and possibly a new avatar. We need to really rethink what Privacy means in the era of constant reality capture and the way to really do it, at least I would offer just my two cent. It's borrowed from Noble Ackerson, who is the President of our diversity and inclusion coalition, Cyber XR. He says it's context, it's control, it's choice and respect. So I think if we can assign this celebrity some new clothes and avatar and this new concept, we'll agree with that.

Speaker 2 (52:47)
Yeah, I like that. Well, our guest has been Kavya Proment, immersive technology expert and founder of the XR Safety Initiative. Kavya, thank you so much for being with us today.

Speaker 1 (53:00)
Thank you, Henry. And thank you for allowing me to dive into these concepts, which are often just very complicated. So great questions and great insights, even for myself thinking about these things. So thank you so much.

Speaker 2 (53:19)
Thanks for listening. We'll see you next time. As always, please subscribe to Privacy as the New Celebrity on all podcast platforms and visit to listen to our radio show and the full archive of our podcast episodes. I'm Henry Holtzman, our producer is Sam Anderson and our theme music was composed by David Westbang. And as we like to say at MobileCoin Privacy is a choice. We deserve you.