The Monica Talks Cyber Show

Is Your Privacy at Risk?

Monica Verma Season 2 Episode 12

As technology, social media and smart devices have seeped into our private and business lives, do we have privacy as a human right any more?

In this episode Monica Verma, CEO & CISO, talks with Debbie Reynolds "The Data Diva",  on the myths around privacy as human rights, privacy challenges related to social media, artificial intelligence and emerging technology, as well as how it's disrupting our private and business worlds.

Looking to become an influential and effective security leader? Don't know where to start or how to go about it? Follow Monica Verma (LinkedIn) and Monica Talks Cyber (Youtube) for more content on cybersecurity, technology, leadership and innovation, and 10x your career. Subscribe to The 10x Circle newsletter at https://www.monicatalkscyber.com.

Monica Verma  0:00  
Good morning, good afternoon, good evening, wherever you're tuning in from. Welcome to today's episode. 

Monica Verma 0:10
Today, I've a fantastic topic, a topic that basically well affects every single individual on planet Earth. So we'll be talking about privacy, with none other than Debbie Reynolds, The Data Diva.

Debbie Reynolds  0:21  
Thank you so much. I'm so happy to be on the show and happy to collaborate with you again.

Monica Verma  0:27  
Very nice. Would you like to say a few words about yourself.

Debbie Reynolds  0:30  
Sure. As you said, I'm Debbie Reynolds, people call me The Data Diva. I'm the co-founder and Chief Data Privacy Officer of Debbie Reynolds consulting. So I work at the intersection of law, privacy and technology. And I help companies make privacy a business advantage as opposed to something that they do grudgingly.

Monica Verma  0:54  
Before we hop into the details of the nuances of it, let's just first talk a bit about are there any differences, and if there are, what are the differences between data privacy, consumer privacy and privacy as human right?

Debbie Reynolds  1:11  
Yeah. Okay. That's a good question. So, privacy in and of itself really is about a person's right, and how those rights are articulated within areas or countries that are looking at two major schools of thought, actually, there are three we are going to talk about two in particular, one is privacy as a human right, which is what we see in the EU, which means that from cradle to grave, you have a right, that cannot be, you know, taken away, can't be sold, can't be traded, right? So a lot of the laws within in the EU over the decades that this has been established in your constitution has tried to codify that into law. So the most popular that we talk about is the GDPR, obviously, because it's has such a big influence around the world. But then also there are privacy laws like we have in the US which are more consumer based. So it means that these rights kick in when you're consuming something. So you're buying something like you sign up for our account or something, and then there're rights to protect that. So the difference between a human and consumer right is that a human right covers more things, right, than a consumer right. So I give you an example. Let's say, in the US, if you go to a grocery store, and you purchase stuff, and you know they have your data, they have obligation, because you're a consumer to protect your data in certain ways. There are laws in different states. And they're different, right, to tell you, you know, what a company has what obligation they have to a person who's consuming something. So let's say you go across the street to a church in the US. And let's say you sign up for stuff like newsletters and different things, whether the church, like, for example, in the state of California, they don't have obligation to handle your data in the same way. So that's a really big difference, I think, between the EU and the US where you know, are a lot of our rights or a lot of the laws that you're seeing being passed are based on consumerism, and not on your you being a human and wanting to protect your rights.

Monica Verma  3:35  
And I think that's also one of the reasons I asked this question is because to me, privacy is always a human right. And to Europeans in general privacy is always a human right. And I think in the conversations that we had before, we talked about these nuances between the understanding of privacy as human right, in different countries and culture and how it's perceived and how it's understood, because so it was like really interesting to understand this different perspective and understanding of how some organizations are thinking of privacy, mostly from a consumer perspective, but not as a human rights perspective. So where do you see the biggest challenge with that? I mean, obviously, one of the challenges being that, yes, human right to privacy is not always protected through this. But what are some of the other challenges that you've seen with this?

Debbie Reynolds  4:27  
I think the other understanding or challenges is, as different privacy regimes are put in place in different countries, knowing as you said, the nuances and the differences, it's not the same. So in the EU, because privacy is a fundamental human right in your constitution since the 50s. You know, you these laws that you are passing are protecting the privacy right that you have. Whereas in the US, the laws we are making, we're trying to have privacy rights and protect them at the same time, so that's very different.

Monica Verma  5:04  
Right. And one of the other things that  is a heated topic and a point of topic of this discussion always is well, privacy versus security, right? I mean, I have worked with security and privacy in my career. And then to tell people, you know what, they're not the same thing, please, you have to really understand there's a difference. However, at the same time, they do overlap, right? There is definitely like interacting and converging parts of both security and privacy. So could you say something about the biggest conflicts that you see or have experience in the way people understand the difference? But also, where is the possibility and where is the importance of collaboration and mutual, how do you say complementing each other?

Debbie Reynolds  5:57  
Yeah, it's hard, because there are a lot of misunderstanding about cybersecurity, I think just in technology in general. So, like, if you do any job in technology, I feel like people think you do everything, right. So they think you do you know, you do hardware, you do software you do you know, you do pen testing, you do all this other type of stuff. And that's my true. So, what I try to do, I try to equate, it's like being a doctor. So doctors can have many different specialties. So it's when someone says they are doctor, are they doctor of education, or are they a doctor or medicine? If they're a doctor of medicine, are they a brain surgeon, are they a podiatrist, so you just don't know. Right? So cybersecurity is that way where people have their different forms and functions and specialties in that. So that's one thing that people don't understand. But about privacy, the way that I explain privacy and cybersecurity, the way that that thinking makes most sense, is I describe it as a bank. So think about the security at a bank. So the bank, they're secure the outside, the inside, everything is happening inside. So like, you know, the tellers, log in and all this type of stuff everywhere. So cybersecurity is responsible for the security of everything that happens within an organization. Right? And privacy is about what's in the vaults, and why is it in there? So why are you protecting this data differently? And why is it in there, so cyber isn't as granular, so cyber's responsibility is to protect everything. And obviously, if something in the vault is like, more, more valuable, right than the other things in the organization, it gets like a higher level of security. But cybersecurity, you're going to protect whatever is in the vault, regardless of what's in there. Whereas in privacy, we're gonna say, you know, this is in here, because of x, you know, this, you know, and this is why we need to treat it differently. So it's like a more granular way to talk about particular things within the organization and not everything.

Monica Verma  8:13  
And somehow, still, privacy and cybersecurity need to collaborate, right? And they can mutually complement each other in certain areas. Where do you see are the biggest potentials for collaboration and complementing?

Debbie Reynolds  8:30  
I think the biggest ways they can complement one another is that privacy should impact the whole organization, right? It should be a culture of cybersecurity in a culture of privacy. So everyone has a part to play. And in a way, people, it makes it seem so generic. So, almost like if you don't give people a specific role, then they don't know what they're supposed to do. So, if you say it's everybody's responsibility, but then you aren't, you know, specific about what that means. I think that's problematic. But I think the really important thing about cybersecurity and privacy is that they do work together, they have a symbiotic relationship. So being able to collaborate together is really important. And really breaking down those silos is very important. So we're in previous, you know, over the years, I've seen, you know, we have people who the technical people are in a silo, and the legal people are in a silo. And this is something, in order to have privacy and security work together, they have a break down those walls, and they have to meet with each other and talk with each other, make sure they understand one another because they, they're all professionals in their own right in their own, you know, area of expertise, but they have to really join together to be able to create something that's of value to the business.

Monica Verma  9:56  
Right? Absolutely. So the key really is both communication and culture basically around it that would help have a better collaboration between the two areas. Absolutely. So let's talk a bit about what's happening. Because obviously, Schrems I came, Schrems II came, GDPR came before that. Even before GDPR, obviously, there has been privacy as a human right, it's been there for decades. It's not something new and GDPR, just obviously made all this formality and legalization around consent, and what do you need the data for, for what purpose it's being used and so on. With all the changes that's happening, and then we've had Cambridge Analytica, we have had cases where we still see that major companies, big companies do not value or they value consumers data for their own purpose, right, without really taking into account the human privacy aspect, the data privacy aspect, the 'how much data do they need', why do they need it? Right? Before they're actually just gathering and collecting all the data for purposes of selling or their business model, whatever that is. Where do you see there still is a big, big need for change, going forward? Where are we still struggling the most, despite that?

Debbie Reynolds  11:17  
Good question. Let's see. First of all, people understanding that these laws may apply to them, you know, like, like, I see, some smaller companies say, well, you know, that's for, you know, the big companies, that's for Google and Facebook, and it doesn't apply to me, which is not true. But then the flip side of that is that it can be very overwhelming, like all the different regulations and stuff like that. So then, realizing that not every single thing in a regulation is going to apply to you so, so don't get so upset about it, or don't feel overwhelmed, because you just have to look at your business and the data that you're handling, and then pick out the things that apply to you. So then you don't feel like 'Oh, my God, I have these 99 extra things I have to do', which is not true, you know, not every company, you know, there are things that you probably are doing in your company that may not apply. So I'll give you an example. You know, like I have a client in the UK and their membership organization have, you know, been in all these different countries around the world and stuff like that. And because this particular company is a not for profit organization, the CCPA, which is the California Privacy Act does not apply to them, because it does not apply to my profit organization. So they were really concerned, they're like, 'Oh, my God, this law comes out, oh, it's gonna apply to me'. And like, no, no, that doesn't apply to you. So you don't even have to think about it. So that and then another one that comes up, like certain businesses, let's say you don't do anything with children. So you don't have any business at all or you don't have any consumer under a certain age, then you don't have to worry about those, the laws that have to do with children. So just don't think about that, you know. So part of it is just right sizing it for your company and figuring out what applies to you and then minimizing your data or really think about why you're capturing it in the first place. And if you can't think of a good reason, then you should not be capturing it.

Monica Verma  13:28  
Yeah, right! Coming back to the purpose of why you're collecting the data in the first place. Absolutely. So let's talk a bit about how the technologies have been changing, right. I mean, we talked a bit about the cases that have happened and still a lot of companies are not understanding data privacy and consumer privacy to the extent that they should. Add to that the complexity of technology. And we are seeing, obviously, ethical risks, we are seeing biases, we are seeing discrimination. We are seeing what's happening in the software, and the systems and the codings and all these things which lead, obviously additional privacy issues. What do you see is the biggest, let's say, the biggest things that one needs to work on this, how do we actually make sure that we are able to consume technology going forward, while not making it a big hinder or not even a big just like a setback for privacy in that sense?

Debbie Reynolds  14:29  
Yeah. Well, privacy by design is very important. It's a great concept to think about it as you're developing tools, as opposed to, you know, let's develop this cool tool. And then let's figure out what we're gonna do about privacy. It doesn't really work that way, especially if your system is supposed to ingest or interact with data of people, right. So people own their own data, in my opinion, right. And companies, they use their data. They're like a steward, a data steward. So the data is on loan to you, and you have a responsibility to handle it in a certain way. So some of the biggest challenges I see have to do with AI, and things like facial recognition and the use of biometrics in technology applications, because the harm can be really devastating to individual if we get it wrong, meaning things like using AI and policing, where you're arresting the wrong person, or are using AI in evaluations in a way that is discriminating against people or in schools even so. We're seeing because someone's distance learning as a result of COVID, you know, some of these software's that are created to do these education systems are, you know, had an example where they were grading people of color, poorly, because the camera could not recognize them. So it made it seem like they weren't, they weren't, you know, doing their classes, which is not true. So that's like an example of an algorithm not working the way it should work, because it's supposed to work for everybody, not just for some people. So part of that is just really, you know, I think there was a leaked manuscript or a document about the EU coming up with standards for AI, which I think is amazing. And I want to see that kind of around the world, like, you know, there are people who can look at code and can tell you like, this is a problem, let's change it here. And instead of it being something that that companies, you know, it's a nice to have, I think it should be, it should be mandatory, I feel, especially if you're dealing with situations that could cause potential harm to individuals.

Monica Verma  17:02
Well, technology is always used for good and bad. So to minimize really the bad impacts of AI, not only by modern cybersecurity terms, normally from cyber attackers, but also by our own mistake and fault in the way systems are built in what are they used for? And how they could have these risks towards bias discrimination. Also, just invasion of privacy in general, right?

Debbie Reynolds  17:27  
Oh, yeah, totally. Well, part of it, invasion of privacy, I guess, has two parts. So one is you're being observed and the space that you don't want to be observed in, right. And the other is, where people want to use a lot of smart tech, which you're basically inviting surveillance. So like smart speakers, and all that stuff, you know, smart doorbells, smart thermostat, they're collecting so much data about you. And I think people really need to understand what it's doing to be able to make an informed decision whether they want to use those things or not.

Monica Verma  18:03  
There's a there's a part, there's a role that consumers or users have to play here, and we will come to that. But there's also a big role that companies and businesses have to play here, and government in general. This is where we also talk about how we cannot have, for example, too much security that invades privacy to an extent that it's conflicting with human rights, for example, right. So what can businesses do here? How should businesses go forward in ensuring that, whether it's just the invasion through apps or mass surveillance that's happening through different technology in that sense?

Debbie Reynolds  18:41  
Yeah, well, I guess when I think, I'm thinking about two types of businesses. So one is technology businesses that are creating these technologies, and then technology businesses that are using these technologies. So where it is the creators of the technologies, it's important that you do things like privacy, by design, and give people features. So when people are making these tools, they want to make them so that they can be used that you know, with anyone, so they'll put in as many features as they can, can and not all those features may be lawful or may be appropriate for your business to use. So for a company that purchased the tool that they want to use and have AI, they need to really look through it and see what it does. And if it's doing something that is unlawful, or you don't think is ethical, you have the ability, hopefully to be able to turn it off and not use those features. So I've talked to people a lot about that where they see it, you know, the thing does 10 things, but maybe, you know you only need six, so turn off the other four.

Monica Verma  19:51  
Right, right. Right, and how about the mass surveillance bit, because this is where the big question as well as are we being mass surveilled by governments? To what extent is or should it be lawful? And what would be considered unlawful?

Debbie Reynolds  20:11  
Yeah, that's a tough question. So, especially because of COVID, especially because of COVID. So, when, let's say before COVID, well, governments have the capability to do all types of surveillance, right. And if if surveillance is deemed to be something in the public interest, that's something that they can do, and that that can sort of short circuit an individual's right. So, as we're seeing with COVID, where a lot of these countries are creating, like these health passports, and, you know, these health checks and tests and stuff like that, because as a public health crisis, they can sort of take over more more individual rights, right? Because they're saying, you know, this is a, this is something that impacts whole country. So there's more data going into public domain about individuals as a result of COVID. So that, you know, the way it's supposed to work is, you know, the government can have these special powers if in an emergency, and then once the emergency is over, they're supposed to give those rights back, but I don't know how that's gonna happen. A lot of times when they, when governments take those rights, they never want to give them back, they want to always have it. But I think it is the discussions we had all over, where, you know, I've seen documentaries, where people are like, Oh, well that, that van is surveilling you or, you know, if you walk past this point, you'll be like, recorded, people have to make a choice. But a lot of times you don't have a choice is like, you know, I live in Chicago. Chicago is one of the most heavily surveilled cities that you can imagine and has been for over 20 years. But you, for me, you know, I don't, I do look up and I see like a camera or I see like a listening device or whatever. And it doesn't interrupt what I'm doing and why. But I think just the idea that everything that you do is being recorded in some way, you know, even your phone, so your phone is surveilling you. So you put a phone in your pocket, it knows that you took 10 steps to the left, or 10 steps to the right in those you know, what street you're on, and all that type of stuff. And, you know, in the past, before computers before cell phones, no one cared about that, like no one was collecting that data. No one cared about that. But now, because there's so much data being collected, people are wondering, for what purpose and what is gonna happen with that data in the future.

Monica Verma  22:51  
Right. So I have two follow up questions on them. Because this is really interesting, what you just just brought up right now. One, have you seen other challenges that COVID-19 has brought with itself in terms of invasion of privacy? And two, as you say, I mean, we don't have to go too far to the government really mass surveilling us, a lot of it is happening on our phones and mobiles at the end of the day on a daily basis. So second question would be 'What can consumers or users really do'? What would be your top recommendations? But let's take the COVID-19 first, other challenges that you've seen?

Debbie Reynolds  23:29  
Yeah. So other challenges I've seen in COVID is people think, when people think about medical information, they think their medical information is always protected? Right. But medical information typically is protected in a patient-provider type of situation. And then if you volunteer that information is not protected, because people can, people may say, well, because you told your information publicly, that you may not have a right to privacy to that because you volunteered it. So there's a debate about that. But there's obviously more information in the public space about that, where we're seeing things like companies saying, you know, you can come back to the office, but only if you had been vaccinated, you know, so that makes it a public issue in some way, or you can't get on the plane unless you've been vaccinated. And we saw that recently, where there was something happening in some Caribbean island, St. Vincent, I believe, and they were evacuating people. And they said, if you hadn't had a vaccine, you can't evacuate. So we're seeing that's kind of crazy, right? So that basically you have to tell people or show them that you've been vaccinated someone who's not a medical professional to be able to get on a plane or go different places. So that's very different. And I don't think that's going to change anytime soon. The thing that consumers can do, obviously don't have a lot of control over what happens with COVID and how governments decide they want to give people access or not give them access, but I think, you know, just being aware of all the technologies that you use and reading, maybe you can, you know, the privacy policies, I think are really helpful. And you have to make you want to be an informed consumer, you want to be able to say, you know, like, I don't want to use smart lightbulbs, because I don't like what it does, you know, so then you say, 'Okay, I read this and I don't like that. So I'm like, I'm, I'm using regular light bulbs, not, not the smart, you know, light bulb or something like that. And for some people they think is fine. They're like, Okay, I liked the advantages that I get to like my vacuum cleaner, that was around the house that don't have to vacuum. So I'm willing to give up some privacy are willing to, you know, do this because they have a benefit to me. So, to me, it's a trade off, and people have to decide, you know, what's best for them. But you know, I think being educated, trying to educate yourself. And it's hard, because not everyone is like you, who's really smart about cyber stuff. So I think being able to get educated and one thing that I started doing, you know, during COVID, our family because we are able to meet and we're all over, we do these zoom calls. And you know, like, oh, how so and so? And how's this, and I bring up cybersecurity stuff. Like I'm telling people don't click on links to, you know from people you don't know, and stuff like that. So I think it's important to tell people like your family members about this stuff as well.

Monica Verma  26:34  
So Privacy Awareness, very, very important. Absolutely. Last question, before we stop today's podcasts, because I believe the discussions were amazing. I think I can talk to you about this for another hour. This is so fascinating. It's lovely to talk to you. But let's just maybe wrap up with one last question. How can we make vendors and businesses more accountable for privacy in products or services that they provide?

Debbie Reynolds  27:02  
I think, you know, people should vote with their feat, or vote with their money. So if you feel you know, if you tell a company, 'Hey, I like your product, but I'm concerned about this', and they're not willing to change, you know, you can move to a different product, or you can do something differently. So I think what's happening now is that companies are starting to see especially because of Apple, I think so Apple, when they start putting these privacy features in their phone, that is really pushing the industry in that direction, where they're actually proving that privacy can be a benefit to your company. So what like Apple announced last summer that they want to do this Apple iOS 14 update, which will have these privacy features that will let people opt in and give them more visibility. And as a result, I think Apple had the most profitable quarter they ever had last fourth quarter in 2020. And I think part of that is because people are like yes, help me, help me protect myself, help me protect my privacy, but it's having a on is having an impact on other technology companies now as well as they're saying privacy can be profitable. So that's what companies want.

Monica Verma  28:24  
So, privacy as a business differentiator. Lovely. That's fantastic. Thank you so much, Debbie for today's episode. I had lovely, lovely conversations with you.

Debbie Reynolds  28:33  
Yeah, this is great. Well, thank you. I love the stuff that you're working on and cyber's really important and education is really important. So that's it. People just don't understand cybersecurity at all. So the more that you can do this, the better things will be.

Monica Verma  28:49  
So hope you had fun. Hope the conversations were engaging and educational. I'll be back with more amazing episodes, fantastic guests and amazing conversations. Take care and stay safe.