Redefining CyberSecurity

The California Delete Act: Emerging Changes for Data Brokers and Its Impact on Data Privacy | A Conversation with Nia Luckey | Redefining CyberSecurity Podcast with Sean Martin

Episode Summary

In this episode of Redefining CyberSecurity Podcast, host Sean Martin and guest Nia Luckey discuss the California Delete Act (California Senate Bill 362), data privacy, and the responsibility of businesses in protecting sensitive information.

Episode Notes

Guest: Nia Luckey, Senior Cybersecurity Business Consultant at Infosys [@Infosys]

On LinkedIn |  https://www.linkedin.com/in/nia-f-713270127/

____________________________

Host: Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/sean-martin

____________________________

This Episode’s Sponsors

Imperva | https://itspm.ag/imperva277117988

Pentera | https://itspm.ag/penteri67a

___________________________

Episode Notes

In this episode of Redefining CyberSecurity Podcast, host Sean Martin and guest Nia Luckey discuss the California Delete Act (California Senate Bill 362) and its impact on data privacy and protection. They delve into the concept of data brokers and the sensitive information they gather, such as personal details, credit data, facial recognition, and driving behaviors.

Presenting a couple examples, the conversation raises questions about responsibility for data protection in the realms of autonomous vehicles and platforms like Meta. They emphasize the need for businesses to understand the data they collect, educate themselves on data privacy regulations, and consider offering opt-out options for customers. Of course, providing the option to delete data is going to be a non-negotiable customer feature.

The discussion also touches on the challenges faced by smaller organizations in complying with the bill and provides advice on data inventory and protection. They stress the importance of knowing what data is being collected, where it is stored, and how to protect it to an appropriate standard. They highlight the need for businesses, regardless of size, to prioritize data protection and privacy. The ultimate aim is to empower individuals and businesses to have control over their data and protect privacy in an interconnected world.

The conversation takes a consumer-centric approach, discussing the implications for individuals and their rights to opt out of data collection. They explore the potential difficulties in deleting data from various platforms and emphasize the importance of making the process accessible and user-friendly.

Throughout the episode, Sean and Nia engage in a thoughtful and informative conversation, touching on topics such as data classification schemes, data handling practices, and the overall spirit of the California bill. They encourage businesses to proactively manage risk and ethics and take steps to protect data and privacy.

By listening to this episode, listeners can expect to gain a deeper understanding of the California Delete Act, its implications for data privacy, and the responsibilities businesses have in protecting sensitive information. They provide practical advice and insights to help individuals and organizations navigate the complex landscape of data protection and privacy regulations.

____________________________

Watch this and other videos on ITSPmagazine's YouTube Channel

Redefining CyberSecurity Podcast with Sean Martin, CISSP playlist:

📺 https://www.youtube.com/playlist?list=PLnYu0psdcllS9aVGdiakVss9u7xgYDKYq

ITSPmagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

____________________________

Resources

An Analysis of California Senate Bill 362 - The California Delete Act: https://www.linkedin.com/pulse/analysis-california-senate-bill-362-delete-act-nia-f-luckey-lssbb

International Association of Privacy Professionals (IAPP). California Legislature Passes Delete Act for PI Aggregated by Data Brokers: https://iapp.org/news/a/california-legislature-passes-delete-act-for-pi-aggregated-by-data-brokers/#:~:text=The%20California%20State%20Legislature%20passed,information%20collected%20by%20data%20brokers

California Legislature. (2023). Senate Bill 362.: https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB362

California's 'Delete Act' Could Let You Scrub Your Data From Brokers' Files.: https://fortune.com/2023/09/15/california-delete-act/

____________________________

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit:

https://www.itspmagazine.com/redefining-cybersecurity-podcast

Are you interested in sponsoring an ITSPmagazine Channel?

👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

Episode Transcription

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.

_________________________________________

[00:00:00] Sean Martin: Hello, everybody. You're very welcome to a new episode of Redefining Cybersecurity here on ITSP Magazine Podcast Network. This is Sean Martin, your host, where I have the pleasure of not actually having to do cyber security in the company. I just get to talk to loads of people who, uh, who get to and, uh, and are Challenge perhaps with some of the requirements that often come from outside the organization. 
 

I'm talking about regulations here that could detract or distract or take away otherwise from things they think are important to the business. We have to remember that the businesses usually run for a reason and that's to serve a customer, be it an individual or another entity. And we have a responsibility as a business to protect. 
 

It the information of those that we're serving as we do that. That's landed on the shoulders of, of cybersecurity professionals and, in collaboration with privacy folks. Uh, so we're going to talk about yet another regulation that's it, the it, the streets or the wire, whatever you want to call it, and that's the, uh, the California Delete Act. 
 

And, uh, I know nothing about it, uh, other than what its name is. And that's why I'm thrilled to have Mia Luckey on, to, uh, to kind of walk through some of this stuff with us. So Mia, thanks for being on.  
 

[00:01:25] Nia Luckey: Happy to be here, Sean. Thank you for inviting me. Um, so as we know, regulations have a space in all businesses, especially regarding cyber security practices. 
 

And here we go, California with SB 362. So it's a Senate bill. Um, and it is kind of changing the game for data brokers and how we're protecting our data and the customers and the clients and what rights we have and how they can kind of just get in line to provide better security for all of its clients and even the companies that it does business with. 
 

[00:02:04] Sean Martin: Perfect intro. And before we get into, uh, all the nitty gritty fun stuff. Um, we met, uh, I don't know how many years ago now, uh, back at ISSA international, uh, conference. It was a pleasure to meet you then. And, um, so I know you from then, I don't know what you're up to now necessarily. And, and, uh, I certainly like our audience now a little bit more. 
 

So if you can kind of give us a view of your, your journey to this point and what you're focused on now, that'd be fantastic.  
 

[00:02:34] Nia Luckey: Well, goodness, Sean, what was that five years ago now? So I think at the time, if memory serves, I was working for Cisco in their public sector and I was helping them stand up a security operations center. 
 

Um, since I decided during a global pandemic to kind of pivot as all businesses were pivoting into the consulting world. So I currently work for a global consulting firm, Infosys. Limited is the name of the company, and I specialize in providing CIO and CISO level advisory services, um, for various clients for different reasons. 
 

And what we're talking about today might end up being one of those reasons, but who knows? Um, what I like about it is the versatility, so we don't have to be, you know, one area of expertise. Now, granted, I have my strengths uncertain, and others are a little bit weaker, but compliance, regulatory, governance is Always been a big part of kind of what drives my passion in cyber. 
 

Um, and it really, I like being able to delve into the new stuff and kind of pull it apart to make it make sense for businesses. Um, and I really like strategy, um, and helping them design and figure out what their strategic alignment looks like when new regulations or bills or anything are updated or just kind of hit, hit the market. 
 

Um, so that's a little bit of what I've been up to.  
 

[00:04:00] Sean Martin: That's great. And I, I think. That's why I connected with you so well so long ago is our brains kind of are wired the same a lot of things you're saying I'm like, yeah, I'd love that too. I can see that as well. So, uh, I'm excited to have this conversation as with many that, uh, that I bring on to the show in terms of topics. 
 

They're usually spurred by. Some post or conversation online, and this is no different. You wrote an article on the topic of the bill on LinkedIn, and of course folks know that listen, uh, and watch, that's all include the link to that article so they can, uh, enjoy it as much as I did. And so we're gonna walk through a bit of that, um, as it's, it does a really good job unpacking the, the new bill. 
 

And, and what's, what's involved there? Um, I think to start SB 360 2, uh, the California Delete Act. Can you, I know you provide a little, little view of what it is, but perhaps why, why now? What's her, what's its intention, uh, as a bill? Maybe even, I know we talked as we were briefing, kind of CCPA and CPRA, if I remember those acronyms correctly. 
 

Why this bill now, perhaps in relation to those other two? That makes it something that's necessary at the moment. No,  
 

[00:05:24] Nia Luckey: absolutely. And great question. Um, so I think what automatically pops in my mind is as Larger regulations, like 
 

what you mentioned in California. So we have CCPA, we have CPRA, there's also a European GDPR. So when our larger regulations that are in place that we know their names, we know the acronyms, we know what they're about, update and change. Sometimes what's observed is a gap and what we've observed this year, especially in the changes that have been made to such regulations. 
 

is well, what's happening to the data that businesses are either using to fine tune their marketing or their research or they're driving other business purposes, or they're shared with external what are called data brokers to help drive this innovation and kind of these ways that we run analytics and metrics to better the business. 
 

And the gap that was kind of that became really apparent this year in 2023 was that we have a lot of organizations that are being now held here in the United States to a much higher standard, right? And what are we talking about here? Well, we're talking about data privacy. So we're talking about the data privacy and the data protection of our customer, our client, our internal data. 
 

So even as an employee for Infosys, the data I give the company needs to be protected to a certain standard. Well, if they decide that they are going to share any subset of that information with a data broker. That information also has to be protected while in transit, while sold, and then with the third party data broker itself. 
 

And that's what Senate Bill 362 really delves into is why that's important. And the reason it's important is if the parent companies that are gathering the data that you're giving that opt in to to have your phone number, to have your email, to have your address, because that's what we're talking about here, Sean, right? 
 

When you're giving that data to Facebook. Facebook owns that data, but we don't necessarily know as consumers if Facebook's selling that data to a data broker or how that data broker is protecting our data. What happens if that data broker has a breach? What happens if our information is compromised because of that breach? 
 

Are there ramifications in place? And if not, then should there be? And that's really what's behind the curtain of the bill itself is saying, Hey, we're doing great. We're getting better, but here's another way to even further improve how we're protecting the data of our U. S. citizens, of our consumers, of our clients. 
 

[00:08:14] Sean Martin: Yeah, I love it. And I want to touch on two things. We'll see how this unfolds. So the first is just that the idea... And our employees. First thing is the, um, sorry, I think you paused there for a second. First thing is the, the, the concept of data broker. And then the other part is some scenarios surrounding the exchange of this information. 
 

Cause you, there are companies that we share information with and then they're using it. And then they perhaps share it with others. An employer collects information and may share with others. So I want to talk through some of those scenarios. Maybe as we're doing that, we can say. That's the data broker in this story. 
 

Um, I'll pick a couple of examples that come to mind for me, and then maybe you can share a few as well. First would be that we've seen the ads on, on television, maybe, and some people use it for tracking all of our subscription services, so we feed a service through an app that. That says, or they can look it up to see what, where our, our, uh, we give them, I guess the, uh, the permission to look up, I don't know how it works exactly, permission to look up what subscriptions we have, so that then we can manage that, or they can help manage that stuff for us, uh, a bit better, uh, so that's one example, and then internal, just, uh, the employer for health insurance, or other, uh, gym memberships, whatever it is, where it is. 
 

On our behalf, they're connecting us with another, another party and sharing potentially very sensitive personal information with them. Um, or perhaps saying, we don't want to be in the middle, so we're just going to point you and then you give it to them. And then they, they collect information. So those are a couple of examples that I can think of, but perhaps. 
 

A view from you or some scenarios where this bill really hones in. And then as we're doing that, we can kind of point to the data.  
 

[00:10:20] Nia Luckey: There's two that really come to mind when I, and the first example I'm going to throw out there, everybody should be familiar with. Um, so Cambridge Analytica, I know it's been a while, right? 
 

Data broker company, they bought data or they were given data. It doesn't necessarily. monetarily transactional, um, to run marketing and targeting of ads for the election through Facebook. So that's one giant example. Cambridge Analytica partnered with Facebook to kind of help, kind of get a gauge of how voters were going to vote, right? 
 

Now, the ones that come to mind behind Cambridge Analytica might shock some people, and it's Experian, Equifax. TransUnion are all considered data brokers. Why? Because not only do they just have your data, because that's what runs your, your credit reports, your credit scoring, all of that. That's, you're welcome. 
 

That's what they're there for. Um, but they also interact with a number of other external businesses. So when you're buying a home, when you're purchasing a car, so any large purchases, that parent company is interacting with That third party data broker company. So whether it's credit, whether it's marketing, those are two really large examples that are relevant to our audience that would help make it make sense. 
 

Now there are others that specialize in other specific data sets. Um, so you can, like I said, you could use them for marketing. Not sure what type of sensitive data would be involved in there though. So that's why I bring up the examples of Cambridge Analytica, and I bring up the examples of TransUnion because now we're talking sensitive data. 
 

Now we're talking data that if it got out there, the average American would be really upset, um, and oftentimes would result in, hey, free credit monitoring or identity theft protection, things of that nature. But let's go a layer deeper, Sean. So we're adults, we're consenting adults, we consent to these services, right? 
 

Well, I just brought up a massive social media platform that now goes by Meta, but also encompasses Instagram. What other demographic are we not talking about? Are there data on those platforms that could also be given to data brokers? Now we're talking about our children. We're talking about citizens that don't have a voice or say other than, you know, their parents can advocate, but we can't advocate on if their data gets sold. 
 

Or if it gets traded, or if it's used in any, in any way, shape or form, and the reason this is especially important in 2023 is with the emergence of external technologies that have really nothing to do with this conversation. It's just a broader, bigger picture, um, and that's the advancement of artificial intelligence and automation and all the capabilities there. 
 

So if we don't know, um, Where our data is outside of the companies we shared it with, from a threat like footprint perspective, it just becomes really large. And so then the question becomes, well, how do I opt out of this? How do I, how do I protect myself? How do I protect my children? And how do we spread the word? 
 

Um, And so a big, that's a big driving force behind a bill like this is because they saw the problem, right? California AG said, Hey, we, we've got an issue. They passed the bill. And now we have to work towards working with the businesses that do work with third party data brokers. Um, we also have to work with. 
 

The data brokers themselves, because there's going to be compliance pieces on both ends, right? So the parent company is going to have a set of compliance and regulations that they have to adhere to, but data brokers are now in that mix. And so they have a date where they have to disclose. How much sensitive data they have, um, they also have to come up with ways for us to opt out or say, hey, please delete my data. 
 

Um, and a lot of that is coming from GDPR where you have the right to erasure. So that's where that big term's coming in and that big push at the beginning of this year in January when GDPR updated and now is enforceable. Here comes the California SB 362 Act.  
 

[00:14:40] Sean Martin: So, so. I love the scenarios you presented as well, and I know, um, Marko just had a, uh, a co founder for those who don't know, I'm sure you do, uh, had a conversation with Chris Pearson looking at, uh, autonomous vehicles or vehicles in general that are collecting information. 
 

And as I'm saying this, I want to, I want people to think, well, what, what is data? So in the examples you provided with Experian, we can, we can picture the data. Yes. Our name, address, our social, last time we bought a home, our credit cards, that kind of stuff. When we move to a platform like Meta and all of, all of its stuff, some of that information is there, our personal information. 
 

But it's also our face to point on AI and other dense technologies, our face, our expressions, our connections are there. When we move to a car platform, uh, it's, it's how we deal with stress. It's how we drive. It's the places we drive. It's who's with us perhaps in the car. Um, do we like to take risks, uh, in the way that kind of thing, which is a ton of data that That can be very powerful to use as a data broker or as a company. 
 

So, uh, I guess my question is in, in the, in the meta sphere, the meta world of all this stuff that they have in, in the autonomous vehicle sphere, 
 

who's the data broker and who's responsible, I think you kind of pointed to the meta. Kind of has the responsibility at the biggest level, honestly, who they're working with, but how does that all play out? And, and does the bill actually touch on the fact that there's a bunch of, I'll call it, metadata?  
 

[00:16:44] Nia Luckey: The build doesn't go that deep, at least not from my familiarity, but in the, in the example you're referencing right now, let's take that autonomous vehicle as an example, right? 
 

And all the capabilities it presents, you can plug in your iPhone, you can plug in your Samsung, you can plug in any of that stuff. In addition, a lot of these vehicles also have a capability very similar to Alexa. So it's collecting voice data. There's so many data points. So you have a responsibility of the dealership who's selling the car. 
 

So that's, that's one point of compliance. Going to ensure that if it is Siri, for example, or if it is a Google kind of equivalent, like I have in my vehicle, that those secondary companies are also aware of. My voice, like you, you were touching on likeness, right? So your voice, those data scrapes, those are all part of the data that those parent companies collect. 
 

And I'm not sure they're completely aware of how much of our actual essence and likeness they do have and how sensitive we. Deem that to be because it is, it falls up under biometric data, right? So you can do voice recognition for multi factor authentication. You could do eye scanning. You could do fingerprint. 
 

It's that falls up under a classification of biometrics. And that is definitely absolutely 100 percent sensitive data. And when I worked for the Department of Defense, um, in the special operations community, we really honed in on that saying, Hey, so from an individual, just kind of device capability perspective, what are we doing to protect kind of, at that time, government secrets from getting out, right? 
 

Um, but it's this, it's kind of the same, should be the same methodology that we're, we're kind of thinking this through. Um, but it really comes down to knowing what data you have, knowing where it lives. And then protecting it to an appropriate standard, which the government, thank goodness, kind of already has a really good framework in place for that. 
 

Um, but outside of public sector, it's not highly publicized. We find other ways to make it work within private sector that are equally compliant with the other regulations in play. Um, But it does, it comes down to how are we protecting the device, so in this case, how are we protecting the car, um, how are we protecting those voice services, how are we protecting any of the data that goes into the computer of that vehicle, along with any of your mobile devices that just so happen to be with you, because it's all applicable. 
 

Um, when we're, you know, when we're seeing advances in vishing attacks, which is vocal fishing over the phone, um, that my son could call me right now, and I wouldn't be able to tell the difference, Sean. I would tell you it was him calling me and he needed 10, 000 and it's not. They're scammers. So if this data is out there and we know it's out there and we know it's being compromised, then the conversation, I like to pull it back with just businesses and strategists and just say, okay, so how are we, how do we use what we currently have to make it better, to protect it in a way where. 
 

You know, if Nia wants her information deleted, or she's not cool with CoreLogic having her information, that she can opt out, right? And I know that might be simplified, and there are some, you know, there's two sides of the argument. There are people that are for it, and then there are people that are like, this is going to be a burden, this is going to be unrealistic, this is going to impact business. 
 

We have to find a middle ground. We have to find a way to protect the data, because going back to my original point, it's less about us consenting adults, and I think it's more about just holistically protecting the data of a citizen and what rights we have.  
 

[00:20:34] Sean Martin: And I, I forget how long ago I thought about this. 
 

I may have written a blog or something at some point about it, but I think it was around the time that Alexa and Siri and all that stuff started to find its way on devices. And not just the, not just the mobile phones, but the devices sitting in people's living rooms. And it's one thing, in my opinion, might be a slight tangent, but I'm going to go there anyway. 
 

It's one thing, in my opinion, as an adult to say, I'm okay with that device using my voice and whatever else to take actions on my behalf. And perhaps identify things that I might not know I need to do and help guide me to do those things. And then maybe even do them for me if I, if I granted access. And I'm okay as an adult and a parental guardian of the, of my children to say, that's okay. 
 

They're running around the house. I'm okay with their voice. Right. Um, and at some point with this bill, I could say myself and my family, I'm, I'm deleting this account now as a guest in a home or as a person renting an autonomous vehicle in a, in a town that they don't live. And this data is being collected. 
 

Without permission, right? Just, just by, just because I rented a car, just because I visited somebody's house doesn't mean I gave their device permission to collect and store the data and perhaps identify me and use it for or against me in some way. Um, I don't know if the bill touches on that. It kind of goes to the point of identity. 
 

It does.  
 

[00:22:23] Nia Luckey: And it, it touches on it and it hints at we're really going to have to dig into the contracting language, Sean. We're going to have to dig into the agreements that are in place. So you rent a car, right? In our rental agreements, there's nothing that covers this. Well, it's, it might be a possibility. 
 

It might just be Nia, you know, going off on a tangent over here, but maybe in five years that would change and there would be an addendum that states like, Hey, Just know this. Um, and if you want to opt out, then this is how you go about that. Um, I don't think in the scenario you gave that there would be a way to opt out without renting that vehicle in the time. 
 

But what it's sounding like is like, let's just take, um, I don't know. What's another, what's another broker other than Cambridge Analytica, but, um, there's, there's many out there that work in terms of like targeted marketing campaigns, things like that, that we can say on the backend, Hey, listen, I rented this car, Tesla. 
 

Can you please delete my data? Or, Hey, Samsung, I've been a loyal customer. Here's the make and model of my phone. Please delete my data. Um, if we can kind of use a simple approach like that, I think it'll be better well received. Um, I, I. I like to postulate on it a little bit. I have a hard time rationalizing, like if emails went out, I think a lot of people would think they would be phishing emails, um, asking you to opt out of things, but if you know where you rented a vehicle from, who you've interacted with, who you share your information with, and just start compiling a list, um, You should be able to reach out in the next five years to say, Hey, thank you. 
 

Did business with you, but can you please delete my data? Now I'm done. Um, obviously I don't want Equifax, Experian or TransUnion to delete any of my data, but for anybody else that may have it, I would, one, I would like to know who, I would think it would be interesting to see what type of technology actually develops to kind of. 
 

Aggregate who has your data and then maybe be able to like, just go in and say, okay, delete my data, delete my data, delete my data. Very similar to what we see with, if you have open accounts that you don't realize subscriptions, I know that's, that's now a business model and people it's very successful. 
 

So I almost want to postulate that, that, that might, that might happen. Um, but there needs to be. It's an ease of accessibility for us as the clients and as the consumers to just say, Hey, done doing business. It's been great. But you know what, Google, I don't want you to have my data anymore, or Amazon, I don't want you to have my data anymore. 
 

[00:25:11] Sean Martin: Yeah. And I can keep going on tangents, uh, a lot of, a lot of which are very consumer focused because I'm thinking of it from a consumer perspective. Right. Um. I want to switch it though in the time that we have left to really get into kind of the business here. We can name huge brands, uh, like we've done and one can, one can presume that, that they have the, the staff and the knowledge and the processes and the funds to do what's required of them, uh, with this bill. 
 

And hopefully they're already doing some of it anyway, or prepared to prepare to manage some of it anyway, if looked at risk and ethics and things like that, uh, proactively, when we start to get to some of the smaller organizations though, where they might say, well, I'm not one of those big, big companies, how does this impact me as a, as a small business or a median business, uh, where I just want to provide a service to my customer. 
 

And I need certain information to do that business. In order to use a service, third party service, that enhances my service to them, I need to share some of that information with that service as well. Advice or what thoughts might you have for some of those organizations that now have to pay attention to this bill? 
 

[00:26:48] Nia Luckey: Right, especially in the state of California where it is the most and heavily like, um, Enforced right now? That's a great question. And so my advice would be this, know the data that you have, know and understand that what sensitive data is, what it's comprised of. So I know we've, we've touched on some of those examples on this podcast alone. 
 

Educate yourself. Um, in addition, there's no harm in offering to your, those clients, those external clients of this business that we're talking about and opt out. Or a do not collect or a right to erasure or some type of kind of. to honor the new bill in your contracting language. So again, it's, it's right there at the surface and it's at your fingertips. 
 

And to your point, a lot of these smaller companies, they're not going to be like largely aggregating sensitive data, right? No. A lot of times we're talking about contract, uh, contractual invoices. 
 

Those are all sensitive data. Um, so it's, it's understanding what is applicable to your business and then doing that data inventory, knowing where it is, and then saying, okay, you know what, if I'm going to maintain this data, how am I going to protect that? And let me protect 
 

it, right? CIPRA and CCPA, just for guidance on that data protection, because that's more so what those are about was like, okay, how do we, and privacy all, you know, Kind of not at the same time, but in the same vein. And this just kind of amplifies it up a level to say, Hey, listen, this also includes, and then here's your data sets. 
 

And so, yeah, we have names, we have addresses, we have social security numbers, we have financial data. Those are four primary examples of sensitive data sets that a lot of businesses, whether they realize it or not, they have, you have it in some capacity. If I. If you buy something from me right now and I write you up an invoice or the other way around, right, that's technically financial data. 
 

And so how are we protecting that data? And that's, that's really the spirit of the bill here, and ensuring that, hey, at the end of the day, if I don't want you to have that data, please get rid of it. Um, now, obviously, as we look at larger organizations, The scale of the data increases, but when, when it is smaller, if we can define the core components of what that compliance looks like from, you know, a bill perspective, from an overall data protection and data privacy perspective, I think for smaller businesses, that is a fantastic place to start. 
 

And then you data handling practices. If you have to, you know, just instill some type of data classification scheme at a very low level, you don't have to go like over the top with it. Again, we're not a banking institution. Um, just some ideas that really come to mind. My sister owns a small business in Ohio, so some of this could apply. 
 

To, you know, her larger contracts because she does contracting work and she's providing a third party service to these larger companies, right? But if we go to the simplicity of the contract and say, Hey, Third party company, her company, you will not sell the data, right? You're including these clauses down. 
 

And that's what the bill points to is, Hey, let's really deep dive into our understanding of the surrounding regulations. What has changed? Let's look into our policies and let's look into our contracts and ensure that the language is in there and it's clear. And it's not buried like somewhere sandwiched in a paragraph all the way at the bottom of page 36. 
 

Make it clear, make it apparent. And then address the questions if there are any questions, because most likely there will be.  
 

[00:30:49] Sean Martin: Yeah, and questions for me kind of sit around, uh, the, the locations and the processes that, uh, the data is part of. And so I'm, I'm thinking one of the things I like to kind of get people to think about, uh, on the show is what can we do differently to minimize or reduce or eliminate exposure in the first place. 
 

There's no better topic to have that conversation around. Um, if we are capturing a bunch of data and replicating it in a bunch of places and sharing it through and using it in a bunch of different processes, um, when it comes time to, well, just protect it in the first place, but then when it comes time to delete it, um, it's going to become an unwieldy mess doing that. 
 

Unless you can kind of limit the places that the sensitive information is. It's stored and processed and transferred and shared. And so I don't know if the bill gets into any of it does not. Unfortunately, I have some data or what the process is and also the types of data. Something you get can a consumer come and say, is it? 
 

I want all of my data related to me deleted in all places. You have it regardless or do they have to say anything that has sensitive personal information needs to be deleted and that might not include an account. Only have a name and a password and therefore I don't have to delete that as a business just the places that it's  
 

[00:32:28] Nia Luckey: So my understanding is this bill closely aligns with that right to erasure that falls up under GDPR, meaning you don't have to be specific if you're asking for your stuff to be deleted. 
 

That's literally what it means is anything that is associated with Neolucky is to be deleted and removed out of their enterprise. Um, again, might be over, oversimplified, but that's my understanding of both. Is that if A customer or a consumer or a business were to say, Hey, I had client a reach out and we have to delete this data. 
 

You would need to comply. And there is a time window associated with that compliance under GDPR, not under this bill specifically, but under GDPR compliance. It is, um. What I like the way you were thinking, though, because that is the step beyond just understanding what type of data you have is then we're going to classify your assets, right? 
 

So if you can define what your critical assets are, and they are hardened to an appropriate security standard, that's where your sensitive data should live. That way you're not hunting all over your enterprise looking for this data. If there's data outside of that. You know, defined area where it should be stored. 
 

Technically, that would be the non compliance. But all of your sensitive data should be in a highly secure location. Um, how you guys do that, the bill does not deep dive into. They really provide a lot of security autonomy here to a lot of private sector and public sector businesses, which I appreciate. 
 

Um, nobody likes, you know, the red tape of regulation. But with that said, if If and when they decide to audit or assess for compliance, anything that deviates would be considered a finding, or could be, let me rephrase, could be considered a finding.  
 

[00:34:24] Sean Martin: So one of the things we touched on before we started recording is that 
 

the, what am I trying to say, the regulatory landscape is not going to slow down, right? Probably not. It's going to continue to grow. I mean, we've seen other bills similar to CCPA in other states. Uh, we're not out of this world to think that we might see, uh, the same type of bill, uh, be passed in other states. 
 

And, I don't know, let's just face reality. AI and other advanced technologies, um, are gonna bring with them more regulations. Uh, globally, nationally, regionally, locally, in many ways. So, I guess the point of that is, If you're an organization constantly reacting to these new laws and regulations and then the standards that hopefully help you tackle some of these challenges. 
 

You're always going to be on the back. One's always going to be on the back foot. So any thoughts or, or advice on how to maybe kind of switch things up and get ahead of, ahead of the curve on this? Cause we're going to see more stuff come.  
 

[00:35:45] Nia Luckey: Yeah, I, I, I agree. Um, it's a big part of why I love cybersecurity so much is because it's always evolving, but sometimes the rate of the change really impacts. 
 

That's the way we do business or our ability to even have an educated conversation about something that's emerging. So I kind of treat it very similarly to threat intelligence. I encourage board members. I encourage anybody that is at the C suite level or above, just keep your ear to the ground with a lot of these emerging changes. 
 

Stay in the know. Um, if you can reach, if like. It literally just hit the press, what, a month ago? And more information is already starting to populate out of it. Stay ahead of it. Start to strategize now. So instead of waiting, I don't know, a quarter or two quarters, when you hear the change, whether it's in regards to a threat, whether it's in regards to a regulation, which is kind of what we're talking about today, Start strategizing like immediately. 
 

Um, it's important to have your advisors around you. It is important to strategize because bills like this stand to impact the way a lot of organizations do business. And it also stands to impact mostly the data brokers themselves and how they handle and manage our data. So. That's going to take strategy. 
 

That's going to take time. That's going to take planning. But what we don't want to see is, okay, we waited, we took six months to strategize and plan. And then come January, 2024, you're now held to that standard and you have to be compliant. And if you're not compliant, there are fines associated, right? 
 

Nobody likes that, that type of language. So it's kind of taking that proactive response, leaning in a little bit and having these types of conversations. And it's no secret. Data protection is a common struggle across organizations, period, regardless if they're a data broker or not. It is not an easy nut to crack, and it is not a one size fits all approach. 
 

So, it's having your smart people in the room and being willing to actively listen to their ideas and see what has the most long term, Kind of benefit to everything, especially compliance. Um, from my understanding and a lot of these spaces, when it comes to sensitive data, there's a very low risk appetite. 
 

And when it comes from the auditing side, there's, there's, it's almost borderline zero tolerance without calling it zero tolerance. So. Lean in, lean on your advisors, talk about it. Don't avoid it. Do as much research as you possibly can to get ahead of it. Um, and yeah, build a strategy together that is cohesive. 
 

Because the strategy for a large company is going to be different than that small company we were talking about. So find what makes sense for your company. Find what makes sense in terms of, you know, feasibility and applicability. And then figure out what that roadmap looks like.  
 

[00:38:57] Sean Martin: This may be a bit. 
 

Distopian, but, uh, I think it was just this past weekend I published an article. It's a fictional story about, uh, dealing with zero day attacks, and, um, All I can think of is zero day compliance, where new requirements come out all the time and you have hours to respond.  
 

[00:39:18] Nia Luckey: I love it. It's true though. That would be a scary world. 
 

[00:39:21] Sean Martin: That would be a scary world. But we deal with that with vulnerabilities at the moment, so it wouldn't be a new thing to, new model to have to deal with, but certainly a different way, different types of information. So that, that being said, I think one of the takeaways that I mentioned in that article was that. 
 

We have tremendous amounts of data, that's what we're talking about here. And we have tools, and we have intelligence, and we have technologies like AI. I think we should be tapping into, I mean organizations spend ungodly amounts of money trying to find that millisecond savings in a workflow. Can we not take a fraction of that and find new ways to approach risk in cyber security and data privacy? 
 

[00:40:07] Nia Luckey: I think that would be very innovative, and I agree with you that there, that would be a space that you really want to just get aligned on what what's applicable to your organization. Right? And then to your point, is there a way to automate the checks and balances? I know there are products out there that check for compliance at the security control level. 
 

Okay, well, what about this bill? You're not going to be able to point, you know, a tool, a vulnerability scanning tool. At a document that doesn't actually call out a specific security control. Look at Sarbanes Oxley. It's the same way. Everybody knows maybe one and it's. Um, 4 0 4 is the, the big security control on that one, but to your bigger point there, there has to be a way to streamline the compliance check aspect internally for businesses, right? 
 

A way that the CISO or the C I O or anybody in the G R C team or risk could be able to say, all right, it's time to do our quarterly assessment, and this is what we're assessing for Ready Go. Right now in a lot of organizations, it's a Tiger team approach. It's still manual. Um, there's ways to aggregate the data that make it easier, but it's not automated yet. 
 

And I would be very interested to see how we would be able to automate that process. Examples could include like taking years of previous assessments that you have to do. Um, so I know in the DoD specifically, it's annual, right? So if you're feeding your AI that data every year, it knows what to check for. 
 

It's going to develop a pattern. Um, But my issue there is it would have to be proprietary to the company, right? You can't feed it external data because it's trying to assess the compliance of your ecosystem of your enterprise. But it would be interesting to see if that does develop as well.  
 

[00:41:59] Sean Martin: Lots to think about. 
 

Lots to think about. And I don't know, I believe that if we, if we take a advanced approach to some of this stuff, we can kind of get out of our own way because as humans, they're not going to keep up with. The amount of change that we're already seeing and that's coming our way. Um, of course there's the, you have to balance the future view with, uh, today's actions. 
 

And so if, if you're, uh, if this bill applies to you, uh, it's time to take a look. And thankfully, Ania has done a lot of work for you already in terms of a lot of, uh, referenceable material. The bill itself, some articles, uh, of course, her read on LinkedIn is a good place. Kind of get an overview and so I'll include that. 
 

I'm not going to repeat all those resources and in my notes I'll just point to your article. Perfect. You can grab them all from there. And uh, I suspect this won't be the last time you hear about the Delete Act, not just in California, but elsewhere and see how it all All plays out. And yeah, it's great to, great to see you again. 
 

Fantastic chatting with you. Um, loads of fun. Hopefully got people to think and, uh, appreciate you having on, having me on the show.  
 

[00:43:17] Nia Luckey: Thank you for having me, Sean. It was my absolute pleasure. Um, yep. And check out the article. Um, let me know if you guys have any questions and yeah, again, thank you.  
 

[00:43:29] Sean Martin: Perfect. 
 

And, uh, thanks for sharing everybody subscribing and, uh, stay tuned for more here on Redefining Cybersecurity.