Redefining CyberSecurity

How Do We Handle Sneaky Changes in Terms and Conditions That Allow Training of AI with Sensitive/Customer Data Essentially Without Our Knowledge | A Conversation with Nigel Cannings | Redefining CyberSecurity with Sean Martin

Episode Summary

In this episode of the Redefining CyberSecurity Podcast, host Sean Martin and Nigel Cannings dive deep into the murky waters of data privacy and AI. They shed light on the ethical consequences of using customer data for AI training, urging listeners to rethink their data sharing practices and the services they trust to use this information as part of the service they offer.

Episode Notes

Guest: Nigel Cannings, CEO at Intelligent Voice [@intelligentvox]

On Linkedin | https://www.linkedin.com/in/nigelcannings/?originalSubdomain=uk

Google Scholar | https://scholar.google.co.uk/citations?user=zHL1sngAAAAJ&hl=en

____________________________

Host: Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/sean-martin

View This Show's Sponsors

___________________________

Episode Notes

In this episode of the Redefining CyberSecurity Podcast, host Sean Martin is joined by Nigel Cannings. The conversation centers around the evolving landscape of data privacy, particularly focusing on the implications of companies using customer data to train AI models, with a specific look at DocuSign's recent policy changes. Martin and Cannings discuss the fine line between using data for enhancement of services and the ethical, legal, and privacy concerns that arise when companies change terms and conditions to harness customer data for AI training without explicit consent.

Cannings, drawing on his background as both a lawyer and a technologist, provides insights into the challenges of truly anonymizing data and the potential risks of data misuse. He shares his personal decision to cancel his subscription to the service in response to these practices, urging listeners to reconsider their use of services that do not transparently and responsibly handle their data. The conversation also touches upon the broader implications for cybersecurity, including third-party risk assessments and the responsibility of companies to not only secure consent for data usage but to continuously update and inform customers about changes to terms and conditions.

Both hosts stress the importance of consumer awareness and the need for businesses to balance innovation with ethical data practices. By highlighting examples from various industries, this episode calls for a more transparent and responsible approach to data usage in the digital age, emphasizing customer rights and the potential repercussions of neglecting privacy concerns.

Top Questions Addressed

___________________________

Watch this and other videos on ITSPmagazine's YouTube Channel

Redefining CyberSecurity Podcast with Sean Martin, CISSP playlist:

📺 https://www.youtube.com/playlist?list=PLnYu0psdcllS9aVGdiakVss9u7xgYDKYq

ITSPmagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

___________________________

Resources

Inspiring Post: https://www.linkedin.com/posts/nigelcannings_privacymatters-docusign-aiprivacyconcerns-ugcPost-7168953031135322112-vZSM

___________________________

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit:

https://www.itspmagazine.com/redefining-cybersecurity-podcast

Are you interested in sponsoring this show with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Episode Transcription

How Do We Handle Sneaky Changes in Terms and Conditions That Allow Training of AI with Sensitive/Customer Data Essentially Without Our Knowledge | A Conversation with Nigel Cannings| Redefining CyberSecurity with Sean Martin

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.

_________________________________________

Sean Martin: [00:00:00] Good morning, afternoon, evening, wherever you are. Welcome to redefining cybersecurity here with Sean Martin, your host, where I get to talk about all kinds of cool things related to cybersecurity as organizations attempt to protect their information and the, and the wealth that they're generating for themselves and their employees and their investors and the whole lot. 
 

Um, sometimes you can't protect them from themselves though. And, uh, Organizations, a lot of them have the right intent, intentions, uh, when it comes to cyber security, but then the business does something that makes people scratch their head. Sometimes it's overtly, uh, visible in, in the form of a product launch. 
 

Sometimes it's covert and, uh, not as, Highly visible when it's deep into a feature that doesn't get promoted. And, uh, you might be lucky to see something come, come through in a policy. [00:01:00] Uh, we're talking about privacy here, of course, and well beyond individual privacy, and you'll understand why in a moment, I'm thrilled to have Nigel on. 
 

Nigel, it's been, been a, been a while since you've been on the show. Nigel Cannings, good to see you.  
 

Nigel Cannings: No, great. No, thanks for having me on Sean. Always a pleasure to, uh, to come on.  
 

Sean Martin: And, uh, it's my honor to have a TikTok, uh, star on the show.  
 

Nigel Cannings: Yeah, I wouldn't say, I wouldn't say star. What's smaller than a star? 
 

You know, whatever's a lot smaller than a star, that's definitely me.  
 

Sean Martin: Well, you're a star in my eyes, Nigel.  
 

Nigel Cannings: Oh, you're so sweet. You really are. I wish my wife would say that.  
 

Sean Martin: There you go. There you go. I think more, more people like other people to say stuff good about them. So, but anyway, you are amazing. And, and, uh, to be honest, I'm not on TikTok. 
 

So I only, only know that from a little birdie that said something that you're there, but, uh, I do know you're on LinkedIn and I presume some of the same type of. Content we find there, which is what prompted today's conversation, but, uh, we're not here to [00:02:00] talk about tick tock. We are going to talk about one episode that you did a little video that you walk around and you, you talk about stuff. 
 

And this one caught my attention, caught my, my co founder Marco's attention as well.  
 

Nigel Cannings: Yeah, well, I do, I do like to do these kind of walk and talk videos. I think the only reason people watch them is they're hoping that one day they'll see the episode where I get hit by a bus. I'm not, I'm not looking properly. 
 

Sean Martin: Will you become part of the zebra?  
 

Nigel Cannings: Absolutely. Yeah, that would be me. I'd be flattened on the floor.  
 

Sean Martin: So let's not, let's not let that happen, Nigel. Pay attention, my friend, watch out for those big red buses. Absolutely. Um, so for those that didn't catch our end of year, Call it a recap prediction. I guess we're a group of us talking about what might be coming. 
 

Uh, if they watch that, they know who you are. Maybe they forgot, but, uh, let's give them a refresher of who you are and what you're up to at the moment.  
 

Nigel Cannings: Yeah, cool. So, um, as you can probably see from the bottom of the screen, I'm Nigel Cannings. Um, [00:03:00] I'm the CEO, uh, co founder of a company called Intelligent Voice. 
 

We're based out here in London. Um, we, we're known very much for our secure confidential speech processes. And so we did a lot of work for. Banks and insurance companies and governments and Police and all sorts of people like that, um, processing voice, voice, identity, voice, transcription, that type of thing. 
 

Um, but we've actually got a kind of very deep expertise in all this kind of fancy large language model stuff that people are talking about at the moment. And we've been working with transformers since about 2018, actually, since pretty much since the, the. the genesis of it. Um, and very much in the kind of how do you make these things secure? 
 

How do you make them explainable? How do you make them responsible? Um, and also, how do you train them without getting access to a customer's data? How do you actually kind of tune them towards that? So I've kind of got a real and and and the thing I don't normally tell people out loud is I'm a lawyer by training. 
 

but a technologist [00:04:00] by passion. So I'm kind of one of those rare people, I think, who kind of came from the law, um, and has a strong legal background, but actually is not bad at programming and thinking up technical stuff as well.  
 

Sean Martin: So you have a constant internal struggle there.  
 

Nigel Cannings: Oh, it's awful. It really is. 
 

It really is. Yeah. So, uh, so yeah, so I'm, I, I spent a lot of my time kind of thinking about Privacy and think about security and think about it from a kind of a practical perspective, but also from a legal and ethical perspective as well, um, and the interplay between them. And then of course, you know, what we're seeing going on in the, in the LLM world at the moment is just incredible from that perspective, the, the ethical and privacy concerns that are coming up. 
 

Um, and, and as you said, I. posted a video about one particular incident, um, which seems to have broken linkedin. Um, I've got, you know, certainly, certainly my [00:05:00] linkedin. So it's, yeah, it's, um, it's really interesting time. So I don't know, do you want to, do you want to segue into it, Sean? Or do you want me to talk about it? 
 

Sean Martin: Let's do that because I, I think, There are two points I want to make as we, as we kick this off. So first, I mean, these videos that you do, I don't want to, I want to on this point, but these are, these are from the heart videos. I mean, this is, this is just you talking. So first it's, it's not rooted in any agenda or anything. 
 

Second is the brand that you are discussing is one that caught my attention because it's one that's a, it's a service that, uh, I'm That we use. And then of course when, when you hear and read what you say, it's like, yeah, we know, but that this is possible. And we know it probably is happening, but kind of to my intro, unless somebody had come out, comes out and says, this is exactly how [00:06:00] we're doing something. 
 

You're not going to really know. So you mentioned in your intro, kind of the, I'll call it the transparency. Part of what you're doing. Um, so how does it work? Why is it do it that way? How are we getting the data? All that to me is transparency to show and demonstrate one. You know what you're doing. You're taking time to think about it and and then it comes through. 
 

And in the implementation as well, hopefully, um, but this case was the second half of my point, which is, it seemed to be somewhat covert and I don't know that they're trying to be malicious or, or try to hide anything in particular. Maybe the response tells, tells a tale there we'll see. Um, so let, let's just get into it. 
 

This is about. DocuSign using signed, I forget, they call them envelopes. Right. So content that goes in their envelopes, uh, to train [00:07:00] AI and uh, yeah. So go take it from there,  
 

Nigel.  
 

Nigel Cannings: I know it's like, I mean, it's, it's a shocker, this one. Um, so back in January, very quietly DocuSign changed the terms conditions, and they were, they very explicitly in there said, um, They can use your content to train their own machine learning models. 
 

I mean, it's as blunt as that basically. Um, now, and, and the problem is everybody, everyone wants to be data company. Now that's the problem, you know, data is the new oil. Everyone wants to be a data company. Um, and, and companies that are sitting on a lot of data are thinking, Oh, well, you know, we can use that. 
 

Um, so the, the thing that the reason I came across it is I saw an FAQ pop up on DocuSign's, um, site. A few days ago, a week or so ago now, which basically said, um, you know, do we, do we use your data to train our machine learning models? And it said, yeah, yeah, [00:08:00] we do. But don't worry. Don't worry. Um, we're anonymizing it all. 
 

And so that's fine. Uh, and that was actually the bit, you know, that this, oh yeah, and we're only doing it if contractually authorized. So, um, I did a bit digging on this. Um, as I said, I've got, you know, I've got a legal background, so I read Terms and Conditions for fun. And, um, what I realized was you cannot use DocuSign service Without agreeing to these terms and conditions. 
 

So it was really disingenuous to say, yeah, yeah, only if you contractually authorize us to do it because it's, it's right there in the middle of the terms conditions. Um, And there's a whole bunch of things came out of this for me. I mean, one, the, the interesting timing of this is the FTC. So your federal trade commission, um, about a month or so ago, um, put a load of guidance out, basically saying, [00:09:00] if you sneakily change your terms and conditions to allow yourself to train machine learning models using people's data, um, that's probably illegal. 
 

And, and I thought, this is great. So you've got, you know, DocuSign. So, you know, do you know I use DocuSign? Well, I did use DocuSign. Let me be really clear. The second I saw this, we canceled our subscription. And, and I'm probably gonna get, you know, lots of trouble saying this to people, but until DocuSign changed their terms, conditions back, I would urge people to look at using other services. 
 

Um, because, you know, I. All my contracts are there. I mean, all your contracts are there, short half the world's contracts are there and you cannot anonymize data 100%. And even if you could anonymize it a hundred percent, the the other [00:10:00] stuff there, it's copyright, it's confidential, you know, it's just like, I didn't even know where to start with this one. 
 

I really don't.  
 

Sean Martin: I know, and we were, we were talking briefly before we started recording just the, the term anonymize. might mean one thing. And I've done some work in the healthcare space where DNA ID identification is another, which is an effort to be much more rigorous. So you can't not, not just anonymize, but you can't, you can't use multiple contracts to decipher, right? 
 

So it's the identification, not just anonymized, but both of those are about the identity. And to your point, uh, it's, it's, uh, it's Yeah. The contract data, right? Um, which could be used to determine how much people make on, uh, on their contracts, what their services are, what their liabilities and responsibilities, all this stuff, right? 
 

And forget about IP. [00:11:00] If somebody's submitting a patent application, as I think somebody commented on that in the, in your thread there. So it goes way beyond, um, Just the identity way beyond privacy. We're talking to intellectual property and all kinds of things that, yeah, as a data company, we'd be cool to have and find some interesting stats for, but, but geez,  
 

Nigel Cannings: I mean, it's, it's horrifying. 
 

And, and the thing is, there are a whole bunch of studies which show it's almost impossible to de identify slash anonymize a large volume of data. And there was a great one. Um, there was an attack done on the U S census. Um, a little while back and again, I posted a link to that in, in my comments on the, uh, on the video as well, which actually talks about how easy it is to reconstruct anonymized properly. 
 

It's really, they really thought hard, um, the census office about how to, um, to kind of put, put it into blocks of people and not down. 
 

[00:12:00] Plus, um, just using public data. So um, they would probably say that the, the value, um, in the confidentiality there, it's really far greater than whether someone can work out where I live. Um, and who I am, you know, I personally think I'm quite important, but you know, the law firm probably disagree on this. So, you know, but, but this idea of a reconstruction attack and, and actually the census guys have actually. 
 

They're completely revamping the way they publish census details because they red teamed their own census records. I thought was brilliant. It was so good to see someone taking it that seriously. Um, and that's what DocuSign should have done. I mean, if they really intended to do this at the very least, they should have had a, uh, a, a documented, um, verifiable, replicable red team kind of report that said, [00:13:00] we have taken an independent company, independent bunch of people. 
 

They've looked at what we've done. They've proven a hundred percent. There's no names in there. There's no numbers. There's no dates. There's no, whatever it might be. And also they've attempted a reconstruction attack, um, and they can't do it, you know, that, that at least would have been, you know, something, but just to say in an FAQ, yes, yeah, we're not advising it. 
 

Don't worry. It is tragic.  
 

Sean Martin: Yeah. Well, I mean, presumably if everything was all Text, there, there might be ways to find and scrape and redact and cut and whatever. Um, but the reality is PDF includes, well it's it's own format first off, not, not too, not too terribly complex, but it does include other elements, right? 
 

Uh, it could perhaps include a [00:14:00] code block, it could include an image, it could include An embedded video, right? All tons of things can be inside this thing that you're uploading. That becomes part of their envelope that maybe previous. And I'm, I'm just trying to think of how, what, what might be going through one's mind as they think about this process, they might may have thought, well, we could do it. 
 

Do the anonymization easily with just data, but it was really hard before with all the other crap in there. Now comes LLMs, we can build something that not just analyzes the text, but goes into all the other multimedia elements that are in these PDFs or whatever documents or elements you're uploading and can, and can actually now do it. 
 

So I think the technology may, may have given them, we'll talk a second about the whole decision to do it in the first place, [00:15:00] but may have given them the opportunity perhaps to do something that they couldn't do before, which I think is cool. The question still comes though, is, Is it something you want to do? 
 

And I think we're picking on DocuSign here, but I can almost guarantee for every DocuSign there's another hundred that have done the same thing and maybe 1 percent of them have updated their terms and conditions.  
 

Nigel Cannings: Well, and I think that's what the FTC we're getting at actually was the fact that people are, you know, there's no, there's no informed consent and you know, I, I know that. 
 

We here in Europe, uh, are a bit smug about all of our data privacy laws and that type of stuff, but there's a certain basic business sense in kind of making sure that your clients understand what's happening to their data and aren't surprised at some point later to find that you've done something with their data, which goes against the, the whole kind of ethos of, of what. 
 

You're doing, uh, and I [00:16:00] think, uh, you know, a, a, a company like don't kiss on, I think, I think the thing that's so shocking about it is because you, you know, you just assume that the data you put in there is kind of locked up and you know, can't be accessed by anyone else and, and the thought that anyone has got access to it, um, is, is the frightening part, I mean, there was an interesting thing and, and, and this is happening a lot and you're, you're absolutely right, Sean. 
 

So, um, So Microsoft, so Microsoft, of course, have tried to put some clear blue water between themselves and OpenAI recently in a number of different ways. Um, one is the Mistral deal they did over here in, in Europe, but also they've got their own OpenAI enterprise service whereby you can use OpenAI models, but there's this huge, great big kind of block of tech. 
 

So. We will not send data open. I will not [00:17:00] use your data to train models. And you're thinking that's good. You know, I like that. But actually, if you dig into it, you'll see they've got something called an abuse monitoring process. And basically, what what it says is, um, Don't ignore all the stuff we said above. 
 

Um, we can keep all of your data. So imagine, you know, you're training, uh, you're training in LLM for internal use, you're using highly confidential data. Um, we can keep that data and the prompts that you've used for 30 days. And if we believe that the data. Is, um, being used in an abusive way or you're misusing the service in some way, we can read it and again, it's one of those kind of hang on a send. 
 

No one's ever done this before. You know, if you're using a search, you know, if you, if you built a search engine in as you don't expect a human being is going to be reading your data. And so, you know, there's a real, I don't know what [00:18:00] it is about. LLMs, particularly that's got everyone thinking, yeah, do you know what we can just use this stuff? 
 

You know, we'll just, we'll just use your data. We can even, you know, even Microsoft and people kick Microsoft a lot, but actually in general, their security policies are pretty good. You know, they do take when you're paying for Azure, they take you, they take it pretty seriously, but you know, the thought that there is an Azure service there, which. 
 

Very and again, it's very quiet. That's that. That's the thing about it. It requires you to kind of swim into the terms and conditions and kind of work through and go. Yeah, there's something going on there. So you're right. There's a lot of companies doing it big and small. Um, and they're quietly changing terms conditions. 
 

They're quietly kind of re architecting the way they use the data. And I think it's going to stop, and I really think it's going to stop.  
 

Sean Martin: Yeah, well, this is, this goes back a number of years now. [00:19:00] Um, I know somebody who, she's no longer with us, sadly, but, uh, I know somebody who is working for the big G and, uh, working as part of the voice product group, I believe it was. 
 

And the, the responsibility or the role required that she listened to half of the conversations mapped on screen to the transcript. So that was live transcript validated by humans. But of course, before the human validated, it was being trained in real time as well. Um, nobody expects their, their voice conversations to be monitored in real time by humans either. 
 

No. And then of course, the technologies, it just scales that and makes it even more scary.  
 

Nigel Cannings: Yeah. And Amazon, funnily enough, did the same with Alexa. [00:20:00] So there was, um, There was a, there was basically this big office block in Romania where they had people listening in to calls and the problem with with things like Alexa, of course, is that it has a lot of, um, accidental activations, so it wasn't just the fact they were listening into the, you know, I've got to be careful what I say because I've got one of these devices very close to me or my lights might go off, but  
 

Sean Martin: I just got a different view of you, my friend. 
 

Nigel Cannings: But, but you know, it's, it's, it's not, it's not just the, you know, the things you expect to say, the things you don't expect to say, the accidental activations, all of which were being picked up and transcribed. And of course, again, it's just done so quietly. You know, you don't, as you say, you don't expect that you, you really think when you're in, in your own home that, that's. 
 

It's just you and the robot. I think that's the point, isn't it? You know, we think it's just us and the robot, [00:21:00] but, but actually in this, um, it's a point that's been made many times before by people, but we are the product. That's the problem, you know, and, and the fact is we just don't realize it. So our data, our private data is, is fair game. 
 

And I suppose when, just, just to hook back to the DocuSign thing, you know, I think, Where the line blurs there is, do you know what, if I use a free service like Gmail, you know, maybe I kind of expect that Google are going to run algorithms over it and serve me adverts. You know, I kind of get that, but when it's a service I'm paying for, and when it's a service that you expect a level of confidentiality, And that kind of goes the same for the Microsoft OpenAI service as well. 
 

You know, you're paying for this stuff. I think that you shouldn't be the product there. I think the money that [00:22:00] you're handing over ought to guarantee that. You know, you've, you've actually become a consumer at that point, rather than actually being, you know, the stuff that you're handing over being, being part of that transaction,  
 

Sean Martin: I want, I want to talk about the consumer. 
 

Um, so, so we've become the product if you're using that service and DocuSign is, or I'm saying, yeah, so we're the product they're using, they're using us. Right. Yeah. The consumer of us, our data. Um, But one of the other comments here, and I'm always thinking about this as well, uh, third party risk, um, I don't know what it, maybe you can comment if you remember, can recall what it talks about in terms of who it shares that information with, because DocuSign, many other services like this are embedded in forms, legal processes, [00:23:00] Custom apps, um, I mean all over the place. 
 

So, and then I, I don't know for certain, I believe there's a, there's a marketplace where you can connect other stuff to DocuSign as well, not just embed it in other things. So, there's a whole web there where presumably the insights are valuable to those other consumers. So they then become, They use our product to create another product that they pass. 
 

Yeah. So I don't know. I can only, only assume that that's probably happening. Um, but I don't know what, and whether or not the terms and conditions say it or not as another thing. Well,  
 

Nigel Cannings: it's the Samson problem. So the, the Samson problem, you know, where, um, You know, they were uploading internal company data to OpenAI. 
 

They're using chat GPT and that data was getting [00:24:00] regurgitated elsewhere. I mean, this is, this is the thing. So going back to the, the third party consent problem. So yeah, absolutely. Let's say that you and I are involved in a, in a transaction, Sean, and you know, we reach a price and we say, yeah, it's great. 
 

Thanks very much. And, and I. Put the document into DocuSign. So I, I have unilaterally decided that we will use DocuSign as that. Effectively, I am consenting on your behalf for the data that you have got in there to be put into a machine learning model, which could then potentially be regurgitated because let's say it doesn't get anonymized quite as well as it should do. 
 

Perhaps, you know, Sean's name comes out because People, you know, it's been mistyped slightly. So someone's put an, a space between the S and the E. So the, uh, the NLP didn't pick up that [00:25:00] it was a name. So actually it's there. Um, you know, and that's the sort of stupid stuff that happens, you know, in, in, in some cases. 
 

You know, DocuSign would say, well, actually, before anyone uses the service, when it gets routed to you, you click on something saying that you consent to use it, but actually, you know, really, I forced you into doing that. I'm the one who's made you do that. So, yeah, so there's a whole issue of me uploading your content to DocuSign for them to then use to regurgitate to someone else. 
 

Yeah, I mean, it's, it's, it's horrendous. Uh, it's, I mean, they do say that it's only, it's only internal, you know, they're not sharing it with anyone else. So I just feel so much better about that. Great. That's right.  
 

Sean Martin: So what, um, I don't want to presume what, uh, what responsibility or liability. [00:26:00] DocuSign might have. 
 

But, uh, what, what do you think needs to happen? I mean, so we've talked about the FTC thing. Um, I didn't look closely to see when the timing of that was compared to this and responses and things like that. But do you think that will have any teeth? And of course that covers the U. S. Um, you touched briefly on some of the things that they're just not using services that in the UK and the EU that, that, uh, Do these types of things, but how do you know first off and then what's going to have teeth to bring visibility and action? 
 

Nigel Cannings: Well, I think that the yeah, I mean, I mean the What you have to hope is that something like this really grabs the attention of people that what you know The only thing that's really going to stop people doing this is where suddenly Half of a company's [00:27:00] subscribers say we're not doing this anymore. So, you know, we know that regulatory pressure takes a long time to ramp up. 
 

So, you know, the, you know, people get a bit of a slap on the wrist. They get a couple of million dollars fine. You know, it was like, yeah, you know, and it takes quite a long while. And we see this. In Europe, as well as in the U S the, you know, the big fines take a long time. So of course, in that time, someone's had first mover advantage. 
 

It's this kind of, you know, move fast and break things attitude that comes out of California. Um, which is, is kind of a great, you know, it's, it's. A lot of great companies have come out of that, but when you apply that mentality to, um, this type of process, which is very structured, very rigid process, one that should be highly confidential, um, they're going to get away with a lot. 
 

So I think that it's, it's customer pressure that's going to do it. And, and I think there's actually, um, I've, I've got [00:28:00] an idea for a new. For a new product, which I will share with you here. First, sure. Um, and you and I can be partners in this one. So I'm actually going to, I'm actually seriously thinking that any company who uses cloud products or any type of product will be able to just use a service which will alert you when the terms and conditions are changed. 
 

And also using generative AI, um, tell you what those changes are as well. Um, but I actually think, you know, there's a real supply chain issue here because, you know, we, we, we sort of know what we're signing up for when we sign up for things, but when was that, if you look at all of the services that you use, Sean, and you know, we're here on a recording platform now, um, you use email, you use instant messaging. 
 

And we all use, when was the last time you actually sat down and said, have those terms and conditions change? Since we first signed up for that.  
 

Sean Martin: So for the [00:29:00] silent ones, never for the, the kind providers that send me an email that say we changed, thank you for recognizing that you be using the service under these new terms, um, which I never read, um, and then there's the, it's, it's funny, I don't know why maybe it's not funny, but, uh, the only ones that. 
 

that catch my attention are from publications that say we, we're updating the terms. We haven't seen you engage in a while. We're going to, we're going to disable, of course, um, most of those are just subscription, uh, free subscriptions. I'm paying for them, but yeah. Those ones trigger me to think, Oh, I might, I might go look and see and at least engage so I don't get disabled. 
 

But for ones that are purely terms and conditions, I probably don't care. And if I'm, if it's like, Oh, I do, [00:30:00] it's just to your point. How do I know? How do I know? And what, how do they change? More importantly, because all of that stuff is, you know, is legal. Mumble Jumbo as well. What's the real impact to me, to the people interact with me in this case, people that, that do contracts with us and whatnot, um, contracts we do with or other organizations, right? 
 

Things like that. Cause to your point of their customer base, uh, needs to speak up. There are two customers, right? There's the one that holds the subscription. That sends the envelope and then there's the poorest lucker on the other end that says that's about to get a business deal that says, man, I'm not going to sign that contract because then they don't, they're not going to want to lose that business deal just because signature comes from that service. 
 

It's going to be a hard, hard nut to crack from that perspective, but. I [00:31:00] don't know, maybe, maybe, maybe something will happen that way.  
 

Nigel Cannings: Yeah. Well, I think, I think, you know, it's this, you know, it's this type of, of outreach, Sean, and thanks very much for, for having me on because, you know, it's only by talking about these things and exposing them on, you know, yeah, I did my, I did my funny little video on LinkedIn and, and it's just gone mad. 
 

I mean, I was, you know, I, I've done about, almost a hundred of these videos now, just walking along, just talking about different subjects, often in, in and around the AI space. And I picked on this because it had a really nice crossover for me between my kind of legal background and my AI background. It was the two kind of clashing together. 
 

Um, and, and this is my most viewed video ever. Because I think it's really, it's hit a nerve. It's, you know, people are saying, hang on, um, my data is my data. And when I've paid you, you know, it's, it's obvious if [00:32:00] I send data to chat, GPT, everyone knows that it's going to get used for training. You know, yes, there's an opt out, but you know, Realistically, if you do that, that's fine. 
 

Um, but if I send it somewhere confidential, it's, and it has honestly, it's made me think now I am genuinely going to start going around and checking up on the terms conditions of all the things that I'm using. I, you know, I pay for Slack, for example, I bet you there's something deep in Slack. I mean, Slack, the wizard Slack, please. 
 

I've not checked yet, so don't take this as read, but I bet you there's something in there which says, Oh yeah, and by the way, we can take all your chats and use them to train our train our machine learning models. Um, I think zoom did something fairly similar as well. Um, a little while back again, they changed the T's and C's, um, to allow the use of your data. 
 

Um, this was their AI companion product. So I switched it off because I realized that actually this data was going to places that you didn't [00:33:00] realize to be processed by people you didn't know for reasons that you weren't really sure about. Um, so, you know, in Europe, In Europe, we've got a bit more protection, you know, theoretically, um, if your personal data is being processed, you're supposed to give informed consent for it in practice, particularly if you're in a business because you're subject to the fact it's a it's done by the company. 
 

Uh, the company has certain rights vis a vis your data. It's a bit more opaque. Um, it's nice to see in the U. S. There's been a lot of moves towards greater privacy. I mean, we have seen, um, you know, from top down from executive level that there is a there's much, much more focused on trying to, um, to preserve people's privacy and think more about it. 
 

Whether if there's a, a change in administration, you'll still have that same kind of momentum behind it is, is tough to know, but I think, you know, con consumers, ul ultimately big [00:34:00] companies only care about, you know, the, as we say over here, the pound in your pocket, so you know the, the dollar in your wallet. 
 

Um, you know, that's what people care about. And so. Consumer education, customer education, supply chain, supply, supply chain education, that is really the important thing here.  
 

Sean Martin: Yeah. And, and, and taking a moment to pause, cause he, he's mentioned dollar in the wallet, pound in the pocket, um, and you mentioned zoom and it's recording. 
 

And I, I know there are countless organizations using other call recording services that transcribe. And I'm just thinking about the conversations between security vendors and security leaders. That the vendors are selling to talking about the change in their environment, the challenges they're having, the breach they just had, and they need protections and [00:35:00] a lot of interesting stuff in those calls that I'm sure the terms and conditions when, when somebody bought that, that, uh, call recording service, that's going to transcribe the whole thing. 
 

They probably said, we're going to use it for use AI for this stuff. Of course, there's an aptitude to transcribe it. Um, so the terms and conditions didn't change, but it's definitely But people are still accepting that we're doing that. So that I think at a personal level, we have to, we have to figure out when and where this is appropriate and at least make a conscious decision that it's okay. 
 

I mean, we all, I think we all kind of blindly. Click the yes, it's okay that this call is being recorded. This one in particular.  
 

Nigel Cannings: Yeah. Well, this seems fair enough. If we were on the record, if we didn't record this one, it would be somewhat pointless, I think, but yeah, but, but yeah. And, and yeah, it's what, it's what I do for my day job. 
 

I mean, we, you know, we, um, transcribe [00:36:00] recordings of meetings and things like that, but we do it. With the express knowledge of the people who we're doing it for, you know, it's absolutely, it's done for regulatory purposes. I mean, the regulator insists it's done. Um, but yeah, for a lot of us, you, you do go on a sales call now and there's, you know, the, the helpful bot is there, you know, it's the, you know, the, the company who you're calling bot and it's like, what the hell, you know, um, you know, because  
 

Sean Martin: I know some meetings I've had invites. 
 

I send, I don't get a response from the person. I get a response from the recording, but. They're going to be there before the, before the actual person.  
 

Nigel Cannings: Yeah, exactly. And, and you, and, and the fact is you don't know where that's going. And it is, it's a very difficult, um, I always tell people if I'm recording a call. 
 

So we, we do, uh, we do a lot of call recording. I send emails out to people. I say, you know, we're intending to record this conversation. You let me know if you didn't want to, um, you know, pretty, very explicit about it, but to turn off on a call and the bots there. And [00:37:00] you're like. Where's this data going? 
 

Who's going to see it? There's going to be a massive data breach at one of these companies. Um, and all sorts of very personal and private, um, data will be exposed. And, and I, and I say that as an absolute, and, you know, you as a cyber security professional know that these breaches. Are inevitable. It's not a question of if it's just a question of when, and if it's not properly protected, if it's not, you know, immediate, if it's not stored securely, if it's not, um, encrypted properly, this stuff just gets out there. 
 

Um, and it is, it's, it's scary stuff that we're allowing it to happen. I mean, it's even worse than when we actually click the accept button and say, it's all right for you to do it.  
 

Sean Martin: Exactly. So, so at the end of the day, we're all. People on the web. Somebody said that the other day. We're all people on the web. 
 

For me, this [00:38:00] should raise, raise a flag on the personal level, uh, to take, uh, take notice and take inventory on what's going on to your point. Um, earlier, I think from organizations that are building products and services. I beg you, please don't do it just because you can. Please don't try to ignore that you're doing it and that we've become the product. 
 

Don't try to sneak it in, the terms and conditions. If you are doing it, Don't try to sneak it by not putting it in the terms and conditions either. Um, so I think we've talked a lot about those two audiences, my core audience, which also fit those audiences. Many, many of them also build products in our people, people working for companies, but my core audience are cybersecurity leaders and business owners and security practitioners. 
 

And I'd like your point on this as well. But my advice to them is [00:39:00] this is something that they probably need to look at. Uh, probably it's a third party risk assessment. Maybe they need to extend that program to look at the terms and conditions of the private providers or organizations using, um, the types of data. 
 

That's going into those services. Um, well, that's probably first, but I think they probably do that already would say it is going into these services. But then what are those services doing with that data and keeping abreast of the terms and conditions there? Any other points to that group? Do you think Nigel? 
 

Nigel Cannings: Um, I mean, I think going back to what you said before, you know, for themselves not to be tempted to go down that route, um, you know, I think that your audiences is one of the most kind of risk averse and quite rightly audiences that anyone could ever address and I don't think any of them would be thinking about it. 
 

Um, but, but the key thing there, I think there's your last point was not just checking it when you sign up. But making sure that you have a regular [00:40:00] review process as part of your supplier management to ensure that, um, these things are not happening. And actually even kind of stupid stuff, just like staying on top of people's blogs, because actually you're, you know, I, I pick up a lot of stuff just because A researcher at such and such a company is released a paper or is written a blog post and you're like, hang on a second. 
 

What they're describing is not what I would expect. So I think it's a it's not just the terms and conditions. It's a holistic look at your supplier. Um, and just make sure, um, even job ads. Do you know what, Sean, job ads tell you more about what a company's intention are than almost anything else. 
 

Interesting. You know, if, if you, if you see a job ad and, um, the company whose data you're giving it to, uh, talking about, you know, someone who's an open AI specialist or someone who's got experience in using API APIs for, you know, run fast in the other direction.  
 

Sean Martin: It's we have, we have 10 recs for, uh, [00:41:00] data scientists. 
 

Nigel Cannings: Yeah, exactly. It's just like quick that way.  
 

Sean Martin: Interesting point, Nigel. Interesting point. Alright, well, I'm, I'm glad you brought this to our attention. 
 

My goal isn't to shame any organization in particular. I think, I think we need to point it out though. Um, I, I'd be very welcome to have a representative on the show to describe what's going on, what they're doing. Um, so if you're listening and then you want to share, I'd be happy to have you on and bring that perspective to bear as well. 
 

Um, but for me, my whole point of Yes, I care about society. I care about us as people. My my whole point is for the security leaders and organizations that are bringing these products to market and have a responsibility to their customers and are using these products and also have [00:42:00] a responsibility to their company. 
 

And their customers. So, um, there's some things to think about because of this and I'll let somebody else deal with the, the FTC. I'm not interested in that, but, uh, I am, uh, the outcome, whatever it means, whatever it is, but, uh, I'm not going to head down that path. But anyway, Nigel, it's been, uh, been fantastic. 
 

Any, anything else you want to add before we wrap?  
 

Nigel Cannings: No, sure. Just thanks again for having me on. Um, it's, it's always a great pleasure to this. I mean, these are really important issues and, and I think, as I say, your, your audience, You know, the type of people who will take this very seriously. Um, and, and so, you know, I'm delighted with that. 
 

So again, thanks for having me on.  
 

Sean Martin: Thank you. And thanks everybody for listening and watching. I'll of course include the, uh, the link to the LinkedIn post, which is. Full of resources and, and interesting comments from, uh, from throughout the thread and, [00:43:00] and a bonus tick talk like video, Nigel. 
 

So, so keep, keep in touch with Nigel there too. He's doing some good stuff. Um, so thanks, Nigel. Keep well, uh, we'll Hope to see you soon in London when we're out there and everybody else is Mark when I go on location to infrasecurity Europe in June. So we'll hope to see everybody there for that as well. 
 

Please subscribe, share with your friends and enemies and we'll catch you on the next one.