SPONSOR 0:02
We’re proud to announce CISO stories – a new podcast series in partnership with Cybersecurity Collaborative and Cybereason. CISO stories features the candid perspectives and experiences of frontline senior security executives and dives deep into timely security topics. CISO stories is hosted by Todd Fitzgerald, VP of Cybersecurity Strategy at Cybersecurity Collaborative, and Sam Curry, Chief Product and Security Officer at Cybereason. Listen weekly as they speak with extraordinary CISOs by visiting securityweekly.com/CSP.
Jeff Man 0:34
And welcome back to Security and Compliance Weekly. Hey, if you want to stay in the loop on all things Security Weekly, visit securityweekly.com/subscribe to subscribe on your favorite podcast catcher or on our YouTube channel. Or you can sign up for our mailing list where you can join our Discord server. And or you can follow us on our newest live streaming platform, which is a brand new platform and bedrock and it’s called Twitch. Anyway, you have to be a Flintstones fan to get that I’m old. Also, our next technical training will be on May 6 at 11am EST, and it will be exploring common misconfigurations of Nginx the damage they could do and how to avoid them. Also next up see how next up see how our attackers can gain access to endpoints and learn defensive strategies to protect against those attacks in our May 13, technical training, also at 11 am EST, you can sign up for both of those by visiting securityweekly.com/webcasts and registered. Now also, of course, if you’ve missed any of our previously reported webcasts or technical trainings, you can find them at securityweekly.com/ondemand.
Alright, let’s get back into this whole data privacy versus security discussion. I do want to make sure that we get to the who, what, where and when and why and how part of the discussion in focus, you know, on what the ramifications are of the new Virginia law in the opinion of our esteemed guest, Mr. Cris Pin. So Chris, I guess is you know who first question for the for GDPR for CCPA, probably most people are somewhat familiar. But as far as the new Virginia law goes, who should be worrying about this.
Chris Pin 2:21
So I won’t talk about the exemptions, because quite frankly, the way they’ve been written today is not going to stay that way, it will be amended, exemptions are going to flex. But currently, the in order to be in scope for the Virginia law, it’s any organization that processes data on at least 100,000 Virginia residents, or they process data on 25,000 Virginia residents, but more than 50% of their company revenue is generated by the sell or sharing of personal data. And there’s no threshold as far as income goes, or revenue goes on this. So if you make $10, right? If you’re 20 – if you processing 25,000 residents data and you make $10 a year, if you $5. So that comes from that sharing of information, then you’re subject to this law as it’s written today.
Josh Marpet 3:21
Okay, okay. I got.. predicting for years that we’re going to see data laundering companies rise up because of GDPR. And because the CCPA. And now because of this, so your point about if you make $5 on the data collection, data dissemination data processing, whatever your subject to this, am I correct?
Chris Pin 3:41
Absolutely.
Josh Marpet 3:42
Okay, so what if I don’t make any money, that little firm over there, that happens to have the exact same board of directors and happens to be owned by the same people, and I just gave them bushels of data to start them off? They make money, I just buy it, I buy a license to use the data or something like that. What do you think?
Chris Pin 4:02
So that goes into you would be dubbed, but what the law calls a processor, you would be processing data on behalf of the organization that uses data to make money. So you’d have to have a binding contract between you and the organization you’re selling to.
Josh Marpet 4:17
I mean, I own both companies. So I can just sign that’s no problem. I’ve got a, I’ve got a contract. We’re good, man. It’s cool.
Scott Lyons 4:26
No, it all comes down to liability and where that lies.
Josh Marpet 4:29
Right there. Exactly. And that’s where I’m going with this.
Jeff Man 4:31
And you’re small so you can self assess. You can self atestate.
Chris Pin 4:36
Until the Attorney General’s step and then good is the enforcement body for these regulations and says, nope, you’ve got to pay us $7500 per person who’s impacted by this.
Scott Lyons 4:46
But we transfer
Josh Marpet 4:47
A small company would have to pay for that.
Scott Lyons 4:48
Hold on, hold on, hold on timeout. Let’s back up here for a second. Josh, in your, in your example. How’s the transfer of data happening?
Josh Marpet 4:56
Well, let’s assume let’s assume the most innocuous possible way. That the small company, I’m smart, and I set it up from day one. For example, I’ll give you an example, and this is a totally legit example. Bowflex is the exercise equipment company, right? Except it’s actually two companies, they have a lead register – a lead generation company, which is the one you see with all the advertisements on the TV. And and this is an old example, I don’t know if this is still true. But they have a company that goes out and gets the leads, and then sells them to Bowflex the exercise equipment company. And that’s so they each have different goals, but they’re very focused on their particular goals. So let’s assume that I decided to set up a company to do things with data, but I set up a second company the exact same time. So the small company, the data company, let’s say they’ve been collecting data from day one. And they I’ve been selling it to the other company, you know, the small companies been selling it to the larger company or leasing it or licensing it or whatever. But the small company has the breach? Well, I mean, that’s fine. I don’t care. I’ll set up another one.
Chris Pin 5:58
Yeah, you’re still uh…
Scott Lyons 6:01
If I may, if I made that really seems like you’d be able to pierce the corporate veil on that.
Josh Marpet 6:05
That’s not and that’s not piercing the corporate veil. That is going from company to company, not person to person.
Scott Lyons 6:10
Same company, same owners dealing with the same data being resold. So one, I’m sorry, that you’re sort of well, we need you know, what we need to do we need to save this question for Priya and ask Priya. What is the ability to pierce the corporate veil when somebody has two companies that they set that they set up to energy exchange data between one company is making money in one way one company is making money in another way? Hmm.
Josh Marpet 6:36
Well, what what if they don’t even have the same interlocking Directorate boards or whatever? I mean, there’s a lot of different variables here. But the point is, and Chris..
Scott Lyons 6:44
Yeah, but the director would be the same, because like you, you said yourself in your example? Well, no, hold on. You said it in your example. You’d be the owner of both companies. So..
Josh Marpet 6:44
Yeah yeah… I’d said it the example. But it doesn’t have to be for example, uh, you know, there’s another one. There’s a large chemical company that actually took all of its lawsuits and threw it off in a divergent company and basically said, goodbye. Okay. I mean..
Scott Lyons 7:08
What if you were to throw a nonprofit into that mix? What if you were to throw a nonprofit in that mix?
Josh Marpet 7:13
I don’t know. They mean, there’s a lot of, again, lots of variables. Right. So Chris, what do you think about the idea of and I’m calling it data laundering, there’s probably a better term but, you know, data obfuscation data laundering, what what do you think of that?
Chris Pin 7:26
Data obfuscation? Right. We’ll talk about that later. But that’s, I mean, that’s one of the safeguards, but non-data laundering or manipulating and selling data in manners trying to get around the laws, essentially. That, you know, that’s going to show up quickly. The first time an AG comes in, and because they get alerted to this type activity, right, they’re gonna throw the book at you, because not only following the law, right, you’re taking measures to try to avoid the law. You know, think about tax evasion, right? If I create the shell for that shell…
Josh Marpet 8:03
You know that thing that Apple and all those large companies do really successfully? I’m not saying they have…
Scott Lyons 8:09
Yes, yes, and that other companies have tried when they’ve spun off other businesses to try to hide their activities. Hmm.
Jeff Man 8:17
So let me throw in a couple questions that came up in the discord. And I think at least the second one, in part ties into what was just being said. Maybe they both do. The first question is, and it’s a little bit long, let me just read it. It seems to me most of the privacy laws are enacted to protect companies hoovering up consumers personal data, which is understandable. But the vast majority of spending I see in corporate world is trying to protect customer and employee data, which in my humble opinion, employees have an explicit agreement already for data collection, usually in customers are mostly corporate identity, not really individuals, is that an overreaction?
Chris Pin 8:55
So depending on the organization, it might be a company, right? That might be their customer. But if you think about a retailer, the Walmart, the Amazons, the COSTCOs of the world, right? They’re their customers, you and I, we walk in, we buy things, they track us from the moment we walk in the store. Think about a casino, right? They’re not the only people that that watch biometrics and thermals and all these other things. A lot of retailers are doing that these days, too, especially with the pandemic, right. A lot of companies have now incorporated thermals into their stores. You didn’t consent for that, when you walked in, there was no signage stating that they’re using this technology. And because of that, there’s no purpose attached to it right, inside of the company. They know that. Okay, we’re going to use it because of COVID. Right? We want to make sure that, you know, its proximity tracing or, you know, seeing..
Josh Marpet 9:57
Contact tracing.
Chris Pin 9:58
Yeah contact tracing. There you go, seeing who has been impacted because you walked into the store, and maybe you had a fever that day. So I think the law is twofold. One, it wants to ensure companies doing the right thing to ensure they’re protecting the consumer side of the house. It’s also setting bounds for everybody that you know, everyday Joe, to be able to say, I know what this company is supposed to be doing with my data. And you should have some sort of, you know, I don’t wanna say comfort, but some sort of level of expectation that the company is going to do just that. Prior to these laws being created, it was really the wild wild west, you sign up for a Facebook account back in 2015, they can do whatever they want with that data, they can sell it to 50,000 different companies, right $1 apiece, they just got $50,000 for your Facebook account that didn’t cost you anything. And that’s really what they’re getting at now. What happens with the 50,000 companies, they sold your data to right, all those companies got to make the dollar back. So they sell it to 100 other companies, and it gets out of hand, you end up with the robo callers texting your phone, you end up with, you know, you’re getting all these marketing emails about, you know, you sign up for Facebook, now you’re getting ads about Chinese food, right? There’s, there’s no, there’s nothing relative to it. I think Jeff talked about it earlier, right? You were speaking with somebody, all of a sudden advertisements for something started popping up. And that’s really what they’re they’re looking to protect and avoid as having those types of situations occur.
So you’re saying to help like, coalesce a lot of that together? You’re saying a good deal of this has to do with business ethics, right?
Ethics and purpose? Like, why did you collect my data in the first place in the usage of my data should be limited to just that purpose? If you want to go above and beyond that, then you need to get my consent or my acceptance for that type of activity to happen.
Jeff Man 12:08
So let me ask a very basic pointed question is because I haven’t read the law. I haven’t read any of the laws. I don’t think that we’re talking about. Is the law….
Josh Marpet 12:18
Wait a minute ..you just admitted you haven’t read PCI?
Jeff Man 12:20
PCI is not a law? Hello.
Scott Lyons 12:22
It’s not a law? Try again.
Josh Marpet 12:24
It absolutely is a law to Jeff, come on.
Scott Lyons 12:28
Yeah, but it’s not a law on the books as like, as in traditional law, like CCPA or CDPA.
Jeff Man 12:33
If ya’ll don’t shut up, I’m going to forget my question! I’m an old man.
I already forget.,
Scott Lyons 12:40
Stop shaking your cane at us!
Jeff Man 12:42
Is the nature of the Virginia law, let’s stick to the law at hand designed to protect against data theft, data loss from these companies and making sure that they’re protecting it? Or is it more focused on the responsible, appropriate ethical, legal use of data buys, said companies.
Chris Pin 13:03
So there are requirements in there that the company must, you know, do its best? We all know, that’s not, you know, it’s not realistic to say a company shouldn’t have a data breach? Or I mean, we can all say that, but in a legal term, you can’t come out and say that you can’t be breached, right. So there are requirements that, you know, you have to provide X, Y, or Z type of control in order to best ensure you know, the security and handling of the data, do try your best to ensure there is no breach. But in the case that there is a breach. Just like with PCI, there’s a timeline you have to follow. You have to you know, alert people in, you know, 30 days, 45 days. And you have to also have a plan, right? What’s your response plan? What’s your action plan? What are the next twelve steps you’re taking because the breach happened? And then when you identified how the breach happened, how are you going to fix it? Right, all this comes to question. So yeah, it’s absolutely applicable. If that’s where you’re going with it, Jeff?
Jeff Man 14:10
Well, I was trying to ask a simple question I got too complicated is the law focused on, you know, making sure companies protect the data in a secure manner, similar to something like PCI, which is make sure this data get doesn’t get stolen? Or try to prevent this data gets stolen? Or to what degree is the law focused on actually trying to dictate the appropriate use of all this data that the company might be collecting said company?
Chris Pin 14:38
Yeah, so as deep as these privacy laws, and this goes across the board for GDPR, CCPA. And now CDPA. Because these laws are written by legal folks, they don’t know the difference necessarily. And I’m speaking generally here I’m sure there’s some people out there that do but generally speaking, your average player is not going to understand the difference between obfuscation or pseudonymisation or encryption or masking, you start hearing all these different terms thrown around. Even if I just say the word labeling, right, even in your head, Jeff, you probably just went to 10 different things. Labeling could be this, this, this, this and all depends on who you’re talking to. Right? So what you see in these laws is that appropriate safeguards must be used. You know, that word appropriate is going to vary from organization to organization. Because the objective of these laws isn’t to have your company go out and purchase some new, you know, multimillion dollar security stack to protect the data. That’s not where they’re going with it. But if your organization and because we’re all familiar with it, if you’re an organization that’s leveraging PCI, and you’re meeting the DSS year after year, that means you must have things in place like FIM and DLP, and monitoring, logging. All these types of things must be in place, right? You’ve got 2factor you’ve got you know, secured network zones, you have all these things and that are there. Why aren’t you using those for your other data sets that aren’t financial? Right? Why aren’t you using those for your marketing database? Why aren’t you using those for your HR database? That’s where these laws are focused on. We’ve been so honed in and just looking at your credit card that we’ve forgotten about everything else that makes you a person.
Jeff Man 16:24
Okay, so what I’m hearing you say is the law is primarily focused on data loss, data theft, you know, protecting the data, without going so much into the details of what the company might be using the data for. There’s I’m hearing you say there’s an assumption that you’re as a company going to be collecting some set of data from your customers just protected. And they’re attempting to expand beyond things that are specifically cited, like your credit and debit card information, because that’s already covered under PCI? Is that a fair conclusion?
Chris Pin 16:59
Yeah, they want everything that could potentially identify a person or household to be adequately protected, but also for organizations to be 100% transparent for their intended purpose for that data collection and data usage.
Jeff Man 17:13
So so to what degree is the law dictating the companies? Or as a strategy, I guess, for protecting the data? Or to what degree are they responsible for either protecting the data from being identified with a particular person or household? I think you had said early, versus you just sort of the general collection of data? Is my question too vague? Maybe?
Chris Pin 17:48
Possibly. But I think where you’re going with that, right is, should the company protect everything? Or just what could be tied back to an individual? Right? I think that’s what kind of I guess..
Jeff Man 17:58
My question is, is a way to solve the problem just to anonymize the data, if the company collects all this data, and focuses, like the anthem example, I used earlier if they focus their efforts on separating all the data they collect from the actual names of their customers, so that the data if it’s compromised, can’t be correlated back to individuals. Are they done? Or are they sort of above and beyond or besides that?
Chris Pin 18:28
So there was there was an MIT study done? And they looked into how do you deidentify? How do you properly de-identified somebody’s data? just removing the names isn’t enough, right? Removing addresses isn’t enough, you would have to remove so much. And this is what the MIT study found in order to truly anonymize somebody, you would have to remove so much data, that the data that remains is not worth the cost of the storage to store. So it’s really, it’s very unlikely that you could properly anonymize somebody and still get value from the data. Which poses the question, why not easy?
Jeff Man 19:18
What’s that Josh?
Josh Marpet 19:19
Properly anonymizing data these days is God awfully unpleasant. You know, you can leave things like a state or zip code or something. But if you even have a birthdate you can get I think it’s 96% of the US with a zip code and a birth date. So it’s not easy.
Chris Pin 19:41
If I have a job title, and a salary range, right, I can get within 75% of who it is. And depending on that title, right? If it’s CEO and it gives me a salary range, I could probably get within 98% of who that person is.
Jeff Man 19:54
You’re almost sounding like, it almost sounds like cryptanalysis you know, it’s not the direct correlation between the data set and the name, it’s all the associated data that together can sorta point you towards the solution as it were.
Scott Lyons 20:12
That’s exactly what I was saying earlier with the concatenation piece. You know, here on SCW, we’ve said it before, and we’re gonna say it again, to your point that lawyers are the ones who are writing these policies. There are two ways to handle dealing with this type of thing. You can either do it through the courts, or you can do it through legislation. And we’re now starting to see that a lot of this is being derived through the courts, where a lot of people don’t understand the precedents of what’s being set and how that really affects them via the legislation. Would you agree?
Chris Pin 20:51
Absolutely. That’s oftentimes the breakdown is, and that’s where I have to help a lot of organizations out even on my customer calls, I’ll be pick a company, right? I’m only speaking with them. And rather, I’m talking to the security person, which I oftentimes do, they don’t understand the wording of the law. Right?
Josh Marpet 21:08
[Sarcastically] No!!!!
Chris Pin 21:08
Or if privacy person, a lawyer, an analyst calls me up, they understand the law. But then when I say things like obfuscation, they, it goes right over their head, they’re like, well, I don’t know, does that meet the parameters? It absolutely does. And this is why and so I have to get into showing them examples, right? Here’s data on the A-side. Here’s data on the B-side when we’re done, it’s gonna look like this. And so it’s almost like you’re, it doesn’t matter who I’m talking to you you’re in some sort of a training session, because there is that persistent gap, between IT and legal. GDPR kind of set the precedents that that gap needs to be overcome and that’s when people like myself started to exist; To kind of fit in and help lawyers understand it and help it understand legal speak. Although, you know, the lawyer may say encrypt everything. We all know that’s not a that’s not a good strategy for so many different reasons.
Scott Lyons 22:09
Well, why wouldn’t it be because there are compliance regimes that say data and data at rest? Data encrypted data at rest, encrypted data, transit? Why can’t we encrypt all of the things?
Chris Pin 22:19
Sure, but encrypt what data at rest encrypt what data in transit, right?
Scott Lyons 22:27
The way the standard is written is that it’s not prescriptive? Or I’m sorry, it’s not descriptive of what data needs to be encrypted. It just says data, data at rest, data in transit, encrypt both. Well, so is it really fair asking, Well, what data do we encrypt? What data do we don’t encrypt? Whereas if you get hit by ransomware, right? How do you make that distinction?
Chris Pin 22:51
Well, it goes down to the sensitivity of the data. Right? I think Jeff’s tying.. is that Jeff trying to jump in?
Jeff Man 22:56
Well, yeah, I’m trying to jump in because Scott is more or less citing PCI. And my initial comment is yes, PCI says protect the data at rest and prep of the data in flight. But those are two sub requirements out of the whole mass of 400, some odd requirements, the point being, that those are two important steps to take for the where data is being processed. But there’s so many other things that you have to do. I think the problem is, Scott, that because the masses don’t really understand how encryption works, and what are the limitations of encryption or obfuscation to use the more generic term. They see it as a Holy Grail, it’s a silver bullet, it’s a well. If we just encrypt the data, we’re done because nobody can read it. It goes back to Bruce Schneier, and I’ll throw him under the bus here in a loving sort of way. When he wrote his original book applied for cryptography. He, he more or less said, Why is security such a big deal, computer security, Internet Security, all you have to do is encrypted the data and you’re done. And I remember reading that as an employee of a certain, you know, three letter agency and a certified cryptologist, thinking, wow, this guy doesn’t know what he’s talking about, because that’s only one small piece of the problem. It continues to be one small piece of the problem. But I think, especially, and I don’t want to throw anybody under the bus, but companies that are touting encryption solutions. I think they’re relying on people thinking, wow, if we just encrypted, we’re done. It solves all our problems.
Scott Lyons 24:43
Yeah, but Jeff, I’ve got a major problem with that because HIPAA is also under the prescription of inflight and at rest encryption, even though it may not say verbatim in HIPAA, HHS has come out and said Yes, encryption in flight and yes, encryption at rest for all data is required. So it’s not just PCI here.
Chris Pin 25:06
But let me come back to you though and say in both of those cases, right, when you say data in a PC islands, it’s talking, you know, payment card, largely, when you say data, in the HIPAA lids, it’s talking about your healthcare data. That’s not for clinical research purposes, right? If it means one of those two qualifications, then it’s not HIPAA, it’s something else. If it is clinical research data, and I saw this actually at an insurance provider, it goes into a data lake, it’s not protected, and it’s used for science and analytics, and judging next year’s rates and all this kind of stuff, but it’s not protected to the same means that your healthcare data is, even though it is healthcare data, it wasn’t used for that intended purpose.
Jeff Man 25:55
Well, let me throw this out just for the benefit of our audience. Because I don’t want to assume that people understand what I’m getting at, I want to be very specific,
There’s data at rest, that means it’s being stored in a hard drive somewhere in some sort of file or folder, you know, whether it’s a spreadsheet or a database, or a flat file, it’s stored somewhere on a drive, virtual or otherwise.
Data in flight, it’s being transmitted over the network over the internet over, you know, some forter, some form of networking communications. But there’s another place where that data exists at least one other place. And that’s, that’s in memory when it’s being processed. And process being could be the act of encrypting or decrypting the data. But it’s happening in memory. It’s happening on the stack. And there’s been a plethora, there’s another SAT word for hopefully, some people today, there’s been a plethora of PCI breaches that have been largely, the bad guys figured out Hey, halt, the data is in memory, we don’t have to steal it from the database because it’s been encrypted. We don’t have to try to intercept it in flight because it’s encrypted. But if we can attack the memory and attack the stack, we can scrape it all out there because it’s all there in plain text. So memory scraping malware, which was the heart of your breaches like Home Depot and target, you know, several years ago, and has been happening ever since. They’re they’re going after the data that that isn’t specifically addressed, although you can argue that in any of the PCI requirements or the HIPAA requirements, so all I’m saying all I’m trying to say is there’s more places to find the data, rather than being transmitted, or it’s being stored. It’s when you’re processing it or doing something with it at an application level, which generally means it’s happening in memory. So I just wanted to make that distinction.
Chris Pin 27:59
And I think there’s a good reason why that hasn’t been addressed through PCI and other means, right, Jeff, why not encrypt the data in memory do great. I think the answer to that is simply, I don’t believer there’s a technology out there that can do that today and still allow the system to process it.
Josh Marpet 28:15
Well, no, there is homomorphic encryption, which actually does allow that but it’s still iffy to a certain extent. Yeah.
Jeff Man 28:23
Don’t get me started on homomorphic encryption. That’s a whole we’ll have a topical discussion on that someday.
Chris Pin 28:30
Okay.
Jeff Man 28:31
But, um I lost my point. Go ahead. Somebody else talk. Dang it.
Josh Marpet 28:37
You know, we’re talking on the discord. Jim, for example, is bringing like real time data shouldn’t be encrypted because it can lead to delaying communications. And so it’s not so much the safety and security sometimes it’s about the the money involved. I remember when I was dealing with a financial firm, a trading firm, they ran with no firewalls, no IDS, IPS, no nothing, simply because any delay, like they literally ripped out, I think it was 150 switches at one point, because they were four nanoseconds slower than advertised and promised. And I said it does it like, I’m sorry, I, it’s your call. It’s fine. But why? And they said every nanosecond no joke. Every nanosecond is about a million dollars every three minutes.
Chris Pin 29:17
I have the same analytics. You know, we’re here the retailer. In the e commerce department, management looks at the milliseconds, what’s the average millisecond for a consumer to get to the web page to have it load? How long does it take somebody to checkout, right when they clicked on shopping cart? How long did it take for that card to show up. And if there was even a two or three millisecond increase because some new security feature was enabled on the website, they would remove it, that’s got to go that’s all that’s altering the shopping experience of my consumer and it’s costing me money.
Jeff Man 29:51
Well, speaking of shopping experiences and this is an analogy or an example that hopefully most people can relate to when you go to brick and mortar store, which we used to do in the old days, and we’re starting to do again and you go to the the the casualties of the checkout lines…
Josh Marpet 30:06
What are these brick things you speak of?
Jeff Man 30:08
Bricks, what are those? You know, nowadays over the last couple of years, more than likely we have a chip card in the United States. And we’re we’re chipping signature, we’re not shipping pin, as Europe and Canada are, which is a whole another topic. But when you when you stick your card in you’re now waiting 5, 10, 15 seconds for the processing to go on. So you know, some sometimes it happens sooner rather than longer. But there’s some sort of measurable, visible delay, while all the cryptography is happening. Are we really pained by that, based on you know, versus a few years ago, when it was swiped the card and you got the transaction approved, the authorization, usually in less than a second, it was measured in milliseconds, microseconds, whatever the term is. Do any of us care as consumers that were that were standing at the checkout line for a few seconds now, versus it used to be a lot faster. And if you’re old enough to remember the old days, when the clerk had to pick up the telephone and call a number, and if you’re really old enough, talk to an operator and read all the transaction information to get the authorization versus typing it in on the keypad and getting sort of an automation response? I mean, these are the real tangible things that most people can see. Not to disparage the the nanosecond analogy that you were just using that, you know, it’s stuff that people worry about that are on the, you know, that are the companies that are trying to do this stuff. And I’m sorry, Scott, go ahead.
Scott Lyons 31:53
No, no, I prematurely thought that you were like leading a new question. Bringing back the days of having to verify a signature might be a good idea. With the advent of pin and chip and trying to speed things up more often times than not, it doesn’t happen that way. You stick your card in it says, Okay, are you good? Do you want credit or debit you hit credit, and bang, you’re done. There’s no other type of authentication that goes into it. And the chip and pin is supposed to be more secure than signing something. But there has to be an even better way of doing it, then just sliding your card in, I can take Jeff take your card, or Josh, I can take your card and slide it in as you know.
Jeff Man 32:39
It’s actually it’s just a point, this just a nitpick difference, it’s not more secure. It’s allegedly harder to commit fraud, because you are hopefully applying things that make it less likely to clone a card to somebody else’s using a fake card. Or you don’t have all the inner workings that are in the chip in a clone. if not more secure it. Yep.
Josh Marpet 33:07
Are you talking chip and pin or chip and signature or chip and signature?
Scott Lyons 33:12
Either.
Josh Marpet 33:12
Chip and pin is more secure because it’s two factor authentication.
Scott Lyons 33:17
No, you can hit credit and bypass the pin all together.
Josh Marpet 33:20
Okay, that was that was true. But that’s not chip and pin.
Scott Lyons 33:24
Okay.
Josh Marpet 33:24
That’s chip and signature.
Jeff Man 33:25
Chip and pin in the United States is more like how we use a debit card as a debit card. And I think what Scott’s saying is that you can use a debit card as a credit card and get the same net result…
Josh Marpet 33:41
You know, and he’s absolutely right. I’m saying that the specific technology chip and pin versus chip and signature are very different in terms of security. That’s all, now…
Scott Lyons 33:49
We agree on that. 110% is just the bypassing of that methodology is the problem.
Josh Marpet 33:57
Yeah, absolutely. totally ridiculous. But I want to point out there was a comedian that actually did a routine. He walked into stores and he would like drop pictures little signature box, instead of signing his name. He site and it worked every time. So he decided to go for broke. He went into I don’t know if it was a Best Buy or some technology store. And he said, you know that video wall right there with like nine huge monitors like Yeah, he goes, can you buy that? And they did a little discussion and they found out yeah, you can it’s like $150,000. He goes, I’d like to buy one. And they’re like, Ah, okay. And they set him up for it. And he wrote not authorized for the signature. And they went, No, no, no, no, you have to sign it. And he went, that’s my signature. And they said, Ah, and eventually they basically declined him from buying it, but from purchasing it, but like, they don’t really check those things. And to Scott’s point, you know, maybe checking his signature would be a great idea. I personally think having a second factor of authentication would be better.
Scott Lyons 34:52
That’s what I was about. That’s where I was going to head with this in that. What you have what you know what you are and trying to say without that multi factor authentication like we do now, sure, it may hinder the way that we log into certain systems. But we have to give up some hinderance to stay secure and stay ahead of, or break that, quote unquote cyber kill chain from last week of attacks. I mean, gentlemen, am I wrong?
Jeff Man 35:22
No, that’s a fair point, Scott. You’re right in. And I think you’re answering the question of why we have chip and signature in the US because the card brands and the major merchants determine that the public in the US is too stupid to remember a second form.
Scott Lyons 35:42
And then that begs the question though, that begs the question, though, is there an ulterior motive behind moving to that methodology? Right, we have Facebook collecting data on us now, and using that as, as a way that they’re making money through data brokerage and selling it off, could that move have had an ulterior motive that we’re just not realizing?
Jeff Man 36:09
Possibly.
Scott Lyons 36:09
And silence.
Jeff Man 36:11
Dimitry brought up an interesting point, you know, scenario on you know, you claim fraud, the bank goes to the merchant merchant says, here’s the signature bank says that’s not the signature merchant says we don’t employ signature experts, bank loses the claim and ends up paying for the fraud, you know, which basically…
Josh Marpet 36:29
The insurance pays for the fraud.
Jeff Man 36:31
Yeah, I mean, basically, it’s cheaper, which is what drives most everything to just not bother and just, you know, whatever, just, you know, issue the refund issue the pay back chargeback, let the let the fraud happen. I know of one major retail customer that I had, that it was a deep, dark secret, but I know it now that they do not fight any disputes. So you could go in and you know, spend $1,000, and then claim, I didn’t make that purchase, and they would just make it go away without question. That’s their policy, is to not fight it. Because it’s just cheaper to just..
Scott Lyons 37:12
You know, that’s sort of like having a really shitty product on Amazon. And once when it breaks and somebody leaves you a really bad review, you say, Oh, I’m gonna pay you. Oh, gee, I don’t know. $30 to take that review away. Now I’m gonna pay you $50 – Come on, man. Can we be better at this?
Jeff Man 37:31
Right? Well, anyway, Chris, you haven’t said anything in a while and you’re a guest.
Chris Pin 37:38
I’m listening to you guys get on this PCI tangent. Brings back good memories.
Jeff Man 37:45
I was gonna say is, is a good memories or, or flashbacks of sleepless nights?
Chris Pin 37:51
Maybe a bit of both. But it’s always comical. I will say.
Jeff Man 37:56
Well, you know, and you lived the PCI world a little bit. And it’s my biggest frustration. One of my biggest frustrations with PCI is how the game has become, how do we eliminate card data storage and transmission? Let’s Encrypt it, let’s tokenize it, let’s make it not be card data, because the rules say if it’s card data, then we have to do all these different security requirements. But if it’s not, then we don’t have to. And the game is very much become what they call a scoping exercise of, you know, how can we get away with having to go through this PCI stuff?
Chris Pin 38:36
Even where the thought process went earlier, right, it’s like, we have these new privacy laws. Well, what if I do x? What if I set my company up this way? How do I get around it? Right? That was the first thing that came to mind. It’s the same thing with PCI. It’s like, yeah, there’s 50 new controls coming out next year. Okay, great. Well, how can you get out of them? Right, let’s not even think about how can I just do them? Right? They spend so much thought process and so much money on how to get around things, oftentimes, it’s just cheaper just to do it.
Jeff Man 39:07
Well, and that’s literally what I used to tell so many of my customers that would, you know, spend hours and hours agonizing on a how to how to avoid the requirement or how to get around it in terms of a compensating control. And I’m like, you know, it’d be cheaper if you just did what it said that you needed to do and stop trying to dance and they’re like, you know, and some small percentage of the time, they’d be like, Oh, you have a point. Yeah, we probably should just do that. Often how long would it take to do it? No, no, it was a major effort to do whatever it was, but still cheaper in the long run.
Chris Pin 39:42
Yeah, the other the other thing that comes up and you see this with PCI when you’re trying to scope the card data environment, but the other thing that comes up with these privacy laws is when you have to surface all, like someone comes in says okay, I want to know all the data you have on me why you have it, you have to service all the data you have obviously. But what happens with that, you know, the backup of the database or the backup of the file share, you know, you have all these redundant and duplicated data sets, that if you even if you’re a DBA, maybe you don’t pay attention to that, if you’re infosec, you certainly don’t pay attention to that. Right? You’re looking at your production systems and you know, how do you keep the business going, that’s your primary focus. But these privacy laws, they don’t care, they don’t care what your production system is, they don’t care, what your QA system is, they care about is that data that could be reidentified or allocated back to an individual and so it kind of brings, it kind of blows the CDE apart, right, it brings everything into scope.
If it’s controlled, or owned, or could be accessed by your organization, you’re responsible for every piece of data that’s on it. And then if we think about things, like tokenization, and the PCI world, right, that was another kind of catch all, well, I’m going to, I’m going to throw my payment card information over to this other vendor, I’m going to shift the risk, if you will, right, it’s not my problem, it’s their problem here, I’ll pay you for that. Privacy doesn’t care that session ID, that payment card ID that you get from the tokenized vendor. That’s personal information, because you as an organization could reidentify that back to who’s – what that card number originally was, right? All it’s gonna take you as a phone call to that vendor you’re paying, and they’re gonna tell you what that card number is. So because it’s within your control to reidentify, it does not meet the deidentified, you know, specification for opting out of the wall, if you will. You know, just like with encryption, encryption doesn’t OPT you out of the law, because you can always just decrypt it, right? Chances are, if you encrypted it, you have the key, or you have a way of obtaining the key. So you can always reidentify it. So, really the only way to permanently the identify something would be encrypting it and throwing away the key, destroying the key, whatever the case may be, or something like masking, right, where you’re taking a name in, you’re transferring it to a different name with no way of reverting it. So you take Jeff, you turn him into Mickey Mouse, right? It’s a fictitious name, you take his address, and…
Josh Marpet 42:30
How’d you find that out?
Chris Pin 42:33
How do you find out what?
Josh Marpet 42:35
That we turned him into Mickey Mouse?
Jeff Man 42:39
Don’t give Josh ideas.
Scott Lyons 42:40
By joining the discord.
Chris Pin 42:44
Yes, you can go to discord and change his name. There you go.
Scott Lyons 42:47
Yep.
Jeff Man 42:49
Hey, well, our time is running out, unfortunately. And you know, as usual, we could probably go on and on and on. Let me give you, you know, Chris, the floor for any parting thoughts and do feel free to very briefly share with us because I’m curious, you know, what is PKWARE doing these days that that would help companies with this whole problem of meeting these burgeoning data security laws.
Chris Pin 43:21
So there’s two things that high level that PKWARE is doing today. One is they call it smart craft, and essentially gives you the ability to share encrypted data from one company to the other, by a secure means, without having to leverage near traditional secure transfer mechanism, right. So if I encrypt data, say it’s a file, and I email it over to you, Jeff, you could have this PKWARE reader, and if I’m giving, if I were given your email address access to that key, then only you could open it and read it right? You would now have access to the encryption key. And I didn’t have to have a secure means I didn’t have to have a key transfer or anything like that with you. So that’s one way of ensuring your data is always protected no matter where it’s going.
The other thing that PKWARE does is very interesting. And it’s a data discovery. So think about any type of data out there. It could be an identifier, it could be a customer ID and employee ID it could be you know, any type of color, just pick any type of data, anything your organization may be interested in PKWARE can discover it across a wide array, almost every platform out there structured, unstructured, semi structured. You know, a very good use case for this type of technology is as a assessor, as a PCI assessor, you come in and you say, okay, you told me your CDE is this little bubble? Well, how do I know? Right? How do I know you’re not storing credit cards outside of that well technology like PKWARE doing its discovery on every other platform, it can come back to you very quickly and say in a report that, okay, here’s the location, I found payment cards, and I didn’t find them anywhere else. So that would just give you as the, as the assessor, right?, assurance that your clients not telling you if it. So that’s, that’s kind of where PKWARE fits in.
And in the light of privacy, as we talked about, you know, privacy could be associated just about any piece of data out there. So organizations really need to boil down what is important to them, what is associated well back to their client base, whether that be a company or, you know, legitimate consumer, what elements do they have that can be associated. And once you have that thought process, you configure PKWARE to go look for those types of elements and surface it back into a meaningful report, to take action on it, whether it’s encryption, redaction masking, or just awareness, right? So all that comes up.
Jeff Man 46:05
So safe to assume that it would find credit, debit card information, primary account number..
Chris Pin 46:11
Absolutely CVV count number, the whole nine.
Jeff Man 46:15
Cool, I’ll keep that in mind for my customers that are trying to carve out their CDE, their card data environment, and then prove that’s the only place where there’s card data. The upshot of that exercise, I will add as a closing thought the good news is, even though companies made it a game and like, whoo!, we can do away with PCI, if we just, you know, shrink down our CDE. The upshot is they cleaned up a lot of data that was in a lot of different places throughout their organization, where it wasn’t being encrypted, people didn’t even know it was there and it was just ripe for the stealing. So, you know, maybe the wrong motive, but a right result in terms of just sort of reducing the footprint of how much data was throughout their network.
Chris Pin 47:01
Yeah, I would say another thing you can add on there, Jeff, is as you’re talking to your, your various clients, another guest speakers and everything else to use, when, you know, inevitably you’re going to come into it, and someone’s going to ask you Well, Jeff, What did you see this other customer do improve GDPR? Or for this or for that? Right? And what now what you can take away and tell them is you have all these controls for PCI because it’s been mandated for years, you’ve been leveraging them for years, why not just expand the scope, right? Your company scope, don’t focus on payment cards anymore, focus on payment cards, plus all the consumer stuff and then based on their sensitivity in their own threshold, right, determine what control is appropriate for the type of data at hand, right? Maybe you encrypt credit cards because you need to be able to decrypt on the process them. Maybe you mask email addresses, right? If you’re not doing marketing, and you’re not doing email, newsletters, that kind of stuff, maybe you mask it, maybe you don’t have a need to actually email it right. And so these are all thought processes that people can go through.
Jeff Man 48:13
Cool. Yeah, we can keep talking PCI cuz I love to do that. But we won’t. Chris, thanks so much for joining us today and helping to facilitate this hopefully interesting discussion on on data privacy issues and challenges and goals and dreams. We would really like to consider having you back at some point, when we have our cohost on that’s a lawyer, which might sound kind of intimidating to you. So maybe we need to find somebody else to interview that’s ready to go more than on the legal aspect.
Chris Pin 48:52
You’re more than happy to hop on.
Jeff Man 48:55
All right, well, we may take you up on that. That’s gonna wrap us for us today for security and compliance weekly. You know, stay secure, stay safe, stay compliant, keep building those bridges and tearing down the silos. Until next time, we’re out.