Security and Compliance News SCW #2
Jeff Man 0:02
Welcome back to Episode Two of Security and Compliance Weekly. I’m your host, Jeff Man, joined by my co-hosts, Josh Marpet. Scott Lyons and Matt Alderman. One more ad before we jump into the news, this is great news for us. A lot of us hosts have been waiting for this anxiously for months. But we are proud to announce that the new security website has – security weekly website has gone live. So visit securityweekly.com and check out all of our new sorting and filtering features and functionality. That’s right, you can now look up topics of interest on our various programs and not just fumble around like most of us have to do. What was that thing we talked about? Who were we talking to? When was that show? So we’re looking forward to it if you’re not, but please check us out. And if you find any kind of issues, or if you have any kind of feedback, you know, pros and cons, things you like and don’t like, like the font or the color. We welcome all of your comments, and you can send them to website at securityweekly.net. That’s website at securityweekly.net Alright guys, so let’s talk about the news. Anybody want to jump in with a favorite story of the week?
Matt Alderman 1:18
Oh, we’re all so fast.
Jeff Man 1:21
Yeah, somebody? like Josh. how about you, before you blue screen again?
Josh Marpet 1:26
Yep. Yeah, we got one of yours that New York breach laws. Fascinating. I really like that one. I’d love to chat about it. A lot amendments. And I really, really want to talk about that. Because I think there’s some new amendments are fascinating. Is that okay?
Jeff Man 1:39
Yeah, let’s do it. Do it. So.
Josh Marpet 1:42
So it’s in the show notes, but it’s from a data protection report calm. It’s talking about the breach law amendments to New York, DFS, 500, NYC 22, CFR 22, whatever you call it, okay. And they’re there. It’s fascinating, because the amended breach law talks not only about notification and how they’ve expanded the time, they’ve expanded the penalties. They’ve done all sorts of things. But what’s really fascinating is it expands what they’re reaching for. Let me explain what I mean. Sure, it takes biometric data now. So if you have biometric data, now that’s technically PII. Does that mean that if I video, I don’t know, the Brooklyn Bridge, there’s people walking across, I’ve collected biometric data sort of, does that count as PII data on my video? Second thing is, instead of you’re doing business in New York, it’s now you have information on a New York resident. So if it’s any information on any New York resident, then the question is, who’s not under this law?
Matt Alderman 2:56
Yet, what’s your expanded scope from just acquiring the personal information to accessing that personal information, which is another interesting change in the law? Is even if you’re not the original acquire, if you’re accessing any public, private information, you’re also in scope for the law.
Jeff Man 3:19
So yeah, I
Josh Marpet 3:20
This is like GDPR.
Jeff Man 3:21
I was gonna say, it almost sounds like GDPR. You know, any country in the world, if they’re x, if they’re obtaining personal information on European citizens, they’re subject to GDPR. So it sounds like the corollaries here. Any company anywhere, if it’s a New York resident, they’re subject to these, these laws.
Josh Marpet 3:42
So what do you have to cordon off New York residents? This is crazy.
Jeff Man 3:45
Well, didn’t they do that in the movie once?
Scott Lyons 3:50
No, pretty soon, there will be a feature that was the video cameras, and all in all, in all regular cameras, that you have to pay a certain amount to be able to film people to be able to access that data. I could see it coming.
Matt Alderman 4:07
Well, I, I already saw parts of this when I was in Italy this summer. Restaurants will not let you pull your phone out and take pictures or video, the restaurant will not do it. I mean, it’s just because of GDPR because of the privacy. I mean, you’re already seeing this stuff being banned in European restaurants, just don’t pull out your phone, try to take a picture because they’re going to come over and tell you not to.
Josh Marpet 4:30
Well, like if you’re taking a picture of your food. It’s okay, so
Matt Alderman 4:33
I assume so. But as soon as that camera comes up and you try to take any picture across the restaurant, the big No, no.
Jeff Man 4:42
What I find interesting is that New York apparently has different definitions and they break out there’s a difference between what they call Privacy Information and Personal Information. Anybody got inside, under have that one.
Matt Alderman 5:02
I don’t I just know they’ve, you know, personal information is any information concerning a natural person, which because a name or other identifier can be used to identify such natural person, it’s a pretty broad scope already in one of the broadest of the breach laws.
Jeff Man 5:19
I mean, you know, just a quick read of the article, it looks like there, and I might be wrong, but it looks like they’re, when they’re talking about private information, it looks like they’re addressing not only personal information, but also information that can be used as part of authentication. So they’re, you know, like, you know, they’re extending, you know, like a password or a biometric to be something that we care about, but they’re actually labeling it as private information.
Scott Lyons 5:49
They’re trying to granular ties and individuate, the different levels of information that can be collected about a person, right, I can always know somebody’s name, but if I know their fingerprints, too, that’s a little bit more. It’s a little bit more, what’s the word that I’m looking for here? Obtrusive?
Josh Marpet 6:11
Invasive?
Scott Lyons 6:12
Invasive? Yeah, that sounds alright. Let’s go with that. Then just sitting back and saying, well, I’ve got your name, your phone number, your social security number, you know? And that’s, that’s really what that’s going towards, whereas personal information, right? name, date of birth, Social Security address, right versus private information, which is facial recognition, biometric, right, what really defines you as you?
Jeff Man 6:40
Yeah, it’s interesting, too, because there’s definitely some PCI centric language in this law, because they talk about personal information, including your credit card number, your debit card number, and also some reference to the security code, which, you know, if you’re, if you’ve ever used a credit or debit card, it’s that three or four digits that’s printed on the back, or in Amex’s cases on the front of the card. I think it’s laudable, you know, I’m not trying to give too many props to anybody. But the fact that they’re trying to put down in writing some of the things that we’ve been talking about for months now on on security weekly, and I’m sure it comes up on the other shows the just the fact that there’s so much information that’s out there right now, that’s personal information, but doesn’t fit the prevailing definitions of name, address, phone number, and so on and so forth. So the fact that they’re writing it down and trying to, to declare certain things is yes, this really is stuff that we care about that I have to think that’s a step in the right direction.
Matt Alderman 7:44
Yes, and we’re gonna…
Jeff Man 7:45
…from a privacy perspective,
Matt Alderman 7:47
Right, and we’re gonna see more of this with the California law. That’s, that’s getting ready to take effect. Right. That’s the other one of the other stories that’s in here is CCPA, and what it’s going to cost potentially to comply with CCPA in the state of California. I mean, this is going to be GDPR to Dotto when it gets rolled out next year in California, and they’re talking up to $55 billion in cost to comply with this thing.
Scott Lyons 8:15
Right. Right. Right. But the thing that we need to keep in mind here, right, is that GDPR goes for the individual. Right. CCPA goes for the family. Right? And this the new amendment to the New York regs that we’re talking about deals with the private information of a New York resident, right? So there’s no ambiguity about who would touch and who it doesn’t touch like we see in GDPR and CCPA.
Jeff Man 8:42
So you’re saying that’s a good thing?
Scott Lyons 8:45
Yeah.
Jeff Man 8:46
Okay. So, you know, you know, the article talks about the cost of complying with CCPA. Is there a consideration for, you know, what are the costs if they don’t comply in terms of fines? I mean, is it appropriate for a business to make a cost justification? Do I care about CCPA? Or is it a price of doing business?
Matt Alderman 9:10
Well, I mean, we’ve seen this already in some of the GDPR litigation. You know, companies who are not complying are going to get fineD. The question is, how big is the fine versus how much is it going to cost you to comply? It’s going to be one of those decisions every business is going to have to make when you look at the Facebook, finEd a $5 billion for Cambridge analytic guy, you know, as a drop in the bucket for them. Does it change their behavior? Probably not. But for somebody that’s smaller. I mean, this could have, it could be disastrous, the company could completely go out of business if the fines are big enough.
Josh Marpet 9:50
We’ve actually got historical perspective here. HIPAA in 2013 or 14 I forget, changed drastically and raised their fines massively. So what you’ve got is all the small doctor’s offices around that time started going out of business, because it became more than the $1,000 they were paying previously as a cost of doing business. Now, it’s large enough that they’re like, this is too much. That’s what all the small doctor’s offices started selling off the conglomerates and just disappeared from the environment, if you will. So is that the same kind of thing? It’s going to happen, you’re going to have people that simply don’t exist at small businesses, or and this is the one that I’m curious about, will you see data-holding companies, because that way, I’m not responsible for the data, the company that I sell the that I sent the data to, and I rent or lease it back from them, or access from them is the one that’s responsible for it, I really think that’s going to be happening. And so we actually coined a term recently, not coined it per se, data laundering. We’re gonna see data laundering, where we have data, where did it come from? Well, it’s clean data. I didn’t ask that. Where did it come from? It’s gonna be interesting.
Scott Lyons 11:01
Yeah, and as far as the fines go, you know, the articles ballparking, for small businesses with less than 20 employees to have to spend about $50,000. You know, that’s a big hit to the bottom line of any small business, you know, it doesn’t matter what it is now, earlier. In the first segment with Alex, we were talking about how do we get small businesses to be compliant, right. And if you have something like this, it’s going to be very difficult. You know, watching the landscape of how all of this unfolds is going to be very paramount for any future privacy regs that are going to be coming out, because they’re going to be saying, Well, on one hand, we’re protecting our people. But on the other hand, the trade-off is we’re sacrificing small and medium businesses that have to become compliant with this, right? The largest will – they’ll always be able to deal with this, right? Unless, the compliance team that they spend the money on the resources, the time, the training, and so on and so forth, can’t handle the volume of requests that are coming in. Right. And in that point, you know, you’re looking at anywhere from 2 to 5 million to start.
Jeff Man 12:17
So it’s sort of an interesting segue, when we’re sort of talking in this article about the cost of actually the cost it takes to become compliant versus the cost of non-compliance, which we assume comes in terms of, you know, fines, primarily, the another article I saw this week, and I put up talks about, we get the name right. “Cybersecurity, the C suite and the boardroom, the rising spread specter of director and officer liability”. So which begs the question. You know, what if it’s my money and not the company’s money? And how does that factor into these decisions of whether we’re going to follow various compliance standards factored into these?
Josh Marpet 13:07
There is DNO – director and officer insurance. So as DNO insurance will actually cover that to a certain extent. But you never know how big those fines can get, or those lawsuits can get all those class-action suits even worse can get. So you got to be very careful with that.
Matt Alderman 13:26
Yeah, but you can get into large class-action lawsuits…
Jeff Man 13:36
I guess my question, one of my initial question is, you know, how does anybody feel about making, you know, members of the board or executive management personally responsible for the actions or inactions of the company that they work for? I mean, are we okay with that?
Matt Alderman 13:58
You’ve seen aspects of this already seen aspects of this with target, right? You’ve seen aspects of this with the Equifax breach. And I don’t think is going to go away. If there is negligence by a director officer, there is going to be liability. The question is how big and I think a lot of it will depend on where the where the, the suits come from. Right. If it’s fines from an agency, that’s one thing, but if there’s class action lawsuits for damages, and sometimes those amounts could get high or there’s criminal suits for negligence. I mean, that’s where things get really, really interesting. In DNO won’t cover some of that if there is just outright negligence So, you know, you got to be a little careful. But this is something I think most boards and directors and officers are taking a serious look at is – How do I put the right risk model in place? How do I cover aspects of this from an insurance perspective, they can’t account for all scenarios. But this is going to be a growing area, I think, for people to consider.
Scott Lyons 15:14
Well, when you start looking at fiduciary responsibility of boards and directors, with the strategy of vision that needs to be handed down from the top, the real question is, is where does the buck stop? Right? Do we keep passing it all the way up the chain? Where does you know, the proverbial stuff roll downhill? You know, it’s, it’s not just negligence from a director or officer? I mean, what if management screws it up? Right? Are directors and officers still liable to that, you know, like it really trying to figure out the the navigation in these waters is really, really interesting, right? Can a board claim that they knew about risks, or that they have visibility into their organization? Right? Recently, Josh and I were at the NACD Summit, and the big theme of the summit was shift. Right? So shifting, not just a responsibility, but shifting the way boards interact with their prospective companies. So realistically, the question is how to reduce board-level liability. Right? It’s, a really big, significant issue. It needs professional help, and it’s going to be interesting to see how this plays out.
Matt Alderman 16:28
But the board’s responsibility is to provide oversight. So if you isolate them, how do they effectively provide oversight?
Scott Lyons 16:37
It’s not that simple, though, you can’t just sit back and say that a board is, you know, to provide oversight. You know, boards provide long-term strategy and vision. Right? We were talking about this earlier. If a board doesn’t have visibility into an org, right? Where does that lie?, number one. Number two, does the board actually take a proactive measure in getting down into the weeds and understanding what the company is doing? Right, what’s going on in the internal in inner workings? How is the compliance team handling all of the regimes to keep the risk down? How is the security team working with the compliance team to ensure that business can be maintained? And then how is all of that bundled up and shoved into sales and marketing to say, Yes, we are providing the best widget we can?
Matt Alderman 17:31
Yeah, good. Good. Good question.
Jeff Man 17:35
Good discussion. So I know, Josh, you’re a little bit interested in talking about PSD to mostly because we’re talking to MasterCard, but you know, please, please pipe up.
Josh Marpet 17:49
So I listed four articles in the show notes. And the articles are basically, when banks are required to give open secure API’s, which might be a contradiction in terms slightly. But they’re required at their own expense to provide secure API access to anybody that’s a registered account information service provider or payment information service provider. Who is going to win? And the answer is probably the tech giants, that’s Facebook, Apple, Amazon and Google, they’re going to create banking, front ends for 100, 200, 300 banks. And all of a sudden, you’re not going to bank by going to your Royal Bank of Scotland website, or whomever, you’re going to bank by going through your Facebook account. You’re going to bank by going through your Amazon account, you know, Alexa transfer $5,000, to my personal checking, and I’m terribly sorry if anybody has Alexa listening. Don’t do that. But I mean, that’s literally what’s going to happen. And you’ve got significant issues there. What happens if I can trigger someone’s Alexa to transfer money? What happens if I can trigger a Google Home? What happens if I get your Google account through some fishing? You know, oh, well, you know, I’ve got 2FA on your Google account, don’t you authorize it for 30 days? These are questions that have to be answered. And so the tech giants who have been making it easier and easier for us to do everything, and working on security as well, don’t get me wrong, some of them have been doing amazingly good strides on security. Where does this end? Where do we help a bank by providing access to our banking and help a consumer – rather? And where do we hurt a bank by not allowing them to advertise? Hey, you’re looking for cars, we can get you a loan? Because Google is gonna see that traffic and they’re gonna offer me the loan, my bank won’t.
Scott Lyons 19:41
So what you’re saying is how do we diagnose liability and the transfer of risk inside of doing something like this?
Josh Marpet 19:50
And where are we not going to, In my opinion, where are we not going to, as we did in 2007? continue creating banks too big to fail? air quotes, air quotes, air quotes by giving not even the bank don’t even have the banking responsibilities. They just the perception of who the bank is is huge.
Jeff Man 20:10
So, it’s interesting the bank, the notion of banks being too big to fail, because they, you know, were greedy or mismanaged funds or whatever. Contrast that with, for lack of a better term “fair competition”, you know, there’s another, there’s another sheriff in town that’s offering an alternative, you know, which is what the large tech companies might offer? How’s that going to change the landscape? Or how, how’s that going to change the battle when the concept is too big to fail? Because isn’t this country built on? On competition?
Josh Marpet 20:52
So the country is built on competition, but we also have antitrust laws, because after a while, things, coagulate, things, things come together, and flow up. And we break up monopolies at various times in our existence as a country. We’ve done it many, many times, and it’s cyclical, okay, and those who do not learn their history are doomed to repeat it. So we have some very, very smart companies out there, Apple, depending on if you’re an Irish, an Irish tax person, or a US tax person. Is an Irish or an American company.
Scott Lyons 21:25
There. Hold on, hold on, can you break that down?
Josh Marpet 21:30
If you’re an IRS, a US – IRS person, and you talk to Apple, they go, Hey, we’re an Irish company. If you’re… This is what I understand from reading newspaper articles. Please don’t quote me on this and correct me if I’m wrong. But if you’re an Irish, the equivalent of an IRS person and talk to Apple in Ireland, they go, No, we’re a US company. What are you talking about? So there’s, there’s various ways that they move money around that that lowers their tax liability. And it gets interesting, because they, according to an article I just read, the larger companies pay less taxes than we do. Get software.
Jeff Man 22:06
It’s interesting. Hey, and speaking of banks, Matt, I saw you had an article about the FFIEC and it that piqued my interest simply because the very, very, very first public speaking role I had first opportunity to speak was a million years ago at an FFIEC convention. So what’s going on with the FFIEC map?
Matt Alderman 22:34
So the FFIEC, has provided guidance to the banking regulators for a while on how to assess security posture and preparedness, right, and built their own tool that they promoted for years, as a way to go out and assess yourself, get ready for an audit, right? If OCC or SEC, FDIC would come in. So they do a press release about the tool, but then they kind of start walking back from their tool little bit. They don’t want to say use their tool anymore. They’ve now opened it up to say, well, you can use our tool, you could use the NIST cybersecurity framework, you could use the financial services sector, coordinating councils cybersecurity profile, or the Center for Internet Security, Critical Security Controls. So what is it? Right How, after years of guidance, saying use their tool, they’re not kind of backing away from that tool and saying, now you can look at all these other standards as a potential way to prepare yourself. And what I don’t like about this, Jeff, is it muddies the waters from which framework should I adopt? Which framework am I going to get assessed against when the regulator comes in? See if it was the FFIEC tool, it’s really easy for people to know exactly what the audit is going to look like. Now, it could vary depending on which framework and that’s what was really interesting about the press release and why the waters are now a little muddier.
Jeff Man 24:06
Yeah, that is interesting. I wonder if they had any struggles based on you know, the PCI Council’s long-standing, learning from making their own mistakes of not wanting to be too prescriptive and setting the bar because then if they missed the mark, they have potential liability issues. I wonder if they’ve run into any of that, you know, with if you just use our standard, but what if it turns out their standard isn’t good enough the measuring stick, if you will.
Scott Lyons 24:38
So that’s one, That’s one way of looking at it. The other side of looking at this right is understanding does the auditor understand the business and understand the inherent regulations that need to be placed on the business right? So it’s one thing to look at it from a business perspective but bring in an auditor who is brand new, right? Just fresh out of audit school, right? I know there’s no such thing but you get what I’m saying? Do they understand that not only is it FFEIC? Right? And I’m sure I just butchered that. FFIEC – see, sorry, but you also need NIST you also need 171, 853. But you also need HIPAA on top of it, depending on who you are. Right? I know, I know, banks don’t go for HIPAA, but multiple standards are going to be a composite issue when trying to say, Well, this is one way to solve all the problems, especially when looking at it from the auditor perspective.
Josh Marpet 25:41
Yeah, I mean, so one part of this is maybe the smaller institutions that and NIST CSF is not a small thing. So and the FFIEC tool, as I understand it, is based on NIST CSF, I’ve read the profile so and so did the NCUA, I think they did a pretty much exact same one. So doing a full NIST CSF is not fun. It’s totally doable, but a smaller institution, a small savings alone, whatever, might have said, hey, look, we’re already doing CIS Top 20. These are very straightforward recommendations and sets of things that we can do let us do that. And that, remember that if the FFIAC and the NCOA endorse a particular tool? They’re also putting some what’s the word? They’re putting some market force behind it? I’m not sure the correct verbiage, forgive me. And to them, that’s not something they’re really supposed to be doing as a regulatory body not as a marketing body, if that makes sense.
Scott Lyons 26:36
Well, compliance, compliance. Compliance as a whole has like natural steps to try to figure out both of what we’re talking about, right? First step is completing a gap assessment gap analysis, figuring out what you have, where you need to go, and then how you get there. And then the second step is actually having an auditor come in and start checking out. Have you done everything correctly? Do you have the correct controls in place? Right, we’ve said it before on Security and Compliance Weekly, you know, compliance is driven by three things, policies, procedures, and control implementation. And that third part is where a lot of companies really get it wrong. And a lot of companies get it really, really right.
Matt Alderman 27:17
Yeah. And to your point, Josh, right. The NIST cybersecurity frameworks 2 – 300 controls. That’s a whole heck of a lot more than than the top 20. Right, the critical 20. Yeah, that expands scope dramatically, even when you’re doing a NIST 853. And you’re looking at low, medium, and high and the different baselines, you know, you can break the big thing up into smaller chunks. But you’re right, it’s hard for a smaller bank to implement all of NIST cybersecurity framework, just massive.
Jeff Man 27:52
Yeah, and all of this underscores the complexity, which is why we’re doing this podcast of compliance versus security. Because, I think a lot of people that are faced with, oh, I’ve got a choice now for these yardsticks, while there’s, I’m sure there’s some organizations that would want to do some sort of, you know, assessment of what’s the best one for them, what’s the most directly applicable, or, let’s just say the easiest to implement, but I think very often they would devolve into, it reminds me of teaching to the test, and, you know, in in the education realm, you know, just sort of the bare minimum, and missing the point of the big picture, which is the big knock against PCI, not that it’s one of the measuring sticks on the list here. But, you know, if you aim too small, you miss things in a large way. And that’s just sort of this, this tension that is built in between compliance and security. There’s always this presumption when you’re presented with options that there’s somebody there within the organization that knows what they’re doing, has the background, the expertise to intelligently look at these various standards and A), you know, which one’s best for our organization, then B) of the one that you select? Which of the groups or controls are applicable? And not necessarily applicable? Or at least, you know, how do you prioritize them? And then, you know, the third element would actually be translating that only not only into actually implementing and following the controls, but how do you evaluate if you’ve done the job adequately, or you know, if you’ve met the objective, if you’ve met the requirement, all sorts of fun stuff, which is why I think we keep having these interesting conversations. Well, that’s gonna wrap us for this week for Episode Two of Security and Compliance Weekly. hope everybody has a good week and stay tuned for next time. over now