You are here: American University Centers Khan Institute Start Here

SH Coverart New

START HERE PODCAST: Cyber Public Policy Fundamentals

Welcome to "START HERE" the educational resource for public policymakers seeking to delve into the intricate world of cyber policy. In this evergreen series, we bring you the foundational knowledge you need to navigate the complex realm of cybersecurity with confidence. Join us as we sit down with top experts in the field, uncovering the essential principles and insights that shape cyber policy today.Ěý

NEW Episode!

Listen to our new episode.ĚýEpisode 12Ěý-ĚýKey Players in U.S. Cyber Policy: The States!

Ěý

Ěý

bagley2-305x341

Drew Bagley, CIPP/E, is CrowdStrike’s Vice President and Counsel for Privacy and Cyber Policy, responsible for leading CrowdStrike’s data protection initiatives, privacy strategy and global policy engagement. He serves on the Europol Advisory Group on Internet Security, the U.S. Department of State’s International Digital Economy and Telecommunication Advisory Committee, and the DNS Abuse Institute’s Advisory Council.Ěý

Megan Brown 2

Megan Brown is Co-Chair of Wiley Rein LLP’s Privacy, Cyber and Data Governance practice and former senior DOJ official. She advises multinational companies and industries on complex cybersecurity and data privacy challenges, including risk management, incident response and reporting, compliance with emerging regulations, and government investigations.

Sasha O'Connell 2-1

Sasha Cohen O’Connell, Ph.D. is the Executive in Residence and Senior Professorial Lecturer at AU (SPA) where she teaches cyber policy at the undergraduate, graduate, and executiveĚýlevel. Additionally, she serves as the Director of Programming and Curriculum at the Shahal M. Khan Institute. Prior to joining the university full-time, she had spent the majority of her career at the FBI where she served most recently as the organization's Chief Policy Advisory, Science and Technology and as the Section Chief of Office of National Policy.

MEET THE TEAM BEHIND THE POD

, Wiley Rein, Content Development & Strategy

, Wiley Rein, Audio Production

&Ěý, AU Khan Institute, Streaming & Web Resources

, Crowdstrike, Graphics

START HERE In the News

Selected
Episode 1
Episode 2
Episode 3
Episode 4
Episode 5

Episode 1 - Welcome to START HERE

In the first episode ofĚý“START HERE”, Sasha O’Connell, Drew Bagley, and Megan Brown debut the podcast and explain why it is filling a gap in discussions about cybersecurity and policy.

Ěý

Transcript

Sasha O’Connell

Hello and welcome to Cyber Policy Fundamentals, the Start Here series. My name is Sasha O'Connell and I'm a Senior Professorial Lecturer at American University, and I'm joined by Drew Bagley and Megan Brown from CrowdStrike and Wiley Rein, respectively. We are thrilled to be kicking off this audio resource series. Drew, do you want to introduce yourself and talk a bit about the gap we're hoping to address with this product?

Drew Bagley

Sure, thank you, Sasha. I'm Drew Bagley and I'm the Vice President and Counsel of Privacy and Cyber Policy at CrowdStrike. And let me pass it to Megan, to introduce herself.

Megan Brown

Hi, glad to be here. I'm Megan Brown. I'm a partner at Wiley Rein, a Washington, D.C. law firm, where I co-chair our Privacy, Cyber and Data Governance practice.

Sasha O’Connell

Awesome. Drew, can you talk a bit about what this project is all about and what gap we're trying to address?

Drew Bagley

Sure. We are kicking off the Start Here podcast series to address a gap we have seen in policy discussions. When we think about cyber policy, the focus is on trying to solve problems, but also consider the process and incentives in solving those problems. And oftentimes, there can be faulty assumptions when we're talking about cyber policy because it's a complex issue. And faulty assumptions can lead to bad policy, like assuming that regulations are always good and going to solve the problem, or that the private sector is only looking to maximize profits, or that decisions are simpler than they actually are. There is also a lack of common language and fundamental understanding in this field, and that can lead to a lack of productive communication on policy, and these are very important issues. There's also a gap in fundamental educational resources on cyber policy, and ultimately, there's a disconnect between problems, the policies proposed to solve those problems, and the metrics for success in determining whether or not the policies put in place actually solve the problems they were purporting to solve. Cyber policy is about properly identifying the problem that needs to be solved and coming up with a solution that solves it. And the audience here, Sasha, for this podcast is anyone finding themselves needing to get up to speed on the cyber policy topics. It's for you.

Sasha O’Connell

Perfect and when we were thinking about this project and that audience, we also give a lot of thought to the setup and layout and the way to best deliver this information so that it's super useful for folks who really truly need a place to start on these complex topics. So, what we've landed on is short segments about 15 or 20 minute audio files that are essentially mini classes that can be evergreen. So, as time goes by, and new proposals come up, they can serve as that key educational resource or touch point to get up to speed on these issues. Our goal is to produce and deliver a twelve-episode starter pack in 2024, and each of these episodes again will serve as a primer. So, cover history people, check and policy principles, all behind these key cybersecurity policy issues of the day. And to do that, our promise here is to use straightforward language and analogies and storytelling to the extent we can to try and get out of this world and get down to the key issues. Speaking of key issues, Megan, what are the topics? What's on the top of the agenda?

Megan Brown

Well, there is no shortage of topics to address in 2024. The three of us have been living with these policy issues for years now, and so the challenge really was narrowing it down and sort of where to start. So, I think our first episode is going to try and tackle cyber incident reporting mandates and kind of the pros and cons and the different equities there; but we also want to talk about ransomware and do a primer for folks, so they can understand what the buzzwords are and what the trade offs and what's really going on for policymakers. There's a lot of discussion last year and heading into 2024 about accountability and liability. Who should own this problem where you've got nation state actors, but you've also got private choices that might have externalities? So, how do you encourage good behavior and set up the right incentives? There's been a lot of discussion about standards, whether it's at NIST or other organizations that put out best practices. Should the private sector face voluntary standards and encourage best practices, or is it time to move to regulation and sort of nudge behavior? And then finally, kind of in the mix is who's who; looking at the various regulators, the agencies, the incident responders, the private companies that are involved as well as the policymakers, the congressional folks, the staffers, and the agencies who are confronting all of these questions that we hopefully will be able to unpack in this series. So, no shortage of things to talk about as we head into 2024, and really look forward to helping folks approach some of these policy issues.

Sasha O’Connell

Absolutely. Drew, why are you fired up about this project?

Drew Bagley

Well, I'm fired up about this project because I think that the three of us really bring together a lot of different perspectives that often don't come together in policymaking, right? We have, obviously academia, private sector, the cybersecurity world all brought together, and I think that, ultimately, when we think about just even the basic questions that need to be asked when proposing cyber policy and trying to solve cyber policy problems, it really is important to understand certain fundamentals and to think about all the different intended and unintended consequences that policies can have. So, I think that this series is going to give us a unique opportunity to really get into those issues.

Sasha O’Connell

I agree, and from my perspective, sort of doing cyber education full time now in my current role, I know there's a real gap in resources that do exactly what you described. So, I'm excited to get it out to audiences, be they educators who want to use them in the classroom, policy makers who have a new role or new responsibilities, anyone who all of a sudden needs to take on these issues and needs to find a place to start. Megan, what do you think?

Megan Brown

I think it's exciting because Drew is working at CrowdStrike, so you're seeing the tactical in the trenches kind of trends and can speak to what is coming at policymakers and organizations. I've worked at the Department of Justice and currently advise companies on compliance and policy from incident response to reporting to DHS and their regulators, and I think the perspective that I hope we can offer is that practical, not theoretical, perspective to sort of say, what does this really look like? And some folks in the policy world might not have been in the chair that Drew sits in to understand what some of these choices actually mean.

Sasha O’Connell

Absolutely. Well, thank you all for joining us. We look forward to seeing you at our next episode on Internet Reporting and please also check out our affiliated website for Start Here. We will drop the link for the website in the show notes. There will be additional resources and other materials associated with the topics covered. See you there!

Episode 2 - Incident Reporting Part One

In the secondĚýepisode of “Start Here”, Sasha O’Connell, Drew Bagley, and Megan Brown discussĚýcyber incident reporting. They cover state and federal mandates and proposals, including the Cyber Incident Reporting Critical Infrastructure Act (CIRCIA), and discuss the tradeoffs of reporting from both the public and private perspectives.Ěý

Cyber Incident Reporting: When an organization experiences a cyber security incident and reports it voluntarily or by mandate. Ěý

Key Questions

  • What are the tradeoffs associated with making incident reporting mandatory?Ěý
  • What is the history of incident reporting, and how has it evolved?Ěý
  • What are the key aspects of the California 2002 Data Breach Reporting Obligation and HIPAA to incident reporting?Ěý
  • What are the pros and cons of implementing mandatory incident reporting?
  • Which agencies should be responsible for reporting?
  • Should incident reports be in the public or private domain?
  • What level of detail do incident reports require?
  • What should the timing of reporting look like?

Extra Resources

Transcript

Sasha O’Connell

Welcome back to Cyber Policy Fundamentals, the Start Here series. My name is Sasha O'Connell, and I'm a Senior Professorial Lecturer at American University and I'm joined by my colleagues, Drew Bagley and Megan Brown from CrowdStrike and Wiley Rein, respectively. We're super excited to launch this episode and dig right into incident reporting. Mandatory cybersecurity incident reporting is a hot topic these days and an ongoing policy discussion at many levels. There are numerous proposals and new mandates coming online, both at the federal and state level. In this episode, we want to break down the fundamentals, assumptions, and tradeoffs so that you can evaluate various proposals and approaches as they come online. So, we're going to start with what is this policy issue? And to start, let me offer from, where I sit, the policy issue most simply is whether cybersecurity incident reporting, so the reporting of cybersecurity incidents, should be mandatory and if so, to whom. And that's really the challenge that policymakers are struggling with to formulate those rules. Megan, is that right? Is that the place to start?

Megan Brown

Yeah, I think it is. I mean, there's a lot of different tradeoffs, but I think a fundamental principle to keep in mind is that cyberattacks and cyber breaches are fundamentally criminal activities, right? If a major company is subject to a major incident, someone has done a very bad thing and violated several federal laws at a minimum and it is unusual in our society and sort of civil justice system to require victims of crime to report that crime. In other circumstances, if you think of victims of robbery, identity theft, physical violence, there’s not some overarching mandate that says all crime victims have to run to the police or tell anybody. That might be a good thing, it might be nice if people did in certain circumstances but that's not the assumption when you're the victim of a crime and so, I think sometimes these discussions forget that private organizations and governments, frankly, are victims of crimes when we're talking about cyber incidents.

Sasha O’Connell

Absolutely. That is really good context and I think that analogy to kinetic or real-world crimes is an important one to keep in mind if there is going to be a change from that tradition, is the justification there. Drew, is there historical context here for cyber internet reporting? What's your thought on that?

Drew Bagley

Absolutely. Cyber incident reporting isn't all that new, especially when we think about the broader family of data breach reporting obligations. So, beginning about 20 years ago, or actually a little more than 20 years ago, in 2002, California passed the first state data breach reporting obligation. The Health Insurance Portability and Accountability Act, also known as HIPAA, was amended in 2003 with its security rule and for those requirements, what you really had was the beginning of an impact driven regime. In other words, where there was some sort of impact on victims and the type of data was the type that could be used, such as to perpetuate identity theft. It was sensitive personal information known as PII. Then that sort of personal information breach would need to be reported. It also mattered how many victims there were, what the circumstances were, and whatnot and these early data breach reporting requirements had very long lead times from when you would discover an incident and then when you needed to tell your regulator. And so, what we've seen over the past few decades is things have gotten more industry specific. The definition of what an incident or breaches has changed, it really applies in some ways to a broader category of data and at the federal level, we haven't seen one overarching federal requirement. Instead, we've seen more of these industry specific approaches. In the same way HIPAA was focused on health care, we now have reporting obligations for critical infrastructure, and we have reporting obligations for the financial sector and other sectors.

Sasha O’Connell

Awesome. I think that is super important context. So maybe then taking next, we can revisit this idea of defining terms real quickly, then maybe talk a bit more about what the real problem is that policymakers are trying to solve. Again, Megan, to your point, if this isn't how we do it generally, right, why is cybersecurity different, and what problem are we trying to solve, and then some of the factors that make it particularly tricky. So just going back to the definitions real quick. So, as I mentioned, right, incident reporting is when an organization experiences a cybersecurity incident and reports it, we generally consider it to a government agency is what we're talking about right, and it comes in two forms. It can be voluntary, the organization decides for itself whether it tells someone, or mandatory, a law, regulation, or contract situation that requires the organization to notify the government. Is that a fair description? Am I forgetting anything just to level set on what we're talking about here?

Megan Brown

Yeah, no, I think that's right. There also could be, as Drew mentioned earlier, sort of data driven requirements right. You may have to notify consumers or others whose data was affected; but, yeah, that's what we're talking about.

Sasha O’Connell

Perfect. So that's what we're talking about, why is this different? Why do we even‑ what problem is created in cyber that to your point, Megan, it's not created in other families of crime and reporting? What problem is the government trying to solve with this intervention, Megan?

Megan Brown

Yeah, so we've had for years on top of the few mandatory regimes that Drew talked about, it's mainly been a system of voluntary reporting of cyber incidents and Congress has done a few things to try and encourage that. Call the FBI, call DHS. They've enacted statutes to make it easier to do that and to protect companies who do that. And there's a lot of reasons, frankly, why the victims of a cybersecurity incident might not want to go public or might not want to call the government, hesitation about having the FBI in your system. The FBI has done a great job recently over the past, say, 5 to 10 years of changing that culture, but it's heady to pick up the phone and call a law enforcement agency and ask them to sort of get in your business. In addition, you might have an incident that doesn't really have strong indications of harm to consumers or likely identity theft or, systemic impacts. Or you might not know the impacts early on, investigations can often take weeks or months to really figure things out. The victim might not want to tell the government because of regulatory risk. They might not want to tell the public or other companies because of brand impacts as well as class action litigation exposure, or as we saw after Colonial Pipeline, your CEO gets hold before Congress and yelled at for any missteps that might have been made and we've seen this. There is a risk of revictimization after you go public. If reporting happens before your systems are secured or what's going on or even some of the ransomware bad actors will double dip. So, I see a lot of reasonable hesitancy to make certain incident reports both to the government and to the public.

Sasha O’Connell

Okay, I'm officially then not convinced, right. Drew? What's the argument on the other side? What are proponents of mandating reporting arguing? What problem are they trying to solve?

Drew Bagley

Yeah, supporters of reporting mandates are generally citing several different problems that they're trying to solve, and it really depends on the type of reporting requirement, the sector they're focused on, and whatnot. So, if we take all of that together, the list of some of the most common policy aims would be, first of all, raising awareness of some of these data breach incidents or cyber incidents and then in theory, what that means is that if victims know that this sort of thing has happened, that helps them ensure they can better secure their own data. That also helps them think more about who they're going to trust their data with, which then in theory can create incentives for companies to bolster their security and make sure that they're protecting data better. There's other arguments depending on the government agency involved and supporting certain types of requirements; there are investigation and enforcement equities at stake. If the government doesn't know about these incidents, then the government doesn't have the information to investigate them and then also the government doesn't have the ability to use its resources at scale to potentially disrupt adversaries who may be behind these types of attacks or to enforce against companies or even other government agencies that might not be responsible stewards of data. And so there really is this enforcement angle that if you have laws to protect data to begin with, how are you going to be able to enforce those if you're not aware of when there have been lapses in that? And then there are some incentives for those companies themselves. If a company is aware or a government agency is aware that they would be publicly exposed in some sort of way or obligated to tell a regulator or obligated to tell the victims about a cyber incident, then that's going to further incentivize them to protect their data with even more resources, to devote more resources to that, so that a breach or a cyber incident doesn't happen to begin with. We certainly see this a lot today in terms of some of the ideas behind some of the modern legislation where you don't just see a notification obligation, but you also see an obligation to protect data to begin with reasonable measures, something that's intended to change over time. And then there's also this notion that if you have a bunch of these obligations, if they're broad enough, you can overall change the culture. So, a couple decades ago, for example, a Social Security number was something that was much more commonly used as an identifier for things that have nothing to do with Social Security or filing taxes and now, of course, we've seen instead a migration into other forms of identity for various applications. So there really can be this change of culture when certain categories of data are deemed to be regulated and other categories less regulated and have less penalties attached. And then, ultimately, there are arguments that if you have some types of these obligations in place that are requiring this type of data to be produced on cyber incidents and data breaches, and that can better inform policy as a whole. So for example, it might be that if you're seeing lots of different patterns that are consistent with certain types of cyber incidents, and they're all coming from a particular threat actor or using a particular payment method or targeting a particular sector, and that can inform policymakers as to where they should devote their energies and attempt to come up with some sort of new policy solution that instead of focusing merely on the reporting is focusing on stopping the cause of the problem to begin with.

Sasha O’Connell

It's so interesting and hearing you both lay those out, it's clear why this is an evergreen issue, right. That there's a lot to contemplate in terms of costs and benefits in both sides of the argument. I would say maybe in summary, and I'm curious what you guys think, the reason this is particularly challenging or what makes this so hard to nail down and define sort of falls into a couple, at least priority buckets and one is that there are clearly, as you both said, complexities and tradeoffs, right, for reporting mandates because victims and investigations could be harmed by over reporting or premature reporting. So that's not even necessarily obvious on the investigation side. Additionally, consumers might suffer from notice fatigue or over reporting to be harmful in that way too and that is just too much, particularly in agents in organizations or companies that are in a regulated space. And then I hear you guys talking about too, the move from voluntary to mandatory kind of impacts and complexities for that relationship between the private sector and government, the trust piece there and how that works both positively and negatively. Before we move on to laying out the key players, any other thoughts you guys on why this makes it particularly tricky given the arguments you both laid out?

Megan Brown

I like that you just introduced that concept of trust, Sasha, and I think we'll come back to it, but I think the relationship so far in cyber has frequently, but not always been, built on trying to create that trust that makes companies want to call the FBI, or call DHS, or even call their regulator and there may be downsides to some of these new mandates because with mandates come accountability and enforcement. So that's, I think, one of the big things that policymakers have to balance and you've seeing that in a lot of the regulatory comment cycles that are ongoing.

Drew Bagley

Yeah. I think another difficult issue to grapple with some of these reporting requirements is that we moved away from an era in which there was a long lead time with these reporting requirements; to one in which there's an expectation to report nearly immediately and granted, there definitely are thresholds. There's thresholds on determining the incidents of high impact, making a determinability, determination, etc., but ultimately, in those first few hours and days after there's some sort of data breach in modern times, you're usually dealing with a cyber incident, and usually, the priority is to mitigate that incident, stop the bleeding, and ensure that more data is not going to get out. And so sometimes I think that there's many equities at stake and interest with regard to the brief reporting itself, but this isn't done within a vacuum. I think that's really important to remember.

Sasha O’Connell

Yeah that's great. So, in addition to this idea of voluntary versus mandatory, you're pointing out two other things, to how big of a breach or cyber security incident does it need to be to be reported this idea of materiality spoken another way, and then you said a couple of times too, I think it's really important, the time aspect. How long do victims have to report? So those are all three kind of important things that need to be considered when thinking about any of these policies going forward. That's super helpful. Let's turn for a second to the key players here and their equities and interests. I think it really helps understand the context of these discussions to understand where people are coming from and what their incentives are, and their equities are when they bring it to the debate. Drew, can you start with the private sector side? I know that’s a big task. Can you sum up how the private sector feels real quick on this? But what are some key points there?

Drew Bagley

Yeah, absolutely. If we look back at the history of data breach reporting, there are, of course, all 50 states have data breach reporting laws, but now what we've layered upon that are many industry specific reporting obligations, so now you have many larger organizations in the private sector that really fall into multiple categories. So, they have a Venn diagram of reporting obligations whenever there is an incident, and there's no consistency over even how those incidents are defined or what those timelines are for reporting those incidents. So, I think that when you think about where the private sector is with anything in the regulatory space, they're always looking for consistency. Then you can resource around consistency, design processes around consistency, and then comply with the law, and then organizations that aren't, there can be enforcement penalties but where you make things very confusing and layered, then it's very hard for organizations to comply with the law, especially smaller ones when they have competing, potentially competing legal standards. So I think consistency is one of the key things that the private sector is interested in.

Sasha O’Connell

That's perfect, and then thinking about it from a consumer perspective or an advocacy, civil society perspective, I think it's important to think about too, the dual piece of being concerned about one's personal data and security and privacy, and how those things kind of come out in the wash in terms of a balance of protecting one's data and getting that notification and also protecting the ability to control one's data right in reporting to government and where that data goes. So that's a whole other piece; I think that's very much in the mix. Megan, what do you think about both of those and also, can you talk about the government's perspective real quick?

Megan Brown

Yeah, sure. I think on the consumer side, one element that we've seen is not just an interest in consumers and security of their data, but also the reliability of the services that they have come to expect and so I think you've seen a shift in how the government thinks about the private sector's duties, which is, I think we've kind of moved beyond, frankly, some of the data focused regulations, and now we're talking about resiliency, availability, etc.; really important services. That's why there's this focus on critical infrastructure. From the government's perspective, there are a lot of actors and whenever I think about who the key players are in government, I also because, I'm a lawyer, I think about what are their authorities? Have they been told to go do this? So we have a few key players. The FBI has been doing this for a really long time. The Department of Homeland Security is now a major player, and I think that creates some friction with the FBI in policy decisions, but DHS has the cybersecurity and infrastructure security agency that has been given a bunch of new authorities from Congress. They're supposed to be kind of the belly button of the civilian critical infrastructure government relationship, but you have these other regulators who are out there that layer on top of from the Federal Communications Commission to the Federal Trade Commission to the Transportation Security Administration, EPA, the securities regulators. They are all doing things on cyber, and I think everyone in policy world needs to think about what are they doing, how does it relate to all the other activities and what are their authorities. And then finally, another huge player in this space are the state legislatures and regulators. Because as Drew mentioned, there's not an overarching federal privacy or cyber regime right now, the states are kind of off to the races and creating some additional complexities. California is one, New York Department of Financial Services is another, but there's a whole bunch. So that's kind of the, as you like to call it, Sasha, the who's who in the zoo. From my perspective, on the federal side, it is an ever-expanding set of characters, unfortunately.

Sasha O’Connell

Absolutely. Thank you both for your help breaking this all down. While many know about current or pending incident reporting requirements or proposals, unfortunately, I don't think we often stop and think about definitions, historical context, and real problems or issues we're trying to solve as we've covered here, because we want to take some time on this topic to dig into some of the proposed interventions more specifically. We're going to reconvene for another episode of Start Here to do just that. Thanks for joining us. We look forward to seeing you next time when we continue this conversation and please be sure to follow the link to our website available in the show notes to access additional resources on this and other related topics.

Episode 3 - Incident Reporting Part Two

In the third episode of “START HERE”, Sasha O’Connell, Drew Bagley, and Megan Brown continue the discussion of cyber incident reporting, digging deeper to discuss the main aspects of proposed mandates and new government approaches. This episode addresses the state data breach reporting landscape and new laws like the Cybersecurity Incident Reporting for Critical Infrastructure Act and news rules at the Securities and Exchange Commission. Sasha, Drew and Megan discuss hard operational questions, including whether reporting should be public or confidential, timelines for reporting (and tradeoffs of speed versus accuracy), and how reporting mandates can put victims at further risk.

Key Questions

  • To whom should reporting be required?
  • How can we achieve uniformity in reporting regulations?
  • Should reports be public or confidential?
  • What should the timing standard be for reporting?
  • What data is used to decide reporting standards?

Trancript

Sasha O’Connell

Welcome back to Cyber Policy Fundamentals, the Start Here series. My name is Sasha O'Connell and I'm a Senior Professorial Lecturer at American University, and I'm joined by my colleagues Drew Bagley and Megan Brown from CrowdStrike and Wiley Rein, respectively. Today we're going to continue the conversation about Incident Reporting Policy. If you're looking for history and context on this topic, I would recommend a listen to Episode 2 first, where we dig into those issues, because today we're going to jump right into policy choices currently under consideration. And with that Drew, can you kick us off?

Drew Bagley

Sure. Well today, there are state data breach reporting obligations in all 50 states. There is no federal equivalent, so no federal standard, that supersedes those 50 independent laws. Now those laws obviously have developed over the course of a similar time period, so many of them are similar in their requirements and then how they define what type of data that needs to be reported. But then layered on top of that, are both in some states and then certainly at the federal level, sector specific reporting obligations, and that's where the definitions of what needs to be reported change, but also the thresholds of when something needs to be reported. In other words, you could have some sort of incident where you have a data leak, data loss, data breach, but it doesn't rise to the threshold of what needs to be reported, and that is something that varies greatly with different laws. And then on top of that, we now have seen newer requirements through the rulemaking process, such as by the SEC, for publicly traded companies to report material cybersecurity incidents. That means that you have some requirements where the duty is to either tell a regulator or directly inform a victim of some sort of data breach or cyber incident, and you have a newer requirement where there's a duty to tell the world, and so that's a bit different as well.

During this time too, what we've seen is because you have 50 different state regulators, then you have different government agencies that regulate different sectors, even how you report a cyber incident or data breach is different. And in terms of what form you fill out, what information is required, who you send it to, and when you need to turn over that information and do that notification. So, for example, something like HIPAA, going back to the early days, that standard is still in place, that's a 60-day notification window, whereas there are now

some that are as short as 24 hours. In addition to that, what we've seen is, we have seen some discussions in Washington and some efforts to try to even picture what would harmonization look like. So for example, if we take one of the more recent requirements, CIRCIA, the Cyber Incident Reporting for Critical Infrastructure Act, that law, one of the things it created, in addition to new mandates for critical infrastructure, was it created something known as the Cyber Incident Reporting Council. And that was chaired by DHS and recently came out with a report. What the CIRC was focused on was coming out with a report on what harmonization could, should, and would look like, in terms of breach reporting. So that even if there were agencies that have completely different remits and completely different equities, there could at least be a way in which a victim could report to one place and have their data then sent to the appropriate venue depending on how they were regulated or appropriate venues.

So we do see certainly an appetite for harmonization, but no clear path at this time.

Sasha O’Connell

So, there are 50 different versions of this, plus the new requirements coming out of CIRCIA and the SEC depending, addressing the critical infrastructure focus and publicly traded companies, specifically. And this is all a bit of a jumble in terms of the specifics, and we could use some harmonization, perhaps through federal law, but that doesn't seem to be happening. Do we know what the issue is here in terms of getting this all aligned?

Megan Brown

Yeah. I think we've identified at least six that have come up in the development of the law that Drew was describing, CIRCIA, that Congress had to grapple with, but also that DHS and CISA are going to have to – they're resolving right now in many respects, as they implement that law. But the first big question we stumble upon, I think, is to whom should reporting be made, if you're going to have mandates? And one major policy debate underlying that legislation was the role of DHS versus the FBI. Sort of does a mandate to report to CISA undermine voluntary reporting to the FBI or should the FBI have some sort of more robust role? And I think that's a hard question. Congress resolved it in CIRCIA, that major incidents are going to be reported to DHS and CISA. Related to that, should regulators get the information that is reported? Should they have their own mandates? Should they rely on DHS, or should DHS keep that information kind of siloed for operational uses and not use it for regulatory enforcement type things? And related to those questions are, does the agency receiving reports, whether it's TSA or the Environmental Protection Agency or DHS, does that agency have the capacity and capability to do something

meaningful with the reporting? Those are kind of key policy questions about that first question, which is, to whom should reporting be required?

Sasha O’Connell

And Megan, what's your thought just on that to finish up to whom? Right? Drew also raised this question of public versus confidential. Right? That if you're reporting, is that protected in that way? Or, again, we saw this with FCC and maybe some other things coming forward, that requirement to make it public too. Right? Adds another complexity.

Megan Brown

I personally think it does add a lot of complexity, and I think policymakers are grappling with that. We saw in the regulatory process at the Securities and Exchange Commission, basically the private sector crying out for less public reporting or at least less public fast reporting, which we'll talk about timing shortly. But this question of, should you report to a law enforcement agency who can operationally help find bad guys, or are you reporting publicly for some other purpose? It goes back to that threshold question Drew identified at the outset, kind of what is the purpose of the reporting? And that is going to be an evergreen question. Whenever a reporting mandate is being discussed, I think policymakers and regulated entities need to think about public or confidential. What are the benefits and tradeoffs of each of those?

Sasha O’Connell

Okay. So, we've got to whom – and we've got a bit of discussion about then what the receiving entity does with it or doesn't do with it. Right? So, and I interrupted you there, but what about the question of what to actually report? Right? And then that triggering mechanism, what triggers it and the timing? I think those are the outstanding issues.

Megan Brown

So, I think in terms of what you report, this is going to be a real challenge for regulators across settings is, how detailed do you want reports to be, and what is the purpose of the information you're demanding. Right.

I think there can be a tendency to want more and more information because someone might find it useful or interesting, but I think policymakers need to ask themselves, what is that trade off about? The more information you demand, the harder the report is going to be to do quickly and accurately. And then what is

going to be the use of that information? And you've seen in a lot of comments, the private sector encouraging actionable timely information. And is the government going to do anything with it that will help the private sector, or are they just pulling in a lot of information that may not actually be actionable? I think Drew had some thoughts on some prior legislation.

Drew Bagley

Yeah. What we've seen certainly is that there is a government interest in knowing, especially at a macro level, not a trend level, what sorts of threat actors are posing a threat to companies in the United States, as well as government in the United States and victims as a whole, so that they can understand where our government resources appropriate to address those, where is, victim self-help the most appropriate way to defensively protect against those. And so, years ago there was legislation passed that went into effect in 2015, that allowed and even created conditions to encourage the sharing of certain information with the government, with DHS, so the cyber threat indicators and defensive measures.

But what we've seen is that that's specific of course to, sharing with one agency, and there certainly is always skepticism from the private sector in sharing certain types of information with government agencies, because government agencies, of course, depending on the agency, they also have the ability to enforce fines against the company. They have the ability to enforce criminal penalties and whatnot or just even raise costs by creating more process in the form of more information requests and subpoenas, and the like. So, that's where I think it's really important to Megan’s point, to be really thoughtful in terms of what is actually necessary in terms of information sharing. How's that information going to be used? But also, what is that benefit for the victim organization sharing that information. Is it that over time, they're going to get some information back that helps them be more secure? Is it that there's some sort of benefit from an immunity perspective with regards to what else could happen with that information being shared and whatnot. So, that's where that sort of incentive structures are really working under this realm.

Sasha O’Connell

That makes sense. Before we sum up, let me throw back on the table this idea of timing too. On addition to everything that's been put out there for consideration, Congress seems to like this idea of 72-hours for reporting. Any quick thoughts on that? I know you guys have been involved in real life incident reporting incidents. Give us some context for 72-hours. What's happening in response? Is that reasonable? And where do you see the conversation about timing going. Megan, I know you've got thoughts on this one.

Megan Brown

So, yeah Sasha, there are various timeframes being adopted and considered for incident reporting. Some are really in contrast to existing state data breach laws that may provide that you have to notify a state AG in a “reasonable time for as soon as practicable.” We're seeing now more rigid and shorter time frames. Congress seems to like the 72- hours. That's in the new incident reporting law for critical infrastructure. I find 72-hours to be a bit arbitrary. Why that number? Did anybody study the benefits of that as opposed to another period, perhaps longer? The other federal proposals that are on the table right now range from 8-hours for federal contractors, which I think is frankly wildly unrealistic. Some regimes require 24- hours from some other agencies. The FCC, for its part, just recently adopted a data breach reporting rule, and it requires reporting, “without unreasonable delay and in no case later than 30 days.” So you can see there's a lot going on in this space and the cyber incident reporting council that Congress set up at DHS to look at these issues about different incident reporting obligations they issued a report in 2023, that hopefully we can have in the resources with this podcast, it identified 45 different federal incident reporting requirements administered by 22 federal agencies, and that's just the federal ones. Other countries are also adopting some pretty aggressive and unrealistic time frames that affect critical infrastructure, and some of those are less than 24-hours, and that's really hard for companies that are multinational. The challenge, as you've alluded to, is at 72-hours after you've decided that an incident has happened, you have spun up an incident response team, Drew does this regularly with his clients, you are probably still in the very early stages of figuring out what happened, trying to figure out what your recovery plan looks like. Do you have backups? Are they accessible? Have you figured out where they are in your system if there's persistent activity? You might not have even started to think about kind of root cause analysis. From my personal experience, 72-hours is very fast because there's a lot of unknowns, and you're going to be expecting companies to tell the government things. People care deeply about being accurate when they're speaking to the government. So, it means that you are involving lawyers, and you are taking some time away from your incident responders to validate what you're going to tell the government, and that's just a real challenge. So, from my perspective, 72-hours is real fast, but that's what we're going to be stuck with for certain reporting regimes. And I really hope the government calibrates their – I guess, the government will need to have some recognition of tradeoffs. Drew, what do you think?

Drew Bagley

I completely agree. I mean, in those immediate hours, the priority is always going to be to mitigate the risk, whatever it is, just like with any other risk

mitigation practice. And so with a cyber incident, part of that is determining whether or not you actually have full visibility into the incident. Oftentimes you might have some indication that there was some sort of infiltration into a computer, but you might not know right away, and for a bit of time, where else on the network the adversary may be.

And, I think that's really important to remember. A lot of these cyber incident reporting requirements are predicated on the notion that a cyber incident is a single moment in time. If we look at some of the more modern trends like data leak extortion, that's certainly not the case, right? Where a victim can be revictimized over and over again. And so, what I think is much more important than even nailing down a particular time window, is making sure that the threshold is right. In other words, 72-hours might not be an impossible standard to meet if that 72-hour clock does not begin until you actually have a hold of the cyber incident itself, you've mitigated the risk, and then can spend those resources on getting a timely report. Whereas that the 72-hours can be very disruptive if the threshold for reporting, and when the clock starts is much more vague and broad, something that's intended to ensure that the victim has to report while they're still putting out the fire. I think that's a good way of thinking about it. Your building was on fire, you'd want to put out the fire first, then write up the report on how the fire happened. I think that should really be the intention here, is focusing on that. So, if we look – historically at HIPAA, there's a 60-day reporting timeline, and the HIPAA security rule has been in place for 20 years, and I don't know that we can say, that obviously, that hospitals are no longer under attack or hospitals are more secure because we have that reporting rule in place or that the cause of why the health sector is still under attack, is because the reporting requirement is 60 days instead of 72. Right? I don't think either is true.

Sasha O’Connell

It makes sense, and it really speaks to my last question for both of you. On behalf of the listener, I was new to this, there's so much to think about, right? We have the history, the definitions, we have this patchwork, context, efforts that kind of reconciliation and alignment, and then all of these sub issues to any of the individual policies. In the end, if I was new to this, I'd be wondering, do we know what works? To your point, Drew, right? Are there cost benefit analysis that have come out? Do we have data on kind of what works given the different policy objectives. Megan, you do this all the time. Is there a place to go to figure out what works? What did Congress use when they were deciding these things for CIRCIA, for example?

Megan Brown

Well, one thing that I think people should stop and sort of scratch their heads about is, the lack of data about the effectiveness of some of the prior reporting regimes or analysis of the uses of that data in the utility. I don't believe there has been an overarching review of what is good and bad in existing reporting. We have not just HIPAA, that Drew mentioned, we've had several years of Department of Defense mandatory reporting under some clauses there, that are fast, it's one of the places the 72-hours came from. And I don't know that folks have looked back and said, has good information come from that? Has DOD been able to help the private sector with that information? Likewise, the 2015 law Drew mentioned, which was about voluntary sharing of cyber threat indicators, a recent report came out from the intelligence community that suggested that – that's good stuff, it's effective. But Congress, I don't think looked at those precedents to say, how can we build on what has worked and improve what hasn't. They kind of move to this CIRCIA law assuming that reporting will be beneficial. I think that's a policy blind spot for some people to really think about what is the past experience with these regimes? What can we learn from that? Because I don't know that the data would support a 72-hour threshold as being particularly beneficial, but that's what we've got in the law.

Sasha O’Connell

Perhaps a great project for any researchers and academics in our listening community to take on and help our policymakers with. Drew, any thoughts on that before we wrap?

Drew Bagley

Sure. I think with some of these reporting requirements, in modern times, we've seen that some regulators somewhere, so 72-hours for example, GDPR, looks to the European Union, some regulators somewhere came up with a theory behind a number or picked an arbitrary number, and then others, for purposes of pure consistency have it piled on, without going back to ask those very questions that Megan has pointed out, as to whether or not that number makes sense to begin with or whether we should all have a new standard. I think that's really important. I think the other thing is, some of the information that's out there and been reported, it might be that that information could be really useful, if certain actions were taken with that information. So, for example when the private sector is voluntarily sharing information with the government, it might be that the government could use that information in actionable ways to disrupt e-crime actors and their infrastructure. And so, for some of these, we might think of the reporting, and we might not think of an equivalent sort of action that could be taken, and for others, we might say, oh, actually, the part that's missing here is the impetus to use this data in this sort of way that would benefit victims as a whole. So, I always think that's really important is to ask, where is there a

realm where the government is uniquely situated to do something in cyber versus where is it that victims need to improve their own defenses, get better about protecting data, be better about transparency, about how they're protecting that data? So, I think that we should never assume that there's any sort of single silver bullet solution for any of this.

Sasha O’Connell

Perfect. I think with that, just about sums up or lay down on incident reporting. Right? The complexities, the history, and certainly the activity that's ongoing here makes it super relevant. So, with that we're going to wrap our episode. We hope everybody both visits us on the Start Here website, and the link will be in the show notes to see additional resources and have availability to the transcript to this episode for reference. We hope to see you next time where we are going to tackle ransomware and other extortion challenges. So, Drew, Megan, thanks so much for joining me, and we'll see you next time.

Episode 4 - Ransomware Part One

In the fourth episode of "START HERE," Sasha O’Connell, Drew Bagley, and Megan Brown delve into the alarming world of ransomware and extortion schemes. This episode addresses the evolution of ransomware and the approaches that organizations often take in response to such threats. Sasha, Drew, and Megan discuss how ransomware affects operational technology and the Colonial Pipeline ransomware attack of 2021.Ěý

Key questions

  • What is ransomware?
  • What is operational technology and how is it affected by ransomware attacks?
  • What goes into the decision to pay ransoms?
  • What was the attack on Colonial Pipeline? What can we learn from it?

Extra Resources

Transcript

Sasha O’Connell

Welcome back to Start Here. In this series of podcasts, we are working to give you, our audience, a framework for analyzing foundational cyber policy questions. In our previous episode, we looked at the policy context, history, and choices surrounding the potential to mandate cyber incident reporting. I'm joined today to follow-up on that conversation by Drew Bagley, Vice President and Counsel, Privacy and Cyber Policy at CrowdStrike, and Megan Brown, partner at Wiley and co-chair of the firm's Privacy, Cyber and Data Governance practice to take on our next topic, that is to break down and explore ransomware and other extortion challenges. Before we start, or maybe actually, this is a great starting place for providing context for a discussion about ransomware and extortion schemes. Let's talk briefly about how you both think about how cyber policy makers should prioritize issues when the underlying technology or adversarial tactics change so quickly?

Megan, what do you think?

Megan Brown

So I really think this is an important topic that highlights that fundamental question for policymakers, and it applies not just to ransomware and extortion threats, but other questions across the cyber landscape, across the privacy landscape and others, which is, you know, how should policymakers think about particular types of threats when we see over and over, in the past two decades at least, that threats and tactics change? So, I think about it from should Congress or federal agencies be kind of running after specific threats that might be in the news, or should they instead focus on some fundamental principles that might have wider applicability and might be more durable and don't become quickly obsolete?

Sasha O’Connell

Drew?

Drew Bagley

Ransomware is a great example of Megan's point about policymakers sometimes chasing either a bespoke problem or a timely problem of today, but not necessarily one that's going to be a problem that we still need solved the same way in three years or that the way we're solving today's problem is going to solve the future problems. So, when you think about ransomware, it's obviously been all over the front pages for the past several years. However, even when we look at that phenomenon, adversaries have actually changed their tactics in recent years. For example, data leak extortion is now becoming much more of a threat than ransomware in and of itself. By analogy, if you think about it, malware was the hot topic a decade and a half ago, and, therefore, there were a lot of efforts that were spent trying to solve the malware problem, both from a technology standpoint and from a policy standpoint. You fast forward to today, there are still certain standards that were designed to address malware, whether or not a file on a computer is malicious or not, something binary and yet today, what we see is that actually 71 percent of attacks don't use malware, and even with the data we've seen at CrowdStrike, 80 percent of attacks are actually using legitimate real credentials. And so just with that example alone, tactics have changed, and therefore, the policy means to address them needs to change too.

Sasha O’Connell

That makes sense. So just to summarize, the context that perhaps ransomware is a version of something that continues to evolve and, thus, for policymakers, the important thing is to focus on key principles is what you guys are saying, not so much specific to this kind of implementation or specific threat activity of the day and tailoring interventions more broadly and less specifically makes a lot of sense. With that context in mind, and you started this, Drew, can I ask you, let's start on ransomware. What is it?

Drew Bagley

The system and the FBI's joint ransomware task force has a great definition for ransomware. They define it as a form of malware designed to encrypt files on a device, rendering them and the systems that rely on them unusable, and the reason why I like that definition is that the definition is not specific to what an adversary might do after encrypting the files. So in other words, an adversary may use ransomware for its namesake, locking up files and offering to release them for a ransom or conversely, the adversary can use ransomware for pure destructive purposes, and we've certainly seen that before. Not necessarily asking for something monetary in return. This is helpful in understanding that ransomware is only one form of a broader trend of extortion. For example, in today's trend of data leak extortion, ransomware may be used in addition to file exfiltration, or pure file exfiltration may take place without the use of ransomware. In other words, use of ransomware is no longer as simple as an adversary locking up files and asking for payment and then potentially unlocking those files in exchange for payment. This means ransomware may be tied to double extortion, asking for a payment to unlock files, and subsequently asking for a payment not to leak or further disclose information. So, I believe the ransomware definition is very helpful, but it's also important to remember that extortion is actually the broad policy problem that we're trying to solve, and ransomware is just one part of it.

Sasha O’Connell

Megan, from the kind of victim perspective, can you talk us through what people see? I assume we're still not in this era of ransomware as a service, we're not still seeing requests to mail checks to PO boxes. Can you talk a little bit about what it looks like on the receiving end - I know you've worked with clients in that situation.

Megan Brown

You know, the paradigm example is an employee tries to log on to their account or workstation and a banner pops up. “Hi. Your network has been penetrated and your files have been encrypted. To recover your files, send a hundred thousand dollars in Bitcoin to the following address.” Or, you know, the security team gets reports that databases just aren't available and when they start investigating, they might find a ransom note embedded in other systems. We've seen artifacts like that in the past or you could have a note sent to executives that says we have encrypted your files - here's a screenshot of a file tree and to get the keys to unlock your data or your system, you have to go to this website, which is on the dark web on, like, Tor and prepare to pay us $10,000,000 dollars in digital currency. Drew's team at CrowdStrike had put together some examples that they make available on the website, so in our resources page, maybe you can look at some of those. Some of them are very sophisticated. Some of them are full of typos and sort of seem like something Saturday Night Live would put out or The Onion. It varies, but that's kind of the game, and then that sets off a whole array of choices and things and playbooks that have to get executed.

Sasha O’Connell

And what are organizations thinking about? What does this mean in those playbooks? What are people balancing and thinking about when that goes down in an organization?

Megan Brown

Well, I've seen an array of reactions and challenges, and Drew's probably seen it far more, but every hour that your system is inaccessible, you either can't conduct a part of your business or you can't provide some service, hopefully, not all of your service, or you're wondering about what data has potentially been exfiltrated and this happens to retailers, hospitals, small businesses. If you're a medical facility or a doctor's office, for example, you might have to try and work with backup systems or figure out what paper records you have. If the incident is affecting your operational technology, maybe a factory has to shut down or critical services have to be paused because the security team doesn't know the extent of the intrusion and so they're trying to triage, how do we contain the damage and then think about remediation or getting backups restored.

Sasha O’Connell

Megan, let me actually ask you to just pause for a second. Can you say a little bit more about what you mean by operational technology in this context?

Megan Brown

Yeah, sure. Great point. Operational technology is different from information technology and as government regulators are thinking about regulation here, they're sometimes distinguishing between operational technology, OT, and IT. OT is and there's not clean definitions that are universal, but you can generally think of it as operational technology makes a service operate or go. In the Colonial Pipeline example, it was making sure that oil could get where it needed to go. It makes trains run on time; cranes operate. By contrast, IT is more about the business systems in the back end, email and billing. Sometimes, you need to consider whether you have to pay folks who claim they have your info so that you can actually determine what data has been accessed. The example I mentioned of, like, a screenshot of a file tree, you might not know what is in all of those files, and so you might need to buy some data back to figure out, you know, have I triggered my state data breach notification laws? Do I have to call so and so? Like, a whole bunch of things and, fundamentally, the threat actors here are making money from that uncertainty that is forcing victims to feel like they have to pay either to restore service, to examine the damage, or to be able to function and run their business.

Sasha O’Connell

Thanks, Megan. That is so interesting. I never really thought about the assessment piece, the need to potentially pay to even know what was lost, taken, or frozen. Drew, I'm going ask you about the payments piece and how that works functionally; but before we get there, can you say more on this need to assess or pay to even know what was taken? Are there other implications for that or considerations?

Drew Bagley

Absolutely, Sasha, and actually for listeners of this episode, this actually touches upon a lot of the topics we discussed in the incident notification episode because whether it's a cybersecurity mandate or a privacy law mandate, a lot of these breach notification laws actually require notification to a regulator or even individuals if the impact is great enough. So in other words, a victim really needs to have enough information about what that impact is. There are going to be situations in which a victim does have visibility into their systems, so even if they had some sort of ransomware incident or data extortion, they might actually know what was exfiltrated. But in other cases, to Megan's point, a victim might not actually have any of that information. In other words, they might not have any endpoint detection and response data, any log data, or anything, and therefore, they might need to actually know a bit about what the adversary may have. Similarly, a victim may need to know which type of adversary this is and what they traditionally do, and that's where a victim may want to analyze some sort of threat intelligence to understand who they're dealing with and what that impact may be. Sometimes ransomware might use other methods. So, for example a payment in gift cards and even other types of payment providers like MoneyPak, Paysafecard, and whatnot. But, traditionally, cryptocurrency is the preferred method of payment by adversaries, and the way in which that works when a victim organization does indeed want to pay or have the option of payment is they need a Bitcoin wallet. And so that's something where in recent years, what you've seen is that rather than victims going and setting up their own Bitcoin wallet or other cryptocurrency wallet and transferring funds to it, they might go through a third party company that specializes in doing those types of payments and that would hold wallets in a variety of different cryptocurrencies, and then, also, that company usually would be the intermediary that would help negotiate. Whereas on the back end, the victim organization would be working with their outside counsel in weighing their different options and, also in checking to make sure that they have some sort of information to make a valid judgment about whether or not their payment would likely violate any sort of sanctions law; because even if you are a victim of ransomware, you're still subject to different considerations, such as with regard to money laundering or terrorist financing, for example. So the AMLCFT restrictions, and so you have to ensure that you're not inadvertently paying money that would go on to finance terrorism or to facilitate the laundering of money, and you have to watch out for bans on payments to certain sanctioned entities. So that's an important consideration in the payment process, but the way that it works in practice is that there will usually be some sort of communications portal set up by the adversary. So the adversary will provide a URL that the victim can navigate to, and that's where there will be some sort of online chat, and that's where back and forth, the victim or the entity negotiating on behalf of the victim, will be interacting with the adversary and then also that's where they'll get the information about which wallet to send money to. Now anytime you're doing that, what you're doing is, of course, weighing that risk against the risk that you never get your files unencrypted because there's certainly no guarantee that an adversary is going to decrypt your files just because you paid them. And in fact, you might be opening the door to be revictimized. On the other hand, sometimes that is the only option for a victim. And interestingly enough, another thing that's happened as data has grown more and more, is that the process for decrypting files actually now takes a lot longer. So oftentimes, the victim's also weighing how long would it take to decrypt their files versus how long would it take to recover from a backup, because recovering from a backup may take a long time. Decrypting files may take a long time. So those are sometimes considerations being weighed too in these situations.

Sasha O’Connell

Thanks. That's super helpful in thinking about how this plays out. Before we pivot to thinking about policy intervention options for this persistent challenge, Megan, can you talk a little bit about Colonial Pipeline? It's probably one of the most high-profile examples people are familiar with and what the decisions were and how they played out for that organization as far as we know.

Megan Brown

Sure. I'm happy to, and, of course, there's tons of publicly reported stuff. The CEO has been really transparent. But back in 2021, some bad guys, some hackers that I believe the government is confident are associated with the Russian Federation, infiltrated the computer networks of Colonial Pipeline. They demanded more than $4,000,000 dollars in ransom. The company shut down the pipeline operations, which I would consider to be the operational technology side of things as opposed to the IT side of things, and that's a technical concept that policymakers struggle with drawing those distinctions, but hackers didn't get in and shut down the pipeline operations. So, it's really interesting how in the heat of an unfolding attack, you can see the decisions that have to be made against a backdrop of significant uncertainty in the Colonial Pipeline situation. The CEO has described the decision process and the timing both to Congress and to the press and based on that public sourcing, they detected the attack just before 5 am on a Friday, when an employee found a ransom note on a system in their IT network and as we discussed a moment ago, IT is the information technology side, billing systems, back end stuff, not the pipeline operations. That employee notified the operations supervisor who is in sort of the pipeline control center, and the CEO has described sort of a broad stop work authority that allows the pipelines to be shut down quickly if there's concern about safety; and here, they didn't know. They didn't know what was really happening and they were worried that there had been access to or there could be future access to their operational technology, their actual pipelines. So they put in a stop work order to halt those operations. That was done to contain the attack and help ensure that any malware didn't get into their OT systems if it hadn't already and that happened as I understand it, within an hour, employees began the shutdown process and then within 15 minutes, over 5,000 miles of pipelines had been shut down and the CEO, even though he faced some criticism for it, you know, he has said that it was an extremely tough decision, but he made that decision. They made that decision and then as folks know, they've paid the ransom. It took them, I think, around 6 days to get back online and so you can see there both the important differences between sort of that IT, OT issue, but really this broader uncertainty that decisions in a rapidly unfolding ransomware attack have to be made with really imperfect information.

Sasha O’Connell

Well, we've certainly covered some ground here in terms of the context, sort of depth and degrees, and implications of both ransomware and other associated extortion schemes and, using examples like Colonial Pipeline, it's easy to see these complexities in action and certainly relevant to our day to day.

We're going to pause here and regroup for our next episode where we're going to talk more about what policymakers can and are doing about this challenging issue. Thank you for joining us, and we look forward to seeing you on that next episode; and in the meantime, please don't forget to check out our associated resources in our Pause Here section of our website, where the link is, of course, available in the show notes. We'll see you next time.

Episode 5 - Ransomware Part Two

Join Sasha O’Connell, Drew Bagley, and Megan Brown as they embark on a journey to demystify cyber policy, addressing the vital gap in understanding and communication that hinders effective policy development. Through engaging discussion and expert insights, this episode serves as your gateway into the intricacies of cybersecurity public policy, specifically focusing on ransomware and policy approaches.Ěý

In this episode, you'll discover:

Understanding the Ransomware Economy: Delve deeper into the complexity of the ransomware economy, exploring its various facets and implications for policy development.

Regulating Ransomware Attacks: ĚýExplore policy approaches to regulating and mitigating ransomware attacks, considering the multifaceted interests that must be balanced by policymakers.

Challenges and Considerations: Examine the challenges inherent in crafting effective ransomware policies, including the need to address divergent interests and perspectives.

Solutions and Strategies: Gain insights into identifying real problems posed by ransomware and formulating practical policy solutions to combat this evolving threat.

A Holistic Perspective: Benefit from a unique perspective that integrates academia, private sector expertise, and real-world policy application to offer a well-rounded view of ransomware policy.

This episode is just the beginning of a series designed to equip current and future policymakers with the knowledge and tools needed to navigate the complex landscape of cyber policy, with a specific focus on ransomware. Don't miss out on this essential resource—follow and subscribe to stay updated with the latest episodes and insights.

Key Questions

  • Why can’t we just “shut down” the ransomware economy?
  • What regulation exists in the United States on ransomware?
  • How effective is paying ransoms?
  • How are organizations handling ransomware attacks?

Transcript

Sasha O’Connell

Welcome back to Start Here. We're working to give you a framework for analyzing cyber public policy questions. In our previous episode, we looked at the policy context and technology and impact associated with ransomware and other extortion schemes. I am so pleased to be joined again today for a follow-up on that conversation by Drew Bagley, Vice President and Counsel for Privacy and Cyber Policy at CrowdStrike and Megan Brown, Partner at Wiley and co-chair of the firm's Privacy, Cyber and Data Governance practice. We're going to take on today options facing policy makers regarding this really challenging problem. So, let's get right to it. Drew, why can't we just shut down this sort of economy that's driving this? Why can't we go after these bad actors who are doing this as a service and close this business down? Is it more complicated than that?

Drew Bagley

This is one of those situations where it's just about as complicated as it gets when it comes to disrupting the adversaries behind this. So, to break this down from a policy standpoint, we obviously already have disincentives in place with regard to criminal penalties for those that are engaging in criminal behavior, which would be inclusive of ransomware, of course, but nonetheless, that doesn't mean that that solves the problem, of course, right? Ransomware and other related forms of extortion, such as data leak extortion are gaining traction, and so policymakers really need to look at what else could they do to disincentivize this sort of behavior? What can they do to incentivize would be victims into bolstering their defenses ahead of time and not waiting until they're a victim to respond to these sorts of attacks, and what other sorts of IT hygiene could be incentivized right now. So, for example, could victims, in addition to implementing by better cybersecurity, could victims actually be incentivized to have better backups in place. And similarly, when there are ransomware attacks or other sorts of extortion campaigns, policymakers can look at whether or not the status quo is sufficient in terms of getting all of the information that's out there on what the adversary's infrastructure actually is and being able to disrupt that infrastructure and take that down. And so I think that's an important framework for policymakers to look at when drawing a conclusion about what they can do about this. Whereas where we've seen this debate, I think, has really focused instead on forms of payment, whether or not payments for ransomware should be allowed or not. So in other words, whether or not victims should have that option or not have that option and while that there's a legitimate debate there, I think there are many other aspects of this problem, and it's something that policymakers need to look at holistically like they do with other problems.

Sasha O’Connell

So it sounds like what you're really laying out when it comes to policy options here is, first of all, it's not just about payments. This question of outlawing payments seems to be a real hot topic today, but, as you said, Drew, that's not really what it's about. There's a whole possibilities around disincentivizing this kind of behavior, disrupting this kind of behavior, and then there's a whole set of choices around how we incentivize good hygiene, either at an individual level or at an organizational level. So in that lay down, it sounds like there's still a lot of responsibility on the part of users or the private sector to sort this out, but potentially some role for government. Where is the government on this Megan? Where are things right now in terms of what they're trying?

Megan Brown

People, you know, sometimes they roll their eyes when the government will indict a bad actors overseas because they scoff that they can't actually be captured and put in jail. I still think there's a process value to the government pursuing the bad actors even in absentia to send this message and push those international norms that, like, you should not be housing these open and notorious criminal enterprises in these dark corners of the world. That said, that's sort of a different bucket of issues, which is what is the United States government capable of doing offensively outside of the country to disrupt some of these actors? I'm going to park that as a totally interesting and separate discussion. There is additional regulation and movement coming to try and change the incentive structures here. Unfortunately, I think a lot of that is targeted at the victims themselves, but that is who the government has in front of them to regulate because they can't go after the bad guys hiding in the dark corners of the world of the Internet. So you've got mandatory reporting. Congress in 2022 passed the Cybersecurity Incident Reporting for Critical Infrastructure Act, which is a mouthful and that the rules for that are currently being developed by the cybersecurity and infrastructure security agency over at DHS. That's going to be a big deal. Congress has said if you get hit by a ransomware attack and you pay a ransom, you better show up and tell the government. They want to be collecting that information; they want to be collecting that information quickly. That is new. There is currently not a broad prohibition on the payments of ransom, nor is there a broad reporting obligation. You might have reporting obligations if a ransomware event qualifies as some other kind of cyber event, but the nature of the attack does not currently drive it. You've got a lot of voluntary work that goes on, which I think is really important and laudable not to lose sight of. Many times, the first piece of advice I'm giving to clients when they have a situation is call the FBI. Call the FBI because they might know something, call the FBI because if you end up making a payment, you want to show that you worked with the FBI. Call the FBI. That is frequently, but not always, the advice. Secret Service is also involved in some of these issues. You've got DOD and the intelligence community. Some of these actors are nation state. New mandatory incident reporting that was a direct response to the Colonial Pipeline attack, they started with a security directive towards the pipelines, and now they've expanded back to other critical infrastructure. The Securities Exchange Commission has new rules that have come online for reporting of material incidents by publicly traded companies. You've got state government activities. The New York Department of Financial Services has robust cyber rules. They also have issued specific guidance to their regulated entities, and here we're talking insurance companies, people offering financial products, debt products, etc., warning them about double extortion. I think we're going to try and drop a copy of that in our resources for the show, and then as Drew said, you've got treasury, which administers our sanctions program, which it is a strict liability regime. If you accidentally pay a terrorist group, you can go to jail. So they take this stuff really seriously even though you're just trying to get your data back online.

Sasha O’Connell

In that context, it's so complicated. Even the FBI's perspective, they're kind of threading that needle where on the one hand they absolutely do not support the paying of ransom as we talked about, they argue ransom payment, not only encourages and furthers the business model, but it often, as you guys have both said, goes in the pockets of terrorist organizations or money launderers or rogue nation states. But on the other hand, I think AD even testified that the FBI is not encouraging Congress to make it illegal because you're then, as you just said, Megan, sort of creating this double extortion risk. I think his quote was, if we ban ransom payments now, you're putting US companies in a position to face yet another extortion, which is being blackmailed for paying the ransom and not sharing that with authorities. So you can see in that FBI position alone, the sort of complexities for the government of the one hand, not wanting to feed this ecosystem. Megan, on the government side, banning payments, voluntary, what other, what else is being discussed here?

Megan Brown

There are proposals that pop up from time to time to try and ban most extortion payments. Deputy National Security Adviser Neuberger said in May that they're grappling with that and there have been a couple of large reports on these topics that I think the government recognizes those tradeoffs, but there are voices who want to go ahead and ban it and believe that there would be some short term, hideously painful consequences of that, but that's the only way to fix the solution. I think those are hard decisions to make. It's hard to me to tell a hospital, go out of business or, patients' health outcomes get affected or you small business that has been around for years, you just go out of business; I'm sorry. That's just, I think that's tough because we don't ban the payment of ransom in other settings. If there is a, let's call it a domestic kidnapping or an international kidnapping, you do not violate the law by paying the ransom unless you violate the sanctions that we mentioned already. So I think it's a big change to consider banning ransom, and I think policymakers have a lot on their plates.

Sasha O’Connell

Absolutely. Drew, what do you guys see at CrowdStrike? I mean, does paying ransom work most of the time at the end of the day? It seems like an empirical question for policymakers that they'd want to put in the mix.

Drew Bagley

Yeah. CrowdStrike, we really see the whole gamut with regard to us being brought in at completely different times. Sometimes it's after the fact. So all the options you know, can vary beyond the table, and so, sometimes, companies are actually in a pretty good position because even if they were hit with ransomware, they actually have visibility into their systems and can actually determine the impact right away and understand that there might not actually be much of a risk. There certainly could be a disruption, but much of a risk that they have data that will remain inaccessible and other times they may not have any visibility into that, which can affect their decision making process. But ultimately, it really becomes something that is up for each individual organization to weigh their risk today. Depends a lot on what the state of their IT was, what the state of their security was, whether they had any visibility into their systems, what the state of their backups were, and even their industry and what's at stake. So if we're talking about a hospital, then that might be a lot different than talking about a small business that might be selling something and might have to redo an inventory list or something like that. So that's where all of those variables can come into play. From a policy issue, I actually think it's really helpful to kind of think about ransomware not as something that is just specific to ransomware, but when I'm listening to some of the examples Megan's giving, to really think about this as extortion and what is it that policymakers can do, and how can they help victims of extortion, cyber extortion, because extortion here can take many forms. So, when we think about ransomware, we're thinking about the encryption of files and then the extortion for payment to get access to those files again. When we're thinking about data that's been exfiltrated, we might be thinking about extortion where data might be leaked and have a greater impact if a payment is not sent to an adversary, but then we can also think about modern extortion where an adversary is threatening to go to a regulator to report that a victim has faced an incident and not yet reported it and in fact, it might be that a victim might be in a position where it's not yet reportable because it has not yet risen to the impact of being reportable, and yet the adversary would nonetheless, being able to extort a victim and so I think this notion of extortion is one where that's what's really important to address because the methods to extort victims will evolve as they have. As we've seen over the past couple decades, they'll continue to evolve, and that's really the public policy challenge today.

Sasha O’Connell

Yep. That makes sense. And, again, you raise the potential for creating compounding extortion opportunities by outlawing payment or even mandating reporting, which is so important to consider and maybe under considered when thinking about policy interventions. Megan, before we wrap, can you, for policymakers out there, give us one more time kind of what are victim organizations thinking? What are they balancing when this happens? Because I think that's a really important sometimes overlooked perspective of people who haven't sat in that chair and now sitting on the policy chair. Like, are they thinking about sanctions? Are they thinking about company impacts only? Like, what's the structure they're running through that policy makers should keep in mind when thinking about crafting any kind of intervention.

Megan Brown

Oh, I love that question because, of course, I want policymakers to have sat in the seat or at least talk to people who have sat in the seat, but I think from a victim company's perspective, they're thinking about all of those things, Sasha, which is, is it illegal to pay? What are the risks of payment? What are the reputational harms of payment? What are the reputational harms if I do not pay? What are the harms that my customers or employees will face that payment might obviate? If someone's threatening to release all kinds of sensitive health information of your employees, it's perfectly reasonable to think about whether a payment could prevent any harm to them from the public release of that data. Similarly with customers, right, so there's lots of those things. There are questions about what is the actual impact of your operations and how quickly can you diagnose those impacts. Has the data actually been exfiltrated, or are they lying to you? Are your systems encrypted? Are your backups adequate like Drew mentioned? How quickly can you spin up your backups because that's not like an instantaneous cutover kind of thing. How sensitive are the systems and data that might be at issue? Those things all factor into the decision to negotiate and a final decision to pay, and those factors change over time. What is right on Monday might change by Thursday by the time you've done certain things so it is not a binary thing, and I think policymakers also need to sort of keep in mind that. As well as, you know, if you're looking at rebuilding, if you've got a ransom demand for $2,000,000 dollars and it's going to cost you $20,00,0000 dollars to rebuild your systems, that may be a pretty simple calculus. As those numbers get closer maybe you have less of a need to pay, and because it is extortion more than anything else, there is a heavy degree of psychology. I know that, I'm sure Drew, on the matters he's worked on, there is a lot of psychology here that, you know, the bad actors and others, you have to try and think about, and that's why it's such an imperfect system, and then finally, you know, I think there's a lot of considerations about is the incident already public and what will be the consequences of that publicity issue. And that, Sasha, relates to a related policy question. So there's the policy question of should payment be illegal? Should payment be approved? Should payment be acceptable? Then there's the ancillary question of should we make you tell on yourself? Should you have to publicly report? And I think, there's some really interesting tradeoffs there as well about, you know, a reasonable reluctance on the part of a victim to disclose that they paid a ransom because disclosing that information might make other people see them as a good target in the future, and so there's just all of these complicated tradeoffs that I think do not lend themselves easily to new federal regulations, but I think policymakers are really grappling with it. They understand all of that. All the folks at the NSC are well aware of all of that, and I think are trying to approach these issues with thoughtfulness and humility. Maybe not the best data, because I think that's hard to get, but I think those are the issues I would say I would love a policymaker to just keep in the back of their mind.

Sasha O’Connell

Amazing place to end. I mean, no simple answers here. But, again, like you said, having more information about what we're really talking about as you guys broke down the technology, the players, the ecosystem, and I think importantly, the context, that this is a changing threat and we need to keep that in mind. Then also the secondary tertiary kind of consequences of any potential policy intervention as you guys laid out are all super important. So with that, we are going to wrap this episode. We hope everyone will both visit us and future episodes in, on our website link will be in the show notes where you will find additional resources on this in previous topics. Thanks again, Drew and Megan, for joining me today and thanks all of you, for listening. We will see you next time.

Ěý

Selected
Episode 6
Episode 7
Episode 8
Episode 9
Episode 10

Episode 6Ěý- Dataflows

Join Sasha O’Connell, Drew Bagley, and Megan Brown as they embark on a journey to unravel the complexities of cross-border data flows and their impact on global cyber policy. In this enlightening episode, they delve into the benefits of data flows and internet connectivity, highlighting their crucial role in driving innovation, economic growth, and international collaboration.

Key Questions

  • What are the benefits of cross-border data flows?
  • Why is there a need for a global workforce for data flows?
  • What is telemetry data?
  • What does data flow regulation look like in the United States?

Resources

Trancript

Sasha O'Connell

Welcome back to Start Here. In this series of podcasts, we are working to give you a framework for analyzing foundational cyber policy questions. In our previous episode, we looked at ransomware and some of the challenges companies and governments face in trying to stop it. My name is Sasha O'Connell, and I'm a Senior Professorial Lecturer at American University, and I'm joined again today by Drew Bagley, Vice President and Counsel for Privacy and Cyber Policy at CrowdStrike, and Megan Brown, a partner at Wiley and Co-Chair of the firm's Privacy, Cyber, and Data Governance practice.

We're going to take on the next topic, and that is cross border data flows. As we all know, being online is completely fundamental to our lives. I start every semester with an exercise where I work with students to list the ways we interact with the internet from sunup to sundown, and sometimes, honestly right, overnight, and it's, of course, every time we do it, a remarkable list. For the purposes of this episode, it's really important, I think we start with that recognition that one of the reasons being online has been so, frankly, incredibly useful and functional and really a necessity in our daily lives is all built on technical protocols that are in fact global and decentralized, and those are also dependent on the ability of major providers to move data around the world at lightning speed. It's what makes all the magic work. Over time, of course, it turns out that that set of global protocols, and in particular as relevant to this episode, the associated free flow of data has some consequences that are important to acknowledge and balance with all the primary benefits that it also provides.

In an effort to assert their vision of the correct balance of those two things, the costs and benefits, if you will, governments around the world have started to push for data localization policies and laws. This is a concept that requires, essentially, companies to store data in particular countries and puts restrictions, sometimes outright bans, or requires export licensing on moving data around the world. We've even seen some recent moves in the U.S. to limit the transfer of data to certain countries. So we want to talk about it. Should data be treated more like a physical product in international trade with a whole scheme of rules and requirements for bringing it into the U.S. or sending it out? We picked this topic for Start Here because there are real practical and policy issues that are at play, including impacts on cybersecurity activities, which we're going to talk about in just a second, when governments limit the movement of data across borders or require in country storage.

So with that, I'm going to turn to Megan and Drew here. I mean, we talked a bit about the benefits, the free flow of data underlying just about everything we do on the internet. Can you talk a little bit more about maybe some of the other benefits that aren't so intuitive?

Drew Bagley

Sure. Absolutely, Sasha. Yeah. As you noted, there are many economic benefits from a business perspective, even in the ability to set up a multinational business across borders. There are cultural benefits, but importantly, and I think that policymakers often overlook this, there are lots of cybersecurity benefits and even, I would argue, cybersecurity necessities to the free flow of data. So for example, all of the devices that we use today have some sort of unique identifier being generated, or otherwise statically associated with the device and those unique identifiers are important, because as we're online, that means that if an adversary is attempting to get into our laptops, our phones, or any other device, there is some sort of interchange between data. Between unique identifiers that might be associated with that adversary and unique identifiers that could be associated with the victim. And those breadcrumbs are really important for cybersecurity and those can be important in terms of preventing a cyber attack and detecting something that could be adversarial behavior, or even in terms of investigating a cyber attack after the fact. And in fact, these days, there are even cybersecurity mandates backed up by official guidance that call for the use of threat hunting, for example. Whereby you would have people 24/7, so because of time zones, you'd have to have people around the globe, looking at this sort of telemetry data that has these unique identifiers for purposes of catching things that the technology alone might not catch. Find that hands on keyboard activity. There's also use of identifiers in red teaming, where you are asking a security company to come in and try to penetrate your defenses and see how good of a job you do protecting against that. All of those things, by the very nature of how you describe the Internet, require some sort of cross border data flows and this all exists in an era in which, in some legal regimes, even something as public as an IP address is sometimes categorized as regulated data that should be localized, and so that's where cybersecurity really is affected by data localization.

Sasha O'Connell

Drew, I know you have a paper out you co-authored on this topic. One thing, again, you mentioned it briefly, to go back to is the need for a global workforce and the need to move data around. Can you explain that? What does that mean for CrowdStrike? And what does that mean for the need to move data across borders?

Drew Bagley

Sure, so I co-authored a paper with Peter Swire and several other co-authors that focused on taking the MITRE ATT&CK Framework, which is the closest thing you can get to an industry standard in cybersecurity for what a cybersecurity framework looks like and we applied that to various data localization rules, and essentially, an adversary is going to take data across the border anyway; regardless of the rules. They're not exactly rule followers and yet a defender, if a defender is hamstrung by data localization rules, that means that you could have a defender in one jurisdiction, able to only look at a certain set of data and as soon as there was, let's say lateral movement within a system, meaning an adversary is moving from one machine to another across the network, then all of a sudden that same defender might not be allowed to look at the data that technically is in another jurisdiction. Where this really comes into play these days is most attacks use legitimate credentials, as we've talked about in previous episodes, and so if you're talking about identity credentials, and those are personal data, well, there's an interplay going on constantly through various internet protocols to authenticate those identities and so if a defender is not allowed to look at authentication logs once they cross a certain threshold, then that's very problematic; and essentially what you have under ideal circumstances, and again, under official guidance, even from the European Union Cyber Security Authorities, under ENISA, is that you have 24/7 security operations centers staffed by people who are doing threat hunting, but even, and forgetting even that technical cybersecurity, you just have 24/7 customer support if something technical is going wrong and that requires access to data, basic data. And if those things are disrupted, then that's something that can really only benefit the adversary who doesn't have to play by the rules and doesn't have a whole lot of benefit for the defender that's going to be boxed in by the rules.

Sasha O'Connell

I heard you say, too, that it makes sense for CrowdStrike, for example, to have folks work the evening when it's day in another country, just sort of by definition, which interests me because it's a real people issue, at the end of the day, not a technical issue. So that aspect is interesting as well.

Drew Bagley

Even when we think about cyber workforce shortages today. Think about in a single jurisdiction, even a larger jurisdiction with a big population, I have yet to find a policymaker in the world that doesn't complain about the cyber workforce shortage. So, imagine if then you've reduced your pull because of data localization, it's hard enough with a follow the sun model, but with a follow thesun model, you can do this. And especially where we have cyber haves and cyber have nots, you have a lot of organizations that have to depend on managed service providers that have that 24/7 backbone to help them from a security standpoint. So, then you're really just shifting that burden if you're making it so that the rules are that you can only find talent in a certain jurisdiction within certain hours that they're going to work and then hope for the best during the other parts of the day.

Megan Brown

Or worst case, I think, or an additional downside of all of this is you're just introducing friction. You may have to have multiple people doing the same thing because they're in different jurisdictions and I think when you're talking about cyber defense or response, speed is really important, and so it's not satisfying when I hear some policymakers say, well, you know, you can just contract around that or there's ways to work around that. Maybe, maybe, but it introduces friction, contract negotiations, additional bodies, and it's just not– it gets in the way of the speed that I think Drew's been talking about being so important.

Sasha O'Connell

So, anything else on benefits? Again, we sort of brushed over, but importantly, 90 percent of probably what we do on the internet, is dependent just from a convenience perspective on this flow of data globally. We obviously have this really interesting addition of the cybersecurity aspect to it. Any other benefits on this before we move to maybe what some of the risks are to this kind of data flow?

Megan Brown

I mean, I'll just flag, I think we take for granted in our connected economy that when we travel, for example, our services are going to work. Like you can hail a ride share in Greece that you could also do here in Washington, and if you don't have data transfer and portability, all of that can be much more difficult, much more costly and less seamless for end users. You know, in addition, we've had, cloud services are enabling huge cost effective storage as well as ready access around the globe and a lot of these data localization questions impact those efficiencies, and so I think just policymakers need to understand that whenever you're putting up an additional hurdle to the use of data or the movement of data, you are having these on services, and technology that a company wants to be able to send telemetry data to its engineers in India for processing for some cool new thing. If they have to check with their lawyers every time they want to do that, you're introducing a lot of friction to the economy.

Sasha O'Connell

Can I ask you guys to define telemetry data?

Megan Brown

No.

Sasha O'Connell

Thanks. Good talk. Drew? It came up twice. On behalf of our listeners everywhere, telemetry data.

Drew Bagley

Absolutely. I'd be more than happy to. It's such a fascinating term to define. So, sure, at its core, telemetry data is generally speaking, and this is something that changes over time as technology evolves, but the metadata being generated, either by a device, so we can think of Internet of things, devices generating some sort of data about what's going on on the device. But more commonly in the context of cybersecurity, it's really the metadata about the processes going on a device. So when you open your office software, for example, there's an executable file that opens. The content of that executable file is not the telemetry, it's the fact that that executable file opened, and then whatever happens subsequently. So that file might call out to different libraries that are on the system, and then the operating system might take other types of actions, and so that chain of events is very important from a cybersecurity standpoint because if, for example, if you opened a Word document, and then all of a sudden there was a file delete event after that; that would just be the telemetry itself. You wouldn't have to look at the document to know what's in it. But that pattern might be indicative of ransomware on the system, and so that data, again, is useful for cybersecurity, but it's only useful if you're able to identify the adversary and stop the adversary, identify the victim machine and block what's going on on the victim machine. If you remove all those identifiers, then that's something where you can't have it both ways. And so I think oftentimes, when these data localization conversations happen, cybersecurity is being thought about as if it's 20 years ago during the malware wars and most attacks today don't even use any malware. So you're not talking about matching hashes to a known list of badness. Instead, you're talking about using this telemetry.

Sasha O'Connell

Okay, perfect. So the point here is that telemetry data would be included in any data restriction?

Megan Brown

It could be. It's all definitional and I think that's another thing policymakers just have to always keep in mind is don't cast too wide a net when you're defining what data has to be protected.

Sasha O'Connell

Yep. That makes sense. Okay. So, I see a ton of benefits here. Are there any benefits to restricting data? Can you guys explain, where this is coming from? Either from a general perspective or a more tactical perspective? What are the benefits on the other side of some restrictions?

Megan Brown

So, there's always going to be some risk relating to the collection, handlings, sharing, trade, and data, right? There's commercial data, there's sensitive personal data, there's all kinds of data, and that's what many times the bad guys are looking for, so, it is reasonable to try and minimize that. It may be reasonable in certain circumstances to try and keep that data from getting out to countries of concern, certain kinds of data, we policymakers might say. We don't want this kind of data in our adversaries hands but there's several justifications that regulators around the world will offer. One frequent one is, a country may have made a value judgment about what privacy and security demands they have domestically, and they are worried that the export of that data will subject that data to less protection. So that's, one model to export your privacy standards to the destinations for your citizens data. Often, it's to ensure that data is going to stay available in a country so that that government can have access to it for their own purposes. That might be counterintelligence, that might be law enforcement regular old surveillance. If someone is in their country to be able to get that information for law enforcement purposes, for example, and some countries really do want to support their own domestic economic growth by encouraging companies to build data centers and offer cloud services in their own countries. That generates jobs and promotes their own economic growth in the tech space. So, that's another kind of motivation that I think many would say is a benefit from some of these limitations on cross border data movement.

Sasha O'Connell

That makes sense. So I hear it again, just saying back to you to make sure I've got it too, so there's kind of a export of privacy values, right, to make sure that the way we do business in our country, we're protecting that as our data goes forward. There are government sort of uses, be it for the government's surveillance and the last, which is again, I really gravitate toward the human, the sort of non technical aspects, this idea of literally creating jobs by keeping the data in your country to build data centers or otherwise. Anything else, Drew, on that or you're, you think we covered the benefits?

Drew Bagley

Sure. I'd say sometimes you see a conflation of all of those in different jurisdictions. So, for example, in Europe, there are definitely data localization restrictions that exist under privacy regimes like GDPR that, of course, have exceptions and the idea is how you create mechanisms for other countries to either bolster their privacy protections or at least bolster contractual protections that follow the data. But then there are also equally, and maybe even right now getting even more traction, there are trade equities at stake. So, this notion that if you are able to control and regulate the data, then you're going to be able to in theory, shape the marketplace and shape how companies are able to play by the rules and especially in certain jurisdictions where you might face that your entire marketplace is dominated by foreign companies. It's a way to have a stake in the market and so in Europe right now, in addition to privacy laws, having some data localization requirements, there are also different certifications. So for example, there's a certification in France called SecNumCloud that's been proposed that would actually have.

Sasha O'Connell

Good French Drew.

Drew Bagley

Thank you! It would have data sovereignty provisions in addition to data localization, meaning that there would even be this component in it where the data being stored by a certified company would not be allowed to be subject to foreign laws. And so, we have to remember that data localization actually comes in different flavors. There's, as Megan was outlining, there's data localization where you have to keep a copy of data in a jurisdiction. Then you go all the way to the other extreme where you're not even allowed to have any other law apply to that data.

Sasha O'Connell

That's fascinating and creative policymaking, as I hear it. So in a second, I want to get back to this kind of balance of trade versus national security, because I think it speaks to a lot of what's going on in the U.S. right now in terms of some ongoing discussions, but where is China on all this, Megan? Can you talk a little bit about that?

Megan Brown

Yeah. I would put China at the extreme of the data localization mandates and sort of data sovereignty as Drew sort of described it. They want data that is created in China to stay in China. They have some permissions for exporting that data, but it has made the business climate for multinational companies very challenging and they justify their rules based on national security interests, and I've heard Chinese government officials, at various events say, we need this data because we need to be able to make sure that we don't have domestic terrorism and we don't have, et cetera. So I would put China on the far extreme in terms of the domestic obligations and the rights that they assert to look at that data, and that has given a lot of U. S. companies and multinationals real heartburn, and it creates problems that, you know, we'll talk about in a minute about how U. S. law is going to deal with that. India is another example of a, pretty big data localization country. I think some observers have said that they're starting to moderate that a bit because they were seeing some economic downsides from being kind of an island but there's a spectrum, of how countries have approached it.

Sasha O'Connell

So it's interesting. So countries can be in this game, but for different reasons, Drew, is kind of what you were saying and maybe framed politically even one way, and maybe there's some other stories going on. I'm getting big nods here in the studio. So okay, let's turn to the U.S. It's a really interesting story right now, and maybe we can start with just the lay down of where are we, Megan? Like, are there U.S. data localization laws today? My understanding is traditionally globally, right? We have been all for the free flow of data. Where are we today on that? And then maybe Drew, you can add in some of the players today and we can talk about what's going on.

Megan Brown

Yeah. So I think we are at a really interesting pivot point for U.S. policy in this area. You're right. Traditionally, the U. S. has been a champion of not just the global open internet, but also the free flow of data, and that's been a policy that the U.S. Trade Representative has long championed as a way to push back on some of the arguably protectionist impulses that Drew discussed, that maybe the European Union is taking a different approach for maybe different reasons, but I think the U.S. government is starting to shift that approach and you see that in several ways, and there's a lot of different dynamics going on here. You've had pressure from Europe for a long time. They have traditionally looked skeptically at U.S. law because they think there's too much surveillance. I think we could have a whole separate discussion about whether that's correct or not. But they've sort of, wanted the U.S. to do more from a privacy perspective but the U.S., they used to push back, I think, a little bit on that. They've pivoted a little bit now. Notably, the U.S. trade representative last fall did a big change in the policy approach. They walked back some longstanding advocacy supporting the free flow of data so that the United States can take a more regulatory approach. One of the reasons the United States wants to take that more regulatory approach and tighten up the flows of data is because of national security concerns that folks are saying all this data, lots of data is ending up in China and we are deeply uncomfortable with genomic data, with lots of sensitive personal information. So, that's caused this, and maybe it's been given them an excuse to do something that they otherwise want to do, but there is a pivot going on at the United States government level, to entertain these notions and to start going towards a more permission-based approach to the global data trade.

Sasha O'Connell

And who do you talk to about these issues in the U.S. government? Which departments and agencies have equities here? Can you kind of break that down for us?

Drew Bagley

Yeah, absolutely. There are many players in the U. S. Government. We can think about traditionally the Department of Commerce having a very big role here. Especially if we think about the Bureau of Information Security, maintaining a list of different types of export controlled information. For example, munitions information is on that list. And we also can think about though, new kids on the block, like the Cyber Bureau at the State Department, having an equity here and definitely being involved in the discussion about cross border data flows, and so, even though there are you know, different data types have always had different rules, right now what we're seeing is that data as a whole or entire subsets of data like personal data are now facing more friction. This isn't just happening at the federal level. In fact, just last summer, the state of Florida, actually enacted a law that regulated healthcare data and the flow of healthcare data that essentially forbid certain types of PHI from being processed outside of the United States or Canada. And that's something where, you know, on its face seems, simple enough and probably you know, intended to apply to medical records, but again, when we start thinking about how complex data sets are, either with cybersecurity or even with AI training models and whatnot, and that's something that really quickly can really trickle down and add a lot of friction to how data flows work.

Sasha O'Connell

Where do you guys see this all going in the U.S.? So there's a little movement at the state, as you described, Drew. We have, I know maybe we should talk a little bit about a recent executive order that just came out on this. Where does this all, where do you think this ends up?

Megan Brown

Well, I think we are seeing this. I used to think it was incremental. I feel like accelerating rapidly. You know, we've seen in the past some data localization, for instance, the committee on foreign investment in the United States and the team telecom process, which reviews when foreign companies want to buy parts of U.S. telecommunications companies. They impose mitigation agreements that require data localization or they prohibit storage in certain countries. So we've seen bits of that. We've now seen the Commerce Department being more active. There's a rulemaking that they're kicking, that they've kicked off to try and to get a better handle on the U.S. computing infrastructure. They call it infrastructure as a service and they want to understand what companies are making use of that here in the United States. But I think the biggest pivot that I'm looking forward to, I think Drew's going to address, is this executive order, where we've really just jumped into the deep end of the pool, I think, on broad new government oversight of data transfers and it starts out by looking at so called countries of concern. But, I don't know that it stops there, and even if the ultimate restrictions are focused on a few countries, the friction that we keep talking about will apply broadly across the economy as companies have to figure out if they're covered by these new restrictions that the President wants to put on data transfer.

Sasha O'Connell

So here in March 2024, for those listening, maybe later on down, what just happened, Drew? What is this executive order, Megan's talking about?

Drew Bagley

Sure, so the President issued an executive order on the bulk transfer of data to foreign countries. And so right now we're actually in a holding pattern to figure out what the implementation will look like because the executive order delegated to various agencies, different rulemaking responsibilities and so what we're going to see are lots of public comment opportunities for anyone listening in the early in the year 2024. A lot of public comment opportunities and how this gets implemented but the overarching framework is that the President is laying out that there should be an apparatus for the restriction of certain data types from being stored in

all of the certainty is coming later. It's coming later; but that's exactly, but that's the general framework and so the hypothetical sorts of scenarios and threats that, it appears this is intended to deal with are those related to say bulk biometrics being stored and what if those were stored in a country that was hostile to the United States. What would that mean in terms of protecting the citizens from a national security perspective?

Rather than this executive order being designed in a way that's intended to really address privacy or trade or some of the other things that we've talked about. Now that's not to say that especially once we see how it's implemented, it won't affect all of those things and have an impact on all those things, but the way it's set up is really under the notion of the fact that there is more data collected about individuals than ever before. This data can be used for more nefarious purposes if it gets into the wrong hands and there needs to be some sort of means, to have some sort of oversight over where that data is going and curtail those data flows.

The executive order itself though, even though, again, it's not super specific since that's going to come later, it still even has this notion of exceptions built in and everything. So we'll see where this really ends up. My suspicion is it's one of those powers that, the President is laying out to have this power, to have this card to play, in some sort of future event, or to have some sort of leverage, in different situations down the road, rather than this to be a new overarching framework that's akin to a privacy framework or something.

Sasha O'Connell

And I'm thinking as you're talking, does what the U.S. does matter disproportionately because so many of the tech companies that have our data are here in the United States? How does that play into this kind of global discussion?

Drew Bagley

I think that that's even if you look at some of the examples Megan cited about other countries that are even attempting to influence our behavior. That would be the European view, for example, is that, we're going to scrutinize whether or not the United States has a federal privacy law because U.S. tech companies tend to dominate the market. Whereas the European Union, for example, generally does not come out and take an opinion on China, which has laws that literally force encryption keys to be turned over source code to be turned over, et cetera, but, their argument is you know, less relevant.

Megan Brown

Well, I think we do have some indications of where the government's going to go, and I think I may disagree a little bit with Drew on the ultimate breadth here. I think the government and the executive order says that it has kind of this modest goal, but the devil really is going to be in the details. The DOJ, the Department of Justice, is delegated substantial regulatory power under that executive order, and they have already released right on the heels of the executive order. They put out what's called an advanced notice of proposed rulemaking, and we may, I'm not going to take us down into, you know, an administrative law nerd, but, they've telegraphed in this ANPRM that they are interested in a broad array of different data categories as sensitive. They have a lot of questions that show potentially quite a broad reach of the ultimate rules. How they define what is a bulk transfer; the countries of concern are fairly narrowly circumscribed but it envisions that even if you're not directly giving data to a country of concern, that you'll have to potentially put contract terms in your vendor agreements that restrict third parties abilities to give your data to countries of concern. So I think there is a lot to unpack here as this executive order flows through, and it really is a big change in how the United States has thought about the free flow of data, and it's just a fundamental philosophical move that we've made to now go to perhaps a more license based approach, but this permission based approach.

Sasha O'Connell

So, on that example of the new executive order, if the issue the U.S. is thinking about in terms of what it wants to solve is the potential misuse or abuse of sensitive data of U.S. persons, are these kind of data restriction regulations or laws going to get us that direction, Megan? Like, you know, we talked about, we love the free flow of data, but there's sometimes unintended, challenging consequences. We make a move to address those consequences to potentially again, now have unintended consequences of those policies that restrict that data. What do you think?

Megan Brown

Yeah. I mean, I think the question that policymakers have to keep in mind when they're drafting something like the Notice of Proposed Rulemaking at the Department of Justice or BIS Export Controls is what are the unintended consequences? Are they focused really tightly in on the actual problem that they're trying to solve? Have they gotten good data about the costs of it, and those costs are not just the restrictions themselves, let's just hypothesize they get it right and they're really focused on a few types of data and a few countries of concern. One of the challenges is all the other companies that have to go through the process of figuring out if they're covered, talking to their lawyers. Analyzing every word that's in the new rule to, so there's the potential for over breadth, that the rules themselves may in fact sweep more broadly than the government thinks is necessary and the sort of damage that's done in economic costs to subjecting businesses, to additional uncertainty and more regulatory hurdles. So all of that, I think, suggests that, folks need to tell the government in response to these public comment opportunities, but policymakers really need to focus in on getting a cost benefit analysis, good definitions that are realistic and talking to people who actually will have to live under these regimes because there's very real practical impacts on intercompany transfers, all kinds of things that they might not anticipate.

Sasha O'Connell

Interesting. Well, this topic is certainly one on the front burner and one to watch going forward. I think with that, we're going to wrap this episode. We hope everyone visits us on our website for Start Here and the link will of course be in the show notes. To see additional resources and have the availability for the transcript as well. We hope you join us next time and we want to send a special shout out for this episode to the production team of Erica Lemen and Josh Walden and the team here at Wiley for hosting us in the studio this month. Woohoo. Thanks guys.

Drew Bagley

Better production quality.

Sasha O'Connell

Exactly. See you next time.

Episode 7Ěý- Digital IdentityĚý

On this episode of START HERE, join Sasha O'Connell, Drew Bagley, and Megan Brown as they unravel the complexities of digital identity and its implications for cybersecurity. Delve into the definition of digital identity, the critical role of authorization, and the emerging technologies shaping authorization processes.

Through insightful discussions and expert analysis, this episode explores the challenges faced by key players in digital identity and authentication, and the workstreams and policies aimed at addressing these challenges.

Resources

  • Washington Post article:Ěý

Transcript

Sasha O’Connell

Welcome back to Start Here. In this series of podcast episodes, we provide a framework for analyzing foundational cyber public policy questions. In our previous episode, we looked at the international flow of data and how governments are looking at changing rules that apply to that data. Today we're on to the next not so simple challenge, digital identity. To work through the ins and outs with me on this topic, I am again joined by Drew Bagley, Vice President and Counsel for Privacy and Cyber Policy at CrowdStrike and Megan Brown, a partner at Wiley and Co-Chair of the firm's Privacy, Cyber and Data Governance Practice. We are going to take on this next topic and break it down.

So with that, let's get to it. There is a famous 1993 New Yorker cartoon by Peter Steiner. Now, somewhat ironically, also an internet meme, I believe, which has a dog sort of typing on the computer quote unquote, talking with another in-real-life dog at his feet and the caption reads “On the Internet nobody knows you’re a dog.” That pretty much seems like a great place to start this episode, because the ability to at least appear or feel anonymous online is one of those fundamental features of the internet and is also something that has both benefits and costs, frankly that the policy community continues to grapple with. In terms of what the right balance is and digit identity in particular wrestling with some of these challenges as it presents itself in the specific context of cyber security is a huge issue while maintaining obviously benefits where possible.

So with that, we want to get into it, and Drew, why should we even talk about digital identity? Doesn't everyone know what that means, and it's crystal clear in the policy community and we can move on or what?

Drew Bagley

Everyone knows what that means, and everyone has a different understanding of what it means, I would say. Identity is something where when we think about it in the context of information about ourselves, we think we know what that is. It's our PII or maybe it's even our actual identity card or whatnot; but then in other contexts, identity is actually dealing with the authentication protocols through which either we, as individuals sign into computers and devices, or that software actually authenticates itself and is able to interact through APIs through different web platforms and whatnot. And you know, a good illustration of this is actually recently, I was involved in a conversation between two people where one person was talking about identity and they were thinking more about identity in the context of authentication and trying to protect that from adversaries and threat groups and another person was thinking about identity more in the context of a digital identity and kind of like a digital passport and thinking about that. Another person was thinking about it more in the context of PII, and I think that's a great illustration of how identity can mean different things in the cyber context and yet, each aspect of identity is very important when it comes to understanding cyber policy and what we can do about protecting various forms of identity.Ěý

Sasha O’Connell

All right, perfect. So, Megan, in that context, when you think about cybersecurity in particular, how do you think about digital identity, or which of those aspects that you just mentioned are most important?

Megan Brown

Well, I think both of them are in different ways. So many of the legal frameworks around cybersecurity and the expectations for companies and organizations depend on this concept of authorized or unauthorized access. And so that kind of means for someone, either a person or a device or a software system, like Drew just mentioned, to be authorized, the system owner has to know what it is, who the person is, are they who they say they are, or is the device who they say it is. And the ways in which we know and verify who someone is online is sort of one of those key foundational pieces of connected services and networks and information systems. It affects who's allowed to buy and pay things, send an email, make a medical appointment and organizations have to have processes and technologies to validate who the people are, and who the entities are that are connecting to their systems, whether it's a large enterprise network, or, your son's school portal to be able to log in. For a lot of the privacy and cyber incident reporting requirements, they turn on whether something is authorized or unauthorized. So you have to be able to tell whether the access you've seen was authorized or unauthorized, but then there's sort of core cyber expectations, which is, you will build systems, and regulate how systems access data based on access, based on digital identity, based on who's allowed to do what; and so it really is a foundational piece of doing cyber security right, and about what the government's looking at when it thinks about, how companies need to be doing cyber security.

Sasha O’Connell

That makes sense. So, in that context, it's obviously critically important that we have the capacity to identify people in a secure way. Drew, in that context, obviously, identity online, digital identity is extraordinarily important in cyber security as Megan just outlined. Can you talk about how it's handled? Break it down, how do we know who's a dog and not a dog as it were on the internet?

Drew Bagley

Yeah, as Megan noted at the end of the day, whether we're talking about the first example I was giving about identity when we're thinking about PII, or we're talking about the authentication piece, it's still all focused on validation. So traditionally, when we think about validation in terms of authentication, of course, we've had usernames and passwords going back decades, as one way to authenticate us, and then you know, in recent years, multi-factor authentication has become a lot more common where it's not just something you know, your username and your password, but it's also something you have, which could be in the form in modern times of your mobile device and being able to authenticate either with a code or a push notification that you are who you say you are by having those two things. There is, of course, verification that in the physical world, we think about with regard to having our passport on us, and maybe even multiple forms of ID and whatnot; and so, even though they're different context, we're still in the digital space, thinking of different ways to really confirm that somebody is who they say they are and that their credentials were not just co-opted by somebody who shouldn't have them. That's one of the fundamental problems of just having merely a username and password and in addition to multi-factor authentication, biometrics has increasingly become more popular, even on our mobile devices, with either fingerprint or face ID and then there's also a notion of even browser fingerprinting and other aspects of identity where there can be a composite made of different factors in addition to your username and password to have some high degree of confidence that the person logging in is the person who really does own those credentials or should own those credentials. But then, that's all really pretty much on the front end. On the back end, it's not going to scale for a machine to be logging into a machine each time, or for a user to say, okay, I need access to this computer and my computer needs access to these three service for information and these five web services and whatnot. So a user is not just logging in at each layer of the internet.

Sasha O’Connell

Got it Drew. So that all sounds like front end or kind of user forward ways to do it. What's happening on the back end? Is there other things going on there we need to understand?

Drew Bagley

Sure, on the back end, we fortunately, generally speaking, don't have a universe in which users in modern times have to manually authenticate themselves for each individual system, themselves with their username and password and multi-factor authentication. So instead, what happens is once that initial front-end authentication occurs, then there are what are called tokens on the back end and that's how machines are able to talk to machines using a secret that is known to be associated with that user who has logged in. So, if there is that validation of the user, then after that, the machines are talking to machines based on that initial validation. That's something that in purposes of scaling in modern times, has really benefited the user experience. But in terms of security vulnerabilities, there have been lots of problems with the way in which this authentication architecture has been built and it actually goes back a quarter century. So, some of the flaws that were initially in the on premise versions of older authentication protocols have now made their way into the cloud era.

Sasha O’Connell

So, I know we're sort of heading toward, the classic question of trade offs, and you already mentioned between kind of user efficiency, user access, and user ease and then security on the other side of these systems to make sure things are locked down as appropriate and I know we're heading to sort of think about the policy of balancing those two things. Before we get to the more on the user experience with Megan, Drew, can you say any more about how adversaries take advantage of these systems are kind of what the risk is around digital identity and how adversaries are targeting that very specific element with cyber-attacks.

Drew Bagley

Absolutely. Digital identities in the form of credentials, as well as tokens, have become immensely valuable in recent years. So, for example, on the dark web, there are a lot of marketplaces that have digital identities for sale. So credentials are for sale and that allows would be threat actors to buy access to a victim organization. Similarly, because those credentials are so valuable, what you see is that adversaries are very focused on trying to obtain those credentials and those credentials could come from previous data breaches, but they can also just come from basic phishing attempts. So, in fact, we've seen, especially over the past year, a group known as Scattered Spider, using all sorts of phishing techniques to do even very sophisticated organizations into turning over their credentials and then also basically finding ways to co-opt MFA processes. In other words, by transferring sims from an individual's phone that has their mobile phone set up for two-factor authentication to the threat actor’s device. So that way the threat actor can even handle the multi-factor authentication once they steal the credentials. So there's been a lot of that going on. And what adversaries want to do when they're breaking in on the front end, or logging in as is more common now, is get in, and then quickly escalate their privileges. So the credentials they might get might be for an individual that doesn't have access to a lot of things on that victim network, and so what the adversary is going to move to do is to try to find ways to escalate those privileges to be administrative privileges. Sometimes, even if the user might not on their face, have access to different things, they might actually be in some sort of user group, that's part of another user group, that's part of another user group, going on and on and on that does have access to different things, or has full administrative credentials and that can be incredibly valuable. But other times we've actually even seen threat actors call the help desks at victim organizations to try to complain that “Oh, I'm logged in, but I just can't seem to access this one folder. I'm supposed to have access to it” and then even getting help desks to actually escalate those privileges once they have them. So, in other words, the social engineering doesn't just stop at getting the credentials to begin with, but the social engineering can even be used to make those credentials more powerful. But then again, as I mentioned a moment ago, as the CSRB report pointed out, there are inherent flaws and certain authentication protocols that have it so that tokens aren't expiring when they should expire, or that tokens could actually be generated by somebody that's not the authentication provider. If the threat actor can generate their own tokens then they don't even necessarily need to worry about getting credentials to begin with and then it's game on for getting access.

Sasha O’Connell

So, what's the - it seems clear then, right? So what's wrong? We need to lock all this stuff down to the max possible. That should be the goal of all policy, Megan. Now can we just lock this down and stop this nefarious activity?

Megan Brown

No!

Ěý

Sasha O’Connell

Why not? Come on.Ěý

Megan Brown

Stop it, Sasha! Yeah, if we just unplug all of our computers you won't have cyber-attacks and networking, then we'll all be fine. We can just go back to pen and paper. Oh, but there’s still fraud then too.

No. it's the processes that these threat actors are exploiting have really important business purposes. Legitimate business purposes, and I think it is this balance that folks are trying to strike and we've seen token issues create and contribute to security incidents, regulators are looking at digital identity, they're looking at multi-factor, but I think folks have to keep in mind that, when credentials are stolen, the systems that they go to don't exist in a vacuum. They're not built just to be secure. They're not only being built to resist criminal activities. They are being built to facilitate the business. They're being built to enable customers to log into their accounts. They're being built to enable fast payments, mobile activities. So, it's about seamless or near seamless transactions with the minimum amount of friction and so these solutions that, we all depend on, yes they can be exploited by bad guys, but the organizations that manage them and are being held responsible for them also have to ensure that they work effectively for the customers that they're trying to serve. The government faces this problem too. You want to log on to various government sites, you need to authenticate yourself. So just to keep that in mind, you can over correct if you want to lock things down too much, but these systems have really important business purposes and companies are trying to manage these issues of if you want people to be able to access your services, you want it to be fast, but you have to take certain steps to try and make sure that only the right people are doing it and that's not going to be perfect every time.

Sasha O’Connell

Absolutely right and I think in terms of foundations on these issues, like many things that there's no such thing as perfect security even if we locked it all the way down, things continue to evolve and there's always threat risk. And then on the other hand, I always talk with my students, as security goes up, efficiency tends to go down. The friction is created when you're building those security features in and so for policymakers, just to have that as a foundational construct and to acknowledge that from the jump, I think is an important place to start. What is the government's role here, Megan? Who's active in this space and how do they think about trying to strike this balance or guide this balance in the private sector that owns many of these systems?

Megan Brown

Well, I think it's a really fragmented space. There's a lot of private sector companies who are out there trying to develop solutions for digital identity and selling those solutions, both to the government and to the private sector. There are standards groups that are out there that are trying to build consensus about interoperability and how you want to do various kinds of authentication. Obviously, in terms of our personal and financial identities, there's the bedrock foundational documents that are, the government runs those, your social security number, your passport, your driver's license but as we transition to digital identity, the question is how do you prove that? How do you take what we all rely on at the airport? I was just at the airport yesterday. TSA is looking at my passport, this is looking at things like mobile driver's license, but the security challenge is, how can system owners build a system that they can reliably use to determine who can have access and what they can access. I don't know that we're going to get into it on this session, but at least privilege principles, the core parts of sort of zero trust. How much do you put on your users to prove themselves? And so, moving to the agencies that actually have been active. There's the National Institute of Standards and Technology, or NIST, which we've talked about many times before. They have digital identity guidelines. It's special Publication 863, and that sets some pretty darn specific standards for how federal networks are supposed to do digital identity. Biden had an executive order on cyber security not long ago. It mandates that federal agencies use multi-factor authentication. We see some regulators starting to look at pieces of this, trying to nudge companies along or force companies to go along with using multi-factor authentication. Examples of that are the New York Department of Financial Services, which has cyber security regulations. They require multi-factor for a variety of things. The Federal Trade Commission's safeguards rule under the Gramm Leach Bliley Act has expectations for multi-factor, and so you see this move to try and get companies to do more to move away from passwords to multi-factor and then we can talk a little bit in the future about better multi-factor, phishing resistant multi-factor, but there is this move to push people along this spectrum of authentication, making it harder, making it more rigorous to prove your digital identity.

Sasha O’Connell

Awesome. Drew, how do you see those work streams? I know we'll talk about some of the specific policy proposals in a second, and some of the trade-offs inherent in those, but how do you see those work streams and how do you see this globally? Where's the U.S. in comparison to some of our international partners in terms of the government thinking about this issue?

Ěý

Drew Bagley

Well, I think Megan's proposal for us to go back to pen and paper is probably the most extreme that has been proposed so far.

Sasha O’Connell

You’re right, there's a perfect security, but Megan doesn't like that either for fraud. I get it. I get it.

Drew Bagley

If we can't do that then, I think that as Megan noted, there's been a very big push with all things for multi-factor authentication, even at the state level, if we look at the New York Department of Financial Services, their cybersecurity requirements now require that much more explicitly. But, we are in an era in which even MFA is being co-opted and so that's where there's a lot of push to make sure that organizations are actually monitoring the identity plane in the same way they've traditionally monitored the network plane, the end point plane, and in more recent years, the cloud plane, to look for things like impossible travel and if somebody is trying to log in from one geolocation and then a minute later, they accept a push notification from the other side of the world, then maybe that doesn't make sense. So, in other words, multi-factor authentications certainly part of basic security hygiene now, but that's not necessarily even the end goal anymore and so that's where you see a lot of movement in the space of identity, threat detection and response.

On the digital identity side, you really have the full gamut. If you look at countries like Estonia, for example, they embraced a full digital identity and digital identity in the context of individuals being able to have full validation for basic government services for voting, for everything else. They embraced that well over a decade ago. Then you have other places around the globe where digital identities being embraced just for specific government services, specific things but this notion of authentication nobody's solved it to find the perfect authentication model that does not create a whole bunch of friction for the user to where it's miserable to do and then is also completely secure. So, I think, we see, a whole marketplace of ideas, but there have been, there's been a lot more interest in recent years for governments to even get more involved in the protocols themselves related to identity. I think we've seen less of that in the States, but in the European Union for example, there's been movement because of this notion of some sort of electronic identity that at the European Union level, there's been guidelines issued on what the protocol should be for digital identities to interact with one another and then who the validator of that identity should be and whatnot. Whereas traditionally with all things with authentication on identity, authenticating your driver's license and validating that, that's the government's role. Authenticating all things online traditionally have been the private sector when we've thought about the domain name system and now with some of these recent proposals, we're kind of seeing a merger of the two.

Sasha O’Connell

It's so interesting again, a place inside our policy where there really is a patchwork of things kind of going on, and nothing's entirely figured out. Again, I know both of you mentioned the Department of Financial Services in New York. Are there other policy proposals? What's CISA up to in this regard? FTC FCC, what else is on the table these days?

Megan Brown

Yeah, so the Department of Homeland Security says Cyber Infrastructure Security Agency has been using its sort of bully pulpit. They do not have plenary regulatory authority other than the incident reporting stuff we've talked about separately, but they've been just churning on performance goals and guidance documents, and they're more than a password program; and so they're really pushing MFA and stronger MFA. One example of their work is their cyber cross sector, cybersecurity performance goals. They lay out use of a fifteen or more character password and phishing proof multi-factor authentication is the new emerging standard. They promote MFA on their website as a best practice and so they're definitely leaning into that the use of biometrics, etc. The FCC, the Federal Communications Commission, is starting to play in this space as well. They have long required telecom carriers to authenticate their customers before they give them access to certain data. Now, if you've tried to call your wireless carrier, sometimes that can be frustrating because you have to go through all these hurdles to prove you are who you say you are, but the FCC wants to safeguard that information and make sure you are who you are. Now the FCC is requiring providers, wireless providers, to do more authentication before SIM swaps. And because they're worried, about the downstream effects of fraudulent SIM swaps, which can lead to other fraud and other problems. So now they have new rules that carriers have to use quote, “secure methods”, to authenticate customers that seek a SIM swap. What I thought was interesting was that they declined to provide regulatory certainty about what it considers adequately secure. So, people were saying, “Well there's all these different ways to do secure authentication. Tell us what is adequate.” And they said, “No, no, no, you guys figure it out because it may evolve over time.” Which, personally, I thought that was a frustrating approach, but New York DFS, as we mentioned several times, they really are into MFA and I've certainly been on the phone with some of their investigators when they really are concerned about the lack of MFA in certain places. Their rules aren't all that specific or haven't been to date about precisely what systems and networks have to have MFA, but they've broadened that and it's much more broadly required. And then, as we mentioned, the FTC safeguards rule, if you're going to get into financial information, they're requiring MFA for access to that protected, that sort of subset of protected information. So, most of these, I will say, have exceptions. They don't require MFA for everything all intense. They have an exception process where you can show there's an equivalent or more secure alternative, which is a nice flexibility piece there but lots of agencies are now in the space, and I think you'll see some enforcement actions that, wrap people on the knuckles for not doing enough in the government's eyes to authenticate and to have tools that are secure.

Sasha O’Connell

Interesting, and Drew how do you see where the private sector is today on this? Are they waiting for clear guidance from government or waiting to be regulated or waiting for enforcement actions or what's the stance in the private sector? Or how do you guys see that today?

Drew Bagley

So on protecting identity in terms of protecting credentials as we were discussing a moment ago, there are MFA requirements that are sector specific popping up and I think MFA is becoming much more mainstream and I think we will, as we even see, enforcement related to cyber incidents and data breaches going forward. I think more and more, especially litigation, you're going to see whether or not MFA was enabled being a factor in terms of whether or not an organization was negligent, whether they were compliant with various regimes, etc. So I think that's going to become mainstream and already is becoming mainstream. Whereas I think it really depends on the maturity of the organization in the sector as to where you see identity threat detection and response being used or some form of actually monitoring that the MFA and the credentials are being used by the people who should be using them and getting to that point where you're challenging, who's logging in over and over again from a technical basis, making sure tokens don't expire things like that. You see that's really all over the place in the private sector right now, where you definitely, you have some places and some protocols that really have are inherent with vulnerabilities, once you're in, you're in, and you can walk around. You're not just in the apartment building, you can then walk around, unlock every apartment in the building and that's what we see on the back end with again, the way lots of this architecture set up and whatnot. So it's all over the place, but there's a long way to go to getting credentials protected and getting that more secure. Whereas I don't think, as a society with every data breach we see, we ever figured out how to protect PII. So, we've got to solve that one too.

Sasha O’Connell

Yeah, so in solving, whether it's from an internal policy perspective, from a company's perspective, or even one of the federal agencies working on their own data, or from a public policy perspective, in terms of what's, either required or recommended. We've talked about some of the trade-offs here about increasing friction in the system with the increase in security that makes sense. Clearly, there's this idea of customization versus uniformity on definitions or what's required. How do you guys see, what else? What are at its heart is that issue in terms of policy tradeoffs when we think about this space? Megan, you want to go first before we wrap?

Megan Brown

Maybe two things for instance, the first is the risk based approach, like it sometimes it seems like regulators just want to say, just do MFA, do MFA for everything and that sort of sounds okay at a surface level, but I think it's important to take a step back and I could cite you a bunch of NIST documents that would back me up on this that you take a step back and have to consider what the use case is, what the risks are that you're trying to address. What is the sensitivity of the data or the information or the service you're trying to protect? And so I don't think people should look at MFA, for example, as this panacea that you just throw on everything because, that might be prudent in many cases, but there may be reasons why there's a compensating control or something else.

The other pieces, anything that is human facing has to have this balance. We talked about a few minutes ago, about making things easy for the customer, while achieving some degree of reasonable security, but remembering that customers come in all different levels of knowledge and capability, especially with security. And so I don't think, it's fair to consumers to just say, everyone has to use token based MFA for the majority of online services. There's a lot of stuff that may not need that. You may not need MFA to access Facebook, but maybe you should have MFA to get into your bank account. So I think that's an important thing for trade-offs as well. One example that I point back to is I remember speaking with NIST, ten years ago when they were updating their digital identity documents, they really wanted to tell everyone to stop using text messaging for MFA. And there's trade-offs with the use of SMS for MFA, I use it a lot, a lot of people use it, maybe it's not the best thing for banks to use or for your crypto wallet, but we discussed with them the trade-offs for your average consumer that needs to log on to their social security account and would my grandma be able to handle token based MFA, or put an app on their smartphone, footnote, not everybody has a smartphone. So I do wonder in Estonia, how all the people that Drew mentioned with their online everything, if there are people who are kind of being left behind on the government services side. So it's just that trade off that remembering the customer behavior has to be taken into account. People, are human, and you want to set up protections for their mistakes, but also not set up a system that's too onerous, especially in light of whatever the risk is and the sensitivity of the data or system.

Sasha O’Connell

It's so interesting and when we think about those different populations, we tend to think about, maybe a more senior population having difficulty with that functionality, but I'll tell you, working with young people at the university do their patience for time. Their expectations in terms of instantaneous services and their lack of patience for friction in the system is pretty real and so their willingness to, and I'm not saying all but move away from products that require more friction in the system. It is another thing just to consider. Drew, other thoughts on trade-offs before we wrap up on this one?

Drew Bagley

Megan hit it spot on with the digital divide. We've been talking about the digital divide for thirty years in other contexts. Oftentimes with just getting folks access to broadband internet, which is still something that we, as a country are working on and there is absolutely also a digital literacy component that goes into this. And so even if we know what the best practices are trying to figure out ways to get those into the hands of the people who are potentially the most vulnerable and need the most is not trivial at all, and, at the end of the day, we should get to a place in which the person using the service can just focus on using the service and not be over rotating on what are the fifteen different things they have to do from security standpoint to use the service. And so I think part of that then means it kind of goes into the secure by design movement, which is the entity best suited to provide security, and bring security to the table should be doing that. So I think that starts in the identity ecosystem with authentication providers ensuring that their code is secure to begin with, and ensuring that there's interoperability for other security to be layered on top of that. When we think about these systems that are driving things like, going to update your voter registration online, going to update your driver's license online. That entire burden shouldn't be shifted to the individual to the perspective victim. A lot of that should be on the entity providing the actual service, but there are also that even when we're talking about the operators of technology that are very tricky in modern times, if we look at a lot of critical infrastructure entities. So, where you have operational technology systems that can be very old or even internet of things systems that may have been designed in a way where it's not easy to layer security on top. That's certainly challenging and so I think that's where it's incumbent upon the government to provide the right incentives to make sure that entities that are cybersecurity have nots are able to get cybersecurity, even if that means they have to first modernize their stack to do so, but I think, like Megan said, though, there's certainly is a lens of risk you have to view all of this through. Or you have to consider that layering MFA and all these things is necessary in every application, or realistic in every application; it would be great in every application. Is it realistic or are there other ways to, wall off certain types of access from other types of access? Because that's the other problem is that over the past several decades the way access has been architected; once you have access to one service, it's not hard for an adversary to get in and get access to other things. So I think there's a lot more thinking that can be done on the back end there too.

Sasha O’Connell

All right, with that, as usual, I don't think we solved any of the policy issues, but we certainly framed them up and offered great-

Drew Bagley

Megan did. I think.Ěý

Sasha O’Connell

Megan does as she does but thank you both for joining me. And thanks to our listeners. We are going to wrap it here. I hope everybody visits us at the Start Here website and that is, as always in the show notes, where there will be additional resources on digital identity and also a transcript from this episode for reference. We also look forward to having you join us next time where we're going to kick off our series on who's who in U.S. cyber policy, starting with a bit of a deep dive into the players at the White House who work on cyber policy and how they rack and stack up, which actually might be a little less intuitive than one might think. There's kind of an interesting story there. So we look forward to getting back together soon to share on that. So, Drew, Megan, thanks again for joining me and we'll see everyone next time.

Episode 8Ěý-ĚýKey Players in the U.S. Cyber Policy: The White House

On this episode of START HERE, join Sasha O'Connell, Drew Bagley, and Megan Brown as they navigate the intricate landscape of U.S. government cyber policy. This comprehensive exploration delves into the roles and responsibilities of the key players shaping the nation's cyber defenses and their interactions with the private sector, academia, and civil society.

Transcript

Welcome back to Start Here. My name is Sasha O'Connell and I am a Senior Professorial Lecturer and executive in residence at American University. In this series of podcasts, we provide a framework for analyzing foundational cyber public policy questions to include previous episodes on topics, ranging from incident reporting to ransomware and what to do about it.

For this episode and the next several, we're taking a bit of a different approach and instead of being topic based, this first one, and the series of five that will come, we're going to look at players in the U.S. government who have a role in cyber policy and walk through six aspects of each of those players. One, what their role is. Two, what their internal structure is. Three, what tools and authorities they have. Four, where they tend to play in the policy space. Five, recent trends in terms of their priorities. Number six, how they interact with stakeholders; be they from any aspect of the private sector, academia, civil society, et cetera. Just to make sure those folks have their voices heard, so that will be the last piece we cover. So needless to say, we have a lot to do here and for those who know me, and my area of interest for both research and teaching, how the U. S. Government is organized to address cyber policy and whether, frankly, we're optimized in terms of that structure is a passion project for me. So excited to dig in. To work through the ins and outs, as usual, I'm again joined today by Drew Bagley, Vice President and Counsel for Privacy and Cyber policy at CrowdStrike and Megan Brown, a partner at Wiley and co-chair of the firm's Privacy, Cyber, and Data Governance practice. We're going to take on this topic.

So, let's jump right in. One of my pet peeves coming out of my role in government where I had the opportunity to work on behalf of the FBI on interagency policy matters relating to cyber security, being coordinated by the White House, that often in the popular media, we hear phrases like the White House decided, or the White House has issued or convened on a topic and that really doesn't give folks a good sense of what's going on and how ideas are being contemplated, and then decisions being made. This is obviously a huge topic for which there's much to discuss, but to get us started, Megan, can you talk a bit about how you see the White House's role overall in cyber policy?

Megan Brown

Yeah, sure, absolutely, and I agree, this is a super interesting aspect of cyber policy, because there are just so many different players at the federal level alone. Not discussing any of the other layers of government or overseas governments, but, the White House has a few key roles in cyber policy. I think of it as kind of like, there's the bully pulpit, there's the convener role, and then there's actually the driver of policy; and so taking those in order, the White House sets the agenda. It can drive what people are going to pay attention to, whether that's remarks by the President, what gets into the State of the Union, or is there going to be a White House event where folks are brought to the Rose Garden to talk about whatever the priority is and when senior staff talk about cyber threats, or a particular incident that has an impact, and they're trying to move the needle on what people are focusing on. They're also a convener; people really do like going to the White House and meeting with senior fancy people. So they can bring people together. Sometimes, you're sort of voluntold told you might get a request from the White House as a Senior Executive that your company's presence is requested at this kind of event or meeting. There's also internal convening where separate from the private sector, we're going to have a task force, or we're going to have a big meeting and the White House has the ability, probably uniquely across the federal government, to pull together the federal executive branch agencies, as well as the independent agencies to say, what are you guys working on, or we need a task force to address this problem and that can be done through the interagency process. It can include civil society groups, others as well. So that's the convening role, and then there's policy development, which is really pretty broad, and they have a lot of tools at their disposal. Whether it's a presidential policy directive, which we'll talk about, it could be an executive order. It could be a speech that gets given where they can really set in motion different things by directing, frankly, they're the boss of the FBI, they're the boss of DHS, so they can set those policy directions and tell the agencies to go forth and do certain policy priorities, and they have to listen. Sometimes they're not effectively done, but those are how I think of the three main things the White House does to sort of drive policy.

Sasha O’Connell

Absolutely, and having been again in that White House facing role at the FBI, I can say it's certainly in my experience, to exercise all of those, quite a bit. It can be quite a bit of a full-time job. To do that, Drew, can you talk a little bit; obviously, it depends from administration to administration how they want to organize to get those three things done in cybersecurity in particular. We've had a lot of change recently over the last couple of years, obviously, with the creation of the Office of the National Cyber Director, but within that kind of mysterious black box of the White House, that's doing their thing in terms of the bully pulpit and as a convener and policy development, as Megan described, can you talk about some of the subcomponents? I mentioned the NSC and others. Who are the big players within that mysterious group called the White House?

Drew Bagley

Sure, there's some classic big players, and then there's the new kid on the block, as you're mentioning. So, you have the National Security Council, the Office of Science and Technology Policy, the National Economic Council, and then the Office of Management and Budget. With regard to some of these entities that really inform cybersecurity in a meaningful way and have done so for 20 years. But then the new kid on the block is the Office of the National Cyber Director and that office was created under the NDAA that passed for fiscal year 2021, but in somewhat comical fashion was authorized, came to fruition, but was not funded in the beginning and it took quite some time, took several months to actually get the appropriations to go along with the authorization, which their initial authorization gave them the ability to hire up to 75 people. So ONCD kicked off in summer of 2021, and ONCD today really is an entity that can do all three of those things that Megan was talking about a moment ago. In terms of providing the means for the bully pulpit, the convener, but at its core, what ONCD is focused on is policy development and ONCD, unlike say a system, for example, outside of the White House, does not have the traditional tools you would think of in cyber security to cajole other agencies or outside entities to do something, they're not a regulator, but being a centralized office for creating policy and strategy related to cyber security, ONCD does serve this unique position that previously was filled by different administrations in different ways. We can go back and think about there being a cyber coordinator in multiple administrations that on a much smaller scale would serve this sort of role of working with other agencies and other parts of the White House - the aforementioned groups in the White House, to try to have some sort of coordination on cyber policy.

Well now, ONCD is stood up to be this institution that persists and survives from administration to administration to do just that. But backing up a little bit, if we look at how we've had some of this muscle memory on cyber security in recent years before ONCD even, if you look at the NSC, for example, by design, the NSC is made up of staff that come from several different agencies throughout government to automatically have that inner muscle memory built into the NSC as an institution, and that's something that I think is really helpful for ensuring you have all those perspectives at the table at the White House.

Well, now with ONCD, naturally on its own, needs to engage with other stakeholders throughout government for input, but then also those other White House components like NSC that are already designed with folks who are coming from different agencies and that are just temporarily assigned to NSC, that means ONCD is also getting that sort of inner agency feedback through that means as well and taken together, despite the fact that ONCD exists, and that these other entities like NSC and OSTP and NEC and OMB have different functions, that doesn't necessarily mean everything's been figured out. You still have a new kid on the block. There's still naturally a process in figuring out when does ONCD take the lead? When is something that, is a national security issue, but also a cyber issue, something that's squarely in the realm of the National Security Council, but this is something that when ONCD was created, that those at the helm of each subcomponent at the time, were making very public, very concerted efforts to talk about how they're going to coordinate together and help make ONCD a success.

Sasha O’Connell

So we've talked a bit about the overall role the White House has here and then thank you guys for that on structure as well. So you have this structure, what tools, I guess, is what I always come to next, are at the White House's disposal? So we've talked about some of them, Presidential Policy Directives, PPDs, Executive Orders, EOs, these kind of convenings, Megan, that you walked through, this sort of voluntold forced family fun in the interagency and it sounds like private sector as well. In my experience too, the White House can work on potential draft legislation, to push forward from their perspective and recently too, we've seen, and actually with every administration in recent time on cyber, we've seen strategies. The sort of drafting and release of strategies, as Megan said to set the tone. I'm curious from you guys in your current roles, which of those tools do you think are most effective for moving the needle? I think one of you mentioned that, the White House doesn't have traditional kind of regulatory authority and certainly doesn't have legislative authority. They can draft language, but they don't have the authorities that Congress has to make laws. So what do you see of those tools as most effective and impactful in the cybersecurity space in particular? Megan, do you want to start?

Megan Brown

Sure. I think from sitting where I sit, which is advising private companies a lot on cyber policy, cyber regulations, procurement policy, I think executive orders are doing a lot of work in the cyber space. They were before if you go back to President Obama, sort of the big cyber executive order kicked in motion the cyber framework that was produced by the National Institute of Standards and Technology. So EOs really have been driving a lot of cyber policy. I think that has sort of gone on steroids in the past 3 to 4 years with the proliferation of executive orders but the reason I think that they are effective, and want to caveat effective, they create a lot of activity and they do have impacts. I don't necessarily agree that the ones we've seen so far have done always good policy, but they're effective in certain ways because they can direct federal agencies to do a bunch of stuff that then they have to do. When you are the Department of Justice and an executive order comes out and says, the Department of Justice shall do the following - you're going to do it. You probably negotiated it beforehand through the interagency. You might not like how all of that ended up, but there's always, frequently turf wars and things, but executive orders can direct federal agencies to take actual meaningful steps. They have limitations for dealing with independent agencies and we can talk maybe later about the, for instance, the cyber trust mark, and kind of how that has flown to the Federal Communications Commission, which is interesting because most executive orders for students that are listening, they are theoretically, they direct the executive branch agencies. So think DOJ, DHS, treasury, state, not the FTC, et cetera. But then they also really drive federal procurement policy and that has been a place, you know, the power of the government to buy what it wants from the people at once, or the companies at once, and attach many, many strings. Executive orders have done that and directed the federal procurement policy to take meaningful steps and that's just something that has real teeth. So, from my perspective, executive orders are kind of doing the lion's share of the White House's work on cyber these days.

Sasha O’Connell

Thanks, Megan, and thanks for bringing up procurement policy. It's something we always talk about in class, to not overlook that as a lever that the White House, the administration, can pull to have real impact on these companies that have such large service contracts and relationships with the government and need to make those changes to whatever standards are in there to comply and then, in some cases that then has spillover effects, to other customers. So it's a way of kind of setting the tone and potentially raising the floor. What do you think, Drew, executive orders, PPDs? Can anyone vote for a strategy? You guys know, I love a strategy.

Drew Bagley

I have to agree with Megan that I think executive orders have certainly had a tremendous impact in recent years. I think we need to look no further than Executive Order 14028, which was actually issued before ONCD was even created. So a lot of that work came out of the NFC and the other existing functions within the White House and what we've seen with that executive order, which imposed mandates for federal agencies and also created work streams and federal agencies is we've seen the adoption of better cyber security technology and practices throughout government. But then we've also seen different agencies tasked with then coming out with, for example, with NIST, new cybersecurity standards for various sorts of things, as well as to Megan's point, even ways to enhance the power of the purse to influence cybersecurity in the private sector as well. So I think that's something that's been very significant, and then to tie back to your favorite thing, strategies.

Sasha O’Connell

Thank you.

Drew Bagley

EO 14028 also being in place before ONCD was staffed, absolutely helped inform the first cyber security strategy that was created by ONCD. Cyber security strategies have been created for decades, but the first one created by ONCD absolutely was informed by that blueprint that existed from the executive order. Although as a strategy that took the approach of coming up with the five pillars, but that notion of having very specific taskings to agencies and seeing where the strategy could feed into some of the existing directives that existed under the EO, I think was naturally something that was very complimentary.

Megan Brown

I'll just say, on the strategy piece, I have been sometimes a little frustrated by the strategy documents because they, at least in this most recent set of the National Cyber Strategy, there was a big pivot in that cyber strategy towards new regulatory mandates, and one of the challenges with a lot of these White House driven products is there is limited and selective opportunities for input from the private sector. The White House gets to choose who they invite to review a pre draft. They get to choose who is looking at this and there's no public comment typically on a strategy document, but it is a way, and this White House has certainly signaled on cyber; they told the regulatory agencies have at it, start regulating and it was a notable pivot for a strategy document. It was not about the federal government. It's very much private sector focused.

Sasha O’Connell

This is a good pivot to our speed round or lightning round. We're going to try something new here, and maybe we can go straight to your point, Megan, to stakeholder engagement, with these White House processes and just talk a little bit. I mean, I think it can see how it's a difficult balance; how much is too much, and again, and it's always we talk about efficiency versus inclusivity and consensus. I will say one thing that I have seen that I'm a huge fan of are the actual exchanges of people. So things like presidential innovation fellows or academics on IPAs, getting actual people into the White House, or components of the White House to who come from potentially private sector experience or academia, researchers, civil society. What do you guys think? Where does that work? Megan, I hear what you're saying that there's often not enough. Do you see processes where that does go well, where there's good stakeholder engagement in this administration or others? What are some best practices for the White House in that regard?

Megan Brown

You know, I have seen some good examples of it. I think the NSC has tried to bring in stakeholders and get good input. I guess I'm just a little on the skeptical side that, a lot of the ideas are pretty baked by the time they're doing that stakeholder engagement, so I think it's around the margins. There are several cyber security advisory committees that I think, some people can question whether they're all that effective, but you've got the end stack. You've got other things that bring the private sector together through federal advisory committees to advise the President of the White House and I think those are moderately effective in identifying issues of interest to the private sector. But I think sometimes when you have a pivot on public policy in the direction of sort of government philosophy, there should be more opportunity, particularly at the independent agencies for public comment; the Cyber Trust market is one example, big priority of the White House to get labels put on Internet of Things devices through a voluntary program. It ended up that the Federal Communications Commission was going to own that program. I think that would have been something for public comment and sort of a broader discussion. Should it be at the FTC? Should it be at the Commerce Department? But that's sort of where it landed after the interagency.

Sasha O’Connell

So, Drew, what do you think from where you sit, and I know you have a lot of experience in this regard, how do you see the interactions and what vehicles work? When you talk about the White House convening, or when the White House gets stakeholder engagement. Are there avenues or examples where you do see there being sort of constructive stakeholder engagement?

Drew Bagley

Yeah, absolutely. I think that the convening authority, even though it's a soft power and absolutely something where there really aren't any sort of mandates. I think it's impactful when used appropriately. So, for example, one of the things that ONCD did once it was stood up, was ONCD held a Cyber Workforce Summit at the White House, invited many large companies from within technology outside of technology within cyber security realm, et cetera, to the White House for the Summit and naturally, when the White House holds these sorts of summits, there are expectations that stakeholders will show up with some sort of commitment and that was one where there were lots of commitments made and, naturally, it's certainly something that in D.C. you get commitments may now do organizations hold to their commitments or whatnot. But, that needs to be tracked later, but I think in the case of the Cyber Workforce Summit, there were many commitments made and really big commitments held to that probably wouldn't have otherwise been made, but for this summit, meaning that it's not that organizations wouldn't have been willing to do it, but they would not have necessarily organized around the principle of Cyber Workforce Development or at least the timing of it and done it all together but for that sort of summit. You can see similar actions being taken even with, for example, the Counter Ransomware Initiative. Kind of ransomware initiative that's one that's more government to government facing, much less does not look as much like some of these other summits where it's all about what will the private sector do, but that sort of thing you could talk about, okay, how can a bunch of governments coordinate on ransomware and maybe you could have treaties and you could go through that process, which could be really long or, pass legislation that's going to be similar, some sort of model legislation or whatnot. It instead, maybe the Counter Ransomware Initiative cannot achieve the things that you could achieve through a treaty or through legislation; but nonetheless, you get a degree of coordination that you wouldn't have otherwise but for the convening authority of the White House. And so I think there are those sorts of things. Now, that's not to say that having mere convening authority is a substitute for actually having new legislation, having treaties, if you are talking about working with other countries, or at the end of the day, having actual operational work streams at the agency level, but I really do think it does move the needle forward and I think we've certainly seen that in recent years.

Megan Brown

Just to follow up on that. I think one of the challenges with something like the Ransomware Task Force is, I think the government has to be really thoughtful about how they're getting good inputs from a lot of different stakeholders, because there's a lot of companies who have the resources to play in these venues, and they have the relationships, and sometimes the government may need to stop and think, how am I getting small business inputs? How am I getting mid market inputs? How am I getting the innovators and the disruptors instead of the typical and, you know, they're great American companies, but they are large and resourced and they can participate in things like those White House meetings where, I think it's important to always try and remind policymakers to look at who is not in the room for those discussions.

Sasha O’Connell

One hundred percent! So thinking about examples of things that have come out of the White House, maybe in recent years, I'm just in a very quick lightning round kind of way. Are there things that have happened? I'm going to start by giving my favorite for my personal reasons that sort of you feel that conversation about White House activity around cyber security would be incomplete unless we at least drop a mention that certainly listeners can follow up on, and we can have some more information on about our website. I will just start with PPD 41, important to me personally and our colleague, Josh Waldman who was instrumental in this podcast as well. We spent a lot of time on the kind of policy development and negotiations surrounding this. It is a presidential policy directive that articulates the roles and responsibilities for government agencies during and in response to critical cyber incidents. So this is a cornerstone one. I feel like we'd be remiss to not at least, like, drop in our lightning round and we'll post it on our website and with a little bit more analysis and thoughts of its relevance and ability to withstand the increasingly complex world. Drew, how about to you? Is there something this conversation would be remiss to not include?

Drew Bagley

I think I've already hit on the one that I think is fundamental to modern cybersecurity in the federal government, which is EO 14028 because that executive order, in addition to influencing the federal government directly is one in which we've seen other jurisdictions that have, where the executive order has zero jurisdiction over, actually copy some of the very same approaches and principles that are in executive order 14028. The reason why is because executive order 14028 is based off of the recommendations from the Cyberspace Solarium Commission and so I think that is one that in the same way we look at PBD 41 and we look at many other executive orders and PBDs from even decades ago, I think that's one we'll still be talking about 20 years from now.

Sasha O’Connell

Awesome and thanks for dropping the Solarium Commission Report. We will, of course, get to that in our episode on Congress. Couldn't miss the bipartisan by cameral report, Megan's going to talk about her deep love for every aspect of everything included in that report when we get to that episode. That's a joke listeners. We'll get there…

Drew Bagley

She loves laws and she loves regulation.

Sasha O’Connell

Megan loves it all! But speaking of which, Megan, I'm not asking you to say that it's necessarily the most positive impacts, but in terms of something you think that we've omitted from this and White House activity over the last several years, is there something on your list that it's important for listeners to keep in mind that you want to mention here?

Megan Brown

Yeah, I would say one of the executive orders from 2021, the executive order on improving the nation's cyber security, that was a really big executive order that covered both how the federal government does cyber security but it also had a lot of provisions in there. I sort of saw it as kind of a grab bag of ideas that had been around the White House for a bit, and they stuck it all in a pretty important executive order. You know, lots of stuff in there that has flowed affecting the private sector, and then I do have to pick up on the PPD concept.

Sasha O’Connell

Absolutely. Okay. Before we wrap, as we are here in the Spring of 2024 and at the tail end of the Biden administration, I just ask for you guys for a hot take on just overarchingly like trends on what we've seen over the last four years of this administration. I mean, from where I sit, just generically, I mean, the level of activity is just exponential. So I began teaching U.S. cyber policy five years ago and it was relatively slow and quiet, in terms of keeping up to speed on the news and bringing that into the classroom. Certainly, over the last couple of years, even just looking at the White House alone, the pace of movement in all of these areas, the convening, the strategies, the PPDs, the executive orders, has just been exponential and really challenging to keep up with. I wonder for you guys, other trends or thoughts; just your hot take before we wrap on this topic. Megan, you want to start?

Megan Brown

Sure. I think the biggest trend that I see that also worries me is we have taken a turn in policy towards mandates and regulation. That is going to impose a lot. Not only imposes the challenge you identified, which is, it's just hard to keep up with, but there's going to be an array of fragmented compliance challenges for companies the government talks about harmonization. Congress clearly wanted there to be harmonization when it passed the incident reporting legislation a couple of years ago, but I don't see a lot of meaningful work to actually harmonize and take some of that burden off of the private sector and so that's a trend that I observe and frankly, I find a little worrisome, but the multiplicity of agencies that are churning out mandates, both substantive and about incident reporting, is just a trend that we're not going to, I think, get away from.

Sasha O’Connell

Drew, what do you think? Are you seeing the same landscape?

Drew Bagley

Yeah, I think the word I would use is momentum. There's been a lot of momentum in all things cyber security over the past couple of years and we can look back decades and see that each administration has built upon the last with regard to cyber security, but we've certainly seen an acceleration in cyber activity in the policy space over the past couple of years everywhere in every sort of realm, to Megan's point. And the harmonization gaps are only getting wider because there is such an increase in activity. So, on the positive, we definitely see an enormous prioritization of cyber security. Cyber security is no longer some sort of fringe IT issue in parts of government. Cyber security is absolutely mainstream and I think even if we look at developments, we've seen absolute paradigm shifts. Where now in CISA, you actually have an agency that can be the government CSO, and a shared services provider to agencies that might not have the resources and means for cyber security to see a lot of that. Well, at the same time, in much the same way that you see the threats constantly evolving and defenders having to keep up with that, defenders also have to keep up with the regulatory and legal risk that comes from all the new cyber security requirements and that's something that certainly is challenging for everyone.

Sasha O’Connell

Thank you for that. And that is going to be our final word on this topic and we're going to wrap this episode up. Hope everyone visits us at the Start Here website. The link will be in the show notes to see additional resources, and we will put up those PPDs and EOs for everyone's easy reference. We'll also have available a transcript of this episode as well. We hope you’ll join us next time where we're going to do our next round of Who's Who with a focus on the critically important world of internet governance. A mysterious and yet critically important component to the players relevant to the U.S. space, as well as internationally. So Megan, Drew, thanks as always for joining me and we'll see you next time.

Episode 9Ěý-ĚýKey Players in the U.S. Cyber Policy: Internet Governance

On this episode of START HERE, join Sasha O'Connell, Drew Bagley, and Megan Brown as they navigate the intricate landscape of U.S. government cyber policy. This comprehensive exploration delves into the roles and responsibilities of the key players shaping the nation's cyber defenses and their interactions with the private sector, academia, and civil society. In this episode, we focus on the global standards bodies called Internet Governance.

Ěý

Transcript

Sasha O’Connell

Welcome back to Start Here. My name is Sasha O'Connell, and I'm a Senior Professorial Lecture and Executive in Residence at American University. In this series of podcasts, we provide a framework for analyzing foundational cyber public policy questions to include our previous episodes on topics ranging from cyber incident reporting to ransomware and what to do about it.

For this episode, we are going to again, take a bit of a different approach here. And instead of being policy topic based, this is the second in a series of five episodes where we're looking at the players in cyber policy with a focus on the U.S. government who have a role in this area and walk through six specific aspects for each. What is that player's role, what is the structure of that player, what tools and authorities do they have, where they tend to play in the policy space, and then kind of a lightning round of recent trends in terms of their priorities and how stakeholders, be they from the private sector, academia, civil society, et cetera can engage and have their voices heard with these players in this space. For the last episode, we dug into all of these aspects as they relate to the White House and today, we're going to turn our attention to a set of institutions really unique in this series, because they are mostly based outside of the U.S. and certainly outside the U.S. government and that is the constellation of global standards bodies commonly referred to as Internet Governance. This is a topic as I've been discussing with my colleagues, who again join me here, Drew Bagley, Vice President and Counsel for Privacy and Cyber Policy at CrowdStrike and Megan Brown, a partner at Wiley and co chair of the firm's Privacy, Cyber & Data Governance practice. We've been talking about these issues, they are on my list of things to discuss when I teach always the most challenging really to frame up and they have been probably the hardest for me just to really understand that there is a series of organizations that are referred to as kind of, capital I, capital G, Internet Governance that are really responsible for maintaining the digital scaffolding behind the public internet or I know some people say there's the recognitions that there's private networks that make up the public internet and those technical standards that allow them to talk to each other and function in the way we all rely on every day and that there's this constellation of these international organizations that are structured differently that work together to make all that go. It's a lot to get our mind around and to do so quickly and without the benefit of visual aids, which we will in fact, post on the website, which are helpful to picture some of this stuff, but Megan, let me turn to you. Can you break it down for us? Like, what do these organizations do functionally? What are the buckets or flavors of what they're responsible for?

Megan Brown

So, thanks, Sasha, and I do think it is a challenging thing and was new to me as a regulatory lawyer to realize that there's this whole chunk of actors that are not operating by notice and comment that your students could just go track down and follow.

Sasha O’Connell

Exactly.

Megan Brown

It is somewhat impenetrable. There are different flavors of standards bodies or standards development organizations, or STOs. They have different rules. There's a bunch of different ones, but they, I kind of think of them as falling into two main categories. There's like big picture strategy policy and agenda setting, and then technical standards and other groups, these may be the IG organizations if you will, that kind of make the guts of the networks work together and I think they're both really relevant to U.S. industry and policymakers and something that folks in Congress, students may overlook sometimes, the importance of these groups. So, on the former side, sort of the strategy and agenda flavor of this, you've got international organizations like the ITU, which is the International Telecommunications Union. Which is actually a United Nations agency, and therefore its members are actually governments. So, this is a piece of the United Nations. It works on some technical standards. It has branches that will focus on satellite spectrum, radio spectrum, making sure that all of our things can talk to each other across the different spectrums but it also talks about global development and digital equity. But because it's a UN agency, it has a lot of influence on sort of telecom and internet policy strategy and agenda setting driven by the governments who are member states.

On the other end of the flavor spectrum, there's voluntary organizations and industry organizations that are made up of individual members, and they can develop and maintain a variety of technical standards and it's where all the engineers and the brain people show up and figure out how the bits and bytes are going to travel across the world and it's really important because you need interoperability, you need the ability for your smartphone to work in a bunch of different countries. These are typically multi stakeholder organizations that have a transparent consensus based governance model and that's really important under US law. US law favors those kinds of standards groups, an example is the Internet Engineering Task Force, or IETF. Another example is 3GPP, which is the 3rd generation partnership, which develops standards for wireless communications networks and one of the other organizations that's really critical is ICANN, or the Internet Corporation for Assigned Names and Numbers. U.S. policy recognizes the role of these voluntary standards organizations and embraces it. Sometimes policy actually commands that U.S. law defer to these or use these standards. Students could look up, for example, the Technology Transfer Act and look at how important this stuff is to federal policy but sometimes they might get overlooked as folks rush to try and develop cyber standards or international trade, digital trade type approaches, but that's kind of how I bucket the SDOs that are out there.

Sasha O’Connell

Thanks, yeah, and just one shout out for those listeners in our American University community, the current secretary general of the ITU, Doreen Bogdan-Martin is an AU alum, so we like to bring it home in that way and always give her a shout out for her incredible international leadership in that space. Okay, Drew, so we've got this very complex constellation of what organizations that do sort of two big buckets of things as Megan described. Can you break down like, how these are organized both kind of functionally and then within the organization themselves? What are we talking about in terms of structure here?

Drew Bagley

Well, there are enough organizations that I don't think we will demystify everything in the limited time we have in a podcast, but the way I like to think about it is that the internet in some form or another is governed at the physical layer, at the protocol layer, and at the content layer. And all of these are treated a bit differently, with some overlap.

So fundamentally at the physical layer with a lot of what Megan was alluding to when she was speaking about telecommunications, there are physical connections that create the internet and those physical connections, of course, not only include terrestrial connections, but also space based connections, connections between satellites and terrestrial connections, et cetera. And so with that, at the physical layer you have a lot of legacy regulations in place from the telecommunications era regulating these that they really can go back even over a century and some auspices when we think about the ITU as an organization for example, and then there's a huge role for national governments to play there. Increasingly, the physical layer also includes data centers because there's a lot of traditional internet connections that maybe would have mirrored what we saw in the traditional telecommunications world, or even telegraph world that instead now are really virtualized with lots going on in data centers with packet switching and whatnot.

Then you have the protocol layer and at the protocol layer, that's often what folks think about when we think about internet governance, and that's where you have a lot of entities or successors to entities that have been created over the past 30 or 40 years. So, for example, what Megan mentioned a moment ago, the IETF, the Internet Engineering Task Force. The IETF is where the request for comment that create a lot of the basic internet protocols that we've layered onto for decades. That's where they come from. So you know protocols we think about with just how do networks even talk to each other? How do they figure out the most efficient way to route somebody's request to go to a domain name, from one country to a server located in another country? A lot of that really comes down to the nuts and bolts that the IETF has created over the years through those who volunteer to be part of the IETF. And then also at that protocol layer is ICANN, the Internet Corporation for Assigned Names and Numbers, which manages both the numbers side of the house, so when we think about IP addresses, as well as the name side of the house, domain name, so we think about those and interestingly, ICANN has morphed a bit just in the past decade where ICANN came out of an authority that was originally developed in the US government decades ago, first in DARPA, eventually with the National Sciences Foundation, eventually with the Department of Commerce, and then eventually, with a contract the Department of Commerce actually granted to a nonprofit that was created for that purpose for ICANN and now ICANN in and of itself basically is a multi-stakeholder organization that now is in charge of a couple of different functions. So, one of those is the IANA function, so, the IANA function is the Internet Assigned Numbers Authority function and then ICANN also governs contracts with internet registries which will control full zone, so the dot com top level domain zone, for example, or a more modern, top level domain, like, for example, dot security, for example, and there are thousands of these. That'll be governed by contracts that ICANN has in place with registries and then with registrars. So, registrars that oftentimes users will buy domain names from, the rules that govern how those registrars work and what the rules are that flow down to a registrant will come from those contracts with ICANN. Then there's a whole group of domain names that exist outside of ICANN, they're called country code top level domain names, and those will be regulated on a country by country basis. Then the ITU also plays a role in some protocols and then there are regional internet registries. And regional internet registries will basically have authority over blocks of IP addresses and ensure that others are not trying to set up IP addresses in a way that would actually lead to collisions, number collisions and whatnot. So, they'll delegate and block IP address blocks accordingly.

And then, beyond all that, so that's the protocol layer, beyond all that, you have the content layer. So sometimes today when we're talking about the internet and internet governance, really what folks will be getting into a debate over will be the content layer and whether or not entities should be doing more, should be doing less to block content that say is actually at a specific domain name that is pointing to a specific server and what's hosted there. They’re really talking about what's hosted and so those protocols can certainly influence content one way or another, but they're not necessarily intended to and so that's what makes that interesting. So content, generally, is regulated domestically, even if people are getting into these debates globally and thinking about things from a global standpoint, there really isn't a pure global mechanism to regulate content. Think about those debates as absolutely being able to affect one another, but I think it's really important to note that they're really designed to be regulated in different ways.

Sasha O’Connell

That makes sense and while content moderation to your point may not be a tech or cyber policy issue that internet governance organizations directly address, I think we were talking about some examples like DNS abuse and the connection with phishing, or the sort of creating the capabilities that allow countries, or potentially organizations to do surveillance or have access to data for users. When you think about those kind of policy examples, and you think about this constellation of resources, Megan, maybe I'll start with you, but go back to Drew too. Again, back to the structure, governmental versus these kind of non-governmental organizations. Are there pluses and minuses here and what authorities do these non-governmental organizations have to enforce any of these decisions they make? Anyway, how does that all work?

Megan Brown

Well, I think in terms of the structure, most of these standards development organizations or sitting bodies are supposed to be policy agnostic. They are about making things work. They're about letting certain technologies use certain radio frequencies in the case of some aspects of the ITU. For 3GPP and some of the telecom standards, they're about having standards that manufacturers of equipment that network operators can look to to know that their stuff will work in a variety of places. They're not making those judgments about under what circumstances should surveillance happen, but how is the surveillance going to be done from a technical perspective? How can your networks talk to each other? So, I think that's kind of the conceit of a lot of this, is they're supposed to be policy agnostic and it's maybe a whole separate class on whether that actually happens at the various bodies, given who shows up and who's trying to influence them but they're ideally about technical interoperability, technical best practices and not supposed to be weighing in on the why, or some of those issues Drew was raising about the policy discussions or decisions that need to go into whether someone should do something is different than can they and how.

Sasha O’Connell

Yes, Drew, we know though this is sort of a blurry line between standards, which you can see everyone's incentive to collaborate on to make sure their products function globally and are open to the open markets and then potentially policy implications or governments desire to kind of normatively put a finger on the scale one direction or another around some of these capabilities. What's your thought on that, or your experience with these different organizations and the balance between kind of governmental structures, non-governmental structures, potentially government engagement and non-governmental structures. How do you see that?

Drew Bagley

Yeah, as Megan noted, I think that a lot of these organizations were set up and still try to adhere to this notion that their job is to make things work. It made things as efficient as possible rather than get into the policy setting on how to use these networks. Nonetheless, because of the design of especially some of these groups that have a multi stakeholder model where you have a lot of different parties showing up, you naturally have these sorts of debates creeping in and oftentimes when these debates come up over, say DNS abuse or something for example, there's a big focus on figuring out how within the confines of what a particular body focused on protocols is supposed to work on. How can they address the issue with that toolkit without having mission creep or without inadvertently having unintended effects on other things like content regulation and whatnot. However, there are bodies that are set up to actually engage much more, even though they don't have rulemaking authority on those policy topics. So, for example, the Internet Society has existed for decades, the Internet Governance Forum, which is part of the United Nations, gets into all of these thorny issues, even though they have no rulemaking authority like these protocol bodies do, and then the World Summit on Information Society which is newer, also convenes to engage in these sorts of debates but these debates naturally break down into different countries having completely different values and views on these sorts of topics. And at the end of the day, when it comes down to real regulation, the heart of real regulation in most of these areas is still domestic. I do see, though that there's persistent debates that go on over DNS abuse, as you were mentioning Sasha, that appear in all of these bodies these days, because that ties so closely to cybercrime and just about every country is affected by cybercrime, by data breaches, by how that affects their infrastructure, as well as the privacy of their citizens. So naturally when there's a choice between building something more secure or setting a new policy in a way that's more secure or one that’s less secure, you see these debates come up, but that doesn't mean that any of them have easy solutions.

Sasha O’Connell

Sure, absolutely. Okay, two quick lightning round questions, which I'm going to ask and also answer myself and headed over to you guys before we wrap. So the first is, what do we see trend wise? I will just throw out there, so, when I was at the FBI and had the, cyber policy, tech policy, externally facing portfolio, so 2014, we had a very, very small team, team of two at the time who on behalf of the FBI and sort of U.S. law enforcement too, to the extent it was possible to coordinate, was engaged with these organizations and became engaged because of interest around, we were just talking about cyber crime and sort of this effort and attribution and how do you figure out who did it. It requires a certain amount of ability to look up names and numbers as we were talking about, or resolve back to individuals or organizations. So things like that really pique the interest of law enforcement and they want to make sure at the standards level that some of that exists or if it doesn't that people at least understand the trade-offs when you're making those kind of functional and technical decisions. So that's where we were in 2014. I know today that team with the FBI is much bigger and law enforcement globally has gotten much more engaged with these policy conversations, both kind of from their desks and comment periods and also in terms of traveling to some of these meetings, we haven't really talked about it, but many of these meetings are open and people who go and have a vote and it really depends on who shows up. So, I know that is a trend that's happened over the last 10 years or so. What are you guys seeing? Megan, maybe back to you to start. What's one of your kind of hot takes on where we are with all this today?

Megan Brown

We've seen Congress in the White House at various points put emphasis on trying to engage more, whether it's U.S. Government directly sort of DOD, or others wanting a seat at the table for some of these standards activities. Or trying to push U.S. industry to be engaged, as you've alluded to. These engagements, they take time, they take money, you need people to actually show up at these things and so it's been a challenge sometimes for the government to, I think, get the right balance there of having the right government folks. I personally don't think it should be DOD. I think NIST, our favorite National Institute of Standards and Technology. Some of those folks are active in these bodies, they may be preferable to something like DOD because there is this desire, people worry about the influence of foreign adversaries in these bodies. I personally don't think the answer is to model their behavior and have our government become super assertive in these bodies. I think it's to encourage U.S. industry to be really active in these bodies, so Congress regularly considers legislation, the White House will have plans for this. So I think it's a challenge, but for policymakers, I would say just remembering that the bodies that are in play have different constituencies. So, if the government wants to show up at ITU and just- yes that's where they should be there; they run the ITU. Maybe less so on some of the private standards organizations, and maybe encourage U.S industry to take lead.

Sasha O’Connell

Makes sense. Drew, what are you seeing here in terms of recent trends from where you sit?

Drew Bagley

Sure, to take the DNS abuse example we were speaking about a moment ago, that has been a really hot topic and to Megan's point on different nation states showing up at different venues and trying to have a lot of influence, you really see that come out where there might be a lot more engagement in a positive way from folks who are worried about DNS abuse and some of the security issues on the internet, but the proposals will run the gamut across the globe. You'll see, certain countries be advocating for what would be essentially content regulation and then you'll see others trying to push proposals to curb DNS abuse that focus much more on what could be done proactively at much more of a technical level, or with much more objective standards. So, for example, if we take phishing, which I think is a great way to think about how complex these issues get. If you're dealing with a domain name that's part of a botnet, you want to just block that domain name and at the end of the day you can feel pretty confident you weren't curtailing anyone's free speech rights. When you talk about phishing, there's obvious examples where you know a web page that's being hosted the host of contents being used for phishing. There's certain examples that are much more gray and much less obvious and if you take an approach where you're just automatically blocking everything, that's the type of process that once that's developed for something that might be very well intended, it could be then abused by others for purposes that are not well intended, to block speech, to block speech of opponents and whatnot. And then similarly, with turning over information behind a certain registration, there's a lot of that that's necessary for correlation analysis. Then there's certain details where when you're talking about that, sending some sort of global standard, you have to picture how would that be abused in certain countries? Is that the sort of internet we want or would that have some sort of chilling effect? So, with all these things the debates about the issues, the issues are real, the issues do need to be solved, but the solutions generally are ones where even if they would work in certain countries, the rule of law, they might not be acceptable in other jurisdictions. So, a lot of that, I would say, is really coming to a head now because of cybercrime just being so rampant.

Sasha O’Connell

It's so interesting, before we jump, I also just want to go back to kind of stakeholder engagement here, which we talk about when we talk about all the players. I think it so interestingly unique here too. So just a quick hit on this, I mean, we've outlined these really critical issues that are being, like, the underlying scaffolding of which are being decided in these bodies. My thought on stakeholder engagement here, I'm curious how you guys see this. It's so different in some of these organizations and say my experience interacting, as we talked about last time with the National Security Council for Interagency Policy Development, and a highly structured process here. I've never attended myself, but I hear, as you guys mentioned, you can fly to some of these meetings, IETF or others, and when they're trying to vote on things to finalize standards, for example, the rumor as I understand it confirmed by multiple sources is there's humming. So, the degree to and someone has to decide who hums the loudest in terms of the different positions. This to me is so interesting to have open meetings, to have such a more informal way of proceeding and again, I understand having sort of studied this and aligns more with the processes around technical development of products. The sort of folks that are involved in that are much more comfortable with this structure than someone who grew up in a very hierarchical government organization. It makes me, a little edgy to think about. What else, what are, how do you guys see stakeholder engagement? Megan, how do you, when you work with industry folks, do they feel heard? Is it easy to engage? Like, how does a stakeholder piece work with these organizations?

Megan Brown

I think it really depends on the venue, but from my experience and talking to clients and others who participate, they're genuinely consensus based. Now, you can pick apart the attendance levels and who showed up and who ran a steering committee and who did this, but I have been assured with some comfort that in the telecom space, they really are consensus based and that bad ideas get weeded out and there's not this backdoor, sneaky way to create new telecom standards that favor one versus the other. I'm not a telecom engineer, so I couldn't tell you personally, but that this consensus based approach and whether a standards development organization actually meets the requirements of U.S. law, like the Technology Transfer Act and others, you want open governance documents, you want transparency, you want some geographical balance, and I think they get that. It's different in the ITU and sort of the government bodies, but I do think at least for my narrow slice of the world, it seems like they get that right and they're abiding their commitments.

Sasha O’Connell

It's fascinating. It begs the question whether to really gage successfully in cyber tech policy going forward, whether governments are going to have to act more like these non-governmental organizations, in terms of transparency and the other things, or not, if this is in fact, the best practice. Drew, what do you think? What do you see?

Drew Bagley

Yeah, there's lots of strengths to the multi stakeholder models. There's also lots of frustration naturally for the very same reasons why Megan was mentioning that it's hard to maybe sneak in nefarious protocols or whatnot. That's also the same reason why it's hard to really do big transformational things and tackle some of these difficult issues. We have now that the internet has aged to its adolescent years and has the classic problems that come with its adolescent years and so naturally when it comes to things like tackling cyber security and some of these issues, that's what makes it very difficult in places like ICAN, for example, to get consensus on what to do about DNS abuse and some of these other issues. Nonetheless, these bodies are set up in such a way that because they are consensus driven, that really just makes them inherently strong from a muscle memory standpoint. You have folks there for the right reasons, generally speaking, but then what that means is that there are countries that are trying to have an outsized influence in some of these organizations and try to flood the zone sometimes with policy proposals and whatnot. Because then if you are able to submit enough, and there's too many for people to even thoroughly review everything for there to be enough of the other countries to and enough private stakeholders and civic society stakeholders to be on all the committees, then that's how you can over time, at least, have some outsize influence where you may not have otherwise where there are fewer proposals and stuff. So, I know that there certainly have been talks on how do you make sure that you're fine tuning some of these processes that have been in place for decades to prevent some of that sort of abuse.

Sasha O’Connell

Well, with that, I think we at least got to start guys on what is a truly, truly complicated environment, but I do think it is makes a profound difference. Certainly it has for me, trying to understand U.S. cyber policy to at least bring these organizations into kind of my peripheral vision and start to understand the role that they play in the broader context, and then directly on some of, as we mentioned, the issues that are front of mind in U.S. cyber policy today. So, with that, we are going to wrap this episode. We hope everybody both visits us at the Start Here website and the link for that will, of course, be in the show notes to see additional resources and we'll have availability of the transcript for this episode there as well for reference. We also hope you join us next time. We're going to do our next round of players with a focus on the U.S. with a specific discussion about who's who in the executive branch and it's an interesting topic that has certainly grown, speaking of trends over the last several years. So, Drew, Megan, thanks again for joining me and we'll see you all next time.

Episode 10Ěý- Key Players in U.S. Cyber Policy: The Executive Branch

On this episode of START HERE, join Sasha O'Connell, Drew Bagley, and Megan Brown as they navigate the crucial role of the Executive Branch in shaping U.S. cyber policy. This episode highlights how federal agencies like the Cybersecurity and Infrastructure Security Agency (CISA) operate at the forefront of national cybersecurity efforts, coordinating responses, and setting policies.

Ěý

Resources

Key Questions

  • What are the main federal Executive Branch activities that relate to cyber?
  • What is the regulatory role of federal cyber agencies?
  • What legal role do federal agencies play in regulating cyber incidents?
  • How do various federal cyber agencies coordinate their regulatory activities?
  • What are the most important levers and tools used by the executive branch in cyber policy?
  • What is the public engagement role of the executive branch in cyber policy?

Transcript:Ěý

Sasha O’ConnellWelcome back to Start Here. My name is Sasha O'Connell and I'm a Senior Professorial Lecturer and Executive in Residence at American University. In this series of podcasts, we provide a framework for analyzing foundational cyber public policy questions to include previous episodes on topics ranging from incident reporting to ransomware and what to do about it. For this episode we are again, going to take a bit of a different approach and instead of being topic-based, this is the third in a sub series of five episodes where we're looking at the players who have a role in cyber policy, with a focus here in the U.S. and we're walking through six aspects for each of those players. One, what their role is, two, what their structure is, three, what tools and authorities they have, four, where they tend to play in the policy space and five, recent trends in terms of their priorities and six, lastly, how stakeholders-be they from any aspect of the private sector, academia, civil society, et cetera, can engage and have their voices heard with those leaders. For the last episode, we dug into these aspects as they relate to the wonderful world of internet governance and today, we're going to turn our attention back here, specifically to the United States, to the current configuration of the Executive Branch and how it, specifically, is positioned to address cyber and cyber policy. To work through the ins and outs as usual, I'm again joined today by Drew Bagley, Vice President Council for Privacy and Cyber Policy at CrowdStrike and Megan Brown, a partner at Wiley and co-chair of the firm's Privacy, Cyber & Data Governance practice and I should say we are recording in-person today at Wiley; so it's fun to be together!Ěý

Good to see you guys! So fun. Okay, let's get to it and jump right in. When it comes to the Executive Branch in cyber, I think the foremost place to start is how much change we've really seen, certainly in the last ten years or so. I know when I teach, I pull out the, I guess it's 2010(ish), bubble chart titled U.S. Federal Cyber Security Operations Team Roles and Responsibilities. I know you guys have seen that as well. I saw it live and in person when I was in government. Some might remember that's a PowerPoint slide we literally used to carry around with us and share in the inner agency to explain the kind of who's who in the Executive Branch. So fast forward to today, or relatively recently, there is a really nice 2020 GAO report titled Key Federal Entities That are Responsible for the Nation's Cyber Security and the thing here is that back in 2010, there were about three departments listed in that bubble chart, and of course today the GAO report from 2020 at least, lists seventeen. So that kind of change is obviously super important and something we can try and walk through a bit. In addition to blooming in terms of just numbers, the roles and responsibilities today are commensurately much more expansive than they once were. So, I think we might start with some very basics in terms of their priorities and that being resilient team, maybe at DHS and DOD, and maybe we can talk too about some intrusion, responsibilities, investigation, and prosecution at DOJ but really it is cyber for all in the executive branch.

Yes, Megan, I know you're, you're in deep with many of these functions. Do you want to kick us off?

Megan Brown

Yeah, absolutely. So, I think you captured a couple of the main federal Executive Branch activities that relate to cyber and I can address, kind of, who's doing those things, but another big piece of what's going on in the Executive Branch is a move towards regulations. So, I don't want to leave out some of those agencies, but yeah, sort of jumping into the resiliency piece that you teed up, you've got CISA, right. CISA is within DHS, but it's its own entity. It is focused on national resiliency. It considers itself the hub for getting information out to critical infrastructure. It's also kind of the cyber point agency for the federal civilian agencies, the non-DOD part of the house when another federal agency has a cyber issue or DHS says there's a problem. They can issue binding operational directives that tell agencies what they have to do to secure their systems. So that's a really key piece. CISA, they're partnering with a bunch of regulatory agencies as well and so lots of activity coming out of CISA.

You've also got sort of when it comes to protecting the federal government's data, as I alluded to, CISA has a role to play there, as do the Federal Chief Information Security Officers, and you have a main Federal CIO, and then you've got CSOs across the different agencies. They're responsible for complying with things like FISMA and FedRamp and the things that Congress tells the government it has to do to protect civilian citizen data and government functions, which are constantly under attack from nation states as well. There's a lot of GAO reports, I will point out that, that think that there remains a lot of work to be done, but that's a big piece of it. You've got Cyber Crime and Cyber Intrusion Investigations and Prosecutions, which is, you know, Sasha, your old stomping ground at the FBI, the Department of Justice, the Secret Service, DHS also investigates a lot, there’s HIS, you've got the U.S. Postal Service and a whole bunch of different entities that have a role to play in investigating cyber crime and cyber intrusions that doesn't get into the DOD side of the house, which Drew might want to chat about later.

There's some, and then I think I'll pivot now to kind of the regulatory piece of it; some people would think of this as kind of the consumer protection type bucket of what the government functions would be. You've heard loud and clear from the White House telling the Executive Branch to go forth and make regulations and to understand a key differentiator within the Executive Branch there's two flavors of agencies that have role to play, or can have a role to play: there's independent agencies and there's executive branch agencies and that will become important to a whole bunch of policy disputes down the road but suffice it to say, the Executive Branch ones are the agencies like DOJ, DHS, Treasury that are directly responsive to the President. The independent agencies, theoretically, are insulated from that political control and here we're talking the Securities and Exchange Commission, the Federal Communications Commission, et cetera. The last agency I'll plug is within Department of Commerce, a super important player in the cyber policy and technical standards role is NIST, the National Institute for Standards and Technology, absolutely critical doing tons of things for the past more than ten years. So, I think that's kind of the whirlwind tour of the Executive Branch and some of its functions related to resiliency, standard setting, and investigations.

Sasha O’Connell

Absolutely, and you touched on too, that protection of federal data too, right, we always talk about, in addition to these externally facing roles and responsibilities, every department and agency is responsible for their own data and then some departments and agencies are responsible for helping all federal agencies elevate their game in that regard and then I think we talked about it briefly, but just to put a little bit of a finer point on that within all of those, as you mentioned, Megan, there are priorities. You can't do everything, so the Executive Branch and we talked about the White House last time, a focus on critical infrastructure, right, when everything else being equal, a place to start there in terms of resiliency and in terms of investigations, as you mentioned, a focus on strategic nation state adversaries and nation state actors.

Okay. That's a lot of things. Drew…

Megan Brown

One more thing…

Sasha O’Connell

Yeah.

Megan Brown

I would be remiss if I didn't also note that the last role that the government has to play is in, as you said, Sasha, sort of protecting its own stuff. It does that not just through regulation and what CISA does, it also does that through federal contracting and so we've been watching and participating in many activities at the federal level to elevate and impose additional requirements on federal contractors; not just at DOD, but it's expanding dramatically across federal contractors and that can affect everyone from your traditional defense contractors, to the folks who help with Medicaid and Medicare and anyone who sells a widget to the government, so it's a broad way for the government to try and use that additional power to adjust private cybersecurity.

Sasha O’Connell

Absolutely! So it sounds like the Executive Branch is super busy with just that. Drew, are there other things going on or have we covered everything?

Drew Bagley

I think we covered it all…

Sasha O’Connell

We can go home. Okay, great.

Drew Bagley

So, I think it's important to when we think about, you know what the federal government's role is in cybersecurity, oftentimes what we're thinking about actually is coming from the Executive Branch rather than always needing a new law or a new policy to solve a cyber problem, sometimes it's a matter of the fact that crime has migrated to new electronic forms and we're thinking about how do we prosecute this? How do we, you know, ensure that adversaries are not able to leverage new means to commit old crimes and that's a lot of what we see the Executive Branch doing in cybersecurity, even where there are not necessarily bespoke cybersecurity authorities. So, for example, with DOJ, naturally, ever since there's been crime using computers, DOJ has been involved. A lot of that, of course, stems from what we've covered on earlier episodes with CFAA but even when we see some other forms of crime that deal with extortion where there can be other statutes at play, we see DOJ playing a big role in those investigations and prosecutions. We also see a lot with regard to the inherent and very unique authority of the government to be able to actually do offensive activities against adversaries. So, a lot of that naturally stems from victims being asked, what can we do to disrupt the adversaries? Why can't we hack back? and naturally the private sector entities can not do that for very good reasons, but that's where the government does have that unique authority to do take downs and sometimes are working with private sector partners to do those take downs, but then something that's very inherent to that government authority is when there are those offensive cyber operations that might be tied to either active kinetic conflicts, or might be tied to a more surreptitious means and things we never even read about in the news headlines and so a lot of that comes from the Executive Branch and that's where it gets really interesting where even what Megan was mentioning a moment ago with DOD - if you looked at DOD, DOD has aspects and responsibilities that are in charge of protecting its own data from adversaries also maintaining the integrity of the data so that other branches relying upon DOD data have information insurance, but then under DOD you have the NSA and the NSA wears multiple hats, including that of signals intelligence and being able to gather information that increasingly is in the cyber realm and not in the traditional over the year signal realm. As well as then being in charge of U.S. cyber command, they can't take these offensive actions and in addition to that, then there's softer kinder things like coming up with standards for workforce development, like the NICE framework, which I know you're very familiar with Sasha.

Sasha O’Connell

Yeah.

Drew Bagley

And then also there's that general convening authority we've talked about before where sometimes it can be used for these more formal things like a take down, but sometimes the convening authority, even if we're, you know, we talked about in the context of the White House, but other agencies can also get people together and shed the spotlight on certain cyber issues and even get voluntary commitments from different organizations that can move the needle forward, or at least get the conversation going on different topics, even in the absence of new policy.

And then in addition to all of that, I think the Executive Branch importantly also helps set the tone in the narrative on the messaging on what we see as a priority in cybersecurity and perhaps what might not be a lot of priority because there's a signal versus noise problem in all things in public policy and cybersecurity is certainly one of those realms that suffers from that as well and that's where I think that's really helpful is when the Executive Branch is able to kind of highlight certain topics and then that can get actually activity from Congress and even from the private sector.

Sasha O’Connell

Absolutely. So, in terms of role, I mean, as you guys laid it out, there's a lot. From protecting data to investigations, to work force standards, as you mentioned in the NICE framework to offensive operations and, in terms of structure, I think it can be really kind of complicated externally to understand where all that sits and I know even from me growing up, professionally at the FBI, I really didn't come to realize later how complex. Environment's gotten more complex in terms of players on the playing field, but even from the inside, it's hard to see. In particular, one thing I experienced was the internal separation between the National Security and Homeland Security agencies and then the more kind of commerce focused or consumer protection focused departments and agencies and the lack of kind of ability to make sure there's coordination across all.

I know, so when we think about structure, you hear the language â€team sport’ in cyber all the time from leaders in government and certainly the lessons learned around the task force model and applying that here seemed to be something that's a real focus of the Executive Branch. One example, of course, is the NCI JTF, which is that investigative coordination hub run by the FBI out in Chantilly, which brings together departments and agencies, federal, state, local, private sector across to make sure there is that coordination piece, but I know those things again, particularly across the kind of DOJs and commerce departments of the world remain hard to coordinate.

Drew, how do you see all that? Like how, you know, from your perspective, do these things get coordinated or what structure is happening and maybe what structure, question to both of you, should be happening in the Executive Branch around all this?

Drew Bagley

I think you know, on the one hand, I think many folks dream of there being one central place to go for all of these issues. On the other hand, cybersecurity, like anything else, is part and parcel of just everyday functions of all the other agencies and is going to be. So that's where I think that there've been two developments that obviously do not decomplicate things necessarily as much as we need but are at least a step in this direction in theory and so one is of course, the way in which CISA has been empowered to be a shared services provider to the federal government. Where then within the Executive Branch at least, you don't have every agency fending for itself and then in theory too, over time you can have the federal government leading by example with one set of standards for what they're even doing with cybersecurity and that naturally takes time but that's something where that's a key part of CISA's role in addition to CISA working with critical infrastructure providers and doing a lot of other things. And then the other is this notion of you know, for a couple of decades we've had some sort of coordinator role that's either sat in the White House or the State Department or both and now what we've seen is the formalization of both with the Office of the National Cyber Director sitting in the White House and then with the State Department Cyber Bureau taking on the cyber role there. So, right now we kind of have the first generation of a lot of these things and what's going to be important is seeing how they're institutionalized. How do these newer institutions maintain structure from administration to administration and as they get, you know new leaders in and especially in roles where they are political roles and you have people swapping out, will they be able to actually bring all the agencies together and coordinate? Even more so remains to be seen, but we at least have the infrastructure in place to start that journey and start that process because before that, because there was such a lack of coordination every administration saw the need for there to be somebody at least doing some sort of coordinating but it's interesting because you have coordination without the power of mandates and that's what it was like when it was a one person job, that's what it's like now when we have you know, a new agency within the White House and that's something that I think is going to remain challenging. Not to say that necessarily you want to give that office mandates at the expense of all these agencies that do have autonomous roles to play but just to say, it's a trickier thing to solve.

Sasha O’Connell

Yeah, and thanks for bringing up State Department. We didn't talk about that previously, but obviously the growth there with Ambassador Fick and then coming out of RSA for, if our listeners who aren't one of the 40,000 people joining us out there last week, the three of us were all out there, and weren’t on our flights to DC. It was really interesting to see the Secretary of State give a keynote. Really interesting to see that leadership on behalf of the Executive Branch being from the State Department. So to Drew's point, it adds another player sort of in terms of leadership that needs to be coordinated in.

So Megan, what's your thoughts on that? Is the Executive Branch organized as it should be, or, what do you think about where this is all heading?

Megan Brown

I mean, I will say I get a little frustrated by the multiplicity of folks who want to be out in front on cyber. It was great to see the Secretary of State out at RSA in a sense. At the same time…

Sasha O’Connell

Raises questions, sure.

Megan Brown

Right; what are you doing here? DHS is here in force and half of CISA was at RSA to roll out many of their varied initiatives. So, I think if it's being done in a thoughtful and coordinated way led by the White House, great, but if it's very fragmented, which right now it really feels very fragmented, then everyone wanting to be point on some aspect of cyber just strains private sector resources and presses, I think on that partnership model that I don't know if the listeners sort of, you know, we've talked about it in various of the pods before, but right, there's been decades of public, private partnership and collaboration that I think this administration has decided doesn't fit the bill. It's not enough, but I worry that some of this push, this pivot to regulation, is straining those partnerships; and another thing that will strain the partnerships is just, you only have so many people that are staffed in DC federal cyber offices of major corporations. So, you know, there's, there's only so much resource to go around to have people monitoring what State doing today? What's DHS doing today? So, I mean, I don't have a great solution of like, oh, we should reorient and restructure the federal government. I think the ONCD experiment is sort of still open and we'll see what people think of that in a couple of years, if that has worked, or if that just created yet another, you know group of people that need to be out in front on cyber.

Sasha O’Connell

Sure. Drew sees it optimistically as a tipping point toward fantastic collaboration and Megan is a little more skeptical. I was shocked!

So I think we all agree though, in terms of structure, in addition to roll, it's a lot. There are a lot of players on the playing field, very, very senior people in government all now have very senior roles and ensuring that coordination both for the government side and Megan, to your point so that folks who interact with government can keep tabs and coordinate as needed is super important going forward.

Okay. So we have all these departments and agencies on the playing field here with all of these responsibilities we've talked about. You know, we could go, and I know I sit with two lawyers who are very familiar with actual authorities that exist. So we're going to try and keep it a little bit high level here. Keep it in buckets. We've already talked a little bit about investigation. We've talked a little bit about regulation. Megan, you mentioned contracting also as a super important kind of tool or lever the federal government can use. What do you guys think here? There's also funding and how that gets used. There's also prosecution sort of that other piece that comes after investigation and in some instances. What do you guys think here are the most important levers and tools or anything else I'm forgetting from my list?

Megan Brown

I mean, one thing that I felt was not on the list that should be, is that partnership model that I alluded to and the information sharing and one thing that kind of bewilders me about how we've gotten to where we are is you know, nine years ago, Congress passed the Cybersecurity Information Sharing Act of 2015 and I don't know that that got enough focus, or attention, or love, and now here we are in a very much different kind of let's move to mandates and standards and things like that, but that partnership model of information sharing, I think, is a really important piece. The government has a monopoly on both the offensive use of force, like Drew mentioned. They also have a whole bunch of information that they can and should be sharing with the private sector and that's an old saw that people have complained about for a long time, but it remains true and they put out a lot of information post Ukraine invasion, but was it really actionable? Are they really putting out into the private sector the kind of things that cyber defenders can meaningfully use? I think that's an important piece of the puzzle in addition to all of the other things that you mentioned, that we can certainly do a dive on.

Sasha O’Connell

Absolutely; information sharing, definitely old challenge, in terms of operationalizing, but I think your point's well taken. As important as ever today for sure.

Drew, what else? What are we forgetting or funding or prosecution or what's your thought on this?

Drew Bagley

I think we see a lot of frustrations with prosecutions because we're dealing with multi-jurisdictional global e-crime or nation state actors and that's something that you know, from a victim perspective, really is a game of whack-a-mole in terms of attempting to do in indictments and whatnot, but nonetheless, they still send a message, send a signal. I still think that's important but I think that ultimately what's really important for the Executive Branch is to constantly ask how do we raise the cost for adversaries so that it's more difficult, slower, and we disrupt their ability to act at scale in doing these types of attacks and that's something where to Megan's point on these public private partnerships, I think the most important part of that is for the government to use the authorities where it has its monopoly and focus on that and focus on its convening power where it can do something unique rather than attempt to recreate what already exists in the private sector in terms of threat intelligence information and all these other things that weren't as robust, frankly, over a decade ago. But where there's things that are already that, robust, I think , what's really important is, the government has a unique ability to raise the cost for the adversary and that needs to be the focus.

Sasha O’Connell

Can you talk in that vein, something I talk to students a lot about too, when they talk about investigations and then prosecutions if the adversary is not in the United States, what tools do the United States have? And we talked a little bit about the power of name and shame. What does that mean? Can you guys talk a little bit about that or explain how that might be used as a tool? Again, it needs to be closely coordinated across the functions of the federal agencies, but what is that all about as a tool?

Megan Brown

I mean, I think name and shame does make a difference and I'm a fan of the prosecutions and indictments in absentia where, you know, these bad guys are scattered across the world. They're in places where we may realistically never, ever be able to get ahold of them, but I still think there's an important norm and you know, I feel like five years ago we talked a lot about international norms in cyberspace, and that seems to have faded away a bit. And yeah, you're never going to get certain countries to agree with our norms but it's still very, I think, notable when you have like the lock bit take downs and some of these big international collaborative efforts.

Drew Bagley

Avalanche.

Megan Brown

Yeah. Like, and it sends a message and yeah, it might be frustrating because maybe you're just maybe preventing some of these people from traveling internationally for, you know, a few years, but it's important to keep saying that these are crimes because I think it's important to respect like who the actual victims are in the United States and reminding people that these are bad people doing things to U.S. citizens and businesses and that's an important thing for the Department of Justice to keep doing.

Sasha O’Connell

Perfect.

Drew Bagley

There's also a muscle memory created by doing that sort of coordination and as much as the tempo can be increased, that's important for really saying, okay, if you're going to set up infrastructure and commit cyber crime, well then you can't expect your infrastructure to be resilient and there is going to be a cost and your infrastructure is going to get burned and the more of these we do in coordination with allies and the more often we do them, then that actually can lead to disruption; I think that's important.

Sasha O’Connell

Absolutely. Okay. So we've talked about that sort of the name and shame as one example of the Executive Branch in action. Let's just maybe go around real quick and add some other examples of what this can look like. I'll start, we haven't talked a ton about education. I mean, Megan, you mentioned information sharing and its sort of the next phase of information sharing, right, is actual education. You mentioned the role of CISA. I'll call out specifically both, two campaigns, one, the Shield Up campaign; probably most folks are familiar with. They did a tremendous job kind of amplifying that information around the invasion of Ukraine in terms of, you know, the need for folks to focus on resiliency, both at the organizational level and at the small business level, it was really focused on organizations, the Shield Up campaign, and again, sharing information but also in a way that takes that additional step to really help folks understand and, and sort of goes into the world of teaching and education. CISA also has, and has recently reupped to their Secure Our World education campaign and that's for individuals, so families and individuals. And it's focused around things like update, you know, your software, complex passwords, do backup of your data and avoid phishing. Four things that are really, they have a very large campaign focused around and we have found, I did some research last summer that actually there are lots of dot govs that have done training materials that fit in those four buckets. So the government's really invested a lot there in terms of education and it's something you can really point to as a work product of the Executive Branch.

What do you guys think? So we talked about sort of name and shame campaigns and the impact they can have, education.

Megan, what do you think? Is there, you know, what would you point to as kind of the government and action on this regard?

Megan Brown

I think I've got kind of to, to draw some contrasts. One is, the massive new rulemaking at the Department of Homeland Security, really within CISA for those who are picky about that. To implement the Cyber Incident Reporting for Critical Infrastructure Act. That is a huge new piece of regulation. It is being developed right now. It is going to affect, I think millions of U.S. businesses; CISA says maybe just 300,000, but we’ll see. That to me is an example if someone's looking for a case study on like, hey, here's cyber regulation. That's that, right? By contrast I wanted to point out to go back to the convening function that you and Drew had talked about at the Federal Communications Commission. They make good use of advisory committees, which we didn't really talk about, but there's a whole bunch of them across the federal government that enable the private sector to work with the federal government in non-regulatory ways. One that I wanted to highlight that we've done a fair bit of work and it's fairly influential in cyber in the comm sector is the Communication, Security, Reliability and Interoperability Council, lovingly referred to as CSRIC. You know, this is weedy, weedy, stock…

Drew Bagley

Sounds like a medication.

Megan Brown

Brought to you by GlaxoSmithKline.

Drew Bagley

Side effects may include…

Megan Brown

I'm going to keep this PG.

The CSRIC puts out, they get chartered by the chairwoman of the FCC and she's added DHS to it and they bring together industry folks to look at problems and put out reports that identify solutions and best practices. That to me is on the opposite end of the spectrum from the CISA rulemaking on incident reporting, and just to me, jump out, as good examples for anyone new to these policy areas to kind of look at different models.

Sasha O’Connell

Awesome. What do you think Drew?

Drew Bagley

I think it sounds like a medication.

Sasha O’Connell

Beyond that, any examples of the Executive Branch in action you want to highlight?

Drew Bagley

When I think about the example of the Shields Up campaign, I think what was successful about that is it was very tailored and focused and even focused toward specific types of data centers and whatnot. So that I think it did a better job of avoiding say, what we were doing 20 years ago with terror alerts, where they were so generalized and so far reaching that you really had alert fatigue and so I think that nonetheless, if we, you know, do these sorts of alerts all the time you'll get alert fatigue but I think that the government can play a great role in educating about specific threats when they actually get very specific about this and are using for example, in the case of CISA, using the communication channels they already have with the sectors they are already working with to convey this information. I think that's something really important in terms of the very broad initiatives, like Secure Our World. I think things like that can be effective if they're universal, they're concise and then they're you know, they're repeated for a long period of time. There was now, I'm going to call the campaign effective and now I might butcher what the campaign was, but I think it was the Stop, Think and Click…

Megan Brown

Stop, Think, Connect.

Drew Bagley

Stop, Think, Connect. Sorry. Stop, Think, Connect. Click, I'm dating myself. Click on things. Stop, Think, Connect Campaign that APWG, and the government had a partnership on and I think that was very effective too. Where now you at least have folks being skeptical, even though folks still fall for phishing all the time but there's this notion that you're at least thinking about whether or not something's legitimate. I think those things are important. I think that's something that the government can do in a unique way or do in a unique way with partnerships. Because of that, you know, having the megaphone, I think that is important.

Sasha O’Connell

Absolutely. Okay. So our category five is recent trends. I think we've already talked quite a bit about the growth roles across agencies. Again, when I left government in early 2017 there was no ONCD. There was no ambassador for cyber at State Department and CISA definitely didn't have both the authorities and the funding they do today. Just as a couple examples. Also this kind of approach of cyber for all, that every department and agency has at a minimum, the responsibility for their own data and securing it and thinking about things that way, I think is pretty new. Drew mentioned offense, really kind of, we obviously were talking about this back at the FBI back in the day, but there's a much broader conversation about this today, both offence and defense. You mentioned workforce, what's old is new again. My partner, Diana Burley at the White House, I mean not the White House, at AU, that too has been working on these issues with the White House and others for many, many years but it's sort of always and forever at the forefront as a persistent and evergreen issue.

Megan, you mentioned regulation in that dramatic increase recently and the need then because of this broad new roles and growth, new departments and agencies that need for harmonization or deconfliction, I think the piece in terms of recent trends, unless I'm missing something, that we haven't talked about is kind of public engagement, and it kind of leads us to our last category of stakeholder engagement too. I can say from my experience again, leaving government in early 2017, and this is sometimes pretty typical of the FBI too, but tend to be pretty internally focused, not super focused on external engagement with stakeholders. That did change under Director Comi, Director Mueller, had a vision for an office of private sector coordination and Director Comi put that into play. So OPS, as it is today at the FBI, is an actual headquarters office that is specifically focused on that stakeholder engagement particularly with the private sector. So that is something I've seen just exponentially grow. I also just speaking of RSA, Megan, you mentioned CISA's presence there, the FBI's presence, their state department's presence there. Again, compared to my time in government, the willingness of leaders across government, particularly in cyber, to engage at conferences, to be a long form podcast talking about their roles and responsibilities, it really has changed exponentially. I think because of that acknowledgement that it's everybody's job, cyber, and if the government sort of stays in their space that kind of partnership, Megan, you were talking about won't happen.

How do you guys see that? Any other new trends I forgot? Anything that I didn't list that we should mention?

Drew Bagley

Well, I think what we were talking about at the top of the discussion where we do have this new infrastructure in place, like ONCD for example and we have every agency or seventeen agencies with some sort of cyber role; something that has been important for a very long time, but it is increasingly important is the need to harmonize obligations for would be victims, for those with the responsibility to protect data, and to deconflict across government and there's that opportunity with an office set up to focus on coordination to at least lay the blueprint for that and I think that's really important because right now you kind of have two different trends. You have regulatory traditions and authorities that date back in some cases at least a century, coupled with newer laws and requirements that are meant to meet the challenges of today with regard to cybersecurity and cyber incidents and data breach reporting but what that creates is really this Venn diagram of obligations for victims, would be victims, and those responsible for protecting data, that is not always easy to follow but also arguably means that when there is an incident resources are not necessarily devoted exclusively to stopping the bleeding and dealing with the incident but instead to…

Sasha O’Connell

Sorting out obligations.

Drew Bagley

Yeah, sorting out the regulatory matters after the fact and it's not always clear that even if there are reasons why different agencies have certain cyber authorities; it's not always clear as to why there needs to be all these different reporting apparatuses and whatnot. I know that's something that on the backend there's, because of the Cyber Incident Reporting Council recommendations that CIRSIA, perhaps will help ameliorate a little bit but it still doesn't solve it. We're still not waving a magic wand and harmonizing regulation, but I think that's something really important and something, again, unique for the Executive Branch to perhaps come up with a blueprint for and then even though there's an enormous role for Congress to play with whatever the solution would be.

Sasha O’Connell

And then our last category of kind of stakeholder engagement; where do you see all that? Drew, do you want to start?

Drew Bagley

Sure. I think all of those trends of folks who previously wouldn't be speaking about their roles and wouldn't be out there are really positive because it also in addition to you know, knowing where to go, it demystifies things because I think often in any realm of public policy, not just cybersecurity, you sometimes you think the worst, if you don't have anybody out there and you can't personify what the role, what the function is and actually what it does. The other thing is too, though, it kind of helps I think demonstrate the real limitations of government in many realms because otherwise, especially if you're a victim, you think that perhaps in that moment that the government's going to be able to solve all your problems and make you whole again and all that and that's not the case either and then it helps, I think, with figuring out where do we need resources. So I think that's very positive. To your point we've had certain roles that have always been engaged with the private sector, but this notion of just about every agency having some sort of stakeholder engagements, definitely a development that's accelerated in the past decade. I think now it's a matter of then prioritizing and coordinating around the things that again, where government has a unique authority has not trying to replicate what already exists and then where the government is actually using its unique position to convene even competitors together to do something in a coordinated fashion is important but I think that you know, what's still difficult is if you're the victim of a cyber crime, they're just as not, there just aren't enough resources to go around in terms of helping you and so that goes back to a lot of the other initiatives that government's pushing like secure by design.

So those best suited to secure things should be and then a lot of the best practices that the government's embracing for cybersecurity, trying to get the private sector to replicate that for those responsible for data. Realizing they have a responsibility to protect that data. I think that's really important too. So, yeah, as the optimist on the podcast, positive development and the final thing I'll just say on that is I think we need to, my eyes glaze over when I, you know, today's conferences when I just hear, oh, if only we did information sharing, the rest would take care of itself like as if that's the goal and like, that's just a means, but there has to be a goal attached to it. We have to be talking about what information and what's actionable and, all of that, because you know, there's software platforms that are sharing information in real time doing things. Information sharing's not the issue, it's information being shared for a space for collaboration on some specific action. That’s what’s important.

Megan Brown

That lets me grab the apparent role of pessimist since you called dibs on optimist. I guess I’ll be pessimist.

Drew Bagley

I was designated optimist to be clear, by the host.

Megan Brown

So some of that public engagement is very helpful. It is very important. I go back to the partnerships that I think sometimes were being taken for granted or overlooked, frankly. Some of the operational work that has been going on for a very long time that I think is being undervalued at the moment but a lot of what we're seeing now, this pivot to mandates and standards and regulation, there's another aspect of stakeholder engagement and that is how does government obtain good information on which to base policy and I see some real gaps and I see some real reliance on we're going to have some workshop calls where the government reads some stuff to the private sector, listens and then goes away and one thing that just jumps to mind is this question that's right now before CISA on their rulemaking on incident reporting, they've taken a lot of flack for how they've structured it, but also whether or not they're going to accept meetings and ex parte communications as we nerdy lawyers will call it, and it appears their position is no and if they're about to regulate the entire U.S. economy, they need to have a way to get good information, to do good cost benefit analysis and figure out, to your point Drew, but if what they're doing is for a real good purpose, or if it's just information collection for its own sake. So, that's my pessimist two cents.

Sasha O’Connell

I know you're not all pessimists. So can you tell us like, are there role making comment period processes that you do think are examples that CISA should be adopting?

Megan Brown

Yeah, I mean I think there's lots of agencies who do rulemakings and they may not get to the results that I like but, or that clients like, but you know, there's a bunch of them across the government and you can just pick the alphabet soup and they can look for examples to like the Federal Communications Commission. Now we'd have a whole philosophical discussion about regulatory capture and whether this is the right approach to cyber, but yeah, there are places they could look that allow for good policymaking and good input.

Sasha O’Connell

Unstructured rulemaking.

All right, well lots on the table, but I think for today, we're going to leave it at that. We hope everyone visits us at the Start Here website. As always, the link will be in the show notes to see additional resources. We've got those maps that I talked about earlier on, that bubble chart and then the GAO report visual will be there and we hope you join us next time, where we will do our next round on Congress, which I'm excited about because I'm ready to learn more you guys. It is complicated! Alright, Drew, Megan. Thanks so much for joining me and we'll see everyone next time.

Selected
Episode 11
Episode 12

Episode 11Ěý- Key Players in U.S. Cyber Policy: Congress

On this episode of START HERE, join Sasha O'Connell, Drew Bagley, and Megan Brown as they navigate the crucial role of Congress in shaping U.S. cyber policy. This episode highlights how Congressional committees and lawmakers operate at the forefront of national cybersecurity efforts, coordinating responses, and setting policies.

Resources

TranscriptĚý

Sasha O'Connell

Welcome back to Start Here. My name is Sasha O'Connell and I'm a Senior Professorial Lecture and Executive in Residence at American University. In this series of podcasts, we provide a framework for analyzing foundational cyber public policy questions to include previous episodes on topics, ranging from internet reporting to ransomware and what to do about it.

For this episode, we are again, taking a bit of a different approach and instead of being topic-based, this is the fourth in a sub series of five episodes, where we are looking at the players who have a role in cyber policy with a focus here in the United States. And for each, we're walking through six aspects: one, what is their role, two, what is their structure, three, what tools and authorities they have, four, where they tend to play in the policy space, five, recent trends, and then six, how stakeholders, be they from the private sector, academia, civil society, et cetera, can engage and have their voices heard.

For the last episode we dug into these aspects as they relate to the growingly complicated world of the Executive Branch here in the U.S. and the different departments and agencies and their roles in cyber and cyber policy and today we're turning our attention to Congress.

To work through the ins and outs and particularly to help me on this issue because this is certainly not my area of expertise, it is complicated and I am ready to hear more from my co-hosts here. I'm joined again by Drew Bagley, Vice President and Counsel For Privacy and Cyber Policy at CrowdStrike and Megan Brown, partner at Wiley and co-chair of the firm's Privacy Cyber and Data Governance practice.

All right, let's get into it and jump in here and Drew, maybe I'm going to just turn the wheel over to you. Can you start us off with some context for the role Congress has and currently does play in cybersecurity? We know they are charged with making the laws.

Drew Bagley

Yep.

Sasha O'Connell

How are they doing there?

Drew Bagley

I’ve heard that.

Sasha O'Connell

Yeah. I got that far. So, but are there other things setting strategy, convening, as we've talked about in other concepts. Where has Congress been focused in cyber in particular and where are they today?

Drew Bagley

Yeah. So Congress, like with anything else, has the ability to authorize other parts of the federal government to do certain things; to pass laws that can create incentives or disincentives for others, for the private sector and whatnot, and can appropriate money to these things. So where you can have a new law pass and have no money to back it up and it might not do anything. You can have a new law pass with money, or you can have a, what we often have today, which is where we're not necessarily getting legislative solutions to problems, but that doesn't mean that Congress is inactive on the topic of cybersecurity. So one of the most significant things, and perhaps I would dare call it really a once in a generation sort of things set into motion by Congress was with the Cyberspace Solarium Commission. The Cyberspace Solarium Commission created a set of recommendations that then later were followed up in some cases by passing laws and other cases by the Executive Branch implementing some of the recommendations and that's something that certainly created new paradigms for how cybersecurity was viewed by the federal government.

So for example, we've talked about it on some of the past episodes, CISA being stood up as an agency migrating from NPPD under DHS to CISA and then being the federal CISO, as well as the provider of shared services to the rest of the federal government. The executive order, 14028 from May of 2021, which set into place modern cybersecurity standards for the federal government. Things like that, even though we're talking about, in one case we're obviously talking about a law that created CISA, but another we're talking about an executive order that still was influenced by Congress. So Congress was actually setting strategy and tone with some of those things and then the creation of a National Cyber Director. That one though is a great example of you know, Congress actually passed the legislation to authorize the creation of the National Cyber Director based on the recommendations of its own commission and then did not fund it at first. So sometimes those things are happening at different paces, but what we see far more often than that is Congress holding hearings and sometimes those hearings are related to a specific oversight authority related to some sort of agency and we're talking about the committee, which I'm sure Megan will get into the weeds on, with oversight authority over a certain agency that has authorization related something cybersecurity and sometimes they're really just hearings that are tied to the news of the day. If there are very high profile cyber incidents, then Congress will generally hold a hearing asking for answers and you might not necessarily get any new laws out of it. You might not get any solutions, but you're at least bringing to light the notion that something is an issue and is a problem on that at least helps with the dialogue and can help those even in the Executive Branch that might already have some sort of authority think about how to leverage that authority. And then, related those specifically to the oversight, sometimes even long-standing laws where there are questions over how those laws are being applied and whether those laws are even being applied as often as they should, sometimes those hearings are helping with that. So we can think of the Computer Fraud and Abuse Act, which goes back to 1986, it’s very old and we still have hearings sometimes related to how CFAA is being applied in certain circumstances related to ransomware in modern times to data leak extortion, et cetera. And then we have Congress though, still you know, I said a moment ago and we were lamenting at, you know, maybe they're not always passing legislative solutions and things, but not to say they aren't passing any new laws in cybersecurity, CIRCIA, is a great example of a recent law where we had a law passed a couple of years ago that is going to be impactful one way or another, but now we've had two years of the rulemaking process. It's still not over yet. We'll see where it goes, and then I'm sure we'll have Congress debate whether or not the rulemaking actually meets the spirit of what Congress intended to pass a couple of years ago with regard to regulating critical infrastructure. So there's really a lot going on but, I'd say, the bully pulpit and the hearings are where we see Congress most active these days.

Sasha O'Connell

That's super helpful, right. So bully pulpit, authorization of funding oversight as you mentioned, setting strategy, all in addition to the potential to actually writing and passing laws.

Okay, Megan. So all that is going on in the cyber context in Congress, it's really actually challenging to figure out who in Congress is responsible for what. I mean, Drew pointed out there was the Bicameral Bipartisan Cyber Solarium Commission Report, that issued its report in 2020. We know there's a cybersecurity caucus, that's has a website. You can find them and see those members, but it's actually really tricky to find committees and what their responsibilities are. Can you help us navigate that space?

Megan Brown

I will do my best.

Sasha O'Connell

Please do. Help!

Megan Brown

We currently have divided government. The House is controlled by the Republicans. The Senate is controlled by the Democrats. That goes to committee and subcommittee leadership positions. Cyber has until recently, I'll just observe, been a largely bipartisan kind of issue. There are some partisan fissures that are emerging to the extent cyber is being linked to privacy law or policy. That's an area to the extent they're moving in a much more regulatory direction that might make it a little more partisan, but you know going back to 2008 and nine when cyber was first being debated with like Lieberman Collins, it was a bipartisan kind of thing to try and solve this problem but in terms of the key actors in each house and, Drew should absolutely jump in and correct me if I miss some. Starting with the House, a couple of observations before I started tick through some of the key committees and subcommittees. I think the House has less institutional stability than the Senate does just by virtue of personnel changes. We've had some big cyber leaders over the past five years or so cycle out, Langevin and Katko, who were big bipartisan folks who really cared about cyber; they didn't run for reelection a couple of years ago and so the, there's more change that happens and you know, but again, it's sort of bipartisan a few names that jumped to mind like Yvette Clark and Bennie Thompson on the Democrat side, have always been interested in this issue and sort of pushing it forward. Congressman Mike Gallagher on the Republican side is leaving Congress or has left Congress. So you have this sort of, the ranks are changing, but, you've got a couple of key committees and I'm not going to do these in order of importance. The challenge in Congress for the listener is there are so many committees that can claim a jurisdictional hook for cyber because it's either an agency that they have oversight of, or it's an industry that their committee deals with and so there's many ways and that's why you see these cyber hearings just pop up all the time; it's exhausting. So a good example is the House committee on Homeland security, right, Chairman Mark Green and Bennie Thompson. They have a Cybersecurity and Infrastructure Protection subcommittee and they recently had a hearing on implementation of that new law we keep talking about CIRCIA. That subcommittee is chaired by Garbarino from New York and Swalwell from California. Both are pretty prominent and, you know, have started wanting to make a name for themselves on these issues. The whole committee, and that's one thing for the listeners to understand is the subcommittees do a lot of the grunt work here when it's a big committee issue or a big hearing, then you, you kind of take more notice. The entire committee is going to do a hearing later this month to follow up on a major report from the Cyber Safety Review Board about Microsoft cybersecurity. That's, we can talk more about the use of hearings later, but that's kind of a chunk of the work is in that committee and subcommittees. Another example is a subcommittee of that House committee is the transportation, the Maritime Security subcommittee. So they're going to be looking at industry specific things that might be relevant to ports. For example, you've got other committees that play a role here.

For example, oversight and accountability has a sub committee on cyber IT and innovation that's chaired by Nancy Mace. They've had a hearing on the cyber threat from China. You've got the Armed Services Committee because of the National Defense Authorization Act, which is what begot the Solarium Commission that you guys have spoken about. They have hearings on this stuff and they're important for overseeing the offensive work the United States does, but as well as that NDAA, which hopefully we can talk about. Energy and Commerce has a big role to play and then you've got the Select Committee on the Communist Party of China, which has been generating a lot and they have a lot of activities, both hearings, but also they will send letters to all kinds of entities asking about things and that's a driver of activities. So that's kind of, you know, the House Intelligence Committee– but there's more but that's a list of the ones that jumped to mind.

Sasha O'Connell

So like the Executive Branch, it's sort of popping up in cyber for all and can be difficult as both of you has mentioned to even just keep up as an external stakeholder on what's going on. Kind of begs the question. I know one of the proposals that the Solarium Commission made that wasn't adopted, was to reorganize Congress and to have some more focused committees, but obviously it can be the trickiest thing is to change one's own house. The Congress was happy to reorganize it, make some suggestions for the White House, but I, as I understand it, the suggestions about re-orging structure on the Hill side didn't go too far.

Drew Bagley

Marie Kondo it.

Sasha O'Connell

There you go. There you go. Okay. So, is it the same on the Senate side then Megan? Are we looking at a similar situation?

Megan Brown

It's similar in that lots of committees and subcommittees have jurisdiction and play a role. It is different in that there's more stability in the Senate and you have some senators who have been longstanding participants in a lot of these debates and so I'll just tick through a few of the most important committees that I watch, for example. So you've got Senate Armed Services, interesting there. Senator Wicker moved from Energy from the Commerce Committee over to Armed Services. Senator Wicker from Mississippi has been interested in cyber for a long time and so that shows you that this cyber issue goes, you know is evergreen. You've got the Intelligence Committee, which I recall was a major contributor to the 2022 legislation we keep talking about. You've got Homeland Security and Governmental Affairs or HISGAC, for the cool kids, shared by Gary Peters and Rand Paul. Gary Peters has had a history of trying to work on a bipartisan basis on cyber security issues and pretty thoughtfully. So you can see lots of continuity there. The Commerce Committee, Senators Cantwell and Cruz, they have a subcommittee on Communications, Media and Broadband. Senators Lohan and Thune chair that, and Thune has had a long history of interest in cyber for, you know, more than a decade. So there is this continuity, judiciary and finance also gets in on it. They obviously have jurisdiction over the banks, but they're holding a hearing coming up, looking at the change healthcare cyber attacks. So you can see that they're just these hearings pop up, the activities pop up and it is very difficult to track.

Sasha O'Connell

Absolutely, sort of begging for a project to map, I would say. Having done some look in preparation for this episode, it doesn't exist, right? It is very difficult to find in one place information for stakeholders, external be they private sector, academic researchers, civil society folks to know, and you know obviously from experience having done this for so many years sort of adding to your list, but it is a challenge from the outside to keep tabs on what the Hill is up to.

So speaking of what the Hill is up to, we talked a little bit about their role, but maybe to put a little finer point on the tools or levers they have, Megan, you mentioned this like sending letters and inquiry. I remember that pretty well from my time at the FBI, we got a lot of letters and some of them trickled to my different responsibilities at different times. Can you talk, you know, so obviously Congress can convene, Congress can write and pass laws– what are these other kinds of authorities and tools they have more specifically where they can push forward pieces?

Megan Brown

Yeah, I mean, one thing we see them use in the Select Committee on the Communist Party of China's doing this a lot, but it's not limited to them, is letters to private industry, letters to federal agencies, many examples of that and in the private sector side, I will say you don't usually want to be getting a letter from Congress asking about anything.

Sasha O'Connell

Not at the FBI either, it's not ideal. Not the best.

Megan Brown

They're not love notes.

Drew Bagley

Its like having a pen pal who is also going to show up at your door.

Megan Brown

Yeah, and has like scary authorities and you can't blow it off so they send letters and they'll do things like ask about security incidents, and that's a public thing and there's a whole process for managing those. Staff on the Hill is eager for information and often can be a great convener. Sometimes again, you don't always want to go talk to staff cause you may not want to elevate your profile on something that looks hard or complicated. So that's I think one area where they'll talk to the private sector. They certainly do investigations after major breaches and that gets real serious, real fast and then a lot of cajoling and oversight and letters, for example, to federal agencies to you know, those agencies have to respond to Congress because Congress holds the purse strings and so you don't, you know, that's another lever that Congress has, even though it's one branch, it can try to affect what the Executive Branch is doing by sending letters and conducting oversight hearings to sort of nudge them on policy.

Sasha O'Connell

Yeah, Drew, I think you mentioned this before, kind of seeing how implementations going on some of their priorities they can do via calling hearings or sending, yes, detailed inquiries, for example, to departments and agencies asking for updates on implementation of laws. What else Drew, and we're going toward examples too, kind of tools Congress has. We've talked about, obviously this Solarium Commission Reports are setting strategy, writing reports. We've talked about some legislation. Are there other legislative examples that are relevant here? I know sort of lack of activity might be one hallmark of Congress and cyber, but are there other legislative accomplishments outside of CIRCIA worth mentioning as an example?

Drew Bagley

Well I'd say even if we're not talking about legislation that's passed, what we have seen is for example, I think Megan mentioned it a little bit, we've seen even with privacy legislation that gets proposed in each Congress. We see cybersecurity incorporated into each bill and that stays largely consistent even if other aspects of the proposed privacy legislation change, like should there be a private right to action? Should there be federal preemption? The security piece has stayed fairly consistent and so, you know, part of that process means that Congress continues to convene, continues to have hearings, continues to have staffers do outreach with leaders, with would be victims and whatnot, to figure out what should be in some eventual legislation. And so oftentimes what you see with those sorts of things is even if you have years and years in which legislation does not pass, if the same principles stay alive persistently then whenever there is some sort of opportunity to actually pass legislation, the work that was done by previous congresses and in previous iterations actually, is something meaningful in some sort of future legislation. So I think that's always important to know that that's going on in the background and so that's an important role where, you know probably much to the chagrin of the staffers actually doing some of the hard work. Now to know their work might not pay off for several years but I think that it's important work. So even legislation proposed that doesn't go anywhere in a certain Congress can still create ideas that are persistent and that's important in cyber.

Sasha O'Connell

Yeah. That makes sense. You both mentioned staff and obviously Megan, and you mentioned too, their retirement of some sort of, OG folks who are cyber folks on the Hill, how do you both feel about, a question comes up a lot about kind of the knowledge base on the Hill in cyber and we'll talk in a second about, you know, ways for stakeholders to interact and obviously the need for continual kind of dialogue and education. There's so much to keep track of. Obviously it's not unique to cyber, to have members of Congress have a lot on their plate but just as kind of people and educational background, where do you see that going compared to how it used to be, both in terms of staff and members and something I've seen certainly where like, around the conversation around AI, for example, right. We have some news coverage of members of Congress actively seeking technology education and maybe I'm particularly sensitive to this being in the education space, but we have members of Congress actually actively seeking education on emerging or existing technologies that are relevant to policy. So we've seen more interest in terms of that. You know, generationally, are things starting to change with staff? How do you guys see that piece? It's a question that comes up a lot.

Megan Brown

I mean, one reaction I have is the lack of hearings on legislation. Like this is, I think you've probably heard me point this out in the past, the information sharing or incident reporting legislation from 2022 had not a lot of legislative history. It did not go through a lot of markups and hearings to benefit from the process that Drew alluded to, which is, you know something gets introduced and in five years later it's matured and been refined and so I think that's a really key piece that is missing from a lot of discussion right now is we're not getting a lot of good general input into legislative drafts, partly because some of them pop out of the Solarium Commission or are otherwise just sort of popped into the National Defense Authorization Act and I think those are missed opportunities to educate staff and members.

Drew, to what I think what your broader question was, is like, I don't know that on cyber we have the kind of, everyone likes to laugh at some of the social media hearings and they talk about how people on the Hill don't have this knowledge base. I don't think we're there with cyber but there is a knowledge gap and I think the challenge they have is getting good information and reliable information from the private sector about things that are proprietary, very sensitive and not often made public. So I think it is a challenge.

Sasha O'Connell

And in that vein, Drew, to our last piece about stakeholder engagement, obviously kind of public televised hearings is a very different environment to get information. What do you see as kind of effective ways to Megan's point, and Megan, I'm sure you have thoughts too, to collect that information, those updates, that education, that information, and folks need on the Hill to be wise in policy in this area.

Drew Bagley

Yeah. In addition to hearings, you know, there's a lot of congressional offices and even committees or groups of members that will work in a bipartisan way to actually have closed door workshops and round tables and get candid briefings on different topics and then there's, you know, every organization and stakeholder under the sun trying to brief different congressional offices and whatnot and there's a lot of talent with congressional staffers. So naturally, if you have a staffer that stays in an office and with a certain portfolio over time, they really develop a deep expertise and that's something that's really beneficial for drafting legislation but then we have certainly had members of Congress that have really developed a great expertise and then several that have announced retirements in the recent years.

Sasha O'Connell

The brain drain.

Drew Bagley

And that definitely has an impact where you get people that not only were they, did they become experts, but they were seen as the other members of Congress as the experts, meaning the other members may not have felt as invested in the topic because they knew the go-to person had trust in that and as Megan was mentioning, cyber has often been a much more of a bipartisan sort of issue. So I think that's something where it's not always been the issue that every member has sought out the expertise. They sometimes assume someone else will lead the way on that. So when you have multiple retirements at the same time, that can really have a ripple effect but I mean there's so many sharp staffers on the Hill that I think that's where it kind of keeps the knowledge institutionally there even if there is disruption.

Megan Brown

Yep and I'm really glad, I just want to make two additional comments. I'm very glad that they can get technical assistance from agencies like the Department of Justice and otherwise that, you know, there is that kind of collaboration on here's what we're doing legal, here is what we're doing sound, can we draft this better, cause I think that's just really important and you need people who understand legislative drafting and administrative law to do this.

The last thing I wanted to say since you mentioned sort of classic hearings is I do think this is one of the weakest parts of our cyber policymaking landscape is this kind of drag a CEO up to Congress and scream at him or her after a cyber incident. It's not productive. It does not yield actionable help. It is performative and I think it contributes to a blame the victim approach that I think is unhelpful to encouraging collaborations. So that's just something for, you know, listeners and students to pay attention to. When you see these hearings, they are designed to create a lot of heat, but not a lot of light and so it's theater there I think it does a disservice to good, meaningful discussions about cyber.

Sasha O'Connell

Well, we're going to leave it there and let you have the last word on that, Megan. We're going to wrap this episode here and we certainly hope everyone, as always, joins us at the Start Here website. The link will be in the show notes and there will be additional resources and a transcript from this episode for reference there as well. We hope you join us next time when we're going to do our next round of players with actually a focus on the states. So get outside the beltway and think about what's going on in the states as well.

So Drew, Megan, thank you so much as always for joining me and we'll see everyone next time.

Episode 12Ěý- Key Players in U.S. Cyber Policy: The States

On this episode of START HERE, join Sasha O'Connell, Drew Bagley, and Megan Brown as they navigate the crucial role of states in shaping U.S. cyber policy. This episode highlights how state governments operate at the forefront of national cybersecurity efforts, coordinating responses, and setting policies.

Resources

Transcript

Sasha

Welcome back to start here. My name is Sasha O’Connell and I’m a Senior Professorial Lecture and Executive in Residence at American University. In this series of podcasts, we provide a framework for analyzing foundational cyber public policy questions. To include previous episodes on topics ranging from incident reporting to ransomware and what to do about it. For this episode, we are going to again, take a bit of a different approach, and instead of being topics specific, this is the fifth and final in a sub-series of five episodes, where we are looking at the who’s who of players in US Cyber policy. For each, we’re going to walk through six aspects is always: (1) what their role is, (2) what their structure is, (3) what tools and authorities they have, (4) where they tend to play in the policy space, and (5) recent trends in terms of their priorities, and lastly, (6) how stakeholders, be they from the private sector, academia, civil society, can engage and have their voices heard. For this last [00:01:00] episode, we have dug into all these aspects as they relate to the often overlooked here in DC role of the states. So we’re getting outside the beltway today and thinking about the states. To work through the ins and outs as usual, I’m joined again today by Drew Bagley, Vice President and Counsel for Privacy and Cyber Policy at CrowdStrike and Megan Brown, a Partner at Wiley and Co-Chair of the firm’s Privacy, Cyber and Data Governance Practice. Okay guys, we’re going to jump in here and Megan, when I think of the role of states in us cyber policy, I think of you because back in 2019 when I first started teaching my classes, on the who’s who of U.S. cyber policy, and I was so inside the beltway focused, you came to class and brought the role of the states to our attention. And needless to say, since 2019, that role has only bloomed. We were just saying off mic that the role of the states really has continued to expand in this specific area. Can you help us start off? What role are the states playing here? Is it [00:02:00] strategy? Is it convening as we’ve talked about, is it rulemaking and legislation? What is going on big picture with the states and their role in U.S. cyber policy?Ěý

Megan

Well, I think they are doing, and I’m glad you think I was impression back in 2019 or 2020. I’m sure I had one or two states on the brain that were vexing my clients at the time. But I think they do all of those things that you just referenced. They legislate, they superintend their own purchasing, right. They have, they purchase a lot of IT, and the federal government is frankly, a little concerned about some of that, cause you know, we’ve seen some pretty bad attacks on state, local government. But they also right, states have traditionally, if you go back there, the 50 laboratories of democracy, they have this what’s called the police power. So they, they believe they can engage in a lot of direct regulation of the businesses and citizens that are resident in them. So, they kind of do all of those things. And I think it has been challenging for companies, and folks in the space because state [00:03:00] legislatures and state regulators are harder to track than Congress. Things pop out a lot more quickly. The process is not necessarily as transparent and whether it’s something like artificial intelligence, true cybersecurity, privacy. There is a lot of time being spent by U.S. businesses, tracking and influencing what the states are doing. In part because they often really need the help. Some of them, they may respond to the news of the day, but they might not have that deep technical expertise on some of these issues. And so, it is a real challenge to be tracking and influencing all of these developments, whether it’s procurement issues direct regulation, utility regulation by a state public service commission, for example, or otherwise. But there is a lot going on, on cyber.Ěý

Sasha

Is it also complicated because the things we’re mostly talking about here are internet based, right. And the internet does not have sort of strict state boundaries. I mean, it’s obviously a challenge globally, right. Country to country in terms of governance, but does that add complications when you have different states making different [00:04:00] decisions in this context?Ěý

Megan

I think it does. It has several layers of complications, right. You have California, New York that are regulating aspects of internet service and that sort of at a high level, those are being litigated, and those are really important questions of like the role of states to do that. But even in terms of, you know issues where they might, states might be moving in the same direction. You might have different definitions and different approaches. And then sometimes they’re moving in completely different directions. And you see that in the state privacy discussions where there’s different models being adopted. So I do think when you’re regulating something like, interstate networks that don’t pay attention to whether a bit is traveling between Alabama and Virginia or Maine and California, it’s difficult to try and put up technical requirements or even policy requirements that are going to vary from one state to another. And I think, it is very challenging because we’re not talking about mud flaps on trucks, which is the paradigm Dormant Commerce Clause example. We’re talking about the internet and it’s, really hard.

Sasha

Complicated! I guess, question to both of you. [00:05:00] Are there strategic areas that make more sense kind of normatively for the states to be active in. So, we talked about how the federal government is focused. For example, when it comes to resiliency on critical infrastructure first, right. It makes sense to start there or with investigations on the activities of adversaries that come from nation states strategically. Strategic nation, state adversaries in terms of the states. I mean, I started to think about things like elections. Right? Are there areas in cyber where it does make sense or states have a specific role that needs to be filled? Drew, do you want to start on that one?

Drew

Um, sure, I think what’s interesting when we think about states, we’re really thinking about, state, local, territorial, and tribal jurisdictions. And sometimes in certain combinations of those that could all be coordinating together, and in many cases they’re not. So, at the state level, you could have a state that might have a CIO position and a CSO position, focused on [00:06:00] all of the state IT assets. But then you could still have cities that are being hit by ransomware all the time that are following completely different standards and whatnot. And that’s been one of the challenges. Some states in recent years have taken a whole of state approach. Where the state government has decided to be a shared services provider for whatever ends up procuring from an IT and a security standpoint to all of the cities and towns within its jurisdiction. And that becomes interesting because it kind of mirrors the attempted approach right now in the federal government, where CISA is taking on this shared services role with certain federal agencies and whatnot. And so with that then states also kind of have a holistic picture of whether or not there are certain threats that are focusing on the state specifically and hitting all the cities and hitting, you know, all the municipalities with the same ransomware attempts or whatnot. And other states have taken a different approach where they’ve decided to give [00:07:00] funding still maybe to localities. But then it’s up to the localities, what they do with that funding for cybersecurity. And then in other states, you kind of have towns fending for themselves and states fending for themselves and whatnot. You also have an interplay with the federal government, with grant programs from the past few years that have actually been quite successful in moving certain states and local jurisdictions along with cybersecurity and helping them modernize. Actually, in many cases at a pace that’s much faster than the federal government where the localities, and the states have been the shining examples of success and upgrading cybersecurity. And then the other unique thing you have there is, you know, so states have this unique role with their own infrastructure, sometimes with their cities to that point, in sometimes with federal grants. But then there are multi-state efforts. And those are interesting. So for example, the center for internet security, has different means in which they’re able to pull states together around common standards, but also even shared resources. So, you’re [00:08:00] mentioning a moment ago, elections, there is an ISAC devoted to elections. There’s also an ISAC devoted to multistate cybersecurity. And so you have a nonprofit center that’s able to then play a role in providing tools to states that want to participate in information sharing, but also leveraging different tools. And that’s for example, for election security and that’s been something that’s been interesting where, due to reasons of you know, historic federalism reasons you might not see Mandates from the federal government, and you might not see states necessarily even embracing all the resources that the federal government has to offer. But that doesn’t mean they’re not coordinating amongst themselves and using resources and elections that are provided by a nonprofit. So there’s a lot of work there, but I don’t even know if I answered the original question. But going back to some of the roles of states, you know, there’s inherent roles of states in election security that just from a jurisdiction standpoint, the federal government does not have as clear of a role in, [00:09:00] or as clear of a right to even play a role in, where states do in addition to protecting the state infrastructure itself and really taking care of the cities, and those cybersecurity have-nots that are not going to be as well-resourced.Ěý

Megan

A couple of thoughts that I just wanted to respond to some of what Drew said. There are some really interesting fights going on, or that we’re about to see with the government trying to tell the state and local governments what to do for their own cybersecurity. Fun fact, state and local government and schools and law enforcement are being treated as critical infrastructure for this new incident reporting rule. And I think that will be very interesting to see because it sort of looks to me like an unfunded mandate to go back to some fun 1990s legal disputes about whether the federal government can tell the state and local governments, what to do. And on elections you’re right. But yeah, to pick up on something Drew was saying yes, states have a unique role to play in election security because in the constitution, the states are vested with that and that means that each individual state, secretary of state, actually has an [00:10:00] important role to play in securing their own election infrastructure. We’ve seen election systems be designated as critical infrastructure by DHS, by CISA and they have access to support. But it’s entirely within the state’s discretion to figure out how to do that when it comes to election security. That’s a thing and they can get help from CISA, but they don’t have to. And so I think we have a lot of interesting questions about the role of states going forward, both as plenary regulators, but also as the recipient of regulation.Ěý

Sasha

Absolutely. So the buckets in my mind then, I mean, you mentioned Megan sort of this, popery regulations. Right? Everything from banning TikTok, right to different privacy regulations. We know we have victim notification laws in all 50 states, for example. And you mentioned Drew, the function around protecting the state’s own data. Right? And its own functionality and for consumers in their states. So that’s another bucket, obviously elections, the two we haven’t talked about and just thinking about kind of roles and structures of the state together are one, education and two, investigation. So on the [00:11:00] investigation side of question, I get a lot, just coming, people sort of associating me with the investigation side of the House is, you know, for routine phishing and fraud, and even maybe a pretty significant cyber attack at a small business, for example, in a town you know, who, do you call the FBI, right? And so what authorities and resources do state and local law enforcement have? The one piece I’ll just point out that has been a real leveraging resource for state and local law enforcement I’m aware of, and then curious for your guys perspective on this is IC3. Which for listeners not familiar is the coordinating body for reporting routine, cyber fraud and phishing. And it’s an inner agency led by the FBI, but highly leveraged by state and local law enforcement to kind of put the pieces together or connect those dots as they say of the sort of broad net activities by nefarious actors, enabling law enforcement to use the limited resources they do have to try and put these things together and go after priority targets. Any other thoughts on investigations at the state and local level Drew. I know [00:12:00] you live in this world. Like, where are we with that? And is it a resources issue or an authorities issue or what’s going on at the state level?Ěý

Drew

It can be resources, authority, or even just a talent issue depending on the jurisdiction. And so that’s where there’s the opportunity for states to actually play a much bigger role for most victims. Because most victims generally are not going to meet the threshold where the FBI is going to be able to devote the resources from a priority standpoint. Because the FBI tends to focus on, you know, those sorts of crimes or potential crimes, that have met very high thresholds in terms of number of victims impacted, amount of money, infrastructure targeted, etc. Generally speaking though, from a jurisdictional standpoint the FBI because of the interstate commerce clause is going to find, you know, usually a pretty good argument for where there is jurisdiction when there is some sort of computer related crime, some sort of cyber crime. But [00:13:00] that’s where I think that there’s been you know, a huge effort in certain jurisdictions to try to build the capacity of state and local law enforcement to better address cybercrime in recent years. But absolutely, you know, one state to the other does not remotely look the same, much less one city or towns or the other within a state in terms of whether or not they’re able to help somebody who’s been the victim of a phishing scam or whatnot. And also, even if the victims are local, and even if you have local jurisdiction, that does not mean that the perpetrator is remotely local. The perpetrator could be on the other side of the world, which then goes back to who has the resources to deal with that. And as we’ve mentioned in past episodes, even if the FBI has the best resources to deal with that in cooperation with foreign allies, that doesn’t mean they’re even able to necessarily get the perpetrator or whatnot. But that’s where, again, the states could at least have a touch point with the victims there. The other thing is that. I think, you see certain states have bespoke [00:14:00] authorities that actually have an influence on wide segments of the American economy. So for example, in cybersecurity, the New York Department of Financial Services has leaned in, in recent years on various specific cybersecurity regulations, they updated them even recently, affecting financial service providers above a certain threshold. So even though we’re talking about one state, one state’s authority, because of the concentration of financial service providers, in New York or that are subject to New York jurisdiction in some sort of way due to their presence. That means that New York can actually set the tone in that industry on what cybersecurity mandates look like. Similarly, New York recently rolled out new cybersecurity requirements for healthcare providers. Again, focused on New York, presence in New York, but because of the way in which a healthcare providers have consolidated in recent decades, that can have a large impact on the economy. And then you can look at non cybersecurity [00:15:00] related laws like CCPA in California, focused on privacy, but that has cybersecurity mandates in it. Same thing where that’s already has had an effect on what sort of cybersecurity mandates companies that aren’t primarily based in California need to follow because they deal with California consumers.Ěý

Megan

I just want to pick up on something Drew said, you know, in characterizing the New York DFS rules, they’re actually pretty darn broad in terms of, what the state attorney general thinks of as its jurisdiction. Right? It’s not just large banks. Right? They tag anyone who registers or is a licensed New York insurer and other people adjacent to it, and just as an example, right, a couple of years ago, they hit Carnival Cruise Line with a fine for violating some of their rules or allegedly, violating some of their rules and having a data breach in 2019. And we’ve handled New York investigations. They do not think of themselves as limited to New York consumers, nor are they focused solely on the financials space. [00:16:00] And I think that shows the complexities here of state activity in these spaces. Right? Because, they can reach way across the country and the globe, quite frankly. And it just shows that it’s challenging. You can have a duplicative investigation in New York, California, Illinois, you know, at the Federal Trade Commission. And it’s a real challenge as they get more and more into cyber.Ěý

Sasha

And in that vein Drew mentioned, California, Megan, and for example, it’s like, we’d be remiss, right, talking about sort of the state’s role here without a little bit more on California. Can you talk a little bit, like the situation in New York and the outsize kind of role that state legislation in California has in fact played in this space, in terms of privacy, in particular and related issues.Ěý

Megan

Yeah. I mean, they’ve been sort of the incubator of new and aggressive privacy laws for many years now, sort of picking up on the European approach. There were some referenda, California has very permissive referendum process. And so a couple of very active folks got some [00:17:00] referenda passed and that set in motion, a whole bunch of new regulatory obligations on companies that do business in California, which is most. And above a certain size threshold. And I think it’s fair to say, California has inspired some other states to follow suit. We’ve seen a proliferation of privacy laws at the state level. Some of them now are including, more robust data security and cybersecurity obligations, and that has affected the whole federal legislative discussion and sort of what now is the baseline. And as a practical matter, a lot of companies have had to, race to the bottom or the top, depending on how you think of it and treat all national consumers like they reside in California. And you know, that’s a real challenge for companies and it raises some interesting questions of you know: How does Louisiana and Virginia feel about that? Right? They’re being supplanted in a way by what California is doing.Ěý

Sasha

So interesting. The other piece before we move off of kind of tools and roles and structure is just to think about education for a second. So, let me think about the resiliency side of cyber, obviously, in this kind of cyber for [00:18:00] all, and the push CISA has had around individuals. Right? At least in world, as it is currently structured, the dependency we have on individuals, and individual consumers to be cyber smart. Right? And getting that into curricular, whether it’s K through 12, or the state led universities. Right? It’s something that falls to the state level and something you know, in terms of recent trends. Every semester I have, as part of the end of the semester, my students research their home state a little bit, spend a little bit of time just seeing what’s going on there. And in the education space like these others, you all have mentioned in terms of regulation and, the legislation it’s a patchwork. Right? So we have some states that are really, forward leaning like Maine and others in terms of driving cybersecurity education into their K through 12 curricula. And we have others that are, it’s just not on the docket yet. Right? It just hasn’t really made it to the front. So it’s another really interesting place because of the way we’re structured, obviously in federalism, and that primary responsibility for education. Being at the state level, that to have that piece, you know, elevated and resource we’re [00:19:00] really dependent on the 50 different states and their approaches to that and seeing where that goes going forward. So with that, what are the other recent trends? We’ve talked about New York and California, we’ve talked about increased activity. What else are you guys seeing, at the states? Drew, you also mentioned kind of the way states structure, whether they have CIOs or others. What else are you seeing states move toward? Is there anything we haven’t mentioned yet?Ěý

Drew

Well first of all, just to comment on the education piece, I think that cyber literacy is so important. Especially just to introduce that skepticism. And we think about that today in terms of you know, we want the children growing up to not easily fall for phishing. Well, what about deep fakes? What about everything else that they’re going to have to face where they won’t necessarily have the analog for, or the arc of what it was like before you could generate things as easily. And so I think that’s even more urgent. So it’s always been important, but I think, there’s a sense of urgency to really get that into the education [00:20:00] system, all across the country. That sort of cyber literacy, I think that’s really important. In terms of other trends, I think that there’s really been, I think what I was alluding to before where, we now have CIO’s across the country in different states, and we also have the role of CSO at the state level, too. We don’t necessarily have those things at the local government levels, at all. So that’s where I think states are still figuring out what’s the best way to protect.Ěý

Sasha

Yeah.Ěý

Drew

Especially if they’re government owned, university hospitals, and things like that. You know, and so, I think we’ve seen an increasing trend to the whole of state model and the shared services model, but that’s certainly not what every state’s doing. But it seems like the trend from recent years to use grant money, in a way, to modernize cybersecurity has been one that, you know, has been embraced in different ways throughout the country. But what’s going to be interesting is once that grant money’s gone: What are states [00:21:00] going to do to sustain that? And are they going to come up with models to sustain that? I think that’s one of the things to be on the lookout for.Ěý

Sasha

Are you optimistic?Ěý

Drew

You know, I’m always, I’m always an optimist. I’ll say this, I think the need’s going to be there, whether or not the moneys there. So I, certainly don’t have a crystal ball to predict the future, but I mean, I think every state’s going to have to figure out something.Ěý

Sasha

Absolutely. What else, Megan?Ěý

Megan

I mean, I think one trend we’re seeing since I’m more on the law and policy side and less on the sort of operational side of what states are doing you know, this flurry of legislative interest in, you know, now it’s artificial intelligence and there’s cybersecurity being thrown into all of those. And I think states are struggling in a couple of different ways: (1) they got to procure this stuff, they got to manage it for their own uses. Right, and figure out if what they’re buying and being sold, is adequately secure. And then they’re, you know, toying with plenary regulation and sort of going out and doing direct regulation. I think that’s a trend that unfortunately, from my perspective, we’re not going to see abate, is there’s just so much [00:22:00] interest and it is the latest thing. So I think we’re just going to see this continue to chug along, at the state level.Ěý

Sasha

That makes sense. So as this chugs along, and I know Megan, you mentioned previously just the efforts and I’ve talked to folks and companies, to just people whose whole job it is to try and track and keep up both globally in terms of changes around regulation and laws, and even here in the states now across the states. So in terms of kind of stakeholder engagement, I think one of the first challenges is even knowing where to plug in and what’s going on to resources, I think you probably introduced me to, but I’ve been using ever since is the National Counsel for State Legislatures, so NCSL and their tracker, their legislative tracker, which is available online. It could be a huge help. I also often go to NGA and the National Governors Association has a cyber initiative that has a lot of resources that really gives you a chance, whether it’s researchers or industry, folks, trying to think about engagement, to understand what the governors are thinking. What they’re looking at, reading and what’s going on in the states. Are there other things like that, or thoughts on engagement from where you guys sit that are worth mentioning before we [00:23:00] wrap?Ěý

Megan

I guess in terms of engagement, I think you’ve hit the NCSL. That’s a great resource. I actually have it up right now and they were racking up 2023, 40 states, 130 bills were adopted in 2023. And I think, you know, state procurement officers are doing a lot and I think you can learn some lessons from what they’re looking at and what they’re doing. But I think you kind of hit the high points of it. It’s just very challenging. Legislative sessions are short. And there’s a lot of folks out there, in state houses who want to do stuff. And the pace is just really tough.Ěý

Sasha

Yeah.

Drew

One other trend just to comment on is something we’ve seen throughout the country. On the legislative front is zero trust. Legislation, either proposed, passed as a nonbinding resolution, or passed as a mandate in several states. Where there seems to be this trend of mirroring the approach of the federal government with executive order 14028 at the state level. At least in [00:24:00] some flavor, maybe not in its totality. But that’s something where, there’s actually been some similarities. So, there’s lots of differences with certain states. Like we were talking about with California and New York, maybe from other states, but there’s been actually a lot of similarity in completely different types of political climates and jurisdictions with the approach to zero trust. And that is something that even in legislative sessions where it doesn’t pass, we see that similar language proposed over and over again. Yeah.Ěý

Sasha

So with that, we’re going to wrap this episode and actually as a follow on to this little mini series, Start Here is going to be on a short hiatus until Fall of 2024. We’re going to spend some time this summer adding additional resources to our website related to previously recorded episodes. And we will be back with you in the Fall with new episodes. So we hope you join us then. So Drew, Megan, thanks again for getting together today in person. And I look forward to continuing the conversation.

START HERE is sponsored in part by a grant from the Special Competitive Studies Project
cyber

Questions, comments, or recommendations for future

START HERE episodes?

Contact Us