A Conversation with Chris Inglis and Anne Neuberger

Available Downloads

James Andrew Lewis: Good morning. Welcome to CSIS.

A year ago, roughly, we had Chris Inglis and Anne Neuberger here to talk about what they were planning to do. But this isn’t going to be a retrospective. If you’ve gone through some of the documents that have come out recently, like the National Security Strategy, we’re going to talk about a lot of opportunities, a lot of the challenges that lay in front of us. So I appreciate you both taking the time to do this again. Thank you.

We will have the ability in the final 10 or 15 minutes to ask questions. If you’re watching online, there’s a button somewhere on the screen that you can click to submit a question, and we’ll see what we can do. Let me remind you that what usually happens is in the last 90 seconds we get a flood of questions while everyone has been bashful. (Laughter.) Do not wait. (Laughs.) Do it first.

But let me start by asking Anne and Chris where they think we are, right? It’s been a year. I’d say it’s been a year of some progress, but where do you think we are in cyberspace? Anne, do you want to go first?

Anne Neuberger: I’m happy to start. Chris and I talk so frequently, we’re on the same page, so we’ll finish each other’s sentences here.

Where we are: So the threat in cyberspace continues to rapidly advance. Right? We live in such a digitized society, and the more connected it becomes in our personal devices, our personal data, from a national perspective in critical services, the more opportunity there is for adversarial entities, from countries to criminals, to leverage those connections to – you know, we used to be very concerned about collecting intelligence; at least our concerns have evolved to where we’re most concerned about degradation or disruption of critical services.

Over the last year we took that on in really three bins of work we’re happy to talk about more.

The first one is a relentless focus on improving the security of critical services in the country, focusing on which are the key companies that are the core delivery of services for a large number of Americans, the types of services that could bring hazard if disrupted from providers of transport for chemicals, or oil and gas, to hospitals, and really focusing on putting place security requirements to lift up and have confidence of what the floor, the threshold is, for security in those sectors. So that relentless focus on critical infrastructure has been step one. And real shout-outs to the agencies who’ve led that, who we’ll talk more about.

The second has been a recognition that cybersecurity is a global fight, and that we want to be arm-in-arm with our partners and allies around the world to do so. So in that way, the U.S. has led, building coalitions to tackle things like ransomware, bilaterally working with individual countries who we see significant compromises of, setting standards for critical infrastructure, for example, so that companies around the world get one voice from various governments.

And then finally, in the third space, emerging technology, saying, wouldn’t it be wonderful if we could secure by design? So looking at new areas, from digital assets to quantum-resistant cryptography, to start doing that from the ground up so that we give our successors potentially an easier fight than the one we’re in.

Dr. Lewis: Great. Thanks.

Chris.

Chris Inglis: I would say ditto. (Laughter.) I agree with all of that. Anne rightly started with a focus on outcomes. By the way, I actually deliver not just critical functions, but the confidence that those critical functions will deliver for our citizens. And not just U.S. citizens, but all those who use cyberspace alongside of us. That’s been our relentless focus. So let me say a little bit about the means.

There are two things that I think have kind of driven a focus on the means by which we do that. The first is to achieve resilience by design. Today, or I might say a year ago, the unrelenting focus oftentimes was on operational response. We responded extremely well about a year ago to kind of something called the Log4j. But if we responded that way well, excellently, time, after time, after time, we’d just lose more slowly. We need to actually push responsibility for building resilience in by design to the technology, to the roles and responsibilities, and to the people skills, such that we avoid those events. Or even when we do suffer those events, we’re in a place where we can catch it early, box it, evict it at the earliest possible moment.

So we’ve had a focus on how do we get the roles and responsibilities right, not just within the federal government, which is important, but at the state and local levels as well, and across the private sector. Having done that, we need to also make sure that the people skills are up to speed. We hosted a conference, a summit, at the White House in July of this year, where we could focus on those people skills. And not just the jobs that have the word “cyber” or “IT” in them, but to make sure that every person who uses cyberspace has enough information and knowledge, and some intuitive and convenient technology, that they can actually exercise their aspirations in cyberspace

without fear, or worry, or apprehension. We don’t want them to obsess about threats. We want them to obsess about their positive aspirations. So let’s get people in the right place.

Finally, if we do those two things – if we get the roles and responsibilities right, if we get the people in the right place, then technology can serve their purposes. We then double down on what are the attributes of technology. Anne’s leadership, with Executive Order 14028, early part of 2021 – deliver in May of 2021, has been a watershed moment for us to actually make the commitment, to get the technology, the architecture right. But on top of that, we’re now pioneering something called a zero-trust architecture, which I think we all know is forty years only, just renamed again. But how do we make that architecture verify, vouchsafe that our expectations of it are, in fact, being met?

Finally, the last thing I would say is if you get the resilience by design right, then what happens on top of that, in the actual defense of that, has to walk away from this idea that we can do that by division of effort. We have to actually achieve a degree of collaboration where we can discover things together, address things together, mitigate, kind of deal with those situations in a way that they have to beat all of us to beat one of us. That’s a collaboration, not a division of effort.

And that’s been borne out in many examples over the last year, not least of which is the Ukrainian crisis, where the Ukrainians have shown us that a modest investment in the underpinning architecture, resilience by design, actually has a very high leverage effect in your ability to defend it. A collective, collaborative defense between the public and the private sector, and governments, plural, can in fact mount a stout defense that holds its own against a numerically superior, and sometimes a technologically superior, offense. That we perhaps underestimated the power of defense. And so we need to focus on that as the predicate, so that we can then hold our own and achieve our positive aspirations in that space.

Dr. Lewis: So one of the things that’s impressed me, not necessarily in a good way, is the willingness of you guys to take on what I think is one of the biggest problems. And that is that we build an infrastructure, an internet, based on light-touch regulation, on voluntary efforts. People in the audience who know what shrink wrap is – the shrink wrap rule. Once you remove the shrink wrap all liability transfers to you.

This is a sector that’s very different from other sectors, and you’ve been more than nibbling at the edges. So I say impressed because it’s a big bet.

How do you intend to change the market? That’s an open question. But we can look at specifics.

You’ve talked about mandatory versus regulatory, both of you. You’ve talked about securing IoT. I want to come back to IT standards, which is a pet project of mine.

What are you thinking in terms of shifting us away from the 1990s approach to how we govern cyberspace? What are the tools? What are the policies you’re going to pursue?

Mr. Inglis: I think what’s old is new again. We have done this before when we addressed the safety of transportation systems, whether it was the devices themselves – automobiles or airplanes – or the systems that kind of users kind of employed to convey themselves from one place to another.

There’s, of course, a large degree of self-enlightenment of the manufacturers, the suppliers, the integrators, about what safety precautions or what safety features they’ll build in. Market forces might take you a further distance.

Remember the day when Volvo started crashing cars into walls and competing on that basis with other car manufacturers? At some point, you get to a point, inevitably, that what remains in terms of the features that must exist in order to guarantee the confidence in the safe use of those systems has to be specified and has to be delivered.

So I don’t think it’s a big bet as much as it’s the inevitable kind of truth. We need to apply the lightest possible touch in that last phase to specify what are the nondiscretionary features, but no lighter.

I think the good news there is that there’s a huge degree of collaborative effort, not just across the federal bureaucracy and the states but across the private sector. When you talk to the private sector leaders, both the technologists and the committing officials, perhaps, from the boardrooms, they acknowledge that resilience has to be by design built into the systems that they deliver.

Now, there – can have some discussions about the degree to which we assign responsibility. But everyone agrees, I think, that the first and last line of defense can’t be the user at the end of that supply chain. We have to push some responsibility along that supply chain.

Ms. Neuberger: Building on Chris’ comments, I think there are three areas that are really guiding our approach to this, and the three key areas that, I think, are at the

root of the market failure in cybersecurity that Chris was describing is, first, visibility.

When somebody’s making a purchase decision for tech, whether a consumer buying a smart TV or a power grid operator buying an unmanned sensor, they have no way to know what is the security of this device and what risk does it bring to me.

The second is, what makes a difference, right. What kind of security features actually drive down risk, and I’ll talk more about that.

And then the third is we don’t – it’s not correct to treat everybody and everything the same because there’s a spectrum of risk, and we need to ensure that the security requirements match the risk in that area.

So three specific examples, and the first, visibility. At the root, the Internet of Things labeling effort that we’ll be rolling out in the spring where we hosted an event at the White House a couple of weeks ago is saying, you know, there’s the data that shows – and Carnegie Mellon had some really fantastic data that shows – that consumers are willing to pay more for security. They value security. But they can’t make a security decision because when they’re buying that smart TV there’s no way to assess one against the other, right.

So the core goal there is, you know, kind of like – you know, I’m a New Yorker and I often think about restaurants in New York. When they started having to do the ABCD rating in the front window that gave consumers a very rapid way to decide which restaurant am I going to and it, certainly, wasn’t the one with a C rating.

We’re trying to do the same for your smart TV, right, and a lot of great work has been done across the tech industry that we’re building on. And, really, a shout out to Laurie Locascio, the director of NIST, and her team, who’s been thinking through how to make this real. So that’s the first.

On the second – what makes a difference – you know, the president’s National Security Memorandum a year ago tasked CISA to create a set of performance controls that CISA, under Jen Easterly’s great leadership, rolled out just last week. That lays out the most important performance controls that will impact security. So it’s that common set of do this, you’re in a better place.

And then, finally, what’s the spectrum of risk. Clearly – and here I’m going to highlight Dave Pekoske, the administrator of TSA, and the tremendous leadership he has done at TSA using security directives to improve the security of oil and gas pipelines, the maritime sector, and the railroad sector.

And what they did so thoughtfully was to start by saying, who are the critical providers in the sector? There’s 96 oil and gas pipelines, there’s 57 railroad-related entities, and start with those who are at highest risk because they transport hazardous materials, they’re the largest in the country, and lay in place security directives for those.

So those three things: giving visibility so people can then make the choices they want to make but they don’t have the data for; making it easier by saying, here’s the standard controls that we’re looking for you all to implement; and finally, the leadership of key agencies who are lead agencies for those sectors, saying, not everybody in the sector is equal; we’re going to start with the highest-risk entities in that sector and put in place requirements for them.

Mr. Inglis: Anne and I, I think, agree on one thing more which is I think we agree that there are distinguished, differentiated kind of attributes of the various sectors and so therefore there will be some tailoring in each of those, but we well realize that there are some entities kind of in this ecosystem that actually operate in many of them, and we need to make sure that we harmonize these regulations, these expectations. Even within some sectors, there are multiple parties, multiple federal organizations, state organizations, international organizations that weigh in and levy expectations, whether that’s in the form of reporting requirements or regulations about the attributes of the architecture that they build and operate. We need to make sure we rationalize that, harmonize that, so, again, that we’ve actually specified what is necessary but with the lightest possible touch so that innovation and capacity generation can continue unabated because that’s been a great benefit, a great boon over the last 40, 50 years.

Dr. Lewis: You’ve both used the word regulation and that used to be a third rail, so let’s talk a little about that. Later on I will tease you about voluntary digital identity – been there, done that, doesn’t work. But what are you envisioning when you say regulation? Is it still the sector-specific approach we’ve been taking? Do you need new authorities? How are you going to do this?

Mr. Neuberger: I think principle number one is use what you’ve got because you can move fastest in that way, and that principle number two is one size does not fit all, because each sector is different. You know, when you look at a water system, the operational water system and what we’re concerned about with regard to risk is different from the electricity grid, is different from transport of materials. So principle number one, so that point, is to approach this by saying, what are the requirements? And certainly, as I noted, the work’s CISA’s done, laying a standard is in place. And then sector by sector, what’s

the additional delta for that sector-specific risk? And we feel that the sector lead agencies – EPA for water, DOE for energy, CISA for chemical – have the best understanding of their sector to design this where there’s the commonality across so they learn from each other, and then the distinct difference is to ensure that we’re addressing the core risk there. So that’s the –

Mr. Inglis: I might add to that, I think that the word regulation or reporting requirements often conjures up in the mind’s eye this sense of burden, someone’s about to kind of acquire a burden, kind of bear some penalty, some cost, but we too seldom think about what is the more important feature which is, what’s the benefit? Who are the beneficiaries? I think Congress in passing the cyber incident reporting law was quite careful about specifying who the beneficiaries would be and relieving some of the liability that if you report fully and faithfully under that that you will not be held accountable under compliance regimes. The beneficiaries in all of these are intended to be the ultimate users, right, in this ecosystem so that they can have confidence that the critical functions, those things that they use to conduct their daily lives, will work as advertised.

All of us when we walk over to a light switch have every confidence in the world when we flick that switch the lights will come on. Why? Because someone has attended to the resilience and the robustness underneath of that. That benefit is often kind of looked past, it’s often not in the conversation when we talk about how then do we deliver that confidence? There need to be expectations across the length and breadth of the supply chain. It doesn’t relieve the burden on the end user to participate in his or her own defense. We hold our citizens accountable to not drink and drive, to not text while they drive. There’s an equivalent to that in cyberspace. But they alone today seem to be the ones that are bearing the entire burden. We need to push some of that accountability across the system so that the beneficiaries, the maximum number of beneficiaries can profit from the benefits of cyberspace.

Dr. Lewis: The light switch analogy made me think about cloud computing and third-party services, which is sort of how you began your tenure; it was a welcome present from our Russian friends. They’ve been doing it for a while. But what are you thinking? I’m a little worried about federal use of the cloud because there’s actually some retrocession. CIOs are moving away from using the cloud, which is a mistake. It’s a very contentious issue, though. So when you think about securing the cloud, how do you also think about where this fits into a larger federal IT ecosystem?

Mr. Inglis: I just think that if cloud is an important commodity for us going forward – and it is; there’s so many enormous benefits from that – we have to make sure that it’s resilient by design. I know we don’t buy cloud by the pound, but let me just use that kind of rough analogy. When you buy a pound of the cloud, it should have its resilience and robustness built in. You should have an expectation you don’t have to go to a separate showroom to then argue for what security features you’re going to pay for above and beyond the base case. We don’t do that with cars, right? When you buy a car, it comes with an air safety bag. It comes with a seatbelt. It comes with – imagine that – brakes, maybe even antilock brakes built in. We need to treat the cloud the same way.

There is an increasing kind of willingness and market forces that exist within this space where that’s happening naturally. We need to make sure, though, for those critical services that commodity actually can deliver the goods. And so we will ensure that those specifications about what is not discretionary – what must be in there – is in there so that we’ll all benefit from that. And the economy of scale I think will make that economically viable, and at the end of the day for all of the users of that not just the federal government something that actually is well worth the investment.

Ms. Neuberger: So what drives the move to the cloud is far easier administration, right? Large enterprises, whether private or government, have thousands of devices that they have to manage, patch, maintain. So moving to the cloud also – first from a security perspective, but also from a use of tech, right? There’s that sense of paying for what you use versus all the capacity that’s in every desktop under a desk.

But merely – so moving to the cloud does make it easier to administer, but as Chris noted the move itself, unless the cloud is properly administered for security, one doesn’t get the full security benefits. For too long, cloud providers said it’s up to the customer. And I think the argument we’ve been making is to say that worked in the old environment where an OEM was bringing together a chipset from one company, an operating system from another, application software from a third; but when you have cloud providers building on bare metal there needs to be – that’s the place to drive the accountability we’re talking about regarding if you’re a provider of tech, you’re responsible for providing a baseline of security in that tech. Yes, you may have some customers who have higher security requirements and will pay a higher delta. They’ll use proprietary encryption if they don’t want to use commercial encryption, et cetera, right? But the core baseline, the idea that it’s on the customer we just think is fundamentally false. And this is the place to shift the requirement and responsibility you’ve talked about in your shrink-wrap example, Jim, a moment ago to the provider because you’re delivering a service, you’re delivering a secure service.

Mr. Inglis: But one thing we haven’t spoken to – I agree with all of that. One thing we haven’t spoken to yet is the high degree of consultation that’s absolutely required to get this right. I can still remember in one of the earlier engagements that I had in my tenure talking to a major manufacturer of software that we were talking to. And they said, I love this newfound kind of ability to collaborate with you, to answer your questions. And he said, what would be even better in the collaboration is you let me nominate some of the questions. He was right. He was exactly right. We need to make sure that this consultation helps us identify through the lens that they enjoy where the innovation takes place, the capacity generation takes place, the sustainment, the defense takes place how do we understand what the right goals are. And that degree of consultation is, in fact, taking place, where the government, which will act on behalf of its citizens, must have a high degree of consultation with those folks who actually have 85, 90 percent of the work before them that is going to be necessary to understand what the right questions are, let alone what the answers to those questions are.

Dr. Lewis: So I think we are in a process of evolution from the old cyberspace to a more mature one where it’s treated more like other industries. And that’s a good thing, but it’s also challenging. So I give you credit for tackling it. Perhaps in two years we can assess how it’s worked out. I agree with Chris we’ll probably be driven there.

But a couple issues have come up. And so if you’ve been – everyone in the room has been following cybersecurity for a while and we see remakes of ideas. Let me go over a couple of them.

You’ve talked about using insurance. By the way, the very first event I did on cybersecurity 20 years ago was on how insurance would drive us to more secure networks. I’m still waiting. You’ve talked about maybe –

Ms. Neuberger: Didn’t say when. (Laughter.)

Mr. Inglis: Nothing says if it was a good idea then that it’s not a good idea now, right? So –

Dr. Lewis: (Laughs.) Oh, they’re so hard on me! (Laughter.)

But you’ve talked about maybe we’ll just insure – one little dilemma – there’s a couple dilemmas with insurance. The first is moral hazard. You all know that. The second is the tendency of some companies to declare it’s an act of war. Reasonable, considering that it’s often a state actor who’s responsible, and then therefore they’re excused. And the third is we would only insure for catastrophic events. And I’m not quite sure what a catastrophe in

cyberspace would look like. And maybe you could just touch on where do you think the insurance avenue will lead us, if we pursue it?

Mr. Inglis: Well, let’s step back for a minute and be agnostic about what we would imply the insurance kind of market or the insurance craft to. Let’s just think about what insurance typically does. It doesn’t just transfer risk from party A to party B. If properly done, it differentiates between risk and it addresses bad risk by imposing expectations about how it becomes a good risk, so that it can in fact have rates that are preferable, perhaps technology that drives those rates, and so on and so forth. It drives a body of practice that essentially gets everyone to a better place. It’s a rising tide that can raise most, if not all, boats.

Now, why does that not work in the cyber marketplace? One, there’s no universal expiration that people are going to buy it. Therefore, there’s no diversified risk. Two, there’s not enough information to do the actuarial analysis because of factor one. There’s not enough information to say, I can assess and kind of address that risk. Three, there’s a high degree of hazard in that space that often goes to the darkest possible corner of the room. It’s high-end risk. And, four, there’s not a mature industry that can install itself, insert itself, into that to say: I can help you buy that risk down by doing the equivalent of the smoke alarms, or the fire detectors, or placing fire departments in a logical place, or adding fire retardant materials.

All of those are within the realm of possibility for cyber. We’ve just not organized it such that those forces can stand in and achieve the beneficial effects. I think cyber insurance could, in fact, be viable. But we haven’t actually taken care of the underpinnings to make it such. So I think the government and the private sector together can consider how to create a viable cyber insurance marketplace. Not so we can transfer risk from party A to party B. That would be somewhat feckless. But so that we can actually achieve the rising tide raises all boats proposition, which we have in so many other industries.

Ms. Neuberger: So two anecdotes that, to me, say why you were right 20 years ago, and why it’s worth it to think about insurance. As a child, of course. One is, my husband and I bought a 100-year-old home. And we couldn’t get home insurance without putting in place smoke alarms, right? An alarm system. Because of the idea that if you can’t detect a threat, it’ll lead to catastrophe. The second, when our teenage son joined our family car insurance you all know what the impact of that was, because of the data that shows what teenage boys do, or potentially could do.

Why do I say this? Because I think insurance has the opportunity to incentivize good and punish negligence. Incentivize good. So for – we now have a good understanding of which practices drive down cybersecurity risk. Insurers can say: If you put those practices in place, your premium price will change. Or post an incident, we’ll actually look to say were you, the entity, doing those best practices? And if you weren’t, we’ll treat you differently. If you were, we’ll treat you differently. So that opportunity to incentivize the good and punish the negligence – punish is too strong a word – but make it clear that negligence can play a factor, I think is important.

And the second aspect is the gathering of data across incidents that happen, or across incidents that don’t happen, to further give us insight on what matters with regard to security investment. Security costs money, fundamentally. People want to get the highest return on investment for every dollar they spend. And that communal data about practices and what it leads to with regard to compromises, incidents, or lack thereof, is so important to us as we continuously work to evolve the threat, right? Because there’s always new and creative offensive techniques – whether they’re from nations or criminals – to break things. And I think what we always want to do is say: What makes a difference in driving down that risk? And I think there’s opportunity for insurance to help us get there.

Mr. Inglis: So what’s nested in the middle of that is this notion that the insurer kind of looks at the insured party and say: I have expectations of you as you acquire this service from me. That’s been missing in cyberspace. Too often in cyberspace there are people who say: I want to take risk, but I hope somebody will actually relieve me of that risk, will perhaps save me from that risk. All of us need to participate in the defense of this space. Now we need to push some responsibility upstream in ways that haven’t been before so that it’s actually a defensible proposition when it gets to me, now a user, in that space.

But that doesn’t absolve me from participating in my own defense. I need to make the necessary investments in my local environment. I need to make some changes in my behavior in order to be worthy of being insured. That happens naturally in just about every other insurance marketplace. Whether it’s the anecdotes that Anne used or a dozen others. That contract or that compact between the insured and the insurer is really, really important in this space. And it will modify behavior as much or more as it will bring financial instruments to bear.

Dr. Lewis: So to make insurance work, and I don’t want to spend the whole session on insurance, but you need actuarial data and you need standards. And there’s some effort to change that picture, with the notification requirements that

Congress has created and with NIST’s efforts to update their cybersecurity framework.

Mr. Inglis: (Off mic.)

Dr. Lewis: Hmm?

Mr. Inglis: Is this a great country, or what?

Dr. Lewis: Amazingly, we seem to be getting ahead of the curve, yeah. So but where do you think it will lead? Will it be enough? I mean, most people conceal when they’ve been hacked, so it makes it very hard to judge risk. And you have big, dramatic incidents. Is that what we should focus on?

Mr. Inglis: On the intrusion reporting, I think we’ve been careful. And I give high credit to the Congress for this. We’ve been careful to identify who the expected beneficiaries are and hold harmless those who faithfully execute the reporting requirements. That they’re not going to be held liable. We’re not going to find and stab the wounded. So that should be something that drives us to have a greater understanding of what’s broadly happening across this space, so that we have, what you’ve described, as the actuarial data.

It puts us in a place then, to Anne’s point, to know more about what are the practices that actually can bend the needle, that can bend that needle down. It puts us in a place then to be able to judge risk, bad risk and good risk, and be able to then stand in and say: I can help. If you participate in your own defense, I can help you, through economy of scale, bear that risk. I think all of those are the predicates for ultimately a viable insurance marketplace. It’s early days. It’s not something we have a script to say if we do one through five we’ll be there. But I think that the tenets of that have been well tried and true in other domains of interest, and we should apply those here as well.

Dr. Lewis: I had a guest from the Danish central bank recently, who said they were shocked at the lack of a common digital identifier in the United States. That it was so much more convenient in Denmark. Now, I’ll note that the countries that have been successful at creating digital identifiers tend to be small. So Estonia, Denmark, others. But what are you thinking, in terms of fixing this? I mean, a voluntary approach, this would be the fourth try at a voluntary approach if you do it, so perhaps there’s some pitfalls. On the other hand, is America so uniquely driven by individual privacy that a government identifier is impossible? What’s your thinking on digital identifiers? Because this has been a problem from the start.

Ms. Neuberger: So, I admit, and Chris is probably smiling although I haven’t looked at him, I’ve spent hours tilting at this windmill. So – with an amazing team, who’ve

been together. So here’s – here are the thoughts. Overall, my thinking generally is that when we’re – the absence of a trusted ID – when I say trusted, I mean, you know, some type of securely encrypted identity – causes billions of dollars in fraud, billions of dollars of identity theft, and tremendous harm to Americans every single day. Harm, wasted time, you know, identity theft is a major issue. And it typically, frankly, hurts the most vulnerable populations.

And what I find is when we start to have the conversation, we cannot allow it to become a privacy versus security conversation. Instead – or, one or the other. We need to put both on the scale and say: This is doable. Because there’s a cost to the current ecosystem. And the most vulnerable parts of our population pay in terms of government programs that don’t go necessarily to the right people at the right time, in terms of identity theft online. And we need to – and we can do it in a secure way that protects privacy, accounts for bias, et cetera.

One of the most helpful developments in this space have been mobile digital license pilots that individual states have done. We’ve taken detailed briefings from those states, specifically a shoutout to Maryland and Arizona. It’s two different states. We intentionally chose, you know, different states. And we’re impressed with their security features, their privacy features, their civil liberties features. Those are thoughtful programs. And the reason they’re building on mobile digital licenses is because our licenses are our IDs. There’s already a picture taken. There’s already significant data that we give. Let’s just make it one that can be used online, remotely attested to. So when you’re logging into your medical records, you’re logging in to your bank account, that license in your pocket actually in a digital way can validate who you are. And frankly, states are trusted. When we go and we give in our license information, right – don’t laugh – but we’re giving all that information. So let’s take it to that next step to make it useful to address the significant harm online.

And I think that’s where we’re watching those state-by-state efforts closely. We’re currently thinking through, you know, how that could be used to address the problems you talked about. But this is certainly a challenging issue. It’s an issue a number of countries around the world, as you noted, have solved. There are now, you know, digital IDs for individuals to use voluntarily when they want to. They’re not mandated, but if an individual wants to be safer online they have a digital ID that can be recognized. So there’s a lot we can learn from from all the countries who have gone before us to address this issue. But it’s important to take it, as you know, step by step in a way that people can understand, in a way that is putting in place the right steps for the long haul.

Mr. Inglis: So I agree with all of that and I will double down on one point in particular, which is that oftentimes cybersecurity is described as something that contends with privacy. I think we would agree – we would argue that we should put cyber in its proper place by subordinating it to our larger societal or individual interest. Cyber should be expected – cybersecurity in particular should be expected to deliver privacy and all other aspirations that individuals and societies have. If we get that right, then cybersecurity can, in fact, enhance privacy as opposed to hold it at risk.

Dr. Lewis: So we’ve talked about at various times data stewardship, and that relates to this. And data stewardship is now part of cybersecurity. Is that a fair statement? What would – what would you want to see change in our national approach to data stewardship? Anne, do you want to go first?

Ms. Neuberger: I do. When we look at data, there’s some data that we intuitively know is – needs to be protected. National-level health data, right – the national database regarding or national information regarding health in the country, reactions to particular sicknesses, spread of a disease or illness. When we think about sensitive data related to banking or financial systems. When we think about opportunities to – information sharing. So, for example, one of the reasons that social media platforms and their role in countering disinformation is such a priority for us is because the health of the society is also the way it debates hard issues and the way it comes together in democracies on hard issues. So certainly we know that there is data that’s important to us from a national level in addition to the individual data as people that we want to protect. And we need to have data protection both standards and processes in place to ensure that data doesn’t fall into malicious hands.

We’re thinking through how to do this in a way that also accounts for the innovation of what makes our economy unique, right? Each of us who may use Waze, we know that the algorithms and the AI and the data aggregation that enables that additional service so we don’t spend hours wasted in traffic is something we value. But there may also be information there that’s useful from a national perspective. How do you prevent vehicle accidents? But also, how do you ensure that you protect sensitive information? So we’re thinking through how to best get the innovative aspects of that, but also protect where aggregated data brings national-level risk. So more to follow in this space.

Mr. Inglis: I think for too long technology has been the organizing principle of cyberspace. It’s understandable. It’s the visible thing. It may be the pacing function as technology turns over month by month. But the organizing principle, I think, should be return to people and their aspirations and the

data that is the embodiment of the value that they store in that space or the choices/decisions, the coordination that they would exercise through that space.

Our colleague Jen Easterly, who directs the efforts of CISA, makes a compelling argument that we shouldn’t talk about cybersecurity; we should talk about data care in the same way we talk about health care. I think it’s a compelling argument. It’s a worthy kind of consideration to say: Should we reorient ourselves to think about the core issue here, which is given, driven, delivered by technology? But technology is a means. It’s not an end in and of itself.

Dr. Lewis: This is just a yes-or-no question. Do you think we need to beat the drum on multi-factor authentication or have people gotten the message?

Ms. Neuberger: We’re seeing improvements. (Laughter.) Until we’ve got to a hundred percent, why not beat the drum?

Dr. Lewis: OK. Well, if people – those who are watching, if you aren’t using dual-factor authentication, now would be a good time.

Ms. Neuberger: Jim is saying please do so he can stop hearing more about it.

Dr. Lewis: On that note, two things.

First, if you have questions, now would be a good time to start writing them on your card. Please give them to Georgia and we’ll go through them.

Second, one of the issue – you both came in at a really good time for cybersecurity because there was a lot of action, and one of the big issues was ransomware. You’re having an initiative – is it next week? With 30 countries or 32.

Ms. Neuberger: Thirty-seven.

Dr. Lewis: Thirty-seven. Oh, it’s gone up.

What are we doing – how well are we doing on ransomware? Some of it is media tied. So, you know, when it’s in the news you get a lot of attention. But what I’m hearing is it really hasn’t gone down, in some ways. What’s the take on ransomware?

Ms. Neuberger: I’ll start on that one.

Ransomware is a tough problem because, fundamentally, deterrence is hard when many of the criminal actors are sitting in countries where we don’t have law enforcement relationships with them, right.

So it’s, fundamentally, a hard deterrence problem. So what we’ve done – tremendous work done across the interagency, and I’ll specifically really shout out Treasury and Justice, who’ve done the bulk of the work there, is approach this as a fiscally-driven problem, right. Ransomware is interesting because it’s where the money is, and in that way we’ve done a set of things within the U.S. and a set of things internationally.

Since you mentioned it, I’ll highlight the international aspect first because it’s the core example of transnational crime. You could have infrastructure in six countries, the target in a seventh country, the actor in an eighth country, oh, and the money moving across a cryptocurrency ecosystem that is in virtual space.

So we stood up a year ago the International Counter-Ransomware Initiative to say America is going to lead on a problem that’s disrupting hospitals and banks and schools around the world, and we will lead by both providing capacity and guiding an international partnership and, really, being arm in arm with our partners.

So it has five working groups underneath led by other countries, and tremendous work has happened throughout the year. Monday and Tuesday, those 37 countries are coming together in person for two days of work to build on what occurred during the year, get threat briefings, focus on how we more effectively disrupt actors and infrastructure, how we more effectively get cryptocurrency exchanges to implement “Know Your Customer” rules, to pursue money, how we more effectively work in international diplomatic venues to hold countries accountable for harboring these actors.

So we’re really excited – watch this space – and deeply appreciative of the partnership around the world, and I will, you know, really call out colleagues in, you know, David Koe in Singapore, Rajesh Pant in India and our colleagues in Lithuania, Michael Pezzullo in Australia, who have led a lot of that work.

In the U.S., we focused on, as I’ve mentioned, disrupting the funding ecosystem by designating, specifically, mixers. Mixers are entities that, to use a simple word, launder funds across the blockchain, which is public. We laundered the – we laundered – we designated the largest one, Tornado.cash, after a series of hacks by the DPRK – not a criminal actor – of cryptocurrency

ecosystems, using Tornado.cash to launder the funds to convert that to hard currency.

So we’ve designated – we’ve done extensive exchanges with cryptocurrency exchanges about “Know Your Customer” implementations and also, of course, there have been, you know, arrests of key ransomware actors.

But there is a level to the point we’ve been talking about. It’s also a call to responsibility by entities – hospitals, schools, and other entities – to put in place the, really, resilience practices that we know make them far harder targets against criminals and, frankly, not to pay ransoms because every time a ransom is paid, while it might make it easier for that entity, incentivizes the ongoing activity.

Dr. Lewis: So you need someone to volunteer to be victim number one. That could work.

Ms. Neuberger: Not quite.

Dr. Lewis: Just kidding.

Mr. Inglis: I think the point that Anne was trying to make at the end there, which is exactly right, which is that we should address the hazard, which continues to be real in the world. Kind of it’s a really pernicious set of capabilities that allow folks in these kinds of sanctuaries around the world to operate up until this point in time with some degree of impunity. We can address those, kick the legs out from underneath those, to remove the hazard.

But there’s an equal if not greater body of work that must be done to remove the target, right, to make it such that they simply cannot prevail. For every ransomware attack that kind of shows up above the fold on page A1 of the newspaper, there are several that have been avoided because organizations made a choice about what information they would hold at risk in their systems, about whether they would have backups, about whether they would encrypt data at rest so that it was useless to somebody when they absconded with it, and so on and so forth.

There’s a website that CISA put out there a year ago, stopransomware.gov; if you read that and do those very practicable – you don’t need to have money to do these things; you just need to have time and attention. And if what is actually on your digital systems is valuable to you, then it’s worth that time and attention. You can therefore remove yourself as a viable target – not completely because the hazards are still there, kind of in the daily life makes it such that we might be looking in the wrong direction when something comes our way. You need to work both ends of that problem. But we are not

going to shoot our way out of this. We have to actually practice those things that make it such that we’re not a viable target.

Ms. Neuberger: I really appreciate that Chris noted CISA’s stopransomware.gov site because that is really key to resilience efforts, so thank you for highlighting –

Dr. Lewis: What’s the reaction been from partner countries? I mean, there’s some topics that they still remain very nervous about addressing, and there are others like ransomware where they seem to want to engage.

Ms. Neuberger: That’s exactly it. When we talk about countering malicious Chinese, Iranian, or Russian activity online, there’s always someone who falls out or is not ready to do so publicly. When we talk about stopping criminal behavior that’s disrupting critical services in country after country, everybody jumps in and says I’m in, thank you, America, for leading, thank you, America, for partnering; we want to be in on this. And I think that’s why – what’s cool about the International Counter-Ransomware Initiative is, first, the set of countries is not just the traditional ones we usually partner with. It includes Brazil, the Dominican Republic, Kenya, South Africa, right, a broader set of countries, and we’re hopeful by taking on a target that we all agree is harmful builds the processes and sharing and relationships and diplomatic norms that we then can build on for

tougher targets, where there may not be, to your point, the agreement right away that we can do this arm in arm.

Mr. Inglis: But, Jim, your question and Anne’s answers are the points, which is there is much in the world we don’t agree on, but on cybersecurity we agree on a profound amount, right? Sixty nations showed up at the White House, early part of the summer, to sign something called a declaration for the internet, which aspirationally talked about what are the common attributes and values that our various societies, which don’t have the same forms of government across the spectrum, but that we can agree these are the things that we owe our citizens. Thirty-seven nations are going to show up next week for a two-day conference on ransomware to essentially put our money where our mouth is to say, how do we actually address the hazards in that space and promote the best practices that would help us avoid those hazards, even if we can’t remove those entirely? That, I think, is a really good-news story. There’s work to be done, but there’s a coalition formed to do it. And we are making real, right, this premise that what if it was true that an adversary needed to beat all of us, or at least many of us, to beat one of us? That would be a new day. We stop fighting alone, we stop defending alone in a way that that risk to the adversary is much, much higher that

they’ll be caught, that they’ll be boxed, that they’ll be evicted before they can do their dirty work.

Dr. Lewis: Yeah, and I should note that we hope at some point to get Nate Fick here, the new cyber ambassador from the State Department, and State has reorganized itself to be –

Mr. Inglis: I’m delighted to say he’s on the job. I don’t know where he is, but I’m sure he’s doing kind of the Lord’s work as we speak.

Dr. Lewis: Yeah. In trying to get him to come to this event, I got responses from at least two continents, so he’s taken the bull by the horns, which is good.

Mr. Inglis: I can offer the larger premise here. This is not a throwaway remark; I mean it. I think Anne and I have a regret that it’s just the two of us up here on the stage. The team that has been arrayed across the private sector and the public sector is simply extraordinary. The opportunity to serve at this moment in time alongside that team – again, private and public – is simply a privilege, it’s simply a pleasure. You know, I find that on a daily basis the degree of collaboration is unlike anything else I’ve experienced in times past, where the complementariness of these roles and responsibilities, because of the culture that has been encouraged, is, I think, the main thing, the main play.

Dr. Lewis: No, I’d agree. I think this is – some of it’s, again, this maturation process. You guys have been in the business for a long time, right, and we didn’t have that before. You have people who know the field, right? We didn’t have that before.

And the final thing I’ll say is if you can figure out a way to get four or five people on stage at the same time, let me know, because we did try.(Laughs.)

Ms. Neuberger: The truth is the team, to Chris’ point, is not four or five, right? The team across the U.S. government is probably 12 or 13, right? You have sector risk-management agencies. You have agencies like Commerce and NIST. You have – you have the intelligence community, the FBI. That’s the team across government. And the private sector are partners in everything, both in visibility. Frankly, the Counter-Ransomware Initiative Monday and Tuesday, we have a session where we have companies coming in from around the world – intentionally not just American companies – and we’re asking them three questions.

Question A: What can you do to help in this fight?

Question B: What can government do better in this fight?

Question C: What can we do better together in this fight?

And we’ve said please come with hard-hitting ideas. You know, don’t be embarrassed to say: Guys, you’re failing in X; please do better. But I think that sense that we’ve really – you know, at the superb EV event, Chris, that you hosted earlier this week, you know, when you looked around the room – and Chris had – I think I can say who is there, right?

Mr. Inglis: Various vendors in the electric vehicle –

Ns. Neuberger: Various vendors. (Laughter.)

Mr. Inglis: – in the electric-vehicle marketplace, right, who were committing officials for the innovation production that they undertake, and government officials from the requisite agencies, the federal entities there across innovation and the various support activities that sector-specific agencies undertake.

And the question on the table was: If the American people and likeminded nations have an expectation that electric vehicles will perform reliably and safely when they’re dependent on digital infrastructure, which they already are, what is the role and responsibility of each and every person in this room to deliver that in innovation, in standards, in the execution of those standards? It was a wonderfully collaborative discussion. The government learned as much as it might have said in that room. And we walked away from that committed to then, for that particular initiative, to undertake the work necessary to deliver to the American people what they expect.

Dr. Lewis: Yeah. My sense is that the politics of cybersecurity have changed in a good way. So it was perhaps in 2012 much more confrontational. Now there’s a greater interest in collaboration.

Mr. Inglis: I think that’s true.

There’s something else that’s changed as well. It’s always been true that cyber is not a vertical, it’s a horizontal; meaning that you can’t subordinate all of the parts to one of the parts. But we’ve now acknowledged that and we’re actually taking that as a feature, where if we can apply concurrently in a complementary fashion all of these insights, capabilities, authorities, talents, we can do that in a way where we are coherent, we can, in fact, overwhelm any adversary, any kind of opposition that we might have and stop obsessing about those threats and start obsessing about our aspirations in that space. But if we operate independently in our stovepipes using what limited insights/capabilities/authorities we have, we will be, as we have

been, picked off one at a time. I think there’s an acknowledgement that that’s true.

Dr. Lewis: So speaking of authorities, one of the question we got was: What do you need from Congress? And of course, I immediately thought of the NDAA, which has provisions on notification and on important infrastructure and on accountability for vendors, all topics that you’ve raised here. What is it you need from Congress, to the extent I can ask you that without getting in trouble with your –

Mr. Inglis: I don’t think there’s any trouble in that. A continuation of what we’ve received, which is a nonpartisan/bipartisan approach to this. Congress asks hard questions. They expect solid answers. And frankly, you can’t tell whether it’s a Republican, an independent, or a Democrat who’s asking you those questions. That actually is a feature of the moment.

In NDAA, what, ’21, kind of a byproduct of some of the Solarium Commission recommendations, there was a watershed kind of delivery of legislative reform or legislative additions that we have been busily executing on the executive-branch side. I think this year there will be a few more provisions that essentially add to the authority and the expectations through the Congress of the American people about what more should be done. I think a continuation of that is what we would ask for. But hold the executive branch accountable, as they do, in a way that’s nonpartisan/bipartisan with a focus on the benefits to the American people as a focus on a political – small-P political – exercise, which I have not experienced.

Dr. Lewis: We have time for a couple questions. And so going through this stack here – and I apologize if we didn’t get to your question – where are we on workforce? You just started a big initiative. And my one contribution was to say do it at scale or don’t bother. Where are we on workforce?

Mr. Inglis: I agree. So let me start with that. Anne, I’m sure, will have some thoughts to add.

At the moment, I think most people know that we have been successful in filling two-thirds of the jobs that have the words “cyber” and “IT” in them. That’s the good news. The bad news is the denominator is flying away from us. At the moment, the numerator is 711,000 jobs that are empty at the moment – one-third of those jobs are empty. That means we need to reexamine everything about the proposition. Have we specified those jobs properly? Have we appealed to the broadest possible population that would fill those jobs? And have we managed the transition from those aspirants to those jobs? We need to take that on be reexamining every piece of that.

But there are adjacent disciplines, adjacent fields. They might be lawyers. They might be CEOs. They learned their craft through kind of curricula that was developed over millennia, or journeyman programs developed over millennia. Tradesmen who are on digital factory lines. The choices they make implicate cyber futures as well. And they need to know more about how cyber works and the consequence of those choices than they do. We need to get into that kind of training program or that curricula.

And then there’s everyone, meaning the – all population. Whether in the United States, upwards of 350-60 million people who are not digital natives. They’re for the most part app natives. We teach them more about managing hot stoves and crossing a city street than we do about cyberspace in their formative moments. We need to address that as well. Now, all of that said, we need to get technology that serves people, that is inherently resilient and robust, and is as intuitive to use as an automobile is. But we still have a people issue.

And so we convened in the White House – Anne and I were there – convened in the White House a summit of leaders from the private sector, who employs and often generates, right, talent, skills, academics, traditional and nontraditional, and federal leaders. So that we could then understand what is the nature of the challenge, what’s a framework that we can use to address that? To your point, what are those practices that we can, at scale, connect, leverage, resource, so that we can actually solve those three problems, right? Fill those jobs. Get the adjacent disciplines in the right place, give everyone the awareness necessary.

We will ultimately write a national strategy for cyber education. Maybe we’ll call that upskilling the American people, so they take full advantage of cyberspace. It’s not pithy, but that’s a more accurate description of what it is. But that will not be a vertical strategy, which has a script of enumerated responsibilities, and someone with a bullhorn who’s directing those. It will be a framework within which we say: This is what we can do to, at scale, kind of reverse that trend and get this in the right place. The better news is the private sector is all-in. Academia is all in. And international countries, other countries, are all in.

Any conversation I’ve had about the topics that we’ve had today, where the lights really kind of, you know, turn on and the faces brighten, is when we turn to the people component, the education component. Everyone is in the same place, and everyone has the same desire to turn that around.

Dr. Lewis: Well, that’s upsetting because I’d gotten you a bullhorn for Christmas, and now I guess I have to – (laughter) –

Ms. Neuberger: I would – just a quick thing, because I know we’re short on time. What’s so cool about cybersecurity is there are open jobs. And the traditional path of a college education is not needed to start out or grow your career to become an expert. And I think that’s so cool, because it offers opportunity to folks from all different backgrounds, in a growing field where one can be hands-on technical expert, one can be a compliance or policy expert, or one can be kind of a risk – you know, bigger picture, risk-thinking expert.

And when you look at, you know, just giving one example – there are several – when you look at, for example, online services like LinkedIn, right, that let you say: Here are some certificates you can get that show you have those hands-on skills. And then here are the jobs that this combination of certificates makes you eligible for. That brings in a workforce who may have thought, that’s not something for me. I don’t know math. I don’t know cryptography. And some folks who college may not be the right path for.

So I think the fact that it offers an alternative path, that there are open jobs, and thinking through what are the ways to ensure that folks – both the average American knows how to stay safe online, absolutely, but that people who wouldn’t have thought of themselves as taking advantage of those open job opportunities now see a path to do so, is just tremendously exciting.

Dr. Lewis: All right. We have at least – we have multiple good questions. What I’m going to do, though – yeah, so –

Mr. Inglis: Call us up next year.

Dr. Lewis: Yeah, oh great. (Laughs.) Let me ask one final one, for me, which is: What did you learn from Ukraine? I was talking to an Israeli general, who said that Ukraine made him wonder about the utility of tanks. I think that’s interesting, from an Israeli. It’s emerging technology. It’s a different kind of war. Cyber’s been a big part of it, if only in the inability of the Russians to execute. What did you learn from Ukraine? What are you going to do differently after Ukraine?

Ms. Neuberger: It’s such a good question. I’d say three Ps – preparation, partnership, and private sector. What do I mean by that, right? Ukraine is actually a rallying call for cybersecurity experts – who may sometimes say, oh my God, we’re on the losing end – to recognize that with the hard works and preparation one can be in a place that one can defense. Ukraine took a lesson from 2014-’15, and the Russians’ attacks on their electricity infrastructure. They got hard to work with international partners, the Department of Energy was a key partner along the way, some of the national labs, and worked to secure their grid. They cut their ties to the Russian grid right before the invasion, and reconnected, and connected to the European grid for resilience. So

fundamentally, that preparation and that international partnership put them in a place to defense key components of critical infrastructure.

Next, international partnership. Every country was on alert to see where is there destructive use of anything going on in Ukraine, and let’s best share that information rapidly, deploy defenses rapidly, so we don’t have a repeat of NotPetya. And certainly we’ve seen – now, we continue to remain vigilant. So learn from Ukraine, continuously watching – none of us, you know, have stopped that. That’s a continuous concern of where this may lead.

And then finally, private sector partnership. Moving away from cyber for a moment, right, the role of commercial space in providing capabilities that heretofore were national-level capabilities is a huge opportunity and huge risk for us to think about those kinds of – whether it’s surveillance, whether it’s communications, in the hands of malicious actors. So how we ensure both we leverage the opportunity and consider the risks in a(n) area beyond cyber. Thinking about the advances that have happened in an emerging technology area, like commercial space, is really key.

And I’ll stop there, because I know we’re short on time and I want to make sure Chris –

Mr. Inglis: So I certainly agree with that. And I’ll help you on time. I would simply say I would go back to what your Israeli friend told you, and say I think he’s right, but I would generalize the proposition. It’s not about tanks. It’s not about technology. It’s about expertise, as Anne said, and the way we apply that expertise in a collaborative fashion, right? Expertise actually dominated the battlefield, which is the Ukrainians were really good at cyber defense. It wasn’t about their perfect architecture. They don’t have a perfect architecture any more than we do. And the collaborative mode between the private sector and the public sectors, plural, actually made a discernable difference – a difference that actually has them prevailing.

I think I, in my own case, underestimated the power of defense against a numerically superior offense. And interior defense, meaning when you’re back-to-back with a defender and you’re actually in a collaboration, actually can hold its own. That might not be an enduring proposition. There could be something that is sufficiently overwhelming that it will prevail. But so far they’ve shown us that expertise and collaboration matters.

Dr. Lewis: We have a lot of work to do here. A company called Skydio gave me a drone. And what’s cool about it is it’s in my office. It fits in a briefcase. And it turns your phone into a targeting device, because it uses artificial intelligence. And

you just tap on the thing you want the drone to follow, and it follows it till its battery runs out. So that’s pretty cool.

Mr. Inglis: I was thinking of more to come, but I think I’ve changed my mind.

Dr. Lewis: (Laughs.) Let me go through – I have four questions that I want to at least mention to you. We’ll do a smorgasbord, and then I know you have day jobs, so you can pick. Someone said: Can you discuss – let me do them all then pick the one that you think you have time for. Can you discuss the mandates that will included in the cyber strategy? What about improving accountability for negligence in cybercrime? We’ve touched on that issue. We’ve touched on all these issues. A question about legislation, that some of the recent legislation doesn’t mention cybersecurity. How did that happen? The Inflation Reduction Act – I can’t read your writing – why wasn’t it in the bill? And that is a good question. You can – I love search because you can just search for the word. Can you discuss partnership to enhance the chemical sector cybersecurity? And then finally – we can do post-quantum later. We can talk about post-quantum later. So why don’t you do – were there any of those that piqued your interest?

Mr. Inglis: Well, we’re in the middle of that, when you talked about legislation. Whether you did a word search, and it didn’t have enough of the occurrence of cyber or cybersecurity. Again, cyber is not an end in and of itself. Cybersecurity is not an end in and of itself. It’s a means by which we deliver the things we care about, right? So we’re about to spend – we’re in the process of spending $1.2 trillion, right, on infrastructure massively across the nation – the bipartisan infrastructure law.

A billion dollars of that has been specifically allocated for cybersecurity, deployed to the states and the local governments. But 1.2 trillion of that’s going to be spent on things that are fundamentally dependent upon digital infrastructure. And if we get that right, if we build resilience into that digital infrastructure, we will make a massive uplift in our cyber resilience. Not for its own sake, but so that we have confidence in the functions that we really do care about.

The CHIPS Act is another of those, $52 billion being spent to essentially improve our ability to have confidence in a supply chain that has microprocessors in it. There isn’t anything that doesn’t have – I’ll bet your drone has more than a few microprocessors in it, and you can use it for good purposes as much as perhaps kind of other purposes. But the CHIPS Act is all about resilience and robustness of a supply chain that we’re fundamentally dependent on. So cyber, whether the word exists in that or not, is in there. And the Inflation Reduction Act, as well, is trying to talk

about resilience and robustness in our society so that we have confidence and the functions that we care about are delivered to us.

So I look at everything and see cyber in it. Now, I know that that’s because cyber is in my job title, Anne probably the same way. But cyber’s the means, not the end. And so you can actually use every activity and every dollar that you spend to achieve the resilience and robustness that we need, that we want in the society.

Dr. Lewis: One change is we actually have money in the equation now, which is –

Mr. Inglis: We do.

Dr. Lewis: – a tribute to Congress and to bipartisanship.

But Anne, yeah.

Ms. Neuberger: Exactly. And in both the IIJA and the IRA, which are really – they represent President Biden’s focus on addressing domestic resilience, whether physical or virtual – physical or digital. And we’re plugged in to the implementation teams, right, to Mitch Landrieu’s teams, to ensure that as, for example, a bridges bill, it includes smart sensors for weight management to determine is something carrying too much weight. And there’s just cool opportunities to think about instead of just sending a maintenance team having those sensors say, well, this system needs maintenance, this one doesn’t, right? So we’re plugged in in terms of execution. So while we may not – while this topic may not necessarily be in the words, in execution there’s certainly an understanding. And when the president talks about resilience, he means physical and digital.

And on post-quantum, what I would just say is it’s the finest example of securing by design today for a threat that can be a decade or more off the line, right? So what we want to do is give – you know, the process of rolling out new encryption that can defend against a potential quantum computer is not a one-year effort; it’s a lengthy effort, right, to build the algorithms – NIST standardized them over the summer – to deploy them in new devices, to deploy them at scale so that the elements of secure digital infrastructure that encryption gives us today – like the padlock when you go to a browser – will continue to exist in a way that’s transparent and invisible. It’s baked in. And the partnership between the public and private sectors in doing that, you know, that’s really foundationally building new digital infrastructure that we can trust and rely on.

So it’s just a great opportunity. There’s a lot of hard technical problems. There’s a lot of hard making-it-happen problems. But I think the fact that we

got started at this time and got started in this way gives us a sense that we can do this, solve a hard problem by thinking about it well in advance and building the team to execute day after day, week after week, month after month.

Dr. Lewis: Great.

One thing you learn when you study history is that when you look at a very granular level things appear to be bumpy and scratchy and difficult, and that certainly would apply to cybersecurity. But when you take the longer perspective, it looks like a steady upward path. And so many of the successes in American history, when you look at the implementation, the day to day – when you read the people’s notes – there were tough fights, and you guys are doing great. So taking a step back, the path has been upward. And so thank you for that.

Thanks for taking the time. Thanks for coming out this morning.

Mr. Inglis: If I might say, I think – (off mic) – in this regard.

Dr. Lewis: Ah, more history. OK. (Laughs.)

Mr. Inglis: I think our truest regret being here today is that Jen Easterly’s not here, Brian Boynton’s not here, Rob Joyce is not here, any number of other leaders across the federal – Nate Fick is not here. We represent their story and the private sector’s story because this is a team fight. This is a team endeavor.

Dr. Lewis: And we did reach out to some of those people. And if you can – as I said, if you can figure out a way to get –

Mr. Inglis: It’s a small stage.

Dr. Lewis: No, four people’s schedules to align.

Mr. Inglis: Somebody has to be doing the real work while we’re here with you. (Laughter.)

Dr. Lewis: Maybe I should start planning for one a year from now and we might have a better chance.

Ms. Neuberger: But thank you for having us, for the questions, and as you know being somebody we always talk to when we’re facing hard problems and want to think through them.

Dr. Lewis: Well, thank you for coming today and thanks to everyone. Thank you. (Applause.)

(END)