Dual_EC_DRBG with Justin Schuh and Matthew Green

Dual_EC_DRBG with Justin Schuh and Matthew Green

Nothing we have ever recorded on SCW has brought so much joy to David. However, at several points during the episode, we may have witnessed Matthew Green’s soul leave his body.

Our esteemed guests Justin Schuh and Matt Green joined us to debate whether Dual_EC_DRBG was intentionally backdoored by the NSA or ‘just’ a major fuckup.

Links:


This rough transcript has not been edited and may have errors.

Deirdre: Hello. Welcome to Security Cryptography Whatever. I’m Deirdre.

David: I’m David.

Thomas: I’m Thomas.

Deirdre: And we have two special guests with us today, returning champions. We have Justin. Hi, Justin.

Justin: Hello.

Deirdre: And we have Matt. Hi, Matt.

Matt: Hi, how are you?

Deirdre: Hi. We’re chatting on the Internet the other day about our old friend, our favorite conspiracy theory, Dual_EC_DRBG. And we started shouting about this because Justin shared a very good talk from, I’m forgetting the person’s name because I don’t have it in front of me, but it was basically a talk from 2014, InfiltrateCon. Is that right?

Justin: Something like that. It was. His name’s Dickie George. And yes, you’re gonna chuckle when you say it.

Deirdre: It was very, a very interesting perspective from a perspective you don’t usually hear, which is inside the NSA, on what happened with Dual_EC_DRBG . And it sparked a lovely. A lovely little debate. Justin, can you tee us up what Dual_EC_DRBG is and what Dickie was talking about?

Thomas: I’m a cut in here, right? So Justin and I are our old friends. We’re both Chicago people, and me and David are friends. And for whatever reason, Justin and David both worked as security PMs on the Chrome browser.

David: Don’t slander Justin like that.

Thomas: I demoted Justin, apparently. Justin, what were you.

Justin: I was an engineer when I was.

Thomas: So I demoted you to PM, one.

Justin: Of the only engineers.

Thomas: But you both worked on Chrome and David actively works on Chrome right now. And they needed to meet up at some point to share war stories. And they happened to meet up in a place where I was. And they were drinking. And at one point, Justin said something about how fed up he was with people talking about NSA backdoors and how quite sure he was that Dual EC probably wasn’t the back door, which is what lit this whole fuse.

Justin: I said it’s not a backdoor.

Thomas: I was giving you some room with.

David: The same vibe as Alex Jones explaining the Star Wars prequels.

Justin: Now, it is hard after several drinks, it is hard to bring up thoughts from something that you haven’t even considered, like, 10 years ago. And I have no idea why I went sideways on that one. But, yeah, I mean, if you know anything operationally about how have you anything about the structure of NSA, the notion that it’s a back door is, like, crackpot. Like, if you understand how that organization works. And if you look at the evidence in front of, like, all this stuff has been made public.

Thomas: And here we should make clear that Justin is not a cryptographer. Justin is an exploits guy. That’s Justin’s background, is the way I would put it.

Justin: Yes. I said I don’t know. I fully admit it. I’m like, crypto is not my thing. I’m not looking at it from that standpoint. I’m looking at it from the standpoint of, I’ve seen a lot of vulnerabilities from, like, wow, this totally looks like it would be a great back door, knowing that they are just, you know, vulnerabilities you find in the world. And in this case, knowing the context, I mean, I have.

I can tell a bunch of stories about vulnerability that absolutely look like they should be back doors, but I know the full context of how they happened. Conflict.

Matt: Yeah.

Justin: No, that’s not a back door.

Thomas: Yeah. And so, like, in a podcast, first for us, we have warring guests because our other guest is Matthew Green. And my feeling is that Matthew Green has some emphatic takes on this topic. Am I misrepresenting there, Matt?

Matt: I definitely have opinions. Yes.

Thomas: So if you were to try and, like, sum up your response to the short thing that Justin just said, just to tee this off, what would. What would your response? If you were on my couch in my living room, four drinks in, and Justin had just told you there’s no way, given what he knows about NSA, that Dual EC was a back door, what would your response have been?

Matt: I mean, if I was to go out of my way to build, like, a, you know, case study of a back door, right, if I was to, like, you know, go somewhere in, like, an artist studio and construct, like, what a back door should look like, like the Dual EC case is exactly what it would look like. There is sort of, there’s so much evidence around it, and it’s not just like, one piece of evidence. That’s the, you know, the smoking gun. It’s more like there are bullets strewn across the lawn and there are guns piled along the doorway and holes everywhere and, you know, like any one of those things, you could look the other way, but not all of them. It’s not possible.

Justin: And I would say that’s how I describe the assertion that it is a back door. There are so many colds. The thing that spun off the conversation online was I linked to the Dickie George post. And yes, I hold back every time I say anything about the talk. And I guess he gave that talk a few places. Can I say something like the NSA in that talk?

Matt: I’m very familiar with that talk. I have a quote from it in one of the presentations I give, he says if anybody can prove to us that, you know, by generating their own parameters for Dual EC. If anyone can prove that, you know, not, not saying the NSA actually, you know, made bad back doors, but if they can prove that if they put their own parameters into that generator, it can actually be exploited, then I will buy them dinner. So we went off and we did that and we wrote a paper and it got into USMIX Security. And he has not bought us dinner.

Thomas: He owes a lot of people dinner, doesn’t he? It’s not, it’s not a particularly hard backdoor to demonstrate.

Matt: Not at all.

Justin: I did not see that version of the talk that the transcript that I have does not include that. But it is totally fair of you, as you did. I’m arguing some garbage. It was, you know, you’re talking about decades old technology that was for a specific narrow use case, not the, not the generalized use case. And I’m not saying that the, the notion of pushing it through as a NIST standard was a particularly good idea. Unfortunately, the weirdness of the way government procurement, everything works. Yeah, that’s how it happens. There’s this back door.

Justin: I mean all the only ones they backdoored were themselves. If you’re, if you’re saying it’s a backdoor because it got into crypto libraries.

David: Let’s temporarily, let’s limit ourselves to like just the algorithm itself, not NSA, not its usage, and like just look at how the numbers are multiplied to generate random numbers.

Justin: Okay, this is where I have to tune out though, because I, it’s safe.

David: To say that like the algorithm is basically a keyed backdoor. Like, that’s just how it works. That doesn’t mean that it’s intended to be used that way, but the way in which the algorithm is constructed is basically multiplying two group elements together. And if you could take the discrete log of where you started, it turns out you can guess if you have the discrete log of where you started, you can figure out what’s coming next.

Thomas: It almost feels similar to like this, the, the conundrum people had early on in TLS 1.3 standardization where it’s like prior to TLS 1.3, if you were a large financial organization and you wanted to do intercept on all the traffic coming in and out of your network, there was an easy way to do that, right? You would just escrow all the private keys, then you would use them to reconstruct the traffic. And TLS 1.3 costs you that capability was the major reason to have TLS 1.3 well, performance in that, right, is to have everything be forward secret so you could no longer passively, you know, reconstruct TLS conversations just given recovered keys, right? So you can imagine like, and this is like 1980s, 1990s cryptography, I guess this is 99, this is 99, the early 2000s, right? You can imagine an IT environment where they’re running, you know, somewhat something like a forward secret protocol across their entire and they themselves want to be able to reconstruct their own traffic. And one way you could do that, like my understanding of the situation is like you could standardize a random number generator that you would be able to like, you could recover that given, you know, a recovered key. And that would give you kind of a sane, I mean a stupid but sane thing. You could deploy enterprise-wide to get that capability.

Matt: And you know, Dan Brown, who was one of the authors of the standard, actually patented the idea of doing that, right? Like in the middle of the standardization process, somebody said, hey, this thing could be like a key escrow backdoor and patented the whole thing.

David: Well, that’ll stop anyone else from doing it because they hold the patent. It’s defensive.

Justin: Isn’t that kind of the argument that it wasn’t like your argument here is NSA built a backdoor into this thing in the most incompetent way possible?

David: Let’s back up again. So I think what we can all agree then is that the algorithm is from a purely technical standpoint, it has a backdoor in it. If you know this parameter, you can figure out what’s coming next. That’s just an indisputable mathematical fact at this point.

Deirdre: Which has a vuln. It has a vuln.

David: Yeah.

Justin: I trust your expertise 100%.

David: That’s thing one and the. So it has a keyed backdoor effectively in the algorithm. The question is for the key varies based on the parameter set. The question is for the default parameter set. Does anybody NSA have the key? And then are people using non default parameters where someone else has the key? And then was this intentional by NSA to do this or was it just like an early use of dual of elliptic curves? Dickie George in that same talk implies that this was used to secure a phone on Reagan’s desk in the 80s, which would have been like very early elliptic curves kind of pre everybody else using elliptic curves. And he makes the claim that it was standardized so that they could have classified and unclassified material use the same phone because the unclassified material at the time needed NIST standards.

Thomas: Hold on he said that. So he was talking about the STU-III phones. He said that they had a prior, he said they had a prior experience trying to standardize a single set of algorithms so that you could have one phone. Because the STU-III phones were not authorized to use non classified stuff. But that wasn’t duly seen. That wasn’t why they did that. Right. It was the experience of going through that that made them say, okay, well we need like Sweet Bee, a set of, you know, a set of cryptography that we trust, but that is also authorized for non classified stuff.

Thomas: And then Dual EC kind of slipped into that. Right.

David: My, my read of it is they were using it for the STU phones, but that would imply that it had elliptic curves like a bit earlier than anybody else. And I don’t know the timeline on elliptic curve cryptography.

Justin: That was my read of it was that as well. My read was not that it was. I mean there, the problem was the talk had a whole lot of, you know, there’s a whole lot of example. There’s a whole lot of biology because it’s the goddamn NSA and they won’t just, you know, come out.

David: Also, Dickey George is like 70 years old.

Thomas: I feel like he talks about like the STU-III incident being in the 1980s.

Justin: Yeah, no, no, the STU-III was late 80s. The STU series I think was like 70s or the 80s. I, it had, you know, there was a STU-III on my desk.

Thomas: Yeah, I don’t know if that detail shatters your worldview and now you agree with us about the, the back door, then this is not important. Yeah, then we’re, we’re fine, we can move on.

Justin: My take on this is just. So this is the NSA where their entire reputation in the intelligence community is they would rather have a capability die on the shelf unused than ever have it burned in public. Like the worst thing it from the NSA perspective is to get caught. And this is, and frankly other, other intelligence agencies take offense at that. And they’re like, like what’s the point having if you don’t use it? And so the notion that they would use the public standard cause this way, that they would take people from the IAD side of the House who don’t, don’t work on offense. They, they work on defense. And they would take people from the IAD side of the house who have had their names publicly attached to these things and actually did end up getting threatened and getting harassed after all this stuff happened and like that they would expose them to this. And just also with such a, just A shockingly clumsy and awkward way where the only one they ever actually, the only like places that saw any meaningful use was in libraries or in FIPS libraries.

Libraries for government systems. I mean, it just doesn’t. It doesn’t make any sense. And then they sent Dickie George out there to be like, no, no, we did not do this. This is not a thing. They never, they never confirmed. They never denied. This is like the one time they’d ever denied. And it’s like, yeah, yeah.

Thomas: I mean, everyone has different smoking guns on this. And you know, I, I remember the patent smoking gun being a really big deal for a bunch of people. And then for me, like, Juniper was probably the big deal. Even Be Safe was not a smoking gun for me.

Matt: Let’s start with the biggest deal. Let’s start with what we know for a fact. And that comes from the Snowden documents back in 2013, which is we have a document that defines a program called the Computer Network Operations SIGINT enabling program. And this is straight. It’s a top secret sei, et cetera, no form document. And it basically says, in a top secret SI place, it says their goal is to insert vulnerabilities into commercial encryption systems, IT systems and so on, to allow the NSA to decrypt.

Justin: Yeah, I don’t, I don’t know where you go with that one because like, all right, I haven’t seen the document. I intentionally for, for, for reasons mostly avoiding looking at those documents. And I haven’t seen that one. And also, honestly, just the whole hack and leak thing, the way people like jumped on that and like the, like the amount of damage.

David: Not a fan of Russian Patriot Edward Snowden???

Justin: I think going there is. Just reflects poorly other people.

Matt: I assume that there was a lot of damage and I’m not going to dispute that, et cetera. But here’s the deal, right? Like this is a $250 million a year program. It has a very clear mission. And the mission is to tamper with commercial encryption systems. And specifically, let me read you the second thing it says to influence policies, standards and specifications for commercial public key technologies. Yeah.

Justin: And I’m saying I don’t trust the authenticity of the document. I don’t want it. This is why I’m not boring. Error. I know from the various happen leaks that have happened. I know, Doc. I know for a fact that some documents have been modified not, not from the government ones that were leaked from other ones where I have direct knowledge of the companies. And it was like, oh, that they were leaked on and they Were modified, et cetera. I don’t think that that is a safe place.

Matt: The Snowden documents. The Snowden documents are something different though. The Snowden documents—

Justin: No, there were, there was a ma. No, we had, There were people talking about PRISM as if it was like a backdoor in every company.

Thomas: There was like there are separable issues here. Right? Like there’s the issue of um, whether disclosures did damage and then there’s the issue of whether reporting and publication, like secondhand and third hand publication, propagated misinformation. And there’s like a stronger case on the latter than on the former.

Matt: But not on this.

Thomas: I’m. I agree with you, but I want to jump in here and say this is interesting. As long as it’s interesting. Right? But there are so many other like, interesting smoking guns about dualisi that like when you guys get bored of arguing about whether those things are. We can just stipulate, we can just stipulate that, okay, the Snowden thing has no probative value and then move on from that.

Justin: No, my. I see this is the. I’m not arguing that doesn’t look exactly like a backdoor. From the perspective. Not from an operational perspective, from operational perspective, from a technical perspective. I’m not dismissing that. I’m saying otherwise people wouldn’t have bought into it. You know, like the whole 15 Minute City conspiracy.

Justin: Right, Right. It’s like nobody’s arguing, nobody’s arguing the facts. They’re just arguing the interpretation of the facts is wrong.

Thomas: I would say that like a couple of steps into this, into this conversation, we are going to get to a point where it’s going to be difficult to make the case that it wasn’t actually a backdoor. Given other things that you are allowed to hear about and that are disclosable in this conversation with you. I think we’re going to get to a place where it’s going to be, it’s going to be pretty difficult to argue that it was not in practice and an actual use of backdoor.

Deirdre: I’m going to throw something out there which is that given the context of Dickie’s presentation, it was not intended to be a backdoor. It had a vuln. It was used by China to swap out their own parameters to either mitigate the vulnerability or to insert their own. And a lot of this can be chalked up to bureaucracy gonna bureaucracy. And you know, APT5 took advantage of it. And only in the context of Dickie’s talk do I have, I updated my analysis.

Justin: That is my argument.

Matt: Thank you. I mean, we can get into that, but there is actually enough information from the NIST investigation that like, there is no way you can look at the NIST standards process and say that the NSA did not deliberately push hard to have these, these backdoored parameters in this back. They were asked by people, why are you using these parameters? And the response was, the NSA has specifically asked us not to talk about that and not to talk about other ways to generate the parameters.

Justin: No, but that actually. No, no, that’s complete. Okay, you’re interpreting that exactly the opposite of how I would interpret it because that is the way the NSA responds on. Everything on the frickin SBOXes was the same. They never, they always push the, never explain. Okay, Tom.

David: Never say anything. And I say point of order.

Thomas: Point of order. The person who ran the Technical Directorate that did this got up on stage to infiltrate and he has an explanation for this. It wasn’t, I can’t comment on it. His explanation was the NSA operates the world’s largest printing press. I don’t know why it needs to be a big printing press for this, but it is, all right. And it generates a constant stream of random numbers. And they’re the same random numbers that are used for like nuclear command and control. That’s what he said.

And whenever they need randomness for anything, they just go take randomness from the world’s largest printing press of random numbers. And that’s where the P and Q points on that curve came from. And that is why they can’t reconstruct them, because according to him. And then he further says that when they standardize, when this. So he, he asked NIST to standardize.

Matt: But that’s not true.

Thomas: I believe you. I’m just, I’m just, I’m just giving his explanation.

Justin: But what, what reason, what reason does he have for going up there and lying like this is my question. Why the NSA? Everything else.

Matt: I don’t think he’s lying. I don’t think he was. I, I honestly don’t think he knew. I don’t think he was read in. I don’t think of him as a liar.

Justin: Wait, you’re, you’re saying the technical director of iad, what, one of the top rating people at NSA, responsible for all of this?

Matt: Whistleblower, I think.

Thomas: Point of order. Justin.

Matt: Yeah.

Thomas: Dicky George gets up on stage at Infiltrate and says out loud to the crowd there that he does not know what the fuck is going on with these points. He said he didn’t know where they came from. He said he didn’t know why people wanted them. He said that he hated the standard. He didn’t know why the NSA was using the standard. He said he didn’t know a whole bunch of things.

Matt: He.

Justin: No, he didn’t say he didn’t know why people weren’t using it. That’s not true. He said, you know, just didn’t say he didn’t know why NSA was using. He said, didn’t know why anyone else would use it.

Thomas: He.

Justin: He knows why NSA is using it because it’s in ancient devices where it’s actually built into hardware, shipped fricking all over the globe. That’s why NSA is.

David: So there’s like. He makes a comment when he first brings it up about how they were using these, like, hardware diodes for randomness, but then they stopped working for random because the diodes started working correctly and instead of giving off a bunch of noise, so they’re like, we need a random number generator and we needed to, like, be verifiably random. And like, one really stupid definition of verifiably random is you define. You build a back door into your random number generator and then you hide the back door from the implementers, and then you define a parameter set. You see what pops out. Then you go and take your back door and you check all of them and you’re like, good job. You did it. Right?

Thomas: I just want everybody. You guys can’t see David on video right now, but he is a pig in shit right now.

Matt: Yeah, so here’s the thing. The Q point is a public key. We all agree with this. So let me just. We did want to talk about the math of this. There are two points hard coded into the standard. One of them is P, and that’s the same generator point that’s in all of the NIST curves. There’s nothing special about that.

Matt: The Q point is a public key. It’s a point on elliptic curve that is X times P, where P is the original point. Now, if you’re using the standard process to generate public keys, whatever that is at NSA, do you believe there is a separate process that does not also generate the secret key along with the public key? Because I don’t believe there is. I think if you’re making public keys, you’re making secret keys and public keys. And I think that the secret key, if they did generate Q in the standard way that you generate a public key, they have the secret key right next to it inside of some computer. And you would have to convince me differently.

David: Deirdre has this very like scientific, I’m thinking with my hands on my chin face right now.

Justin: You remember when I said I wasn’t here for the technical stuff, for the technical crypto stuff? I’m here for all sorts of other technical stuff.

David: Just, just for background. Can you say some of your former employers besides Chrome?

Justin: Yes, I did spend close to a decade in the intelligence community and I did work at NSA, among other.

David: He’s a spook.

Matt: Did you generate the cue point?

Justin: No. Wait, so Dicky George, what he said is like, no, no, we just got, we’re just spitting them out of the thing. It’s like I told I, I have worked with our group. They are, they have their weird things. They have. Like, if you’ve ever seen like, like I know you’ve seen this before in code, where you look at code or you look at kisms, whatever, you see the way they’ve evolved and it is this giant kludgy mess. And you’re like, why would anyone ever build this? And the answer is no one would ever build this this way. If they started from scratch.

Justin: It’s just layered on layered and layered. And, and it, it’s very clear to me from that talk that he’s, that he’s like, they had criteria. There are criteria for this random number generator had to fit these criteria. And there’s no way in the world they’re ever going to tell you what those criteria are because the criteria themselves are obviously, they consider them very sensitive, whatever. And it’s like in, in his mind, he’s like, if you knew the criteria. Because he does know the criteria. If he knew the criteria, he’d be like, oh, well, duh, it’s not a backdoor. But you don’t know the criteria.

Deirdre: But yeah, I still.

Justin: The thing that drives me insane is we’re not even talking about how bone headed this is operationally now. You were like, they got a $250 billion budget, et cetera. And it’ yeah, but, but they’re the Keystone Cops. Suddenly, in this one instance, this one time, they’re the complete Keystone Cops. And not only do they make themselves look horrible, they also like expose the actual names, histories. A whole bunch of people put that and put them at actual risk. This is my point where it’s like, this does not connect.

Matt: But the thing is, the Snowden documents were the Keystone Cops, right? The Snowden document should never have happened to an organization that had their crap together.

Justin: No.

Matt: And honestly, they would have gotten away with it if it wasn’t for you know that like meddling Ed Snowden kids.

Deirdre: That meddling Snowden not only stole.

Justin: Snowden stole people’s certs, he was a help desk tech who was actually stealing people’s credentials and using it to access their information.

Thomas: But like, I’m really sympathetic to the argument that you just made because prior to the run up to bull run for me, prior to Juniper, probably prior to Juniper, I will never live down the fact that I was all over hacker news saying this is an idiotic random number generator. No one will ever use this thing. So if this is a backdoor, this is the stupidest backdoor anybody’s ever come up with.

Matt: And yet we were all wrong.

Justin: No, you believe you were wrong. And I understand this is a struggle belief, but we all know make any sense Opera. It is. It is nonsensical. It is. It is. It is peanut butter voodoo. Looks crazy.

Matt: Crazy that it makes sense.

Deirdre: This is, this is the layers of detritus and also why cryptographic standards are so important because NSA had strongly held requirements and they wanted to get it into the standard so they could like interoperate with other stuff. Right?

Justin: That’s a fair criticism.

Deirdre: It’s just for them.

Justin: This is a fair criticism.

Deirdre: No one would ever, no one would ever use it because it’s so crap, right? But because it’s a FIPS compatible random number generator. Aha. We can use this for X, Y and Z. And then someone decided we need this in a very common commercial library that serves the, you know, the needs FIPS compatibility, RSA BSAFE, and that’s how it gets shipped. And then that is the target of APT5 or whatever, Juniper. So and so and whatever, whatever the major target was that had all these, you know, government records protected by Juniper. This is why cryptographic standards matter. And not just because putting something in there that’s so shit no one would ever use it is not a good reason to put in a cryptographic standard.

Thomas: Now, now Deirdre is speaking on behalf of the cryptographic standards enterprise, which not all of us on this podcast agree with.

David: Deirdre, could you quickly say your title at work?

Deirdre: Oh, standardization research engineer.

Justin: Deirdre, I completely agree with you that laundering their own tech debt through a standard is a terrible thing to do. And that is not extrusible.

Thomas: So like we were talking a little bit about like one issue I have with this that like with attempts to kind of describe Dual EC and kind of talk through why it’s a backdoor is that we tend to go right to like, you know, there’s this P and there’s this Q. And if you have the secret D, then you can relate the P to the Q and break it. But like, it’s. So the way I looked at dual C, Dual EC, when I was very wrong about it was it’s a random number generator that uses big num math and no one would ever do that. That’s very dumb. Right? I’m just ruling it out. I didn’t even like, think about how. What the structure of it was.

Right. It was only when Dual EC became very salient after the Snowden disclosures that you actually take a second and look at it. Right. And the structure that Matt is describing, that base point P, public key Q. Right. This is just basic, textbook elliptic curve DL. Right. It’s like if I had published a standard.

Justin, I know you’re not a cryptographer, but I can get you through this. Right. If I had published a standard for a random number generator where I used RSA and RSA key and the output of my random number generator was just encrypt this value with this public key and send it off into the world, you would get why there’s no, there’s no other reason to do that.

Justin: Oh, no. I, I do not know what these secret criteria are.

Matt: Right.

Justin: And this is where we, we go back to like, like I said, operationally it doesn’t make sense to me and like people who I trust to tell me the truth. And I, and I have no reason, just given the insanity of the technical director of IAD going out and like, it’s like, okay, we never talk about anything, but this time we’re telling you, no, this was not a backdoor. We didn’t do it. When they say, like, we had. We have secret criteria. We’re not going to tell you those criteria, but we have some secret criteria. But it was not.

Justin: Those criteria did not involve us being a back door.

Thomas: This is like one of those New York street shell games. We have to watch really carefully what’s happening here because we’re talking about the secret criteria for these constants. And I think the underlying point that your opposition in this conversation is making is so sure P is random and there’s some secret criteria for the randomness that is in P. And there’s another value besides P that is also random and subject to all that criteria, but that other value that is random is not Q. The other value of those random is D, which is the scalar multiple of the base point that gets you Q.

David: Right.

Thomas: Q is not random. Q is the result of an elliptic curve multiplication secret D parameter. And this is not like. If you’re not a cryptographer, it sounds like that’s a lot of crypto stuff, but I’m really dumb and I could follow that one, right?

David: Yeah. It’s literally the equivalent of the random number generators. You have a counter and you encrypt it to a parsay public key and take the output every time.

Matt: Yeah, you’re encrypting the seed and you’re writing the seed out every time.

Justin: I understand that. Yes. It looks like a back door and looks like an escrow key. My. My full thing that I am holding to here is that the operational aspect, the total key keystorm cops aspect of it, because I don’t think you can point to anything like, operationally that would have ever been such a train wreck. And the fact that, like, jeopardizing their own people, like the notion, that they intentionally expose their own people and jeopardize their own people like that, that’s just. No, this is where I go back to. If you know anything about how they work operationally, none of this makes any sense and none of this sounds believable.

Matt: Let’s look at the real world, the world that actually we live in. Right? All of those people.

Justin: That’s what I’m doing. I feel like you’re theorizing and I’m telling you what the world is.

Matt: No, what I’m saying is all those people who could have been jeopardized by a deliberate backdoor. Let’s hypothesize that it was all a terrible coincidence. Right? Like, they didn’t try to rob the bank. They were just standing near the bank. And, you know, the broken windows were just an accident and whatever. The hole in this, the vault was just. I don’t know. Let’s.

Matt: Let’s hypothesize they’re totally innocent. All the people that would have been damaged by a real backdoor attempt are now compromised by it. All the reputational damage they could have, you know, been subject to by a real backdoor.

Justin: Do we know. But the argument presumes that there is a back door. Like, let’s assume. No, no, no.

Matt: Even NIST had to write in front of the entire, you know. You know, Congress had to write a report saying, we will never trust the NSA to standardize anything ever again. We do not trust them. We will send it out and make sure it gets reviewed from other people.

Justin: I feel like you’re making my point. For me, I’m confused.

Matt: The point is that the way this was done, whether it was a deliberate back door or just ineptitude of, like, the first order led to all these bad outcomes where, like, people’s reputations got trashed, the NSA got trashed, the whole world ends up thinking. And not just that, but when you factor in this Juniper episode, which we’ll talk about in a bit, like actual, actual US systems got compromised and real Americans got hurt.

Thomas: Yeah, I think Justin’s point there would be that all the. All the bad things that you’re talking about are real, but that also NSA could trivially have predicted those things back in 2005 or whatever that they could have and that they would, like, he would say, like, this is a reason why this. This isn’t what happened. Right. Because they knew that if they had done that, NIST would never take another standard from them again.

Matt: Here’s my point, right? They could have designed this generator so that there was no suspicion of a back door, and they chose not to.

Deirdre: I think to both Justin and Matt, like, yes, incredible reputational damage. They don’t, they’re not trusted to ever actually publicly advise on standards like this again. And there’s the, like, the distrust is, like, basically perpetuated forever. I think to Justin’s point, they were so just stuck in their own, like, foxhole. Like, not. Not pejorative, but just like they had their own things and they weren’t looking over there and they weren’t thinking that far, and there’s just like, lots of layers of bureaucracy and like, these people are looking at their knitting and these people are looking at their knitting, and then it just kind of leaks out and then someone uses it and so on and so on and so on.

I think that is perfectly doable. And it sucks, too. It sucks too that they are so bad at their broader mission because they’re so, so narrowly focused on their particular tiny mission that is backwards compatibility.

Justin: Can I point out that half of the mission, half of the NSA is IAD, Half, like, half of it is just supposed to be defense. There’s.

Matt: IAD is gone. They got rid of it.

Justin: I. Well, no, they’ve changed names and stuff like that, but the half of the mission here was defense. And to have, like. I know they’ve restructured and stuff like that, but they still have the defense mission. Right? I mean, essentially you’re saying that one side of the house somehow tricked the other side of the house into doing something and confessed.

Thomas: You’re what?

Justin: No, no, you’re taking.

Thomas: I’m watching Matt Green’s soul slowly leaving

Justin: You’re responsible for.

Matt: Justin. Justin, I know that we started this conversation with the idea that like the Snowden documents are all forged and you can’t read them and therefore they don’t exist. But let’s be, let’s be real. He said it all like. I’m not trying to be rude. Okay, we started the idea. We’re having a conversation about a crime that happened.

Justin: No, we have.

Matt: I’m using a metaphor. I’m using a metaphor. We are having a conversation about a metaphorical crime and we’re debating whether it was an intentional crime or just. Did you agree with that metaphor? We’re debating whether it was an intentional crime or just a very bad set of coincidences.

Justin: Negligence doesn’t have to be a crime, right? I’m not denying negligence.

David: I think legally negligence is a crime.

Matt: But yeah, we’re debating about whether it was an intentional, intentional criminal act or somebody was just careless and they. It was negligence. That’s the debate we’re having, right? Yes, that’s the debate. But here’s the thing. Like we have a document from the accused criminal saying, by the way, I am going to like for the next several years we have a program, and I will read the program description again that says we are going to spend $250 million a year sabotaging, sabotaging commercial encryption systems to make it so that we can decrypt. And keep in mind, this is after 9 11.

Justin: Matt, do people from entirely different parts of the university come in and tell you like a totally laterally different parts of the university come and tell you, hey, you have to do this to get there. You, you have a document. You don’t understand the context. You don’t. You. You’re not certain about the. But it’s not but you know, for. But even based on what you know, it’s not the same people.

Justin: Let me read the organization.

Matt: The Sign Enabling project actively engages the US and foreign IT industries to covertly influence and or overtly leverage their commercial products designs. These design changes make the system in question exploitable through SIGINT collection endpoint midpoint X with foreknowledge of the modification guys to the consumer and other adversaries.

Justin: I can’t do a system security remains intact. Seriously, I have intentionally avoided and you didn’t. So I couldn’t hear what Matt was saying there. So I’m not touching those documents but touching those documents. My only point is I’m testing those documents. It is too, it is too risky for me to go near them or have someone read them to me. My only point is NSA is a massive organization. I don’t know the authenticity of those documents, but I sincerely doubt they came from idd.

Justin: I cannot imagine that they came from the side of the house that you’re blaming.

Matt: But they specifically talk about.

Thomas: That does seem. That does seem like a reasonable point.

Deirdre: No, but even if they did and if they succeeded, it is also possible that they came from different sides of the house and different parts may have been infiltrating commercial encryption.

Matt: Sure.

Deirdre: And or standards. But it is possible that those things can be true and totally Dual EC and its parameters are just a fuck up or a perfectly secure thing in the setting it was designed to be used in.

Matt: Yeah.

Deirdre: While someone else on the other side of the house is also doing all these other shenanigans.

Matt: But let me, let me articulate another possibility. Right. Another possibility is that a small group within the NSA designed a random number generator. Told the rest of the organization. We can’t tell you our very secret design principles and our requirements. Just trust us. You have to use this. And then a whole bunch of very good innocent people trusted them and took it out to the world. And I think that’s a very reasonable hypothesis.

Justin: That’s not the way I works. They have SID do their homework for them. Right.

Matt: It’s kind of what you’re saying though. Right? Look.

Justin: What?

Matt: No, I’m saying that there were.

Justin: No, no, no, no way that way.

Deirdre: Whoa, whoa.

Justin: We can’t. No, what I am saying is there is no way that IAD intentionally backdoored this.

Matt: I’m perfect—

Justin: You’re trying to sort of make counter arguments for how and why. But I’m saying, look, like I still don’t understand how operationally you thought this was ever going to like it did work.

Thomas: Work for.

Justin: No, no for someone else when they hacked in and just swapped out the code. I mean you’re already in there swapping out the code anyway. You could have put anything you want me in there.

Matt: It got inserted into the most popular commercial encryption library starting in about 2005 and continued to be in there until 2013. And it was included in printers. There are printers out there that have, that have BSAFE in them.

Deirdre: So this is kind of circling over to Juniper and these attacks. So contextually after Dual EC was a FIPS compliant random number generator rsa, the companies encryption library B safe which is very Popular amongst a lot of FIPS compliant customers, supported Dual EC, which is not very performant, but is FIPS compatible and was part of the Juniper system. Does someone have more background than mine on what to describe about that attack?

Matt: Yeah. So the RSA BSAFE library is one product, one library, and it was used in a bunch of products, but not in Juniper. Juniper did their own implementation. Juniper did the strangest thing ever, where in about 2008 they said, we’re going to put Dual EC into all of our firewalls. And this is Netscreen at the time. So they put into all of their firewalls. So all VPN connections were using it, but they didn’t publicize it.

What they did is they put another random number generator that would post process the output of Dual EC. And Justin, you’re an exploit person, so I want to describe an exploit. So you have random number generator one, which might be vulnerable, and random number generator two, which runs in the same buffer and should overwrite and remove any vulnerability because of the way it works. But the for loop that is inside of random number generator 2 is instead of saying for, you know, local variable I equals 0 to 32, it uses a global variable. It says for global variable I equals 0 to 32 and somewhere in a subroutine call that global variable gets set so that the for loop never runs. So if you saw that kind of vulnerability, if you knew that was the difference between a completely exploitable system and a secure system, would you look at that bug and sort of think, wow, somebody did that on purpose. That’s a very strange bug.

Justin: Matt, if I did that, I would be saying that about like half the bugs I looked at. Like, I mean, that’s.

Thomas: No, I actually, every time I talk, I’m just going to reiterate that I believe that Dual EC is a backdoor, but I’m more on just inside of that one.

Matt: Thomas, would you put two random number generators into a firewall product?

Deirdre: No, but by accident and by negligence.

Thomas: I would also only go to fail once. I would not go to fail twice. Right. And then people like, there’s a whole conspiracy.

David: The qualifier here isn’t the for loop, it’s the firewall product.

Thomas: Right?

David: Like, that’s the qualifier on the sentence is firewall product.

Matt: Yeah, but would you put an undoc in a FIP certified random in a SIP certified crypto project, would you. And knowing that you have to disclose all of your algorithms, would you put a one certified random number generator and Then backstop it by having it also use Dual EC.

Justin: Oh, yeah. None of this is surprising.

Thomas: So this is, like, to me, this is the probably the most interesting conversation that’s left to have about Dual EC. So you said like a minute ago that, like, first of all, I have. I have weird takes about Be Safe as well. Right. But like you said, because of BSAFE, Dual EC’s in a whole bunch of printers. Right. And your first thought there is, well, that’s actually a pretty slick attack. Right.

Thomas: Because printers are the file servers for the most important files.

David: Right.

Matt: I don’t.

Thomas: I don’t think the downside of that is that nothing goes to a printer encrypted, especially in td.

Matt: I don’t think the printers were the target. Right. My point is just to illustrate that, like, it was not government. I don’t think they were government printers. Right. Like, the allegation throughout this entire thing is, oh, it’s just some government random number generator in FIPS is just some library used in government products. Do you think HP printers only use Be Safe in the government printers?

Justin: I didn’t. I didn’t say only. I said. I said they primarily backdoor themselves. I didn’t say no one else. I said primarily because this is mostly used by the federal government.

Thomas: When you’re talking about the Juniper construction there, Right? Like, that’s not the NIST standard, and that’s not the BESAFE library either. That’s like Juniper or really Net Screen Engineer code in that situation.

Matt: It is, but it’s. It’s weird. It’s a weird decision to use that generator. Like, can you think of a reason that anyone would use that generator and then not get certified?

David: What was the other generator? Because the thing that I could see is that you’re using Dual EC for FIPS compatibility, and then you shove another one in there because you heard it was bad and then you totally screwed that up.

Matt: No, they are not. They never documented in any FIPS documents that Dual EC even existed. They only certified and described the ANSI generator that was there as the second generator, which, by the way, didn’t run.

Justin: Yep.

Matt: So why would you. In a product where you’re not certifying Dual EC or even telling anyone to guess, why would you include Dual EC?

Deirdre: Oh, you mean Juniper never documented that Dual EC was in there?

Matt: Not until after the Snowden documents when they did a code review and discovered it and said, whoops, it’s there, but we’re not taking it out.

Deirdre: Fascinating.

Justin: I mean. I mean, the sloppiest code I’ve ever looked at was firewall code.

Matt: You don’t. You don’t.

Justin: I mean, like, the worst. Like that just. It’s like, just pile more and more layers of stuff on it. So, I mean, does it surprise me? No.

Matt: Do you include a second random number generator and the infrastructure, like, including flags to turn it off? Like, that’s nuts.

Justin: I thought firewalls essentially, like, jumped the shark when they started going up, you know, to, like, layer three or whatever it was. For me, it was just. Yeah. Application, inspection. All that seemed crazy. So am I surprised.

Matt: First, we have a generator that’s. First, we have a generator that’s exploitable. Second, we have the generator appearing in code where it doesn’t need to exist because it’s not a FIPS certified generator. And then third, the second generator that’s supposed to protect it doesn’t work. And my view is that, like, one coincidence is pretty bad. Two is starting to feel ugly, and three is just. Come on, let’s not kid ourselves.

Justin: Except for you don’t have any. Like. Like, your core thing is that they did this as a back door.

Thomas: I don’t think NSA did it as a backdoor. I think NSA did it as an internal key escrow thing. I think the problem. The problem that we have here is that there are three NSAs. Right? And we. Like you’re saying I didn’t do this as a backdoor, to use the parlance of the time. Right. I didn’t do it.

I mean, I agree with you. It’s not.

Justin: It’s not an I check director out there to tell everyone, no, we made this, and it was not a back door.

Thomas: But, like, second director was like, oh, we made this thing. Right. Like, the IAD came up with this. Don’t check me on the group names. Right. Clyde Frog.

Justin: Oh, no, no. It’s more than. SID doesn’t tell IAD what to do.

Thomas: They didn’t have to. IAD made it for legitimate reasons. IAD came up with a key escrow random number generator for wholly defensible and legitimate reasons. Right. All SID had to do was notice that they did that and then take advantage of it.

Matt: Yep.

Justin: All right, but then your argument is that they didn’t backdoor the standard because the standards work all comes out.

David: Are we just having a debate on the semantics of the word backdoor?

Matt: Yeah, somebody backdoored the standard. We’re debating about who it was.

Thomas: I think, like, the step. The step past that. You want to go in terms of the step past this. You want to go in terms of intentionality is that the impetus to put it, to create it as a NIST standard was primarily there to subvert cryptography. Right. That, you know, whoever told, you know, Dickie George to go talk to NIST and convince them to put this in the standard. Whoever told Dickey George to do that, their goal was to recreate k escrow in 2005.

Matt: Yeah.

Justin: I don’t send it to that in the slightest. Yeah, no, in fact, that’s in. That’s my arguing against that.

Thomas: I can see that we disagree on this point. I’m just saying that that’s the point. Right.

Justin: Okay.

Deirdre: Yeah.

Thomas: I’m still hung up on a Juniper point here. Right?

Deirdre: Yeah, yeah, yeah.

Thomas: What I want to hear from Matt is like, to me, so first of all, juniper is the biggest smoking gun for me. Right. But like that second random number generator thingy that we’re talking about there, like juniper Netscreen, not Juniper, but Netscreen had to go build that. Right. Why did Netscreen build that?

Matt: You got me?

Thomas: NSA didn’t build it.

Matt: No, they didn’t. But here’s the thing. Once you built a standard, all you need is one engineer. You need one engineer who is willing to work for you and put a piece of code for entirely defensible reasons into a product. And if anybody asks, why are you putting that standard into a product? You say, how could it possibly hurt anyone? It’s a NIST certified government standard.

Thomas: Yeah, I agree with all that stuff. But you’re, you’re words. We’re still talking back to each other, right? At some point, that global variable and the broken for loop had to be introduced into Netscreen’s code.

Matt: Sure.

Thomas: How did that come about?

Matt: Someone worked there and obviously used a, you know, lame collision for a global variable. And you can say, it’s an accident, it’s entirely defensible, and if somebody had found it, they would have fixed it and said, whoopsie daisy, but nobody found it. And as a result, there was an exploitable vulnerability in every NETSCREEN firewall from 2008 to 2015.

Thomas: Interesting. Okay, so you’re saying that like the original Netscreen backdoor was just that they used Dual EC and then after that.

Matt: Like, they used Dual EC and then disabled the second random number generated. So it didn’t run, so it didn’t over.

Thomas: So it’s just like, it’s just like a stroke of good luck for NSA Tao that, oh, no, there’s no viral.

Matt: There’s no luck in that. This stuff is not.

Thomas: Hold on. In your story in your story, it has to be luck, right? Because, like, a Netscreen engineer had to be enough of a doofus to use a global variable as a, you know, in a name collision to meet that.

Matt: That’s not.

Thomas: Otherwise they’re, otherwise they run the ANSI generator and break the back door.

Matt: Here’s the thing. So what? So let’s say they found the vulnerability. What would, what would Juniper have done? Would they have panicked and told all their clients there was a serious vulnerability and they had an emergency and they had to do an emergency patch?

Thomas: Or would they have said, that’s what they do routinely? Yes.

Matt: Or would they have said, hey, whoops, we used a one NIST certified generator instead of the other NIST certified generators. There is no vulnerability. Don’t worry about it.

Thomas: Right. No, I get that point. Right. I’m with you on that. Right, but like, still, NSA’s access to those, to those firewalls or those VPN terminators, right? NSA’s access to those NetScreen VPN terminators was predicated on that stupid bug being there. If the bug hadn’t been there, it wouldn’t have worked.

Matt: But the bug is easy to add.

Thomas: You lose me there, right? No, no, I see. I see that. I see that there’s an innocent explanation for the bug. I’m with you on that. Right. I get that you could add it, but you couldn’t. But you wouldn’t bet on that.

Matt: Here’s what I’m saying. There is no innocent explanation for dual EC being there. And the minute Dual EC is there, the question you’re going to ask yourself is, huh, that’s very strange that Dual EC is there. There are two hypotheses here. One is that Dual EC is there because somebody loves Duly C, and the other is because somebody’s putting a back door in this. And then the next question you’d ask is, let’s assume the second. Let’s assume it’s a backdoor. There’s no way that backdoor could be exploitable because a second random number generator would cleans the output and that second random number generator would break the back door.

Matt: And immediately you would think the only way that second hypothesis could be true is if somehow, due to a very silly error, the second random number generator never runs. And when you look at the code, surprise, surprise, that’s the case.

Thomas: Right? It sounds like you’re making a case that NSA had some agency in getting the second round of number generated.

Matt: I think that NSA had one person who worked there who did them A favor. I really do. I can’t prove it. I’ll never be able to prove it, but it looks too deliberate. You know, it looks much too.

Justin: Yeah, I do feel like I’ve never. I’ll never be able to prove it applies to the whole Congress. I read that. At least I’m not denying how busted it is. I don’t know the history or reasons for that. And I’m sure they have. I mean, basically, did you do it? That’s the secret criteria, right?

Matt: It’s post 9 11. It’s post 9 11.

Justin: The US means it’s post 9 11. What does that mean?

Matt: It means. It means. Okay, I don’t. You’re. You’re as old as me. And maybe actually most. Some of you are. Maybe some of you are blessed to be a little younger.

Matt: But you remember how crazy things were after 9 11.

Justin: These people weren’t even born when the STU-III is shipped. Right? And this is as far as we know. This is the cryptography that we are talking about.

Thomas: Right? And that’s just.

Matt: David, it’s post 9 11. It’s post 9 11. The war on terror is going on. It feels like an existential thing. It really did for a few years. And this is the exact time period we’re talking about. You’ve got terrorist organizations. You’ve got people like Hamas and Hezbollah, et cetera, all over the world who are communicating with banks and so on, moving sums of money.

Matt: Banks are using firewalls. Commercial. Commercial products. And you’re thinking to yourself, literally, the difference between an American city going up in a mushroom cloud and not might be my ability to access that firewall traffic I’m intercepting over that cable.

Justin: But this is my view, you think.

Matt: They’Re not going to do this stuff.

Justin: You’re telling a story. I don’t want a story.

Matt: I have a document. I have a document that says document.

Justin: You have a document from an entirely different organization. A total, not a totally separate part of the agency. This is the thing.

Matt: It’s $250 million a year. In NSA terms, it’s $250 million a year. A small program or a large program.

Justin: It’s a medium you’re talking about. This is the thing. Like you are looking at something from the outside that you know nothing about. You’ve already made your decision on the intent. Like you go in, you know, the intent being you’re putting a story together to fit that.

Matt: No, I mean, I’m looking at. For. For one thing, there are a series of documents that says they specifically altered this specific standard. The documents don’t just say we have a program to alter standards. They say this one with duly see in it. In a top secret line. We went.

Thomas: Just because I’m nerding out on this one specific point. Is $250 million a big program or not? What military vehicle do we pay for that costs just $250 million?

Matt: Is it small? Is it small? It doesn’t have to be.

Thomas: It might be. I don’t know, is it, Is it.

Matt: I don’t know, is it like two guys in a room with no influence.

Deirdre: An F-35, like $100 million or $200 million or something like that? But that.

Thomas: Anyway, real program in NSA, it’s.

Matt: Could you miss it?

Thomas: Yeah.

Matt: Could you, could it happen accidentally? And you’d be like, whoops, you didn’t even know that was happening.

Thomas: It’s not a side hustle. I have one other big thing I want to hit before we like let this spit out of control again, which is just in the grand scheme of things. If you’re coming at this like, you know, a tau operator, right, or you know, any C and E person, a serious exploits or vulnerability person or whatever, if you work for Azimuth, that kind of thing, is this a good backdoor? Is this effective? Is this giving them a capability that is really, really important in the world compared to what they had before? And I know that you’re going to say yes, right? And I believe it’s a backdoor, right?

Deirdre: But like, and, and I feel like there it’s, it’s fair to ask that question because at least one independent operator went and switched out those P’s and Q’s for some reason. And they targeted this for some reason as opposed to, I don’t know, shimming in some other infiltrating code.

Justin: Well, they targeted after the issues with it had been disclosed. Right, so you’re talking about explaining an end vulnerability, right? Like at that point you’re exploiting an end day vulnerability.

David: The issues with it were disclosed before it was standardized.

Thomas: Yes. I mean the exploit for it was like, it’s, it’s pretty obvious.

David: It was a 99 paper and a 2000 standardization like.

Thomas: But it’s also like, there’s an elegance to the idea that it’s a NOBUS backdoor, just meaning like, you know, only NSA can actually exploit it. But it’s kind of like a weird border area in terms of like nobody but escapability. Because it’s a random number generator. There’s no way to verify that you’re running the right points anyways. So like other people can. In fact, there is. If you swap the points out, it’s. I agree. It’s a NOBUS backdoor. Right? It’s just the answer.

Deirdre: Looking at the numbers coming out.

Matt: Yeah, it’s true.

Justin: Wait, the answer is FIPS?

Thomas: So Dickie George would say that when he brought this to, to the NIST, he said, standardize this, but let people use their own points. And then NIST and came back at them and said, no, it’s got to be these specific points, which is just like another nail in this whole thing, right? Like, why wouldn’t you let people.

Deirdre: From when I watched him, they stuck with those points. Not necessarily that it has to be only those points. And no, no availability for other points. Which is part in, in my perspective, in my narrative of the negligence or the unfortunate. Not looking beyond just the thing in front of you and seeing the consequences of only allowing the points that had been shipped over from the NSA and not allowing any others, especially when those points had no explanation about how they were generated other than we did it securely according to our secret, you know, parameters.

Matt: I think you’re missing there are three, three levers at play. The first is the standardization, which includes just the default points. The second is FIP certification, which. Who does FIPS certification? Well, it’s the Cryptographic Module Validation Program, which is run out of the NSA and they can decide whether they actually certify anything. It is, at least it was back in the 2000s. It was an office in Fort Mead.

David: Well, it’s not anymore, but anyway, I.

Thomas: Believe Nate Lawson was a FIPS validator before.

Matt: No, no, no. There are labs. CMVP sets the standards for the labs. They are the ultimate controller of the lab. And CMVP could decide, hey, have we ever certified a Dual EC implementation with alternative points? And the answer is no, they never have. The only points they ever allowed for certification were the canonical P and Q. The third lever they have is they have developers and contracts with organizations where they can, say, implement this generator. And those three things together are a lot more powerful than any one alone.

Justin: Great to take a step back to that one with. They never certified on anything else. This is sort of the textbook example of what sucks with government contracting because you write something down once and even if you, when you write it down, you say, this is not mandatory. This is just an example. Everyone from then on treats it as mandatory.

Matt: You don’t.

Justin: You don’t have a choice. Right? So Dickie George’s argument was that they didn’t care what the points were. They didn’t even want to supply the points. But NIST said, no, we can’t give you with, you know, we can’t do it without you supplying something. They’re like, okay, fine, here, use these. We don’t care. But the problem is now it’s written down and everybody’s going to use the same thing. So for you, that’s a backdoor argument.

Justin: For me, you didn’t say the government.

David: Stuff was going to use those government parameters.

Matt: Yeah, I mean, I think the argument was there were a lot of legacy customers, AKA government customers and BSAFE that were using the original points. And I think that there was some pressure to use the original points. And I don’t know, maybe NIST dropped the ball. It’s possible NIST dropped the ball, but I don’t really believe the. I mean, in retrospect, I think you could make a case. They said you could use alternative points, but I also think that they knew that the momentum was on their side and they wouldn’t be used and they.

Justin: Were happy to assume you’re conveying intent. You’re assuming intent when you say that. Like, I think it’s hard to have an actual discussion about this without going in immediately assuming intent.

Thomas: Be careful because he’s not, he’s not necessarily saying intent about iad, just somewhere else in the organization.

Justin: Yeah, but structure, like organizationally, the somewhere else in the organization still doesn’t make sense.

Deirdre: Yeah. As a person who is blocked on generating test vectors for a completely open standard right now, I understand the. Once you have test vectors and interop parameters, you, you, you’re kind of done. You’re like, do I have to go through all this work to generate even more? What if I don’t? Everyone’s fine with that. Cool. Great. Ship it. I completely understand that scenario happening for the people at NIST, given parameters handed to them via, you know, Dickie George.

Thomas: I also just. And I think I feel like I have some empathy for the position you’re in here because I like, I picked the king hell mess of a fight with, with Matthew and with Trevor, with Trevor Perrin about extended random, which was in my, in my case it was mostly like you’re saying like Dickie George or I didn’t do any of this stuff maliciously. And when I was talking about. I still mostly, I still entirely believe this. Right. But my whole thing with it, there’s. There’s a follow on standard called extended random. There’s like six different permutations of it that make Dual EC exploitable in a TLS in a conventional TLS setting, right. And like EKR was involved in the standardization of it and when the first extended random stuff came out, people were like, well what, you know how much culpability EKR have in that, which set me off and all that. Right. Like I have some empathy for the position you’re in there. Right. But like all you have to say is okay, whoever you’re saying was good was good. Right? Like whoever you like in this situation, they’re fine. We’re just saying there’s some other evil person out there that you don’t know of.

Matt: I don’t think they’re evil. I think they were trying to save the world. I think they were trying to save the U.S. but they really screwed up.

Thomas: We got off on a really good tangent about the. Is this a good backdoor thing? But I’m still like, I come back to even in 2006 was Dual EC like an effective Internet wide mass dragnet cryptography.

Justin: There was a table.

Thomas: That’s certainly what that’s.

Matt: I mean, I think, I don’t agree. Look, here’s the situation that the NSA was in the 2000s, by the way.

Justin: I was there at the period we’re talking about.

Matt: Okay, but, but, and I think, I think that’s, I think that’s relevant, right?

Justin: The story that you are. Penny, take. I’m like, I mean I was there and it doesn’t. Okay.

Thomas: And you were at Google at some point, but you don’t know you could you, I can’t Google you instead of using Google. Like, just because you were at Google doesn’t mean, you know, everything that was happening at Google.

Matt: The, the NSA saw themselves in a situation where their encryption was not very widely used yet in the early 2000s and was slowly becoming increasingly deployed and it was starting to block capabilities. We agree on that, right?

Thomas: Yeah, unfortunately that’s, that’s just going down.

Justin: But yeah, no, I disagree with that notion.

David: Fair.

Matt: Like George says this as well notion.

Justin: That it was an issue in the early 2000s. It became an issue starting.

Matt: I, I think, I think it was people who are people who are prescient were looking forward and seeing TLS and encryption as being an issue. IPsec was starting to come out.

Justin: This is where I’m going to say I disagree but I don’t want, I can’t get into the details of why this.

Matt: Okay. But I think people saw it. Maybe they looked forward a decade and they saw this being a potential issue, and they said, we need to develop, like, a whole bunch of strategies to deal with this. And I think their strategies broke up into three different branches. And I think one was people doing exploits, and I think that’s been very successful. I think we can all agree on that. And the second one was looking for vulnerabilities and sort of an exploit, like, passive exploit thing. And the third, which unfortunately we do have evidence for, because we have a document that says they did this, was, let’s try to build back doors in crypto, in commercial crypto systems.

Matt: The stakes are so high. We need to do this. And maybe at the end of the day, it actually didn’t turn out to be as useful to them as they thought it would be, but they certainly tried it.

Thomas: Justin, I can see the look on your face right now, and I just want to say that based on the rules that we agreed to before we started this podcast, the terms of engagement here, based on the evidence that I have heard in this conversation, I have determined, and I’m now ruling that that document that Matthew Green has is real.

Justin: I’m not saying it’s not real. I agree with it’s real. I don’t. I don’t know the provenance of it. I don’t know if it was like somebody making a proposal. I don’t know if it. If it was like a proposal that nobody ever did like, all right, so there’s a budget. I, unfortunately, I’m comfortable talking about, like, other employers.

David: Right.

Justin: There’s no risk of me, like, going to jail about revealing details. And I will say it. Other past employers, non government employers. I saw documents leaked, things that were proposals that were. Nobody was working on. Somebody just wrote it up as a proposal. Nobody. They did it, but it ended up getting leaked and it was treated as.

Justin: It was like, oh, this is something that they’re absolutely doing. And it was like, no. What? And as I said, I also know people in other contexts. This has not happened for anything I’ve worked on, but I know people in other contexts who had, who were victims of hack and leak and the leaks, minor alterations to documents, little tweaks, et cetera. I don’t. And this is why the provenance of these, like, I. I’ve been like, as soon as the hack and leak stuff started, I was like, no, this is not good. And I am.

Matt: This is not hack and leak, though. This is not hacking.

Justin: Oh, no, it was hacking leak. He stole credential, he stole credentials.

Matt: But this, these did not.

Justin: In order to get the documents. It was still.

David: It was insider hack and leak.

Matt: This did not pass through, say, Russian intelligence. And, and secondly, wait, wait, no, he.

Justin: Went, he went to China and then delivered it to Russian intelligence.

David: Yeah, but afterwards.

Matt: And secondly, secondly, you’re not trying to.

Justin: Argue that Snow doesn’t, for all intents and purposes present just like a foreign intelligence asset, right? I don’t know if that was his.

Matt: Intent, but I’m trying to argue two things. One is that this did not go through some foreign intelligence agency. And secondly, this was reported in the New York Times. And third, the NSA was given copies of these documents and they were asked to comment and make corrections. And at that point they could have said these documents are forged. And they did not.

Justin: Yes.

Matt: And that was the time.

Justin: Thank you. This actually gets back to a really important point that I was making earlier, which is the NSA in this case, they could have come out, they could have said something. They didn’t. They’re like, screw it, we’re just leaving it be. But you know where they did say no, this was not a back door, was they actually sent somebody out to.

Matt: Say they sent one. Really?

Justin: This was not a back door for DU vd. So you’re saying, so you’re saying if they said no, you wouldn’t trust, but you don’t trust them.

Matt: That I, I would have trusted. And if I wouldn’t have trusted anything.

David: Actually, based on this podcast, retired former NSA will say anything.

Matt: You may need to plug your ears for just 10 seconds. The fiscal year 2011 actual funding for this program was 298.6 million. Fiscal year 2012 enacted was 275.4. And fiscal year 2013 request was 254.9. It had a headcount of 144 full time equivalent positions and went down to 141 in 2013. So over three years there were over 140 people working on this thing. And with funding that lasted over three years, it was not some idea that somebody had or a request that was made, but that was never enacted.

Justin: So you’re just talking about like money and personnel stuff. Right.

Matt: You made the point that maybe this was some idea somebody had and it didn’t actually exist.

Justin: No, no, that’s not what I said. No, I said that maybe the document that you’re looking at was just a proposal specific to what, like.

Thomas: Right, but it describes a funded program.

Justin: No, no. Does the whole thing describe a funded program? Or does that. Or is there like this is the thing that’s not how pro. That’s how stuff gets written up.

David: Like, it might not be the entirety of the program.

Justin: I never saw someone, whenever someone was putting forth a proposal or like Stryer program. I mean, I didn’t seem like, like attaching dollars is something different, but I’m looking at a.

Matt: But I’m looking at a budget table that says top secret SI tk, no foreign at the top. And then talks about the computer network operations. This is unclassified computer network operations, second enabling program. And then everything else is sigma.

Justin: This is my point. You’re, you’re, you’re. It’s.

Matt: By the way, this is all like.

Justin: You’Re treating a bunch of documents in the same way you were treating SID and IAD as interchangeable. It sounds like you’re treating different documents that say different things as interchangeable.

Matt: No, this is one document, and then below it it says project description. I already read you that paragraph, but I don’t know if you heard it. That says exactly what the money is designed to do. It’s really unambiguous. I really. I understand your feelings about this classified document, but my feeling is once it’s been published in the New York Times and 11 years have gone by, you have to just be willing to read.

Justin: Yeah, I just avoided it because, like, in particular, I’m doing this podcast right now where I’m talking, where I’m trying very carefully not to talk about anything.

Matt: Okay.

Justin: I’m intentionally avoiding any of the things where I could step on myself.

David: All right, on that note, I think let’s kind of close this out by. We’re going to. We’re going to do two things. We’re going to go around the horn first, and then we’re going to have Thomas adjudicate. But first we’re going to go around the horn and we’re going to say on a scale from, let’s say 7 to 24, where 7 is an absolute impossibility and 24 is total metaphysical certainty, what is the likelihood that the standardization process was a specific attempt by a part of NSA to backdoor a standard, and we’re just going to go around the horn. That’s 7 to 24, where 7 is impossible and 24 is absolute metaphysical certainty. Justin?

Justin: Obviously, I was 7 and I resent being having this argument with four cryptographers.

Thomas: And David?

David: The correct answer is 18.5. Thomas.

Thomas: I’m sure that Dual EC did make things pretty hard for people doing cryptographic standards. I am not so sure that there’s a lot of evidence that That’s a bad thing.

Justin: Secret hero.

Thomas: Like, things could be easier and, like, it could be easier to deal with NIST and have people trust NIST. I’m not sure why that would be a good thing.

Justin: You know, when I was at NSA working in iad, I had to go give a talk on something it missed. And I didn’t realize until after I was there getting yelled at on stage by a bunch of myth people that they really just wanted someone to take flack for their password policy, which had literally nothing to do with me at all. And I was, like, set up as, like, a sacrificial lamb. It was a weird experience.

Thomas: Well, I mean, to wrap this whole thing up, I think we can all safely conclude that Justin is wrong.

Justin: You know, here’s the thing, actually. Wait, I got one. Can we make a bet? The problem is It’ll be like 40, 50 years before we get the answer. But who wants to make a bet? How much? How much you want to bet inflation adjusted? Because we’re not betting, like, money now and that. How about. How about like a thousand dollars? I’ll bet $1,000.

Matt: I’m in on this bet.

Justin: Inflation adjusted to when and when these documents could eventually be foiaed. That’ll turn out. No, it’s just really, really ugly legacy garbage and not a back.

Thomas: Oh, yeah, no, I’ll take that bet.

David: Yeah, I think this is an undecidable question.

Thomas: Oh, it’s a pretty safe bet. Like, in that it’s either to come out like a set of smoking gun documents about how they literally operationalize this or nothing will come out. So there’s really no way for me to lose the bet. Are you going to put a time limit on? Are you saying, like, nothing will come out in 40 years and if nothing comes out, I lose the bet?

Justin: See, this is the problem. This is why I’m screwed on this one.

Thomas: It sounds like you’re saying either, like, a set of documents comes out, that’s proving that it’s not a backdoor and it’s just legacy cruft. Or a set of documents is going to come out proving that I’m right and I’ll take that bet.

Justin: I mean, that’s basically the document.

Thomas: No documents come out. That I’m not betting anything.

Justin: No, no, I’m not betting anyone on no documents. No one should bet on that.

Thomas: Yeah, that’s obviously, I take that bet.

Justin: The absence of proof is not proof.

Thomas: Yes. Okay. This is very good. I have the opportunity to win money and no opportunity to lose money. This is great.

David: Luckily, that bet will be able to be paid off with just one copy of the Art of Software Security Assessment.

Thomas: Which I’m holding onto my copy which will only appreciate in value, so.

Justin: See this. This would have felt better though, if we could have gotten like someone else who’d ever spent time in an intelligence.

Thomas: We feel for you, Justin.

Matt: Yeah, I try, actually.

Thomas: I really. I really appreciate you taking this side of it. There’d be nothing for us to talk about with Dual EC if you weren’t here.

Justin: I. Well, nobody’s talking about anymore. It was ten years ago. Over.

Matt: Just for the record, like, I strongly disagree with you, but I completely respect your position on this and I hope we don’t fight too much on Blue sky or Twitter or wherever we are now.

Justin: I try not to fight online anymore.

David: I just.

Justin: I just. I just dodge at this point. I’ve. My fighting online days are gone.

David: He says, while currently fighting online.

Justin: No, anytime somebody got. Anytime somebody got recommended, I just disengaged. That’s.

Thomas: That’s what I noticed. What I noticed in this podcast is that anytime anybody got argumentative, you just disengaged.

Justin: I said online. I didn’t say here. But Matt, I didn’t want to say it. None of this in any way, like, disagrees with like your work or the technical capability or anything else like that. Like, I. I agree with you. It sure as hell looks like a backdoor. We just.

Justin: Yeah, we disagree on the intent and whether anything like this could have ever been operationalized.

Thomas: Well, I’m posting online the series of screenshots I took of David Adrian’s face during this whole thing. You brought him so much joy. So much joy.

David: Yes, this was. This was great. And on that note, I would just like to remind everybody that you can buy merchandise at securitycryptographywhatever.com, and that Security Cryptography Whatever is a side project from Deirdre Connolly, Thomas Ptacek and David Adrian. Our editor is Nettie Smith. You can find the podcast online @scwpod, including on BlueSky, hosts online @durumcrustulum, @tqbf, and @davidcadrian. If you like the pod, give us five star reviews wherever you rate your podcasts.

David: Thank you.