Security Cryptography Whatever
Security Cryptography Whatever
Dual_EC_DRBG with Justin Schuh and Matthew Green
Nothing we have ever recorded on SCW has brought so much joy to
David. However, at several points during the episode, we may have witnessed Matthew Green's soul leave his body.
Our esteemed guests Justin Schuh and Matt Green joined us to debate whether `Dual_EC_DRBG` was intentionally backdoored by the NSA or 'just' a major fuckup.
Transcript: https://securitycryptographywhatever.com/2024/12/07/dual-ec-drbg
Links:
- Dicky George at InfiltrateCon 2014, 'Life at Both Ends of the Barrel - An NSA Targeting Retrospective': [https://youtu.be/qq-LCyRp6bU?si=MyTBKomkIVaxSy1Q](https://youtu.be/qq-LCyRp6bU?si=MyTBKomkIVaxSy1Q)
- Dicky George: [https://www.nsa.gov/Press-Room/Digital-Media-Center/Biographies/Biography-View-Page/Article/3330261/richard-dickie-george/](https://www.nsa.gov/Press-Room/Digital-Media-Center/Biographies/Biography-View-Page/Article/3330261/richard-dickie-george/)
- NYTimes on Sigint Enabling Project: [https://archive.nytimes.com/www.nytimes.com/interactive/2013/09/05/us/documents-reveal-nsa-campaign-against-encryption.html](https://archive.nytimes.com/www.nytimes.com/interactive/2013/09/05/us/documents-reveal-nsa-campaign-against-encryption.html)
- On the Practical Exploitability of Dual EC
in TLS Implementations: [https://www.usenix.org/system/files/conference/usenixsecurity14/sec14-paper-checkoway.pdf](https://www.usenix.org/system/files/conference/usenixsecurity14/sec14-paper-checkoway.pdf)
- Wired - Researchers Solve Juniper Backdoor Mystery; Signs Point to NSA [https://www.wired.com/2015/12/researchers-solve-the-juniper-mystery-and-they-say-its-partially-the-nsas-fault/](https://www.wired.com/2015/12/researchers-solve-the-juniper-mystery-and-they-say-its-partially-the-nsas-fault/)
- ProPublica - Revealed: The NSA's Secret Campaign to Crack, Undermine Internet Security [https://www.propublica.org/article/the-nsas-secret-campaign-to-crack-undermine-internet-encryption](https://www.propublica.org/article/the-nsas-secret-campaign-to-crack-undermine-internet-encryption)
- DDoSecrets - Sigint Enabling Project: [https://data.ddosecrets.com/Snowden%20archive/sigint-enabling-project.pdf](https://data.ddosecrets.com/Snowden%20archive/sigint-enabling-project.pdf)
- IAD: [https://www.iad.gov/](https://www.iad.gov/)
- Ars Technica - “Unauthorized code” in Juniper firewalls decrypts encrypted VPN traffic: [https://web.archive.org/web/20151222023311/http://arstechnica.com/security/2015/12/unauthorized-code-in-juniper-firewalls-decrypts-encrypted-vpn-traffic/](https://web.archive.org/web/20151222023311/http://arstechnica.com/security/2015/12/unauthorized-code-in-juniper-firewalls-decrypts-encrypted-vpn-traffic/)
- 2015 IMPORTANT JUNIPER SECURITY ANNOUNCEMENT: [https://web.archive.org/web/20151221171526/http://forums.juniper.net/t5/Security-Incident-Response/Important-Announcement-about-ScreenOS/ba-p/285554](https://web.archive.org/web/20151221171526/http://forums.juniper.net/t5/Security-Incident-Response/Important-Announcement-about-ScreenOS/ba-p/285554)
- Extended Random Values for TLS: [https://datatracker.ietf.org/doc/html/draft-rescorla-tls-extended-random-00](https://datatracker.ietf.org/doc/html/draft-rescorla-tls-extended-random-00)
- The Art of Software Security Assessment: [https://www.amazon.com/Art-Software-Security-Assessment-Vulnerabilities/dp/0321444426](https://www.amazon.com/Art-Software-Security-Assessment-Vulnerabilities/dp/0321444426)
"Security Cryptography Whatever" is hosted by Deirdre Connolly (@durumcrustulum), Thomas Ptacek (@tqbf), and David Adrian (@davidcadrian)
Hello, welcome to Security Cryptography Whatever. I'm Deirdre.
David:I'm David.
Thomas:I'm Thomas.
Deirdre:and we have two special guests with us today. Uh, returning champions. We have Justin. Hi, Justin.
Justin:Hello.
Deirdre:And we have Matt. Hi, Matt.
Matt:Hi, how are you?
Deirdre:Hi. We're chatting on the internet the other day about our old friend, our favorite conspiracy theory, Dual_EC DRBG, um, and we started shouting about this because Justin shared a very good talk from, I'm forgetting, uh, the person's name because they don't have it in front of me, but it was basically a talk from 2014 InfilitrateCon, is that right?
Justin:Uh, something like that. It was, uh, his name's Dickie George. And yes, you're going to chuckle when you say
Deirdre:it was very, a very interesting perspective from a perspective you usually hear, which is usually inside the NSA, on what happened with Dual_EC_DRBG. Uh, and it sparked a lovely, a lovely little debate. Justin, can you tee us up what Dual_EC_DRBG is and what Dickie was, was talking about? Yes.
Thomas:and me and David are friends, and for whatever reason, Justin and David both worked as security PMs on the Chrome browser. Whoa, whoa, whoa, whoa!
David:the p don't slander Justin like that.
Thomas:I demoted Justin, apparently. Justin, what were you?
Justin:I was an engineer when I was
Thomas:So I demoted you to
Justin:one of the only engineers.
Thomas:both worked on Chrome, and, and David actively works on Chrome right now, and they needed to meet up at some point to share war stories, and they happened to meet up in a place where I was, and they were drinking, and at one point, Justin said something about how fed up he was with people talking about NSA backdoors, and how quite sure he was that Dual_EC probably wasn't a backdoor, which is what lit this whole fuse.
Justin:I said, it's not a
Thomas:I was giving you some room there.
David:same vibe as Alex Jones explaining the Star Wars prequels.
Justin:Now it is hard after several drinks. Uh, it, it is hard to, uh, bring up thoughts from something that you haven't even considered like 10 years ago. And I have no idea why I went sideways on that one, but yeah, I mean, it's hard. If you know anything operationally about how, if you know anything about the structure of NSA, the notion that it's a backdoor is, like, crackpot. Like, if you understand how that organization works, and if you look at the evidence in front of, like, all the stuff that's been made public,
Thomas:And here we should make clear that Justin is not a cryptographer. Justin is an exploits guy. That's, that's Justin's background, the way I would put it.
Justin:Yes. I said, I don't know, I fully admit it, I'm like, crypto is not my thing, I'm not looking at it from that standpoint, I'm looking at it from the standpoint of, I've seen a lot of vulnerabilities where I'm like, wow, this totally looks like it would be a great backdoor. Knowing that they are just, you know, vulnerabilities you find in the world. And, in this case, knowing the context. I mean, I could tell a bunch of stories about vulnerabilities that absolutely look like they should be backdoors, but I knew the full context of how they happened. It was like, yeah, no, that's
Thomas:yeah. And so like in a podcast first for us, we have warring guests, because our other guest is Matthew Green. And my feeling is that Matthew Green has some emphatic takes on this topic. Am I misrepresenting there, Matt?
Matt:I definitely have opinions. Yes.
Thomas:So if you were to try and like sum up your response to the short thing that Justin just said, just to tee this off, what would, what would your response, if you were on my couch in my living room, four drinks in, and Justin had just told you, there's no way, given what he knows about NSA, that Dooley C was a backdoor, what would your response have been?
Matt:I mean, if I was to go out of my way to build like a, you know, case study of a backdoor, right? If I was to like, you know, go somewhere in like an artist's studio and construct like what a backdoor should look like, like the Dual_EC case is exactly what it would look like. There is sort of, there's so much evidence around it. And it's not just like one piece of evidence that's the, you know, the smoking gun. It's more like there are bullets strewn across the lawn, and there are guns piled along the doorway and holes everywhere. And, you know, like any one of those things, you can look the other way, but not, not all of them, it's not possible.
Justin:And I would say that's how I describe the assertion that it is a backdoor. There are so many colds. The thing that spun off the conversation, uh, online was I linked to the Dickie George post and yes, I hold back every time I say the name, but, um, the talk, and I guess he gave that talk a few places,
Matt:I say something
Justin:like the NSA, I
Matt:familiar with that talk. I have a quote from it in one of the presentations I give, he says, If anybody can prove to us that, you know, by generating their own parameters for Dual_EC, if anyone can prove that, you know, not, not saying the NSA actually, you know, made bad backdoors, but if they can prove that if they put their own parameters into that generator, it can actually be exploited, then I will buy them dinner. So we went off and we did that and we wrote a paper and it got into USENIX Security, and he has not bought us dinner! Not
Thomas:Owes a lot of people dinner, doesn't he? It's not a particularly hard backdoor to demonstrate.
Matt:Not at all.
Justin:did not see that version of the talk, that the transcript that I have does not include that, but it is totally fair of you as you did. I'm not arguing it's garbage. It was, you know, you're talking about decades old technology, that was for a specific narrow use case, not the, not the generalized use case. And I'm not saying that the, uh, the notion of pushing it through as a NIST standard was a particularly good idea. Unfortunately, the weirdness of the way government procurement, everything works. Yeah. That's how it happens. There is the backdoor. I mean, all the only ones that backdoor to themselves if you're, if you're saying it's a backdoor because it got into crypto libraries.
David:Let's temporarily, let's limit ourselves to like, just the algorithm itself. Not NSA, not its usage. And like, just look at how the numbers are multiplied. Um. to generate random numbers. This is,
Justin:is, this is where I have to tune out
David:um, because I, it's safe to say that, like, the algorithm is basically a keyed backdoor. Like, that's just how it works. That doesn't mean that it's intended to be used that way, but the way in which the algorithm is constructed is basically multiplying two group elements together. And if you could take the discrete log of where you started, it turns out you can guess— if you have the discrete log of where you started, you can figure out what's coming next.
Thomas:it almost feels similar to, like, this, the, the conundrum people had early on in TLS 1. 3 standardization, where it's, like, prior to TLS 1. 3, if you were a large financial organization and you wanted to do intercept on all the traffic coming in and out of your network, there was an easy way to do that, right? You would just escrow all the private keys, then you would use them to reconstruct the traffic. And TLS 1. 3 costs you that capability was the major reason to have TLS 1.3. Well, performance in that, right, is to have everything be forward secret. So you could no longer passively, you know, reconstruct TLS conversations just given recovered keys, right? So you can imagine like, and this is like 1980s, 1990s cryptography. I guess this is actually, this is 99, the early two thousands, right? But you can imagine an IT environment where they're running, you know, somewhat something like a forward secret protocol across their entire and they themselves want to be able to reconstruct their own traffic and one way you could do that, like my understanding of the situation is like you could standardize a random number generator that you would be able to like you could recover that given you know a recovered key. and that would give you kind of a sane I mean a stupid but sane thing you could deploy enterprise wide to get that capability
Matt:And you know, Dan Brown, who was one of the authors of the standard, actually patented the idea of doing that, right? Like, in the middle of the standardization process, somebody said, Hey, this thing could be like a key escrow backdoor, and patented the whole thing.
David:Well, that'll stop anyone else from doing it because they hold the patent. It's defensive.
Justin:Isn't that kind of the argument, that it wasn't? like your argument here is NSA built a backdoor into this thing, in the most incompetent way possible.
David:back up again. So, like, I think what we can all agree then is that, like, the algorithm is, like, from a purely technical standpoint, it has a backdoor in it. Like, if you know this parameter, you can figure out what's coming next. Like, that's just an indisputable mathematical fact at this
Deirdre:It has a vuln. It has a vuln.
Justin:expertise
David:like that's thing one. And, and the de so that it has a, a keyed backdoor effectively in the algorithm. The question is for the, the, the key varies based on the parameter set. The question is for the default parameter set, does anybody (NSA) have the key? And then are people using non-default parameters where someone else has the key? Um, and then, was this intentional, um, by NSA to do this, or was it just like an early use of dual, of elliptic curves? Um, Dickie George in that same talk implies that this was used to secure a phone on Reagan's desk in the 80s, which would have been like very early elliptic curves, kind of pre everybody else using elliptic curves. And he makes the claim that it was standardized so that they could have classified and unclassified material use the same phone because the unclassified material at the time needed NIST standards
Thomas:Hold on. He said that, so he was talking about the STU-III phones he said that they had a prior, he said they had a prior experience trying to standardize a single set of algorithms, um, so that you could have one phone because the STU-III phones were not authorized to use non classified stuff, but that wasn't Dual_EC and that wasn't why they did that, right? It was the experience of going through that, that made them say, okay, well, we need like suite B, a set of, you know, a set of cryptography that we trust, but that is also authorized for non classified stuff. And then Dual_EC kind of slipped into that. Right?
David:My, my read of it is they were using it for the STU phones, but that would imply that it had elliptic curves like a bit earlier than anybody else. I don't know the timeline on elliptic curve cryptography.
Justin:that was, my read of it was that as well, my read was not that it was, I mean the problem was, the talk had a whole lot of, you know, there's a whole lot of example, there's a whole lot of by-analogy, because it's the goddamn NSA and they won't just, you know, come
David:Also, Dickie George is like 70 years old.
Thomas:I feel like he talks about, like, the STU-III incident being in the 1980s.
Justin:Yeah, no, the STU-III was late 80s. The STU series, I think, was like 70s to the 80s. I had my, you know, there's a STU-III on my desk.
Thomas:Yeah, I don't know if that detail shatters your worldview and now you agree with us about the backdoor. Then this is not important. Yeah, we're fine, we can move on.
Justin:My take on this is just, all right, so this is the NSA where their entire reputation in the intelligence community is, they would rather have a capability die on the shelf unused than ever have it burned in public. Like, the worst thing from the NSA perspective is to get caught. And this is, and frankly, other intelligence agencies, take offense at that, and they're like, like, what's the point of having a capability if you don't use it? And so the notion that they would use the public standard causes this way, that they would take people from the IAD side of the house who don't, don't work on offense, they, they work on defense, and they would take people from the IAD side of the house who have had their names publicly attached to these things and actually did end up getting threatened and getting harassed after all this stuff happened, and like, they, that they would expose them to this, and just also it's just a shockingly clumsy and awkward way where the only one they ever actually, the only, like, places that saw any meaningful use was in libraries or in FIPS libraries, libraries for government systems. I mean, it just doesn't, it doesn't make any sense. And then they send Dickie George out there to be like, no, no, we did not do this. This is not a thing. They never, they never confirmed. They never denied. This is like the one time they ever denied. And it's like, yeah.
Thomas:Yeah, I mean, everyone has different smoking guns on this and, you know, I remember the patent smoking gun being a really big deal for a bunch of people. And then for me, like, Juniper was probably the big deal. Even BSAFE was not a smoking gun for me.
Matt:Let's start with the biggest deal. Let's start with what we know for a fact. And that comes from the Snowden documents back in 2013, which is, we have a document. Uh, that defines a program called the Computer Network Operations SIGINT Enabling Program. And this is straight, it's a TOP SECRET, SEI, et cetera, NO FORN document. And it basically says, in a top secret SI place, it says their goal is to insert vulnerabilities into commercial encryption systems, IT systems, and so on, um, to allow the NSA to decrypt.
Justin:Yeah, I, I don't, I don't know where to go with that one. Cause, like, alright, I haven't seen the document. I, I intentionally for, for, for reasons, mostly avoided looking at those documents. Uh, and I haven't seen that one. And also, honestly, just the whole hack and leak thing. The way people, like, jumped on that. And, like, the, like, the amount of damage, like, the, the, Not
David:a fan of Russian patriot Edward
Justin:going there is just for Flex before the other people
Matt:I assume that there was a lot of damage and I'm not going to dispute that, etc. But here's the deal, right? Like this is a 250 million a year program. It has a very clear mission. And the, the mission is to tamper with commercial encryption systems. And specifically, let me read you the second thing. It says to influence policies, standards, and specifications for commercial public key technologies.
Justin:I'm saying I don't trust the authenticity of the document. I don't want — this is why I'm not going there. I know from the various happened leaks that have happened, I know docu I know for a fact that some documents have been modified. Not, not from the government ones that were leaked, from other ones where I have direct knowledge of the companies and it was like, oh, that they were leaked on and they were modified, etc. I don't think that that is a safe place
Matt:The Snowden documents, the Snowden documents are something different though. The
Justin:No, there were, there was a, Matt— No, we had, there were people talking about Prism as if it was, like, A backdoor in every company.
Thomas:was like, there are separable issues here, right? Like there's the issue of, um, you know, whether, whether the disclosures did damage, and then there's the issue of whether reporting and publication, like secondhand and thirdhand publication, uh, propagated misinformation. And there's like a stronger case on the latter than on the former.
Matt:But not on
Thomas:I'm, I agree with you, but I want to jump in here and say, this is interesting as long as it's interesting, right? But there are so many other like interesting smoking guns about Dual_EC that like, when you guys get bored of arguing about whether those things are, we can just stipulate. We can just stipulate that okay, the Snowden thing has no probative value and then move on from that.
Justin:I see. This is the, I'm not arguing, it doesn't look exactly like a backdoor of, from the pers not from an operational perspective. From operational perspective, it does from a technical perspective. I'm not dismissing that. I'm saying otherwise people wouldn't have bought into it. And like the whole 15 minute city conspiracy, right? Where it's like nobody's arguing, uh, nobody's arguing the facts. They're just arguing. The interpretation of the facts is wrong.
Thomas:I would say that like a couple of steps into this, into this conversation, we are going to get to a point where it's going to be difficult to make the case that it wasn't actually a backdoor. Given other things that you are allowed to hear about and that are disclosable in this conversation with you, I think we're going to get to a place where, um, it's going to be, it's going to be pretty difficult to argue that it was not in practice and an actual use, a backdoor.
Deirdre:I'm going to throw something out there. Which is that, given the context of Dickie's presentation, it was not intended to be a backdoor, it had a vuln, it was used by China to swap out their own parameters to either mitigate the vulnerability or to insert their own, and a lot of this can be chalked up to bureaucracy gonna bureaucracy and, uh, you know, Apt5 took advantage of it. And only in the context of Dickie's talk do I, have I updated my analysis.
Justin:That is my
Deirdre:there are three.
Matt:I mean, we can get into that, but there is actually enough information from the NIST investigation that, like, there is no way you can look at the NIST standards process and say that the NSA did not deliberately push hard to have these, these backdoored parameters in the spec. They were asked by people, why are you using these parameters? And the response was, The NSA has specifically asked us not to talk about that and not to talk about other ways to generate the parameters.
Justin:but that actually, no, no, that's complete, okay, you're interpreting that exactly the opposite of how I would interpret it. Because, that is the way the NSA responds on everything, on the frickin, uh, uh, SBOXes, was the same, they never, they always push, they never explain. Sorry,
David:Never say anything, NSA.
Thomas:Point of order. Point of order, the person who ran the technical directorate, that did this, got up on stage at Infiltrate and he has an explanation for this. It wasn't, I can't comment on it. His explanation was the NSA operates the world's largest printing press. I don't know why it needs to be a big printing press for this, but it is. All right. And it generates a constant stream of random numbers and they're the same random numbers that are used for like nuclear command and control. That's what he said. And whenever they need randomness for anything, they just go take randomness from the world's largest printing press of random numbers. And that's where the P and Q points on that curve came from. And that is why they can't reconstruct them because according to him, and then he further says that when they standardized, when this, so he asked NIST to standardize
Matt:not true.
Thomas:I believe you, I'm just giving his explanation.
Justin:But what, what reason, what reason does he have for going up there and lying? Like, this is my question. Why, the NSA, everything
Matt:think he's lying. I don't think he was, I honestly don't think he knew. I don't think he was read in. I don't think of him as a
Justin:Wait, you're, you're saying the technical director of IAD, One of the top rating people at NSA responsible for all of this was a
Thomas:on. Point of order, Justin. Dicky George gets up on stage at Infiltrate and says out loud to the crowd there that he does not know what the fuck is going on with these points. He said he didn't know where they came from. He said he didn't know why people wanted them. He said that he hated the standard. He didn't know why the NSA was using the standard. He said he didn't know a whole bunch of
Justin:He, no, he didn't say he didn't know why people were using it. That's not true. He said, you know, just didn't say he didn't know why NSA was using it. He said he didn't know why anyone else would use it. He, he knows why NSA is using it because it's in Ancient devices where it's actually built into hardware and shipped freaking all over the globe. That's why
David:Yeah, so, there's like, he makes a comment when he first brings it up about how they were using these like, uh, hardware diodes for, um, randomness, but then they stopped working for random because the diodes stopped working correctly instead of giving off a bunch of noise, so they're like, we need a random number generator and we need it to like, be verifiably random, and like, one really stupid definition of verifiably random is you define, you build a backdoor into your random number generator, And then you hide the backdoor from the implementers, and then you define a parameter set, you see what pops out, then you go and take your backdoor, and you check all of them, and you're like, good job, you did it right.
Thomas:I just want everyone to know, you guys can't see David on video right now, but he is a pig and shit right now.
Deirdre:Yep.
Matt:So here's the thing. The. That Q point is a public key. We all agree with this. So let me just let, we did want to talk about the math of this. There are two points hard coded into the standard. One of them is P, and that's the same generator point that's in all of the NIST curves. There's nothing special about that. The Q point is a public key. It's a, it's a point on elliptic curve that is x times P, where P is the original point. Now. If you're using the standard process to generate public keys, whatever that is at NSA, do you believe there is a separate process that does not also generate the secret key along with the public key? Because I don't believe there is. I think if you're making public keys, you're making secret keys and public keys. And I think that the secret key, if they did generate Q in the standard way that you generate a public key, they have the secret key right next to it inside of some computer. And you would have to convince me differently.
David:Deirdre has this very, like, scientific, mmm, I'm thinking with my hands on my chin face right now.
Justin:remember when I said I wasn't here for the technical stuff? For the technical crypto stuff. I'm here for all sorts of other technical stuff.
David:Just, just for background, could you say some of your former employers besides Chrome?
Justin:Yes. I did spend close to a decade in the intelligence community and I did work at NSA, among other.
David:He's a spook!
Deirdre:Hmm. Ha
Matt:you generate the Q points?
Deirdre:ha.
Justin:No, wait, so Dickie George, when he said, like, no, no, we just got, we're just spitting him out of a thing. It's like, I told, I, I have worked with R group. They are, they have their weird things. They have, like, if you've ever seen, like, I know you've seen this before in code, where you look at code, or you, whatever, you see the way they've evolved, and it is this giant kludgy mess. And you're like, why would anyone ever build this? And the answer is no one would ever build this, this way, if they started from scratch, it's just layered on layered and layered. And it's very clear to me from that talk that he's, that he's like, they had criteria. There are criteria for this random number generator. It had to fit these criteria. And there's no way in the world they're ever going to tell you what those criteria are, because the criteria themselves are obviously, they consider them very sensitive, whatever. And it's like, in his mind, he's like, if you knew the criteria, because he does know the criteria, if you knew the criteria, he'd be like, oh, well, duh, it's not a backdoor. But you don't know the criteria. I still, the thing that drives me insane is, we're not even talking about how boneheaded this is operationally. Now, you were like, they got a 250 billion dollar budget, etc. And it's like, yeah, but, but they're the keystone cops, suddenly, in this one instance, this one time, they're the complete keystone cops. And not only, do they make themselves look horrible, they also, like, expose the actual names, histories, a whole bunch of people, put them at actual risk, is that like, then, this is my point where it's like, this does not connect.
Matt:but the thing is the Snowden documents were the Keystone cops, right? The Snowden documents should never have happened to an organization that had their crap together. And honestly, they would have gotten away with it if it wasn't for, you know, that like meddling Ed
Deirdre:meddling kids! That meddling Snowden!
Justin:Snowden stole, Snowden stole people's certs. He was a helpdesk tech who was actually stealing people's credentials and using it to access their information.
Thomas:But like, I'm really sympathetic to the argument that you just made, because prior to the run up to BULLRUN, for me, prior to Juniper, probably prior to Juniper, I will never live down the fact that I was all over Hacker News saying, "This is an idiotic random number generator. No one will ever use this thing. So if this is a backdoor, this is the stupidest backdoor anybody's ever come up
Matt:with" And yet we were all wrong.
Justin:No, you believe you were wrong, and I understand this is a strongly-held belief,
Matt:We, we
Justin:doesn't make any sense, Oper. It is, it is nonsensical. It is, it is peanut butter woodle
Matt:so crazy that it makes sense.
Deirdre:This is, this is the, the layers of detritus, and also why cryptographic standards are so important, because NSA had strongly held requirements, and they wanted to get it into the standards so they could like interoperate with other stuff, right? It's just for them! Um, no one, quote, no one would ever No one would ever use it because it's so crap, right? But because it's a FIPS compatible random number generator, Aha! We can do use this for X, Y, and Z. And then someone decided, ah, we need this in a very common commercial library that serves the, you know, the needs FIPS compatibility RSA BSAFE. And that's how it gets shipped. And then that is the target. of APT5 or whatever, Juniper, so and so, and whatever the major target was that had all these, you know, government records protected by Juniper. This is why cryptographic standards matter, and not just because putting something in there that's so shit no one would ever use it is not a good reason to put in a cryptographic standard.
Thomas:Deirdre is speaking on behalf of the Cryptographic Standards Enterprise, which not all of us
Deirdre:Yeah, well,
Thomas:with.
David:Deirdre, could you quickly say your title at work?
Deirdre:Standardization Research Engineer. I am.
Justin:with you. Laundering their own tech debt through a standard is a terrible thing to do, and that is not excusable.
Thomas:So, like, we were talking a little bit about, like, one issue I have with this, that, like, with attempts to kind of describe Dual_EC and kind of talk through why it's a backdoor, is that we tend to go right to, like, you know, there's this P and there's this Q, and if you have the secret d, then you can relate the P to the Q and break it. But, like, it's, so, the way I looked at Dual_EC when I was very wrong about it, It's a random number generator that uses big num math, and no one would ever do that. That's very dumb, right? I'm just ruling it out. I didn't even, like, think about what the structure of it was, right? It was only when Duel_EC became very salient after the Snowden disclosures that you, like, you actually take a second and look at it, right? And the structure that Matt is describing is, like, base point P and public key Q, right? This is just basic textbook elliptic curve DL, right? It's like, if I had published a standard Justin, I know you're not a cryptographer, but I can get you through this, right? If I had published a standard for a random number generator where I used RSA, And an RSA key, and the output of my random number generator was just encrypt this value with this public key and send it off into the world. You would get why there's no, there's no other reason to do that. But
Justin:Oh no, I, I do not know what these secret criteria are, right? I, and this is where we, we go back to, like, like I said, operationally, it doesn't make sense to me, and like, people who I trust to tell me the truth, and I, I, and I have no reason, just given the insanity of the technical director of IAD going out, and like, It's like, okay, we never talk about anything, but this time we're telling you, no, this was not a backdoor, we didn't do it. When they say, like, we had, we had secret criteria, we're not gonna tell you those criteria, but we have some secret criteria, but it was not, those criteria did not involve this being a backdoor.
Thomas:this is like one of those New York street shell games where you have to watch really carefully what's happening here. Because, like, we're talking about, like, the secret criteria for these, these constants. And I think the underlying point that, you know, your opposition in this conversation is making is, so, sure, like, P is random, and there's some secret criteria for the randomness that is in P, and there's another value besides P that is also random, and, you know, subject to all that, like, that criteria, but that other value that is random is not Q. The other value that is random is D. D, which is the scalar multiple of the base point that gets you Q, right? It's Q is not random. Q is the result of an elliptic curve multiplication on the secret D parameter. And this is not like, if you're not a cryptographer, it sounds like that's a lot of crypto stuff, but I'm really dumb and I could follow that one, right?
David:Yeah. It's, it's literally the equivalent of the random number generators, you have a counter and you encrypt it to a parse public key and take the output every
Matt:Yeah, you're encrypting the seed and you're writing the seed out every time.
Justin:I understand that, yes, it looks like a backdoor, and looks like an escrow key, my, my full thing that I am holding to here is that the operational aspect, the total keystone Cops aspect of it, because I don't think they can point to anything. Like, operationally, that would have ever been such a trainwreck, and the fact that, like, jeopardizing their own people, like, the notion that they intentionally expose their own people and jeopardize their own people like that, that's just, no. This is where I go back to, if you know anything about how that they work operationally, none of this makes any sense, and none of this sounds believable.
Matt:Let's look at the real world, the world that actually we live in, right? All
Justin:That's what I'm doing! I feel like you're theorizing and I'm telling you what the world is!
Matt:what I'm saying is all those people who could have been jeopardized by a deliberate backdoor, let's hypothesize that it was all a terrible coincidence, right? Like they didn't try to rob the bank, they were just standing near the bank and, you know, the broken windows were just an accident and whatever, the hole in the vault was just, I don't know. Let's, let's hypothesize they're totally innocent. All the people that would have been damaged by a real backdoor. Now, you might say, okay, this is a bad idea. But, I don't
Justin:Wait, wait, I don't understand, do we know, but the argument presumes that there is a backdoor.
Matt:think it's a
Justin:but now you're presuming.
Matt:And, I think it's a good idea. Trust the NSA to standardize anything ever again. We do not trust them. We will send it out and make sure it gets reviewed from
Justin:I feel like you're making my point for me. I'm confused.
Matt:But my point is that, that, the point is that the way this was done, whether it was a deliberate backdoor or just ineptitude of, like, the first order, led to all these bad outcomes where, like, people's reputations got trashed. The NSA got trashed. The whole world ends up thinking, and not just that, but when you factor in this Juniper episode, which we'll talk about in a bit, like actual, actual US systems got compromised and real Americans got hurt
Thomas:Yeah, I think Justin's point there would be that all the, all the bad things that you're talking about are real, but that also NSA could trivially have predicted those things back in, you know, 2005 or whatever, right? And that they would, like, he would say, like, this is a reason why this, this isn't what happened, right? Because they knew that if they had done that, NIST would never take another standard from them
Matt:Here's my point, right? They could have designed this generator so that there was no suspicion of a backdoor and they chose not to.
Deirdre:I think to both Justin and Matt, like, yes, incredible reputational damage, they don't, they're not trusted to ever actually publicly advise on standards like this again. And there's the, like, the distrust is like basically perpetuated forever. Forever. I think, to Justin's point, they are, they were so just stuck in their own, like, foxhole. Like, not, not pejorative, but just, like, they had their own things, and they weren't looking over there. And they weren't thinking that far, and there's just, like, lots of layers of bureaucracy, and, like, these people are looking at their knitting, and these people are looking at their knitting, and then it just kind of leaks out, and then someone uses it, and so on, and so on, and so on. I think that is perfectly doable, and it sucks, too. It sucks, too, that they are so bad at their broader mission. Because they're so, so narrowly focused on their particular tiny mission that is backwards compatibility.
Justin:Can I point out that half of the mission, half of the NSA, is IAD? Half? Like, half of it is just supposed to be defense, there's,
Matt:IAD is gone. They got rid of it.
Justin:Well, no, they've changed names and stuff like that. But, the half of the mission here was, was defense. And to have, like, I know they've restructured and stuff like that, but they still have the defense mission, right? And essentially you're saying that one side of the house somehow tricked the other side of the house into. doing something,
Matt:And confessed,
Justin:you're, um, what? No, no, you're, you're, you're taking,
Thomas:Matt Green's soul slowly
Justin:this is like me saying you're responsible for,
Matt:but Justin, Justin, I, I know that we started this conversation with the idea that like, the Snowden documents are all forged, and you can't read them, and therefore they don't exist, but let, let's be, let's
Justin:what I said at all.
Matt:and not trying to be rude, okay, we started the idea, we're having a conversation about a crime that happened, And
Justin:wait a wait a minute.
Matt:I'm using a metaphor, I'm using a metaphor. We are having a conversation about a metaphorical crime. And we're debating whether it was an intentional crime or just Do you agree with that metaphor? We're debating whether it was an intentional crime or just a very bad set of
Justin:No, something negligence doesn't have to be a crime, right? I I'm not denying negligence.
David:I mean, I think legally negligence is a crime, but yeah,
Matt:debating about whether it was an intentional criminal act. Or somebody was just careless and the, it was negligence. That's the debate we're having, right? That's the debate. But here's the thing, like we have a document from the accused criminal saying, by the way, I am going to like for the next
Justin:And that's saying it's a person!
Matt:We have a program and I will read the program description again that says we are going to spend 250 million a year. Sabotaging. Sabotaging commercial encryption systems to make it so that we can decrypt. And keep in mind, this is after 9 11.
Justin:But do people from entirely different parts of the university come in and tell you, like, like totally laterally different parts of the university, come in and tell you, hey, you have to do this thing with their You, you have a document, you don't understand the context, you don't, you, you're not certain about the authenticity of it, but it's not, but you know for, but even based on what you know, it's not the same
Matt:Let, let, let
Justin:totally different part of the organization.
Matt:The SIGINT enabling project actively engages the U. S. and foreign IT industries to covertly influence and or overtly leverage their commercial products designs. These design changes make the system, in question, exploitable. Through SIGINT collection, endpoint, midpoint, ets, with foreknowledge of the modification. To the consumer and other adversaries, the system's security remains
Justin:seriously, I have intentionally avoided, I muted, so I couldn't hear what Matt was saying there, so I'm like, I'm not touching those documents! I am not touching those documents, my only point is, I'm not touching those documents! It is too risky for me to go near them or have someone read them to me. My only point is, NSA is a massive organization, I don't know the authenticity of those documents. I sincerely doubt they came from IAD. I cannot imagine that they came from the side of the house that you're blaming.
Matt:but they specifically
Thomas:That does seem, that does seem like a reasonable
Deirdre:Even if they did, and if they succeeded, it is also possible that they came from different sides of the house, and different parts may have been infiltrating commercial encryption and or standards.
Matt:But it's
Deirdre:It is possible that those things can be true, and Dual_EC and its parameters are just a fuck up, or a perfectly secure thing in the setting it was designed to be used in. While someone else on the other side of the house is also doing all these other shenanigans.
Matt:let me, let me articulate another possibility, right? Another possibility is that a small group within the NSA designed a random number generator, told the rest of the organization, we can't tell you our very secret design principles and our requirements, just trust us, you have to use this. And then a whole bunch of very good, innocent people trusted them and took it out to the world. And I think that's a very reasonable hypothesis.
Justin:That's not the way IAD works. They don't have SID do the homework for them, right?
Matt:kind of what you're saying though, right? Like,
Justin:No, no, when I've
Matt:of saying that there were,
Justin:No, no, no, no, wait, wait, whoa, whoa, whoa, we can't, no. What I am saying is there is no way that IAD intentionally backdoored this.
Matt:I'm perfect.
Justin:you're trying to sort of make counter arguments for how and why, but I'm saying, look, I still don't understand how operationally you thought this was ever going to, like,
Matt:It did work
Justin:work for, no, no, work for someone else when they happed in and just swapped out the code and you're already in there swapping out the code anyway, you could put anything you want in there!
Matt:It got inserted into the most popular commercial encryption library starting in about 2005 and continued to be in there until 2013 and it was included in printers. There are printers out there that have, that have, have BSAFE in them.
Deirdre:ESL, this is kind of circling over to Juniper and these attacks. So, contextually, After Dual_EC was a FIPS compliant random number generator, RSA, the company's encryption library, BSAFE, which is very popular amongst a lot of FIPS compliant customers, supported Dual_EC, which is not very performant, but is FIPS compatible, um, and was part of the, uh, Juniper systems. Does someone have more background than I on what to describe about that attack?
Matt:Yeah, so RSA, um, so the RSA BSAFE library is one product, one library, and it was used in a bunch of products, but not in Juniper. Juniper did their own implementation. Juniper did the strangest thing ever, where in about 2008, they said we're going to put Dual_EC into all of our firewalls. This is Netscreen at the time, so they put it into all of their firewalls, so all VPN connections were using it, but they didn't publicize it. What they did is they put another random number generator that would post process the output of Dual_EC, and Justin, you're an exploit person, so I want to describe an exploit, So you have random number generator one, which might be vulnerable and random number generator two, which runs in the same buffer and should overwrite and remove any vulnerability because of the way it works. But the for loop that is inside of random, random number generator two is, um, instead of saying for, uh, you know, local variable I equals zero to 32, it uses a global variable. It says for global variable I equals zero to 32. And somewhere in a subroutine call, that global variable gets set so that the for loop never runs. So if you saw that kind of vulnerability, if you knew that was the difference between a completely exploitable system and a secure system, would you look at that bug and sort of think, wow, somebody did that on purpose? That's a very strange bug.
Justin:Matt, if I did that I would be saying that about, like, half the bugs I looked at, like, I mean, that's, no!
Thomas:I actually, I'm, I'm, every time I talk I'm just going to reiterate that I believe that Dual_EC is a backdoor, but I'm more on Justin's side of that one.
Matt:But Thomas, would you put two random number generators into a firewall product?
Thomas:No, but,
Matt:No, no, no.
Deirdre:negligence.
Thomas:would also only go to fail once, I would not go to fail twice, right? Like, and then people like, there's a whole conspiracy
David:big qualifier here isn't the for loop, it's the firewall product, right? Like, that's the qualifier on the sentences, firewall product.
Matt:Yeah, but would you put, would you put an undock in a, in a FIPS certified random, in a FIPS certified crypto project? Would you, and knowing that you have to disclose all of your algorithms, would you put a one certified random number generator and then backstop it by having it also use Dual_EC?
Justin:yeah, none of this is surprising.
Thomas:this is like, to me, this is the, probably the most interesting conversation that's left to have about Dual_EC. So you said like a minute ago that like, first of all, I have weird takes about BSAFE as well, right? But like, you said, because of BSAFE, Dual_EC is in a whole bunch of printers. Right, and your first thought there is, well, that's actually a pretty slick attack, right? Because printers are the file servers for the most important files, right? But the flip side of that is that nothing goes to a printer encrypted,
Matt:Oh, I, I don't think the printers were the target. My, my point is just to illustrate that like, I don't think the printers were the target. My, my point is just to illustrate that like, It was not government. I don't think there were government printers, right? Like the, the allegation throughout this entire thing is, oh, it's just some government random number generator and FIPS is just some library used in government products. Do you think HP printers only used BSAFE in the government printers?
Justin:I, I didn't, I, I didn't say only, I said I said they primarily back toward themselves. I didn't say no one else. I said primarily because this is mostly used by the federal government.
Thomas:When you're talking about the Juniper construction there, right? Like, that's not the NIST standard, and that's not the BSAFE library either. That's like Juniper or really NetScreen engineer code in that situation.
Matt:is, but it's, it's weird. It's a weird decision to use that generator. Like, can you think of a reason that anyone would use that generator and then
Thomas:Yeah, I,
David:what was the
Thomas:I can't, but I
David:The thing that I could see is that you're using Dual_EC for FIPS compatibility, and then you shove another one in there because you heard it was bad, and then you totally screw that
Matt:no, they are not. They'd never documented in any FIPS documents that Dual_EC even existed. They only certified and, and described the ANSI generator that was there as the second generator, which by the way, didn't run. So why would you in a product where you're not certifying Dual_EC or even telling anyone to do this, why would you include Dual_EC?
Deirdre:Oh, you mean Juniper never documented that Dual EC was in
Matt:Not until after the Snowden documents, when they did a code review and discovered it and said, whoops, it's there, but we're not taking it out.
Deirdre:Fascinating.
Justin:I mean,
Matt:Aren't you?
Justin:the sloppiest code I've ever looked at was firewall code.
Matt:You don't, you don't. I mean.
Justin:Like the worst, like the, just, it's like, just pile more and more layers of stuff on it. I mean, does it surprise me? No.
Matt:do you include a second random number generator and the infrastructure, like including flags to turn it off? Like that's nuts.
Justin:thought firewalls essentially, like, jumped the shark when they started going up. You know, to like layer three or whatever. It was, for me, it was just, yeah, application inspection and all that seemed crazy. So am I surprised? No.
Matt:First, we have a generator that's first, we have a generator that's exploitable. Second, we have the generator appearing in code where it doesn't need to exist because it's not a FIPS certified generator. And then third, the second generator that's supposed to protect it doesn't work. And my view is that like one coincidence is, is pretty bad. Two is starting to feel ugly. And three is just, come on, let's not kid ourselves.
Justin:Except for you don't have any, like, like your core thing is that they did this as a backdoor.
Thomas:I don't think NSA did it as a backdoor. I think NSA did it as an internal key escrow thing I think the problem that the problem that we have here is that there are three NSA's right and we Like you're saying IAD didn't do this as a backdoor to use the parlance of the time, right? IAD didn't do
Justin:I mean, they sent
Thomas:agree with you. It's not it's not an
Justin:they sent their check director out there to tell everyone, no, we made this and it was not a backdoor.
Thomas:But like SIGINT Directorate was like Oh, we made this thing, right? Like the IAD came up with this. What don't don't don't check me on the group names, right? Clyde Frog,
Justin:No, no, no. It's more than that. SID doesn't tell IAD what to do.
Thomas:They didn't have to. IAD made it for legitimate reasons. IAD came up with a key escrow random number generator for wholly defensible and legitimate reasons, right? All SID had to do was notice that they did that and then take advantage of it.
Deirdre:Mm hmm,
Justin:but then your argument is that they didn't backdoor the standard, because the standard's work all comes
David:Are we just having a debate on the semantics of the word backdoor?
Matt:Yeah. Somebody backdoored the standard. We're debating about who it was.
Thomas:I think like the step, the step past that, that you want to go in terms of the step past this, you want to go in terms of intentionality is that the impetus to put it in, to create it as a NIST standard was primarily there to subvert cryptography. Right. You know, whoever told, um, you know, Dickie George to go talk to NIST and convince them to put this in the standard, whoever told Dickie George to do that, their goal was to recreate key escrow in 2005.
Justin:Yeah, I don't submit to that in the slightest, yeah. No, in fact, that's, that's my arguing against that.
Thomas:I can see that we disagree on this point. I'm just saying that that's the point, right?
Justin:Okay. I,
Thomas:still hung up on a Juniper point here, right? What I want, what I want to hear from Matt is, like, to me, so first of all, Juniper is the, the biggest smoking gun for me, right? But, like, that second random number generator thingy that we're talking about there, like, Juniper, NetScreen, not Juniper, but NetScreen had to go build that, right? Why did NetScreen build that?
Matt:You got me.
Thomas:NSA didn't build it.
Matt:No, they didn't. But here's the thing, once you've built a standard, all you need is one engineer. You need one engineer who is willing to work for you and put a piece of code for entirely defensible reasons into a product. And if anybody asks, why are you putting that standard into a product? You say, how could it possibly hurt anyone? It's a NIST certified government
Thomas:on, I agree with all that stuff, but we're still talking past each other, right? At some point, that global variable in the broken for loop had to be introduced into NETSCREEN's code. How did that come about?
Matt:Someone worked there and obviously used a, you know, name collision for a global and a local variable. And you could say it's an accident. It's entirely defensible. And if somebody had found it, they would've fixed it and said, whoopsy daisy. But nobody found it. And as a result, there was an exploitable vulnerability in, in every net screen firewall from 2008 to 2015.
Thomas:Interesting. Okay, so you're saying that, like, the original NETSCREEN backdoor was just that they used Dual_EC. And then after that
Matt:They used Dual_EC, and then disabled the second random number generator. So it didn't run. So it didn't
Thomas:so it's just like, it's just like a stroke of good luck for NSA, TAO, that the Netscreen firewalls.
Matt:no luck in that. This
Thomas:Well, if, hold on there, in your story, in your story, it has to be luck, right? Because like, a Netscreen engineer had to be enough of a doofus to use a global variable as a, you know, in a name collision to make that kind of thing happen. Otherwise they, otherwise ANSI generator and break the backdoor.
Matt:Here's the thing. So what? So let's say they'd found the vulnerability. What would, what would Juniper have done? Would they have panicked and told all their clients there was a serious vulnerability and they had an emergency and they had to do an emergency patch? Or would they have
Thomas:That's what they do routinely, yes.
Matt:would they have said, hey, whoops, we used a one NIST certified generator instead of the other NIST certified generators. There is no vulnerability, don't worry about it.
Thomas:Right, no, I get that point, right? I'm with you on that, right? But like, still, NSA's access to those, to those firewalls or those VPN terminators, right? NSA's access to those NetScreen VPN terminators was predicated on that stupid bug being there. If the bug hadn't been there, it wouldn't have worked.
Matt:But the bug is easy to add.
Thomas:You, you lose me there, right? Like, I see that, I see, I see that, I see that there's an innocent explanation for the bug. I'm with you on that, right? I get that you could add it. But you couldn't, but you wouldn't bet on
Matt:Here's what I'm saying. There is no innocent explanation for Dual_EC being there. And the minute Dual_EC is there, the question you're going to ask yourself is, Huh, that's very strange that Dual_EC is there. There are two hypotheses here. One is that Dual_EC is there because somebody loves Dual_EC, and the other is because somebody's putting a backdoor in this. And then the next question you'd ask is, let's assume the second, let's assume it's a backdoor. There's no way that backdoor could be exploitable because a second random number generator cleans the output. And that second random number generator would break the backdoor. And immediately you would think the only way that second hypothesis could be true is if somehow, due to a very silly error, the second random number generator never runs. And when you look at the code, surprise, surprise, that's the case.
Thomas:Right, it sounds like you're making a case that NSA had some agency in getting the second round of number
Matt:I think that NSA had one person who worked there who did them a favor. I really do. I can't prove it. I'll never be able to prove it. But it looks too deliberate. You know, it looks much too
Justin:I do feel like I've never, I'll never be able to prove it applies to the whole course. I read Metaindies. I'm not denying how busted it is. I don't know the history or reasons for that. And I'm sure they have, I mean, basically Dickie George said, they have the secret criteria, right?
Matt:It's post 9 11. It's post 9 11.
Justin:I don't know what that means, it's post 9 11.
Matt:It means, it means, okay, I don't, you're, you're as old as me. And maybe I actually most, um, some of you are, maybe some of you are blessed to be a little younger, but you remember how crazy things were after 9
Justin:These people weren't even born when the STU-IIIs shipped, right? And this is, as far as we know, this is the cryptography that we are talking about
Thomas:That's just David.
Matt:It's post 9 11, it's post 9 11, the war on terror is going on. It feels like an existential thing. It really did for a few years. And this is the exact time period we're talking about. You've got terrorist organizations. You've got people like Hamas and Hezbollah, et cetera, all over the world who are communicating with banks and so on. Moving sums of money, banks are using firewalls, commercial, commercial products. And you're thinking to yourself, literally the difference between an American city going up in a mushroom cloud and not, might be my ability to access that firewall traffic I'm intercepting over that cable. And you think they're not going to do this stuff?
Justin:You're telling a story. I don't want a story.
Matt:I have a document. I have a document that
Justin:have a document! You have a document from an entirely different organization, a totally separate part of the agency. This is the thing.
Matt:is 250 million a year, in NSA terms, is 250 million a year a small program or a large program?
Thomas:It's a medium program.
Justin:you're talking This is the thing. Like, you are looking at something from the outside that you know nothing about. You've already made your decision on the intent. Like, you go in knowing the intent and you're putting a story together to fit that.
Matt:No, I mean, I'm looking at, for one thing, there are a series of documents that says they specifically altered this specific standard, right? The documents don't just say we have a program to alter standards, they say this one with Dual_EC in it, in a top secret line,
Justin:Wait, what, what,
Thomas:Just because I'm nerding out on this one specific point, is 250 million a big program or not? What, what, what military vehicle do we pay for that
Matt:Is it small? Is it small? It doesn't have to
Thomas:It might be, I don't know. Is it, is it? I don't know.
Matt:Is it like two guys in a room with no
Deirdre:F 35 like a hundred million dollars or two hundred million dollars or something like that, but that
Thomas:it's a real program. Yeah, it's a real, it's.
Matt:you miss it? Could you, could it happen accidentally? And you'd be like, whoops, we didn't even know that was happening.
Thomas:It's not, it's not a side hustle. Well, I have one other big thing I want to hit before we like let this spin out of control again, which is just in the grand scheme of things, if you're coming at this like, you know, a TAO operator, right? Or, you know, any C& E person, serious exploits or vulnerability person or whatever, if you work for Azimuth, that kind of thing, Is this a good backdoor? Is this effective? Is this giving them a capability that is like really, really important in the world compared to what they had before? I know that you're going to say yes, right? And I believe it's a backdoor, right?
Deirdre:So, and, and I feel like there, it's, it's fair to ask that question, because at least one independent operator went and switched out those P's and Q's for some reason, and they targeted this for some reason, as opposed to, I don't know, shimmying in some other infiltrate y
Justin:what they targeted after, after the issues with it had been disclosed, right? So you're talking about explaining an end day vulnerability, right? Like at that point you're exploiting an N-day vulnerability.
David:issues with it were disclosed before it was standardized.
Thomas:yes, I mean, the exploit for it was like, it's a pretty
David:99 paper and a 2000 standardization. Like,
Thomas:But it's also like, there's an elegance to the idea that it's a NOBUS backdoor, just meaning like, you know, only NSA can actually exploit it. But, it's kind of like a weird border area, in terms of like, nobody but us capability, because it's a random number generator, there's no way to verify that you're running the right points anyways. So like, other people can, in fact, exploit it if you swap the points out. It's, I agree, it's a NOBUS
Matt:way. The
Deirdre:looking at the numbers coming out? Yeah.
Thomas:Yeah. it's true.
Justin:wait, the answer is fifths?
Thomas:So, Dickie George would say that when he brought this to, to uh, to NIST, he said, Sanitize this, but let people use their own points, and then NIST and FIPS came back at them and said, No, it's gotta be these specific points, which is just like another nail in this whole thing, right? Like, why
Deirdre:Well, from when I watched him, they stuck with those points. Not necessarily that it has to be only those points and no availability for other points, which is part, in my perspective, in my narrative, of the negligence or the unfortunate non non looking beyond just the thing in front of you and seeing the consequences of only allowing the points that had been shipped over from the NSA and not allowing any others, especially when those points had no explanation about how they were generated other than we did it securely according to our secret, you know, parameters.
Matt:think you're missing. There are three, three levers at play. The first is the standardization, which includes just the default points. The second is FIPS certification, which, who does FIPS certification? Well, it's the Cryptographic Module Validation Program, which is run out of the NSA. And the, they can decide whether they actually certify anything. It is. At least it was back in the 2000s. It was an office in Fort
David:it's not anymore, but it
Thomas:I mean, Nate Lawson was a FIPS validator
Matt:No, no, no. There,
Thomas:serious crypto, and he didn't
Matt:CMVP sets the standards for the labs. They are the ultimate controller of the lab, and CMVP can decide, Hey, have we ever certified a Dual_EC implementation with alternative points? And the answer is no. They never have. The only points they ever allowed for certification were the, the canonical p and q. The third lever they have is they have developers and contracts with organizations where they can say, implement this generator. And those three things together are a lot more powerful than any one alone.
Justin:Wait, take a step back to that one, with the, they never certified anything else. This is sort of the textbook example of what sucks with government contracting, because you write something down once, and even if you, when you write it down, you say, this is not mandatory, this is just an example. Everyone, from then on, treats it as mandatory. You don't, you don't have a choice, right? So, Dickie George's argument was that they didn't care what the points were, they didn't even want to supply certain points, but NIST said, no, we can't give you, you know, we can't do it without you supplying something, okay, fine, here, use these, we don't care, but the problem is, now it's written down. And everybody's gonna use the same thing. So, for you, that's a backdoor argument. For
David:He did say the government stuff was gonna use those government
Matt:Yeah. I mean, I think the argument was there were a lot of legacy customers, AKA government customers and BSAFE that were using the original points. And I think that there was some pressure to use the original points. And I don't know, maybe NIST dropped the ball. It's possible NIST dropped the ball, but I don't really believe the, I mean, in retrospect, I think you could make a case. They said you could use alternative points, but I also think that they knew that the momentum was on their side and they wouldn't be used and they were
Justin:you're conveying intent. You're assuming intent when you say that. Like, I think it's hard to have an actual discussion about this without going in immediately assuming intent.
Thomas:Be careful, because he's not, he's not necessarily saying intent about IAD, just somewhere else in the
Justin:Yeah, but structure, like, organizationally, somewhere else in the organization still doesn't make sense.
Deirdre:As a person who's blocked on generating test vectors for a completely open standard right now, I understand the, once you have test vectors and interop parameters, you, you, you're kind of done. You're like, do I have to go through all this work to generate even more? What if I don't? Everyone's fine with that? Cool, great, ship it. I completely understand that scenario happening for the people at NIST, given parameters handed to them via, you know, Dickie George.
Thomas:I also, Justin, I think, I feel like I, I have some empathy for the position you're in here, because I, like, I picked a king hell mess of a fight with, uh, with Matthew and with Trevor, with Trevor Perrin about extended random, which was, in my, in my case, it was mostly, like, you're, you're saying, like, Dickie George or IAD didn't do any of this stuff maliciously, and when I was talking about it, I still mostly, I still entirely believe this, right? But my whole thing with it, there's, there's a follow on standard called extended random, there's like six different permutations of it, that make Dual_EC exploitable in a TLS, in a conventional TLS setting, right? And like, uh, ekr was involved in the standardization of it, and when the first extended random stuff came out, people were like, well, what, you know, how much culpability did ekr have in that? Which set me off and all that, right? Like, I have some empathy for the position you're in there, right? But like, all you have to say is, okay, whoever you're saying was good was good, right? Like, whoever you like in this situation, they're fine. We're just saying there's some other evil person out there that you don't know of.
Matt:I don't think they're evil. I think they were trying to save the world. I think they were trying to save the US, but they, they really screwed up.
Thomas:we got off on a really good tangent about like the, is this a good, you know, backdoor thing, but I'm still like, I come back to, even in 2006, was Dual_EC like an effective internet wide mass dragnet cryptography backdoor? Because that's certainly what, that's,
Matt:I don't agree. I don't agree. Look, here, here's the situation that the NSA was in, in the 2000s.
Justin:By the way, I was there at the period we're talking about.
Matt:Okay, but, but, and I think, I think that's, I think that's relevant,
Justin:the story that you are pinning to, I'm like, I mean, I was there, and it doesn't, okay.
Thomas:if you were at Google at some point, but you don't know, you couldn't, I can't Google you instead of using Google, like, just because you were at Google doesn't mean you know everything that was happening at Google.
Matt:The NSA saw themselves in a situation where their, the encryption was not very widely used yet in the early 2000s and was slowly becoming increasingly deployed and it was starting to block capabilities. We agree on that, right? Right.
Thomas:Yeah, of course you do, that's just 'going dark'.
Justin:but, no, I disagree with the
David:To be fair, like, Dickie George says this as
Matt:this is in
Justin:with the notion that it was an issue in the early 2000s. It became an issue,
Matt:starting. I think, I think it was people who are, people who are prescient were looking forward and seeing TLS and encryption as being an issue. IPsec was starting to come
Justin:This is where, alright, I'm gonna say I disagree, but I don't want, I can't get into the details of why I disagree.
Matt:Okay. But I think people saw it, maybe they looked forward a decade and they saw this being a potential issue and they said, we need to develop a, like a whole bunch of strategies to deal with this. And I think their strategies broke up into three different branches. And I think one was, people doing exploits. And I think that's been very successful. I think we can all agree on that. And the second one was looking for vulnerabilities and sort of an exploit, like passive exploit thing. And the third, which unfortunately we do have evidence for, because we have a document that says they did this, was let's try to build backdoors in crypto, in commercial crypto systems. The stakes are so high, we need to do this. And maybe at the end of the day, it actually didn't turn out to be as useful to them as they thought it would be, but they certainly tried it.
Thomas:Justin, I can see the, I can see the look on your face right now, and I just want to say that based on the rules that we agreed to before we started this podcast, the terms of engagement here, based on the evidence that I have heard in this conversation, I have determined and I'm now ruling that that document that Matthew Green has is real.
Justin:I'm not saying it's not real. I agree with it's real. I don't, I don't know the provenance of it, I don't know if it was like, somebody making a proposal, I don't know if it was like a proposal that nobody ever did, like, alright, so, I am, I unfortunately, I'm comfortable talking about like, uh, other employers, right, there's no risk of me like going to jail about revealing details, and I will say it, other past employers, non government employers, I saw documents leaked, things that were, proposals that were, nobody was working on it, somebody just wrote up this proposal, nobody, they didn't, but it ended up getting leaked, and it was treated as, it was like, oh, this is something that they're absolutely doing, and it was like, no, what, and as I said, I also know people in other contexts, this has not happened for anything I've worked on, but I know people in other contexts who had, who were victims of hack and leak, and the leaks, minor alterations to documents, little tweaks, etc. Etc. Etc. I don't, and this is why the provenance of these, like, I, I've been, as soon as the hack and leak stuff started, I was like, no, this is not good. And I am
Matt:This is not Hack and Leak, though. This is not
Justin:Oh no, wait, what's hack and leak? He stole credentials! He stole credentials
Matt:this, these
Justin:in order to get the documents! It was still a
David:It was insider hack leak
Matt:this did not pass through, say, Russian intelligence, and,
Justin:no, no, wait, wait, no, he went, he went to China and then delivered it to Russian intelligence.
David:Yeah, but afterwards,
Matt:and secondly, secondly, this was published in
Justin:Wait, you're not trying to argue that Snowden doesn't, for all intents and purposes, present just like a foreign intelligence asset, right? I don't know if that was his intent, but
Matt:I'm trying to argue two things. One is, that this did not go through some foreign intelligence agency. And secondly, this was reported in the New York Times. And third, the NSA was given copies of these documents, and they were asked to comment and make corrections. And at that point, they could have said, these documents are forged. And they did not.
Justin:yes. Thank you. This actually gets back to a really important point that I was making earlier, which is the NSA, in this case, they could have come out. They could have said something. They didn't. They're like, screw it. We're just leaving it be. But you know what they did say? No, this was not a backdoor. Was they actually sent somebody out to say, no, really, this was not a backdoor for Dual_EC_DRBG. So you're saying, so you're saying if they said no, you would have trusted them that time, but you don't trust them that.
Matt:I, I would have trusted, and if, I wouldn't have trusted anything, actually.
David:Based on this podcast, retired former NSA employees will say anything.
Matt:You may need to plug your ears for just ten seconds. The fiscal year 2011 actual funding for this program was 298. 6 million. Fiscal year 2012 enacted was 275. 4. And fiscal year 2013, request was 254.9. It had a headcount of 144 full-time equivalent positions and went down to 141 in 2013. So over three years, there were over 140 people working on this thing. And with funding that lasted over three years, it was not some idea that somebody had, or a request that was made that it was never enacted. Uh,
Justin:So you're just talking about like money and personnel stuff,
Matt:I you, you made the point that maybe this was some idea somebody had and it, it didn't
Justin:No, no, that's not what I said. No, I said that maybe the document that you're looking at was just a proposal specific to what,
Thomas:Right, but it describes a funded program.
Justin:No, no, does the whole thing describe a funding program? Or does that, or is there like, this is the thing, that's not how, that's how stuff gets written up. Like.
David:It might not be the entirety of the program, I believe, is your
Justin:saw someone, whenever someone was putting forth a proposal or like, try a program, I didn't seem like, like, attaching dollars is something different,
Matt:I'm looking at a budget table that says TOP SECRET, SI, TK, NO FORN at the top, and then talks about the computer network operations. This is unclassified. Computer network operations, SIGINT, enabling, program, and then everything else is
Justin:Well, no, but you're, you're, this is my point, you're, you're, you're,
Matt:By the way, this is
Justin:like you're treating a bunch of documents in the same way you were treating SID and IAD as interchangeable. It sounds like you're treating different documents that say different things as interchangeable.
Matt:No, this is one document, and then below it it says project description. I already read you that paragraph, but I don't know if you heard it. Um, that says exactly what the money is designed to do. It's really unambiguous. I really, I, I understand your feelings about this classified document, but I, I, my, my feeling is once it's been published in the New York times and 11 years have gone by, you have to just be willing to read
Justin:Yeah, I just avoided it because, like, in particular, I'm doing this podcast right now where I'm talking, where I'm trying very carefully not to talk about anything. So, I'm intentionally avoiding any of the things where I could step on myself,
David:Alright, on that note, I think let's kind of close this out by, well we're gonna do two things. We're gonna go around the horn first, and then we're gonna have Thomas adjudicate. But first, um, we're gonna go around the horn, and we're gonna say, on a scale from let's say 7 to 24, where 7 is an absolute impossibility and 24 is total metaphysical certainty, Uh, what is the likelihood that the standardization process was a specific attempt by a part of NSA to backdoor a standard? And we're just gonna go around the horn. That's 7 to 24, where 7 is impossible and 24 is absolute metaphysical certainty. Justin,
Justin:Obviously, I was seven, and I resented having this argument with four cryptographers.
Thomas:And David,
David:uh, the correct answer is 18. 5. Thomas?
Thomas:I'm sure that Dual_EC did make things pretty hard for people doing cryptographic standards. I am not so sure that there's a lot of evidence that that's a bad thing.
Justin:Secret hero.
Thomas:like, things could be easier, and like, it could be easier to deal with NIST and have people trust NIST. I'm not sure why that would be a good thing.
Justin:You know, when I was at NSA working in IAD, I had to go give a talk on something at NIST, and I didn't realize until after I was there, getting yelled at on stage by a bunch of NIST people that they really just wanted someone to take flack for their password policy which had literally nothing to do with me at all, and I was like set up as like a sacrificial lamb. It was a weird experience.
Thomas:Well, I mean, to wrap this whole thing up, I think we can all safely conclude that Justin is wrong.
Justin:You know, here's the thing. Actually, wait, I got one. Can we make a bet? The problem 40, 50 years before we get the answer, but who wants to make a bet? How much? How much you want to bet? Inflation adjusted. Because we're not betting, like, money now, and then, like, how about, how about, like, a thousand dollars? I'll bet a thousand dollars. Inflation adjusted to when, when these documents could eventually be FOIAed. That'll turn out, no, it's just really, really ugly legacy garbage. And not a backdoor.
Thomas:Oh yeah, no, I'll bet. I'll take that bet.
David:Yeah, I think this is an undecidable question.
Thomas:I think it's a pretty safe bet, like, in that it's either gonna come out, like, a set of smoking gun documents about how they literally operationalized this, or nothing will come out. So there's really no way for me to lose the bet? Are you gonna put a time limit on it? Are you saying, like, nothing will come out in 40 years, and if nothing comes out, I lose the
Justin:bet? See, this is the problem. This is, this is why I'm screwed on this one.
Thomas:It sounds like you're saying either like a set of documents comes out that's proving that it's not a backdoor and it's just legacy cruft, or a set of documents is going to come out proving that I'm right, and I'll take that bet.
Justin:I mean, that, that's basically the document. The
Thomas:If I lose, if no documents come out, then I'm not betting anything.
Justin:No, no, I'm not betting anyone on their documents. No one should
Thomas:Okay, yeah, obviously I take that
Justin:The absence of proof is not proof.
Thomas:Yes, okay, this is very good. I have the opportunity to win money and no opportunity to lose money. This is great.
David:Luckily, that bet will be able to be paid off with just one copy of the Art of Software Security Assessment.
Thomas:holding onto my copy, which will only appreciate in value, so.
Justin:this would have felt better, though, if we could have gotten, like, someone else who'd ever spent time in an intelligence agency.
Thomas:We feel for you, Justin. I, actually, I really, I really appreciate you taking this side. There'd be nothing for us to talk about with Dual_EC if you weren't here.
Justin:I well, nobody's talking about it anymore. It was 10 years ago. Over.
Matt:So, just for the record, like, I strongly disagree with you, but I completely respect your position on this, and I hope we don't fight too much on, uh, Blue Sky or Twitter or wherever we are now.
Justin:I try not to fight online anymore, I just, I just, I just dodge at this point, I've, my fightin online days are gone.
David:He says while currently fighting online.
Justin:No, any time somebody got, any time somebody got argumentative, I just disengaged. That's
Thomas:That's what I noticed. What I noticed in this podcast is that anytime anybody got argumentative, you just disengaged.
Justin:online, I didn't say here. But Matt, I did want to say it. None of this in any way, like, disagrees with, like, your work or the technical capability or anything else like that. Like, I agree with you. It sure as hell looks like a backdoor. We just, yeah, we disagree on, on the intent and whether anything like this could have ever been operationalized.
Thomas:Well, I'm posting online the series of screenshots I took of David Adrian's face during this whole thing. You brought him so much joy. So much joy.
David:Yes, this was, this was great. Um, and on that note, I would just like to remind everybody that you can buy merchandise at securitycryptographywhatever. com, and that Security Cryptography Whatever is a side project from Deirdre Connolly, Thomas Ptacek, and David Adrian, our editor is Nettie Smith. You can find the podcast online @scwpod, including on BlueSky, and the hosts online @durumcrustulum.com,@tqbf, @davidcadrian.Com. If you like the pod, give us five star reviews wherever you rate your podcasts. Thank you.