End of Year Wrap Up

End of Year Wrap Up

David and Deirdre gab about some stuff we didn’t get to or just recently happened, like Tailscale’s new Tailnet Lock, the Okta breach, what the fuck CISOs are for anyway, Rust in Android and Chrome, passkeys support, and of course, SBF.


This rough transcript has not been edited and may have errors.

Deirdre: Hello, welcome to Security Cryptography Whatever. I’m Deirdre.

David: I am David. We don’t have Thomas today, but I would like to note that Michigan football is undefeated against Ohio State since we started podcasting. So I’m gonna take full credit for that.

Deirdre: Cool. We should have gotten Chris Peikert on for end of year holiday call to celebrate another win, cuz he

was on last time they won.

David: standardization.

Deirdre: Right. This is basically like a catch up, holiday, not even a mail bag, just like what else did we not talk about this year on the pod episode. And one of the things we didn’t talk about is, uh, the NIST post quantum KEM, key encapsulation mechanism, contest came to a close, and the winner shitty drum roll was Kyber, uh, surprise, surprise, a lattice-based KEM. However, there were some patent issues or licensing issues. and so a very particular parameter set of Kyber has been, uh, licensed to whatever, the community, the world. But all other parameter sets for using Kyber are not available and that people are a little bit unsure about that.

It seems

David: like nobody is going to use Kyber unless they resolve this licensing issue, but, uh, that is kind of like their entire job, so, they’ll probably get it done, albeit slowly.

Deirdre: NIST’s job is to make sure that people can use the, the cryptography that they

David: things that they standardized Yeah. As to find something and make it standard and open.

Deirdre: It’s like you have one parameter set, one very narrow parameter set. You’ve got no room for error.

David: mm-hmm.

Deirdre: But, uh, yeah. Yay, Kyber.

David: congratulations to the Kyber team,

they may be. I don’t actually know.

Deirdre: I forget who was on that team. Like I know, I know a

David: We’ll look it up and put it in the show

notes.

Deirdre: Uh, I know a lot about the, the dead and dusted. I saw the SIKE team.

David: Mm-hmm.

Deirdre: but not, not much of the, the other teams.

David: Mm-hmm.

Deirdre: what else?

David: I will say on the, the post quantum stuff, the good news is, the KEMs, we will be able to like integrate into TLS modulo deployment issues of, like, things on the internet don’t necessarily like big ClientHello’s, and the Hellos get a lot bigger with

these schemes. But like aside from that, there’s no fundamental reason that it can’t go into TLS, whether that is Kyber or another lattice.

Deirdre: Right.

David: signatures on the other hand just not feas—, like there’s too many signatures in TLS such that like a, it’s like one to two kilobytes of post quantum signature, something like that. I’m not quite sure, and there’s like 18 or 20 or 40 signatures in the TLS handshake, but not actually quite that many.

But like between the CT log, like each, you know, each SCT has, uh, from multiple CT logs, signatures over the handshakes, signatures in the certificate chain, um, signature’s everywhere. So that approach, the approach to TLS for the last few years has kind of been we will simply solve a problem by slapping another signature on the handshake cuz they’re small and quick.

Deirdre: Oh,

David: But in post quantum world signatures are neither small nor quick. I don’t know, maybe they’re still quick. I’m not actually sure. They’re definitely not small.

Deirdre: Can I interest you in a small and somewhat quick isogeny-based K—

David: I

Deirdre: signature?

David: Still

Deirdre: EVAs

David: still think it’s way too freaking big. Um, right, like we’re talking 64 bites for like an Ed25519 signature?

Deirdre: That’s true. Hold on. SQIsign size. The caveat being that, well, would you like a, would you like a signature of 200 bytes?

David: That’s not

Deirdre: to a kilobyte. There is a post quantum signature competition with this that is going, um, is still, still early-ish, not related to the KEM one that just finished.

And there is possibly another like lone, lone isogeny candidate. but, they make, again, one of the attractive parts of isogeny-based cryptography is that it tends to have very small public data. Um, so like the public keys are small, the signatures are small. It seems to be fast. So two seconds for signing.

50 milliseconds for verifying for these out of date metrics I’m looking

David: Two seconds for signing is not fast, but

Deirdre: No, but for isogenies, that’s really fast.

Um,

But the downside is it’s not just isogenies, it’s got like quaternions algebras and other complex stuff.

David: puter ins,

you hate it when your TLS ends up in gimbal lock.

That’s why you’ve quaternions in it, right? Like

Deirdre: Yeah. it’s complex and like even, uh, even some very, like in the weeds mathy, people are like, yeah, this is, this is complicated to implement. So it might be small and it might get faster, but it might be an unattractive, implementation target for, uh, people just trying to deploy the software.

We’ll see.

David: The good news is that signatures like, they just like don’t matter in the sense that, uh, they don’t matter until, uh, there actually is a quantum computer. Whereas you could argue that like,

KEMs matter now because you could be recording all the traffic and saving it for when you have a quantum computer.

Um, I’m not sure I entirely buy that argument, but like,

Deirdre: the signatures matter, like right now when you’re making the connection, and then if you didn’t, you’re just like, eh. Oh, okay. Whereas with the, the

David: well, they only matter during the connection like if a quantum computer actually exists, whereas like you’re, right now, your traffic could be being recorded to be saved, to be decrypted later by the quantum computer when it exists, but it doesn’t matter how you signed it

Deirdre: yes, they, like, you could record the whole, uh, handshake and all the, the keys that you exchange, and then they just save them for later. And all the traffic that you used after that, that you encrypted after that. But the signatures to be vulnerable, you have to have like an online quantum computer.

Uh, issuing signatures or cracking signatures, like, online. And we think that even if that were to happen, that would be a stupid use of your expensive quantum computer.

David: And also I think that we would like the various parts of the US government, including like some well known academics, did like a bunch of studying and released a report that was basically like we would see more second-order effects if a quantum computer existed. Uh, the subtext being that if China somehow developed a quantum computer, like, the US would know.

Deirdre: Yeah, yeah.

David: And there’s no evidence that the incremental steps that would need to happen to

Deirdre: yeah. Sort of reminiscent of when, um, all the academics stopped publishing about nuclear fission research in the forties, and everyone was like, wait, what happened? Where did they all go? Why did they all go quiet? It’s like, oh, they went to a place in the, in the

David: They went to Los Alamos, which let me tell you, you should not drive to Los Alamos in a rear wheel drive car in the winter, without first checking the weather, lest you need to turn around and get out of Los Alamos very quickly. it turns out that Los Alamos is at the top of a mountain, inside of a valley, um, which in retrospect makes a lot of sense.

Yeah.

Um, and there’s just a single two lane road up there. Um, and I drove up there, realized it was about to snow, and was like, I’m turning this Miata around and I’m getting out of here.

Deirdre: Oh my God. " Don’t make me turn this Miata around." God. Okay, cool. it’s not post quantum things. Things that, uh, that got published recently that are related to things we’ve talked about this year: Tailscale released a new thing that they’re calling Tailnet Lock, I think, um, which is actually kind of related to solving a problem that we talked about two episodes ago, which is that, it’s all great if you have your traffic end-to-end encrypted or your messaging end-to-end encrypted.

What really matters is who’s controlling the keys to those ends. And so Tailnet Lock is basically when you add a new, Tailscale device, you have a key pair that goes with your Wireguard connection. Previously, they would just upload the key, the public key to the control plane or whatever they call it, and the control plane could just see all those keys and you know, it could, in theory just, add more keys and just say, ‘Hey, connect to this other end of your Tailnet.’ Um, and that was allowable. And now they’re basically doing a thing where every new key that gets, uploaded has to be signed by another tRusted device. and if the signature doesn’t verify, then that you just don’t get access to that.

And so it’s avoiding compromise of your ends, of your end to end encryption, from the like control service itself. The devices that you control and you already tRust are the only ones that can, add new devices or I think remove devices, I haven’t looked that far into it, from who you can connect to, which is neat.

David: Do we know what the interface for that looks like? Like

do you get like a push notification on it? Like, I wonder how,

Deirdre: I don’t know. They have a little diagram, but uh, I haven’t looked at the literal UX and uh, I have to update my own, setup over here cause I logged out of it. but yeah, I don’t know what it looks like, but, uh, I like their diagram. , I like their diagram and I wrote like that they wrote up a white paper, but instead of it being a pdf, a literal white paper, it is a webpage.

and I appreciate that. Another cool part that I noticed was, uh, I think they were already using, cuz Wireguard uses, uh, curve25519, uh, key exchange already. And you, it’s like a well established trick. You can kind of call it a trick where you can turn your, uh, curve25519 key exchange keys and you can turn them into Ed25519 signing keys.

And you can use them to sign stuff with the same key pair that you would do either Diffie-Hellman with or whatever.

David: believe it’s called XEd5519. X E D

Deirdre: for Signal, it definitely is. I don’t know if they’re using it for this one, or at least for the signatures.

but uh, for. if you’re using Ed255 19 signatures in a distributed setting that has to, uh, have independent parties verifying Ed25519 signatures, and they all have to agree on what a valid signature is, you should not use, an Ed25519 implementation that is strictly compliant with RFC 8032, because that lets you have a variety of signatures that depending on how you conform to RFC 8032, may or may not agree on whether a signature is valid or not.

And if two ends of your Tailscale network do not agree on whether this thing that you added to your bag of keys is validly signed or not, then you will have a split in consensus of what is a valid bag of keys.

David: The good news is that like most of the time that you’re dealing with that you just end up with like, oh, validation, failures. It doesn’t really create like fundamental, like the whole security of the system is broken unless you are also trying to rely on the fact that like it’s a uniqueness property.

Like some early blockchains did.

Deirdre: Yeah. But if you just wanna like, have some real clear rules about how to verify, which Ed25519 signatures are valid, there is a handy dandy document from the Zcash people about which signatures, uh, and like what, points are valid, what keys are valid and all that, called ZIP 2 15, that has seen adoption because it is very clear, and explicit about how to verify these signatures.

And so they’re using that. And so I was like, Hey, I know that document .

David: So what you’re saying is it’s only really an Ed25519 signature if it comes from the Montgomery region of MSR.

Deirdre: I wish, uh, what I’m saying is I wish that someone had, uh, seen these things in 8032 before everyone just went and had slightly different but conformant implementations of Ed25519, based on 8032. But well, we’re learning. So, but this is very cool, neat to see from Tailscale. and I am always happy to not just having to always tRust the server, even if the server, is being operated by pretty tRustworthy people.

We just would like not to be able, need to tRust you at all, cuz shit happens.

David: And since Tailscale came on the podcast in January, like three or four months later, they raised a hundred million

dollars. So, I think all, uh, Wireguard, derivative VPNs that have come on the podcast, uh, now have raised a hundred million dollars. So once again, we’re, we’re making waves.

Deirdre: good business decision to come on the podcast. yes. well done to Tailscale. big fans over here.

Let’s see, what else haven’t we talked about? There are beaches, but there are also breaches, like security breaches. there was like an Uber breach where they, like,

David: Yes. I believe someone bought like Uber credentials off of the gray market or black market or whatever, I think was the landing point. You know, like initial access broker, I don’t know where they came from. Some like employee laptop credentials. And then they used that to get access to source, which they then used to find like admin and credentials to the cloud account or something.

I don’t know. Like they, they found credentials in source. They pivoted, oh, they push spammed somebody and then they impersonated IT to trick them into accepting the push spams, even though they were rejecting them. And then they were on, I think the notable thing to know is that the, um, the attacker was on Slack,

with like actively interacting with u Uber employees. and since then, the alleged, hacker is, I believe arrested in UK? Maybe as like the 17 year old British kid, I think? 17 year old someone, allegedly. Um, so I don’t know, maybe don’t interact with employees on Slack if you’re, uh, breaching a tech company.

Deirdre: But how do you brag? How do you not just be like, yo, I am in your cloud.

David: this podcast condones doing financial crimes once you’ve breached into, uh, breached a company rather than just screwing around on Slack.

Deirdre: Just be, oh my God, , uh, yeah, I don’t know. So like, did they have literally like a, you know, are you trying to log in to aws? Yes? No? And they just got pushed to people’s devices and they

David: I don’t remember specific, it was one of those, whether it was a Duo or something else that like sends a, request and you say no, but if you keep getting them, you might eventually say yes. But I, I think what was notable was they then did spearfishing attempt where they pretended to be Uber, like internal IT, and had them, that person tell the employee to accept the push.

They’re like, we’ll deal with it, like, accept it now and then we’ll do this. Or like, we’ll

handle

it and mark as fraud or something. And, uh, so they accepted it and I think that was the, perhaps the initial foothold or maybe that was what got ‘em on the Slack. I don’t

Deirdre: Yeah. That’s impressive. Yeah. Like you don’t just have to trick them one time. You can literally DoS them and just overwhelm them and just be like, it’s fine. Just click accept. Accept. You’re like, please. Thank you. ,

David: this goes back to my longstanding policy of you can’t be phished if you never check your email,

And I think you’d, uh, you can, uh, amend that with, uh, you can’t, don’t have to worry about your Slack credentials being stolen if you don’t use Slack.

Deirdre: Yeah. teams. The next thing will be teams. I don’t know

David: I don’t know. No one successfully pivoted across an active directory based network in the past. So,

Okta was breached at some point this year. I think that was this year. It was like January or February or something. and they had a really bad, they kind of denied it at first or downplayed it and then apologized later.

Deirdre: Yeah. I remember it being like, people, it, it feeling kind of like, really? Are you trying, like, I would’ve thought a little bit better of Okta and then they finally kind

of, stopped downplaying it a bit.

David: Yeah. I feel like a lot of the people that I talked to, like never had a particularly high opinion of like Okta security, but used Okta anyway because they had the feature set that

was needed. And so now, what I think was interesting about this breach was like, everyone yelled at Okta about it and I believe for good reason, I don’t remember the details, but like no one was like, ‘Hey, you know, vendor X, your security’s like a piece of shit cause you use Okta’, like you just kinda get a free pass to use Okta.

it’s like a way as like a head of security to just like dump the blame on the land somebody else. Cause at this point, like you somehow don’t get, like no one gets fired for buying Microsoft, you know, no one gets fired for buying Okta, but also no one gets fired for buying Okta then when Okta gets breached either.

Um, so,

Deirdre: like this is supposed to be your job. It’s like, yeah, but

also

David: through Okta. I guess is the, the takeaway for security teams?

Deirdre: Oh God. Is there any proper competitor to Okta? Like I, all I can think of is like, you already have Google Workspace or you already have some other thing, some id IDP identity provider, and you’re just like, You just completely leverage that.

David: Yeah. I mean, Microsoft, Google, um, Duo to some extent, are I think are the big ones. Maybe there’s Ping identity. I don’t know what, I don’t know much about them. Yeah, I mean, it, it depends. I

Deirdre: Hmm. Didn’t Duo get bought by Cisco?

David: Is owned by Cisco.

Deirdre: Okay. But it’s on, it’s still its own thing. It, you can still just do Duo, cuz I, I know it’s still hooked up into lots of things.

David: Yeah. So I mean, those are, those are options. The feature sets between everything vary. Um, I don’t think we necessarily want to go into a deep dive on picking an SSO provider.

Deirdre: like, like the, the principle makes sense, which is like you, not everyone can build their own, you know, like fully stood up, two factor, if you wanna do a good job with FIDO, if you like all like, uh, recovery flow. Not everyone has the like staff and talent and, can just do that. And so th that’s why they exist, so that you can plug in all of that via Okta and usually works fine.

But I definitely thought that they were better?

David: Don’t know. I probably have like people like off to the side looking at me sideways already. Um, but, uh, I don’t know. The, the other thing to keep in mind is like, do you need to authenticate your employees or do you also need to like, authenticate customers? Like do you need a provider to do MFA for your customer accounts?

Like who can do that? all things to keep in mind.

Deirdre: yeah. All right. there seemed to be like a running, presence of CISOs getting into trouble this year. and not all of them were associated with Twitter,

David: Yeah. Um, so there was a, drizzly, consent decree from the FTC, but that was actually about their ceo. I don’t know that they

Deirdre: Oh, and drizzly is like the deliver booze to your

David: yes.

Deirdre: for booze, right?

David: or GrubHub for alcohol? Yeah.

Deirdre: booze.

David: Or Instacart for alcohol, I guess. and yeah, I believe that like the exec, someone on the executive team, maybe the CEO themselves had like a seven character password for GitHub with no 2FA, that

Deirdre: No,

David: they reused on other services and that like got breached and then they like didn’t do anything about it. Um,

or something like that. yeah. And then they like kicked the hackers out or changed the password, but quote, according to the FTC, "failed to take steps to adequately address its security problems while publicly claiming to have put appropriate security protections in place." And two years later, um, a similar thing happened.

so they got a consent decree from the FTC and, uh, have to like take some steps to improve security and, some, presumably the CEO or someone else on the executive team, as part of those consent decrees, it usually ends up with personal liability,

Deirdre: huh? Yeah.

David: which is why they’re a big deal. Um, so like as long as you do the thing, you’re fine. But if you don’t do the thing, it’s not just like Drizzly gets in trouble.

Deirdre: yeah.

David: you personally,

Deirdre: Is that what happened to Uber as well? Like Joe Sullivan,

that guy?

David: so Joe Sullivan, who was the CISO of Uber, there was some sort of FTC investigation, of Uber in like 2016, I

believe. Um, I think it had to do with the privacy protections. So they’re being investigated for unrelated reasons, but like had reasons that they had to disclose things to the FTC and during that time they were breached slash bug bountied by somebody?

There’s like, it was a kind of a gray line between whether or not this was someone who was doing a bug bounty or someone that was extorting them, but they were like, okay, we’ll pay you like a bug bounty and make you sign an NDA and then this problem will go away. Um, but they apparently, did not, disclose it correctly and specifically didn’t tell the FTC about it.

Um, and then maybe I don’t, I don’t know, like might have lied about it after the fact. I’m not sure, to the FTC. And so Joe Sullivan, who I believe now was at CloudFlare, ended up being found like personally liable for, for handling this because like, on one hand it’s like, oh, we had a successful bug bounty.

On the other hand, it’s like you paid off some attackers and didn’t tell anybody. Um, and so I, I think the moral of the story here is don’t lie to the FTC.

Deirdre: Yep. Or they will find you personally, not your company, you personally guilty of obstruction of the FTC, like a proper, capital- F felony. Um, which is

David: can lie to your local cops, but you can’t lie to the FBI and you can’t lie to the FTC.

Deirdre: Yeah, it’s shut the fuck up Friday for your local little coppers. But if you’re under an FTC consent decree, don’t lie to the feds because they will convict you of lying to the feds.

not just for, companies that have, FTC issues, but do CISOs do anything or are they just a scapegoat, including for FTC issues?

David: Yeah, I

Deirdre: issues

David: mean, I think like you have, you’re in this weird spot as a cso, like, do you have influence on engineering? Like where, where are these things? Like, are you doing compliance? Well, compliance might not even be under you.

You, uh, do you have like product stuff to like make the product be designed and built securely?

Well, maybe, maybe not. I don’t know. And then like, you could, you could easily find yourself in the spot where like, I don’t know, maybe you have the security analysts or something, but like, uh, can you actually do anything to like, improve the, uh, security posture, or are you just being paid a lot of money to, um, have the personal liability from the FTC and, you know, hope that everyone else is, uh, uh, complying with your consent decree if you have one.

I don’t know. That that is a failure situation that you could be in as a ciso.

Deirdre: I have a theory that CISOs make more sense, if you are not a technology first organization, if you are an insurance company that has technology then a chief information security officer, or chief security officer makes a little bit more sense. Cuz you have to kind of be in charge of a lot of things like internal HR systems, but also like whatever the software that actually manages your insurance stuff.

I don’t know, whatever. But if you are, a technology company, you’re an engineering company, that is the thing that you know, as your primary offering value, as a, as a company, it might not really, might quite make sense anymore. Like you kind of want good security practices to be part of how you build your technology and how you run your service.

So having that be outside of the, I don’t know, technology org, engineering org, product org, however you set that up, might not make sense.

David: Yeah, like if you’re a, kind of a need to do technical security work, that’s probably gonna be handled by the technical teams who like, understand the platform and can build security into the product, which is why you see like product-aligned security leaders, but like they might not be at the C level, right?

So like, which kind of means that if the CISO is migrating back towards the business side, then it’s not like, what are they doing, but like, are you starting to look perhaps more like a, a legal or like a CFO type person and less so like a, uh, a VP Eng type role. Um, we’ll see.

Deirdre: Yeah. it’s definitely like, it really depends on how you build stuff. If you are an engineering driven, sort of company, but also what responsibilities you have. Like there’s compliance and there’s audits and like, if you’re processing payments, like you have like a whole bunch of people that just have to be in charge of making sure that you’re PCI compliant and like you, you pass your required audits and stuff like that.

So there might be several people that like, have something to do with quote security. but they may be, have a bunch of different titles because of everything that you try to do. But yeah, I don’t know.

David: and then if things go weird, you end up testifying to Congress. At least one CISO testified to Congress about Twitter this year,

Deirdre: Yeah. a guy named Mudge. That was interesting.

David: and uh, and we’ve all moved past that. Yeah.

Deirdre: Yeah. I, I don’t know how I feel about that. It was weird.

David: let’s stop talking about, boring organization stuff and let’s talk about rust.

Deirdre: Ooh, yes. all right. You, you work at, you work at this company, so I’ll, talk cuz I’m, I’m excited. there was a nice blog post out from Google Security, blah, blah, blah. 1st of December, memory safe language is in Android 13. And, just lots of, lots of nice. Pie charts, bar charts, measurements of lines of code, and how many memory safety vulnerabilities were found.

proportional to their code per Android release. we’ll link to the, post in the, show notes, but I’m gonna read some of these boldface lines from from this, uh, this blog post: Android 13 is the first Android release where a majority of new code added to the release is in the memory safe language.

Aw, that’s nice. And some of those memory safe languages include Java because it’s Android, Kotlin, which is new Java, basically. And Rust. We all love Rust. another thing that they said is 2022 is the first year where memory safety vulnerabilities do not represent a majority of ‘Android’s vulnerabilities.

Not just vulnerabilities in new code, not just in Android 13, but the first year where memory safety vulnerabilities do not represent the majority of Android’s vulnerabilities. And that is a big win because in almost every large project, like a browser or an os, almost always 50, 60, 70% of vulnerabilities are memory safety.

So basically Android is showing that all of those sort of wins that a lot of us have been trying to evangelize about memory safe languages such as Rust and like how to deploy them in some of these existing projects, that it will drive down your memory safety issues, is showing real fruit in a huge, a hugely deployed project and a very large project.

There’s a millions and millions of lines of code as Android and it’s so exciting to see. this was very nice to see. What else: "to date there have been zero memory safety vulnerabilities discovered in Android’s Rust code." That’s so sexy. "historical vulnerability density is greater than one per thousand lines of code."

In many of Android’s C, C++ components, uh, such as Bluetooth, media decoding and things like that. "Based on this historical vulnerability density, it’s likely that using Rust has already prevented hundreds of vulnerabilities from reaching production." And I’m just like, I’m just, I’m not gonna read this whole thing,

David: Deirdre is

Deirdre: yes,

David: bouncing back and forth

Deirdre: yes, yes.

Like this is like, This is it. This is the shit. This is exactly, this is exactly it. the, let’s see, what about non-memory Safety vulnerabilities, you ask? Cuz you can still have logic errors, right? Yes, you can: a drop in severity. So, uh," despite only representing 36% of vulnerabilities in the security bulletin, memory safety vulnerabilities accounted for 86% of our critical severity security vulnerabilities, uh, and 89% of our remotely exploitable vulnerabilities," blah, blah, blah, blah, blah. Basically with the drop in memory safety vulnerabilities, we’re seeing a corresponding drop in vulnerability severity. So If you’re not introducing new memory unsafe code, or you’re modifying memory unsafe code, and all the new code that you’re introducing is memory safe, it turns out the vulnerabilities you do have are less severe.

So it’s just all these wins all the way down. Anyway, I’m, I’m not gonna read the whole thing, but basically new code that they’re introducing is basically all memory safe. It’s a lot of r, it’s millions, it’s not small. It’s millions of lines of code in Rust, Kotlin, and Java, and they’re not touching old C and C++ code.

And that seems to match the expectation, uh, of experienced people in these sorts of code bases where. , if you don’t touch the older kind of battle tested C and C++ code, you’re generally not gonna like magically find new vulnerabilities in there. It’s if you’re modifying the C and C++ code.

And so you wanna try do new stuff in the memory safe languages. If you add or rewrite something, you want to do it in the memory safe language, and that generally will keep you in a good spot. And that seems to be playing out for Android at least.

David: Yeah, it’s not clear that, the fact that like vulnerabilities we’re mostly in new stuff and not old stuff in Android,

would

Deirdre: Hmm

David: necessarily apply to, other projects where, cause like another failure scenario, like that’ll certainly apply to one class. Another class of projects might, vulnerabilities can stem from like older code that like worked one way and had a mental model of the code, but that could not necessarily be enforced entirely by the type system, the compiler, et cetera.

But, you know, you reason through it, uh, it gets complicated, you figure out the right way to do it. And then in the future, new feature comes in, the changes, um, changes the constraints or changes invariants around that code. And then suddenly, even though you didn’t touch some of the old code, the way in which, like if you have code that has a lot of callbacks, for example, Um, kind of think about the memory safety of projects like that being like JavaScript before TypeScript,

You change one part of the code and then suddenly a callback in a completely unrelated section, is throwing errors. Um, so that, that is another failure mode that you can end up in, that result in memory safety vulnerabilities. Um, but you know, it’s going to depend on, on your project. Now that, does that mean that, like you shouldn’t be trying to use memory safe languages?

No. just that, like, you might see the, the impact of it, play out different as you start to, to add them, uh, in various parts of the project.

Deirdre: That’s wild. Like I, the, the JavaScript example is sort of like, yep. Like I can totally, like, I remember that whereas I changed something in something completely another place already just got broke because it assumed that maybe not global state, but its global context, uh, had certain, uh, invariants and it’s like, nope.

Uh, that was not controlled by your runtime. It was not controlled by whatever compiler you have or the type system you have. or you know, the memory model you have and whoopsie doodle, uh, you you’re trying to call something that doesn’t exist anymore and, and it’s broken. Uh, yeah. That’s interesting.

David: I remember there’s an old Twitter account, maybe it still exists, that was a magic realism bot that would tweet like short descriptions of magic, real fake magic, like realism stories. Um, and one of them was like, uh, uh, it’s revealed that the world is actually like a hallucination in the mind of an ostrich or something like that. And someone like quote tweets it, that is like, "With shared, mutable state, anything is possible!" And that, that’s what I think of in these situations.

Deirdre: Oh,

David: I’ve, I’ve tried to find this again. I, it’s very, I did not save it well, and I can’t ever seem to find it well, but

Deirdre: magical realism. I’m excited to see this progress and hopefully we’ll see more, like hard metrics and like results from any projects that are, adventuring into a, uh, hopefully more memory safe future. thanks Android, and I’m, I’m happy, happy Android user.

I’m all patched up in Android, 13, blah, blah, blah. So I’m, I’m very excited to have all this Rust running on my fucking glowing square device. All.

Ooh. another project that has more Rust is Chrome, they’re doing all the tooling apparently so that you can, they will support Rust in their whole build and pipeline, and they’re accepting, third party dependencies in Rust as well. And like I can think of, I can think of some that I would like to use if I were doing cryptography, in a Chrome context.

this is very exciting. And let’s see. Hold on.

David: No Rust in the actual like chromium source though. Just depend.

Deirdre: Okay. But still, but I mean, but you all build it all in. cool. I wonder if they’ll pull in rustls, the, the TLS— No, they won’t pull in the Rust Implementation of tls? Uh,

David: how you pronounce that?

rustls? Oh, always said Rust, TLS. But now that I think about it, there is only one T.

Deirdre: yeah, it’s like Rustell, my Jimmy’s, but it’s just

David: Yeah. Uh, too bad Thomas isn’t here. He loves the phrase Rustle my jimmies

My Gs.

Deirdre: um, yay. Yay for Rust in these big, big, big big C, C++ projects. I’m very happy to see, Slowly but surely Rust getting pulled into Chrome. feels like the Rust usage in Android started very slowly and quietly, and then all at once, they’re just like, oh, we’ve deployed like 1.3 million lines of Rust in Android 13.

Like, surprise, we’ve been doing this and working on this the whole time, and it’s like, yeah, surprise. I didn’t, they’re not rewriting, Android in Rust, but they are doing like these big components, um, and adding anything, not anything new, but like new pieces in Rust and like all of a sudden you get to 1.3 million lines or whatever they, whatever they’ve deployed.

So maybe more with Chrome?

Okay. What else?

David: Well over the summer we had, um, Adam Langley on to talk about passkeys, um, and those that now like officially launched in Chrome,

Deirdre: Yes.

David: weeks ago.

Deirdre: I am excited about this. I think this got officially announced on the 8th of December, and I think I used this, I used it at least once, on my phone. But what does it say? Uh, so just a reminder, passkeys are supposedly going to replace passwords.

So instead of having a alpha numeric phrase that you have to remember or write down or put in your password manager, and you really, really hope that you don’t use the same one in more than one place, even though people do. instead of that, you basically do what you have been able to do with like a Yubikey or another Fido, uh, key pair,

that, uh, will get generated for you, for your website, you’re, that you’re trying to log into, and do the kind of same challenge response protocol where, you will register your the public part of your passkey with your website, and then when you try to log in to the website, it’ll issue you a challenge.

And then you will, you know, sign the challenge with your private key part of your passkey and then return it. And if they can verify the signature basically over the challenge, you can log in. you can use this with your real YubiKey as well, whether your YubiKey is still your two factor. But the idea is that there’s no more typing passwords.

It’s just a passkey that can get synced as well. You can sync it between devices and use it for your website. the same way you would sync passwords, but it’s not phishable. You can’t phish a password and like say, you know, send someone the wrong link that looks like, you know, facebök.côm and then to be like, put in your Facebook password and then you just type it in and the attackers are like, oops, I have your password now.

Bye. I’m gonna reset your account and, uh, use it to spam people and, and scam them too. and it seems like this cross-browser OS consortium, with the Fido industry, I forget what they’re called, the Fido Consortium. I forget, apple and Google and Microsoft are all, you know, holding hands and moving together to deploy passkeys, across the industry so that no, no one is getting left behind and that they can be set up for success for getting adoption.

David: No login left behind.

Deirdre: no login left behind. so yeah, with Chrome, latest version of Chrome, which is what, one 10 or something like that? One 19, no, sorry. One 18.

David: It’s either one 10 or one 11. I’m not sure.

Deirdre: Okay. Uh, we’re, we’re

in the guy, the guy that works on Chrome

David: I always go, so there’s a chromium.org or.com or something like that, and they have a release. There’s a release schedule on that.

Deirdre: Yeah.

David: whenever I need to figure this out.

Deirdre: " with the latest version of Chrome, we’re enabling passkeys on Windows 11, Mac Os, and Android." Uh, with, "on Android, your passkeys will be securely synced through Google Password Manager or in future versions of Android. Any password manager that supports passkeys", and I betcha a dollar, maybe more than a dollar, that 1Password, big fan, 1Password for your password manager will be supporting those

David: I think they

already Do actually.

Deirdre: they, they, might. Let’s see,

David: Um, I’m not sure. Maybe only in like beta or something like that.

Current Chrome version is 108. Apparently I was just completely wrong.

Deirdre: um, they are posting about them. They are coming. Um, let’s see, yeah, "passkey support will be available in early 2023," so it’s not in 1Password yet, but it is coming like, rapido, and I will be super ready to, so the caveat being. If your browsers support it and your, there’s a little bit of os support involved.

Actually, I’m not sure if there is. for Chrome syncing, that’s the same sort of sync mechanism for your password manager and things like that. You still need the websites to support using passkeys. So similar to how not every website will let you, uh, register a YubiKey or a Fido key, for your two factor, not every website will just be ready to let you ditch your old password, uh, and use passkeys instead. they have to support it.

But, I think that will, that will happen. And like the first stumbling block that, you know, of course needs to be overcome is, all the browsers support it. And so that seems to be just happening and with good, cross-company coordination and that is so exciting.

David: it’s just this week I think, uh, or maybe late last week, SBF got

Deirdre: Oh fuck,

David: while earlier I might have, uh, suggested that you do financial crimes if you were hacking companies, in general, my advice is to not do financial crimes. Um, and if you are doing financial crimes, don’t create a group chat that’s the name of the federal offense of the financial crime, and

then

Deirdre: fraud.

David: Do not create a group chat called wire fraud where you discuss all the wire fraud that you’re doing.

Deirdre: my God. Or just fraud. Even if, I don’t, I am not a lawyer. I am not. this podcast is not considered, uh, financial advice. Like, we, like investment involves risk and all that crap.

David: Not only that, but while Deirdre may be a cryptographer, she’s not your cryptographer, so don’t take any of her advice.

Deirdre: yeah. The, this

David: have you ever seen "It’s a wonderful life?"

Deirdre: No, I’ve seen like clips from the very end, but

David: Really? Well, there’s a scene in "it’s a wonderful life" where the main character runs a bank, and then there’s like a run on the bank and they can’t give them money out, right? Because he’s like Eddie Good Bank. He’s taken the money and then he’s loaned it to people to get mortgages. So in this situation, he had a liquidity crisis and I’m just stealing this explanation from Matt Levine’s "money stuff", right?

And that he did not have all of the money to give back to everybody cuz he had loaned it out, but he was still solvent because people were still paying their loans and those houses really existed. And so, like, if you add up all the numbers in the spreadsheet or in the, the accounting book or whatever it is, it’s all fine. It’s just that money wasn’t there right now.

That is not what happened with FTX. Imagine if instead he took all that money and then lit it on fire and then people came and asked for it

back then the spreadsheets don’t add up. And then to make the spreadsheets to add up, maybe you like pull a piece of shit outta your pocket and you’re like, this is actually worth $8 billion.

And that’s how you made the, uh, uh, balance sheet add up. Um, and that’s closer to what FTX did rather than, uh, what George Bailey did

Deirdre: God, was it, what was the line Like? Weird fiat,

like God, that spreadsheet, it was just, it was just sunshine and rainbows and wishes and dreams instead of actual accounting of, of what? Eight, seven to $8 billion that they lost?

Yeah.

it was very funny because I, you know, I work on a, private cryptocurrency and a lot of cryptography that’s, you know, in and around it, when like, FTX just went and this guy just was like on his little speaking tour trying to be like, "I feel really bad. I, I fucked up." And it’s like, Uhhuh, Uhhuh, you’re now you’re arrested. people are like, is this going to affect you, ? Is this gonna affect Zcash? I’m like, let me check. No , no.

There’s all this, you know, worry about contagion, about the rest of crypto and it’s just like, ah, like I don’t, I don’t think it really affects Zcash.

Zcash is over here and it’s nice kind of in this nice little bubble,

David: and while it is perhaps kind of a goofy situation, presumably a lot of like, normal people lost their shirts because all the money is gone.

Deirdre: yes.

David: that is certainly not good.

Deirdre: No, this guy just his fucking,

David: maybe Tom Brady lost a bunch of money too, so,

Deirdre: right.

David: some, you lose

Deirdre: Yeah. And I’m trying to figure out if Giselle was smart enough to like just be completely separate

David: I, I don’t actually know if Tom Brady had any actual money in FTX or if he was just in their commercials.

Deirdre: Yeah. I don’t know either, but.

David: I will say one like weird side effect of this is, uh, UC Berkeley sold the naming rights of Cal Memorial Stadium to, uh, FTX a year or two ago. Um, and they got to take it down,

right? ,it’s no longer this, I think this became a thing after Enron.

I feel like a basketball stadium in I wanna say Dallas was named like the Enron Arena for like five years after Enron went under. And apparently as a result of that, or nowadays, it’s more clear that like if your sponsor goes bankrupt, you simply take it

down.

Deirdre: Yeah. what was the other stadium that FTX bought the naming rights to? Was this the

David: uh, the Miami

Deirdre: Oh, was the Hi. Okay.

David: more the one in la

yeah.

Deirdre: The other

basketball stadium.

David: Another, arena that LeBron

Deirdre: Yes. Okay. Yeah, that

one.

David: maybe that’s a new arena. I don’t know.

Deirdre: Did they, did the heat arena take it down too? I thought they

David: I, I, think so.

Deirdre: Yeah. Yeah, you’re just allowed to do that

David: mm-hmm.

Deirdre: It’s just

David: apparently, like spending at nightclubs in Miami has gone down a lot because the

Deirdre: Aw,

David: of cryptocurrency is way lower.

Deirdre: Yeah. It’s, definitely a crypto winter. So the, crypto city Miami, seems to be going into hibernation as it were. Um, yeah. this whole FTX, Alameda Sam Bankman-Fried shit show doesn’t, it just seems like plain old embezzlement. It’s like, not even anything, like, not even into anything interesting with crypto, like, or anything interesting with the blockchain or anything with like tethers or stable coins or any of these other sort of things that have like happened earlier this year, like Terra and Luna and all that crap.

It doesn’t seem like this collapsed because really of any of that. It was just, he and they were taking money out the back door. and not paying it back and like whoopsy doodle , like the price of your self issued token that like your whole balance is being held in, starts to drop. And then there was a run on FTT on FTX.

And then it’s just uh, we, we don’t have 8 billion worth of assets?

David: Yeah. Imagine if you went to like Bank of America and you were like, Hey, I’m a hedge fund. Can I borrow 8 billion? And they’re like, well, what can you put up as collateral for the 8 billion? And you’re like, how about 8 billion of Bank of America stock? Bank of America’s answer would be no. Fuck you. Like, that’s not something you’re allowed to put up as collateral.

Like that’s just like double leveraging yourself. Right. Um, because if they don’t pay you back, then not only. do you not have that money, but presumably your stock goes down like it’s a highly correlated asset. Uh, maybe it would need to be a bigger number for, for Bank of America, but like still in general,

Deirdre: God,

David: you, that’s not something that, um, normal banks would accept.

Deirdre: no.

David: did anything else happen this year? What’s gonna happen next year?

Deirdre: Oh God. All right, we’re gonna, slog through this NIST signature competition and Of course there’s gonna be one little isogeny scheme that could, this one doesn’t involve any torsion points. So , this one is not gonna just die a fiery death with like half an hour on a laptop in a, on a weekend in August, like, uh, the last one did.

But, who knows? It seems a bit complicated. So, we have to do some analysis on it to see if, uh, we want to use it. But signatures, lattice signatures, they’re big, they’re beefy, but they’re fast-ish. I think this competition will have less, intense. vigilance over it than the, the KEM one for reasons that we described earlier, where like the attack scenario on post quantum signatures, uh, is just a very different thing than just recording the entire key exchange and all the traffic you encrypted under it and just saving it for later, which is what you cared about with the post quantum KEM.

So that will happen

David: Mm-hmm.

Deirdre: I was talking to a couple of cryptography people and I’m sort of like, what are you nerding out about? Like, what are you doing for funsies? that’s cryptography related. And almost all of them were like, I’m learning about lattice schemes. I’m learning about Kyber, I’m learning about this, I’m learning about that.

They’re either implementing it or they’re learning, like they’re finally getting into the nitty gritty of these, of these lattice based schemes. I’m sort of like, all right. I guess I gotta start learning more in depth about these lattice schemes, and maybe it’ll be Kyber. I don’t know. What do you think?

David: I think we’re gonna get another supply chain security notable

breach, but I don’t think it’s gonna come through like an open source dependency.

I feel like a lot of the, uh, discussion of like supply chain security ends up being around, like, oh, you know, where are your node dependencies or where are your python dependencies coming from? Stuff certainly does happen badly. Like there’s been like ransomware, other like leak, that type of stuff. I think the token example so far has been SolarWinds. Um, and I think it’s more likely we’d see something else like SolarWinds where a provider that has access to something that goes into software gets popped, than like you would see a targeted attack through a,

Deirdre: Hmm.

David: dependency. That’s my prediction.

Deirdre: Hmm. I don’t like that. I don’t want that to happen.

David: My second prediction is that, Michigan will defeat T C U in the college football playoff semifinal

and then go on to re defeat Ohio in the college football championship.

Deirdre: yeah.

David: making it the first year that Michigan beats Ohio twice.

Deirdre: go blue. I have no college football, uh, loyalties except sort of the UCLA Bruins cuz Nathan went to ucla. I went to MIT, we had a D3 football team, they made it to the playoffs once in my, waking memory and that was like, "we have a football team?" So , go Michigan.

I’m trying to think. I don’t, I don’t want more supply chain attacks, but I can’t deny that, they are scary. what happened recently, log4j? we had all these executive orders coming solar winds, but also related to log4j.

David: Yeah.

Deirdre: So I don’t

David: it log4j wasn’t really what you would call a supply chain

attack, though. it

was

just a vulnerability in a third party dependency.

Deirdre: exactly.

I don’t know.

David: is more commonly known as like security,

right? Or patching right.

Deirdre: Yeah. I keep hearing about SBOMs. What fucking,

David: Software bill of materials.

Deirdre: and I just don’t like, okay. I don’t want to have to do more work to make, I don’t know, a checklist. be checked. For something that I don’t know is gonna really offer a tangible security benefit?

David: I mean, I don’t, I don’t know specifically how the, like requirements from the executive order play out. they’ll say this is the same executive order. We had Eric Mill earlier the year talking about Zero TRust. The same executive order that was doing Zero tRust is also mandating some secure software practices, part of which includes like tracking an SBOM a software bill materials.

For what you do. And it’s unclear, like, I don’t know the details of like what is required, what isn’t in the format that things come out and and so on. But like, unless you are a mega, mega, mega, large organization, if you are using, I don’t know if you are, tracking just your third party dependencies sanely, like in your development process by like, if you’re just using cargo, like if you have a Rust project, that is I believe enough tracking to, like, to, in terms of the, the fundamental data that you need is there.

It’s just, it gets complicated cuz like in reality you probably have more dependencies than might be in your cargo, whether it’s because you use software. Like what about the third party services that you like communicate with in other ways or, you know, you probably have things coming in from different, uh, different versions and

Deirdre: that’s fair. So like my cargo lock is most of the way there. All right. All right.

David: But I’m

also kind of making a bunch of that up so I could be completely wrong.

Deirdre: I really like the trend line of more projects incorporating more memory safe languages like Rust. I’ve been hearing more about more swift. being implemented in, iOS and um, other Apple systems, which is nice.

I don’t see a day where Apple is going to start pulling in Rust into iOS. but if they make swift an even more fully featured language, uh, so that people can use that on Apple Systems, that would be great. because I know some people who would really like to write, nice crypto or other tools in Swift and it’s not as nice as Rust.

So,

David: Mm-hmm.

Deirdre: it’s a bit harder.

David: does zig count as a memory safe

language, or no?

Deirdre: I, doesn’t it, I think it does, doesn’t it have a

David: It doesn’t. So it gives you, um, spatial memory safety, but not temporal

memory. Safety, meaning that, uh,

you’re not gonna have buffer overflows, but you could still have use after frees.

Deirdre: I don’t like that. that’ll get ya. I feel like those temporal ones, those are the ones that that’ll get ya. they would get me , they would get me. I’m really looking forward to the rollout of passkeys. I really hope we can get sites and web services to adapt passkeys.

it would really help for a lot of the attacks that we’ve just talked about, like people getting phished, people getting, targeted. and if you just, you need to have a trusted device or you need to have, a stored, non duplicated, asymmetric crypto key pair to log into, whatever Okta or Slack or whatever it is.

you’re less likely to get a huge fucking breach. So, you know, crossing my fingers for, wide adoption of passkeys through the next year, that would be really, really, really great, I think. the ease of use is really sexy. And like if you just show people the demo of just like, click log in and like your browser knows your credentials and will log you in and know really this is way better than you remembering a password. I think that is so nice, that it’ll be very attractive to just a generic human user. But I do worry that people will be like, where’s my password? I just want my password. I just wanna be able to remember like my child’s name and their, the, the year of their

birth. Yeah. and like I just, that’s just the thing I remember and it’s just like, I swear to God.

This is better. And look, you just click the button. Like, you don’t even have to remember that. so I think there will be, you know, a bit of, evangelizing to do, but, that’s not my job.

David: I, uh, it’s, uh, so I, I got a, a M2 macbook air

Deirdre: Ooh.

David: a

Deirdre: Nice.

David: and, I was setting it up and I was logging into my Google account so that I could have my contacts synced and stuff. And it prompt the first thing that I asked when I was like signing a Google, I was like, do you wanna sign in with a passkey or else, like in Apple? Um, and I was like, yeah, I wanna sign in with a passkey, but I didn’t have a passkey, so it failed I was hoping that it would like, push me through a flow where I then signed in with something else and created a passkey.

Deirdre: Yeah. But didn’t quite do that. But that’s nice. It’s just like you didn’t have to like go into, the super user. Awesome. Like, you know what you’re doing, flow to go find it. It was

like, do you want

to, it’s like yes, yes. Good. Cool. Yeah, that’s, that’s the thing that I am excited about and I hope like we’ve got some good momentum going with all of these browser vendors and all these OS vendors, like being ready to go.

like what if this is wild? I don’t know if it’ll happen, but what if TikTok supported passkeys? That’d be fucking amazing.

David: Well, TikTok supports sign in with Apple,

Deirdre: ah, all

right.

David: will presumably support, pass

Deirdre: That’d be fucking great. That would be amazing. But yeah, the, uh, I’ll leave it there.

I am hoping that that is what our new year brings in security, in cryptography, and whatever.

David: Well done. We brought it back to the name of the, we said the name of the thing and the thing.

Deirdre: roll credits.

David: It’s exactly. Um, well I think on that note, um, uh, I guess we’ll thank our listeners

Deirdre: Yes.

David: another year of support. Wish them all a happy holidays.

Deirdre: holidays. Happy New Year. Thank

you all for listening.

David: insert your preferred car dealership holiday sale, um, here.

Deirdre: Awesome. All right. We’ll talk to you in the new year. I’m waving at you, listener, and this is a podcast. Bye.

David: And Thomas wishes you a happy New Year as

Deirdre: Yes, wherever he is.

David: Security, cryptography, whatever is a side project from Deirdre Connolly, Thomas Ptacek and David Adrian. Our editor is Netty Smith. You can find the podcast on Twitter @scwpod and the hosts on Twitter @durumcrustulum, @tqbf, and @davidcadrian. You can buy merchandise at merch.securitycryptographywhatever.com.

Thank you for listening.