Real World Cryptography 2023 is happening any moment now in Tokyo. Also, some phone basebands are broken.
This rough transcript has not been edited and may have errors.
Deirdre: Hello. Welcome to Security Cryptography Whatever! I’m Deirdre
David: I am David.
Deirdre: We don’t have Thomas. Thomas has Covid. We are very sad, but he sounds like he’s on the mend even though he’s still infectious. So today we get to talk about real world Crypto is coming up again and we get to talk about it without Thomas again.
David: A podcast tradition.
Deirdre: Yeah, where it’s a new, SCW tradition of there will be a Real World Crypto and we wanna talk about it. And we don’t have a Thomas with us, so we’ll see if we can get a streak going. Other things we wanna talk about today is a blockbuster report from the Google Project Zero team about some remote code executions at the baseband level in some Samsung chips.
And it sounds kind of bad.
David: Yeah, well, why don’t we just go into that first. The longtime listener might remember that this podcast started out by complaining about the security of phones. And then we learned that this podcast was gonna have guests when Justin Schuh called us and was like, I’m coming on your podcast to talk about phones. After we talked about phones, and that was where we got our start. So all our abilities and phones have been our bread and butter.
Deirdre: At least a for this one, this is not an Android thing or an iOS thing, or, or even a specifically like pixel thing or an iPhone thing. This is in Samsung Exynos chipsets. They are in a lot of devices and actually connect your device, the rest of your computer to the network
David: They’re the modems, right?
Deirdre: Yeah, like the specific modem.
They’ll actually connect, like turn your cellular connection into something you can do, you know, TCP/IP over and over. Project Zero usually does this 90 day disclosure deadline. They find a bug or a vulnerability, they disclose it to a vendor and they say, you’ve got 90 days to fix this and patch this, or we’re just gonna tell people because people need to know about this.
And for this one, they found 18 VULs. They disclosed 18 things and some of them have CVEs, but some of them are very severe. But they are telling people that they. Turn off wifi colleague and voice over l t e on vulnerable devices, but they’re not fully disclosing the nature of these four worst vulnerabilities.
Is that correct?
David: Yeah, and like, uh, 2G and 3G are going away, which means that like, voice over LTE is the only option for voice in a lot of places. And like, I don’t even know that you can turn it off on, on some phones now. Like, I think it, it’s just like a thing.
Deirdre: I tried to turn it off and it didn’t, that was not a thing I could do.
David: Yeah, you just have to like, I don’t, I don’t know, I guess turn off all calling or something.
Or only use Google Meet for calls.
Deirdre: Oh God. So some of the affected devices, okay, so the, the actual like risk is, um, a remote attacker can get remote code execution on your phone by just. B, by just doing no user interaction at all. The attacker just needs to know your phone number. For example, if these chips are in, um, mobile phones or, uh, the remote, the mobile modems of certain cars that have them deployed, which also sounds really shitty, and they’re able to remotely compromise a phone at the baseband level with no user interaction.
That sounds really bad. It’s really bad. This is, this is not even like a, this is not at the OS level. This is below the OS level. This is like the firmware or capability built into these actual chips. Um, before it even gets out of like the modem level of the actual chips, you know, firmware before even get, like, gets out of there into the OS level and that privilege ring.
David: some sort of like buffer overflow and some sort of parsing done on the chip or something like that,
Deirdre: Yeah, I mean, yeah. And so some of the devices that are affected include pixel six, pixel seven, which really sucks because those are generally pretty good phones. Um, but this is in one of your. Hardware dependencies, as it were. A bunch of Samsung phones. Like there’s a long list of this. We’ll put the, the blog post, uh, from Project Zero in, in our show notes and any vehicles that, uh, use the XOs auto chips.
Uh, T 51, 23 chip set. And so now I really need to go check what kind of chips are are cars? My old ass, old ass car is probably fine, but uh, we have a newer car and I want to check about that one.
David: Yeah, and I wonder like how, like even if your car doesn’t do like internet or something, it one, it may still just like have a modem in it for the purpose of downloading, like updates to itself and so on. That’s effectively just paid for by the. The manufacturer?
Deirdre: Yeah. Like there, there are newer cars that don’t do anything smart. They’re not doing like an over the air update like the Teslas do, but they still have a modem in the car for things like OnStar and like trying to find your location and having like emergency calling or something like that.
David: Or just updating the head unit. I think even if it’s not like giving you wifi.
Deirdre: yeah, exactly.
So if you can just completely remotely compromise my car in certain ways with silently, that’s not fun. So one of the ways to, if you have like a regular mobile phone to mitigate this, is to turn off wifi calling. Um, and if you have voiceover lt. It’s tolerable in your device. You can turn that off as well.
I was able to turn off wifi calling, but not voice over LTE on my pixel. Yeah. . This is just kind of wild
David: this just further proof you shouldn’t ever call anybody or accept any calls.
Deirdre: This is, yeah. I generally live by this philosophy, but sometimes the world just forces me to pick up the phone because they just will not communicate in the asynchronous manner and I think it’s very rude of the world. This is kind of wild. They, the Project Zero team withheld details about some of the worst vulnerabilities, even when they went beyond the 90 days basically, cuz they said they were so bad and they thought that disclosing them would distinctly privilege an attacker versus defenders trying to mitigate them.
So they’ve disclosed them and there’s a like 14 of these 18 that are out in the open. They have a bunch of CBEs, but the worst ones are not.
David: Yeah, well four have been passed their 90 day deadline and withheld for safety reasons. Four have passed their 90 day deadline and are public, and 10 have not yet passed their 90 day deadline.
Deirdre: Okay. Wild. Thanks. Project Zero. I wanted another reason to turn off wifi calling. Cool. All right. What else? So real world crypto’s coming. They’ve got a cool program out. I am going, it’ll be fun, if any, if we get this out before, uh, real world crypto. See you in Japan. Um, we’ve got the Cherry Blossom season happening and, and all of that.
And it’ll be nice to see the other side of the world again. Um, but we’ve got a program. We’ve got
David: I E
David: I E T F is also happening, um, in, in Tokyo, uh, around roughly the same time. So,
Deirdre: there, there are some people who are grumpy. They’re like, why did you pick this place in time? And we’re like, the I A C R people are like, we picked it over a year ago. So like, I don’t know what to tell you. just happened to be all in the same place at the same time. Maybe everyone likes the, the cherry blossoms.
But, uh, I am excited to see some of these. Um, I want to catch up on some of this stuff, including some pqc stuff. Breaking Masked Hybrid by copy Paste Keber is the, the, the Pqc finalist. From the NIST post quantum competition and like masking is a, a, a technique that you usually do to mitigate a side channel analysis and side channel attacks.
So this is, uh, I’m very interesting to see like exactly how they were able to attack this because, you know, When you have like kind of traditional techniques of trying to implement your cryptography in a resilient way and you’re just sort of like, nah,
Deirdre: I could just like, just go, whoop. I am very interested.
David: Yeah. I mean, but it’s like power analysis on an arm, cortex M four. And so like, you probably only care about this if you’re making. Putting Kyber into like an HSM or a hardware wallet. And even then it’s not clear how much you care about this. Cause the fact that like, I don’t know, it’s almost like a feature that several years after you make like a hardware wallet, the technology gets good enough that you can side channel the key out of it, you know, just in case.
Deirdre: But, uh, this is, this is sort of the thing though, because like. Especially with nist, like they want to be able to deploy it in some of these, uh, embedded environments, or they want to be able to deploy it in environments where those attacks would be in scope. And if the sort of mitigations that they basically say, like, if you want to be resilient to power analysis and side channel attacks, you should.
This masking technique or you know, this implementation technique for your, say, field arithmetic. If you’re doing stuff with like elliptic curves and they’re just, and this attack is just basically, nope, nope, nope. No, not for you.
David: I think It’s
a reminder of like what, uh, what Nate said when he was on that. Like, when you’re building systems like these, you shouldn’t build one where like everything falls apart if you compromise the security of like an embedded device at the edge.
David: assume that someone can do power analysis and get the key out and hopefully that, you know, only affects that device and not.
I don’t know every credit card reader or something. I don’t
Deirdre: Yes, the, it’s not, it’s not like the same embedded key in every single device. And if you crack one, like everything falls down, that would be very bad. And hopefully these CBER deployments do not, well, you never know. C’s gonna be for a whole bunch of stuff for chems and for anything else that needs to establish keys between two parties.
So you never. Uh, we’ve got some more attacks on Frodo, um, which frodo’s interesting because it got pretty far in the post quantum, um, competition. It’s another lattice based, uh, learning with errors sort of, sort of deal, uh, ring learning with errors. That’s, you know, Frodo, Freddo in the ring. Ha ha ha. Ring learning with theirs.
David: Yes. I believe a former podcast guest, Chris Piper, um, worked on Fred
Deirdre: Yeah, I
David: definitely the main title that he has,
Deirdre: former podcast guest and this is, uh, using rowhammer on Che. That’s neat. So Rowhammer is a, is actually a technique based on like bit flipping in memory. It’s not necessarily specific to anyone crypto system. It’s exploiting a kind of like underlying physical, weird. About memory where if you like do certain reads and writes in a certain pattern, and I’m wiggling my fingers right now that no one can see cuz it’s a podcast and you like do bit flips, bit flips on either side of like a, like a, a line in memory, a a cash line or whatever.
Um, you could influence it and make it flip. The wrong way. And it’s basically a way to leak information out of a side channel in a weird way. And I guess they use that against Frodo, which is interesting. I don’t know if, if Rowhammer is like moving forward and like attacks, I don’t see a lot of Rowhammer.
Like I feel like Rowhammer happened and the inspector and Meltdown ha happened and there’s been a little bit less focus on Rowhammer or am I just sort of showing my ass or something?
David: I don’t, I don’t know. Um, I didn’t keep up with it.
Deirdre: But this is neat poisoning. Required achieving publicly poisoning requires an extreme engineering effort because, because Che’s Keygen is so fast it, you can do keygen in eight milliseconds, which is one of the pluses of all these lattice based schemes is they can be very fast, they’re not very space efficient.
I kind of. That though, like kudos to the attackers. But like sometimes efficiency is not just good because you, you, the person using it want it to be efficient and you wanna get in like a bunch of signatures and a bunch of, you know, key establishment and like, it’s just nice to have things go fast. It can actually be a mitigation against.
Um, it can be much harder to leverage an attack against a crypto system that goes, goes really, really, really fast. And so, uh, last year we talked about what was the attack on SIDH Hertz
David: the, I heard speed.
Deirdre: Yeah. And at the time it was like, it was a over the network, uh, side channel. To break SIDH in psych. And like my first question was like, okay, well can you do it against someone that some crypto system that’s not so slow?
And the answer is, yeah, they could do it, but it was harder. And then later SIDH and psych got broken for completely different reasons, , um, having to do with auxiliary points. But the kind of point stands is that it becomes harder to leverage these. Against a very fast crypto system. Uh, even if you’re not doing a keygen every time, but if you’re doing chem, you are doing keygen.
David: and, uh, Rowhammer specifically, I think we did reasonably well at mitigating it in both, uh, firmware and hardware, um, RAM modules. I haven’t kept up with the state of the art, but.
David: I think that was mostly for Specter.
And it was just like, whoa,
David: was like a, i I, there’s a, a row hammer plus nap exploit at some point that I think they found some way to, to mitigate it. Na nap in this case, being native client,
Deirdre: Yeah, not sodium or lib Sodium. Uh, the descendant of nael, the, the Crypto Library naming CS suck, um, for people who don’t know about each other. And the last one in that section is, um, protecting Crystal’s di Lithium, which is yet another latice based post quantum scheme.
David: That’s a signature, right?
Deirdre: I think di Yeah, I think di Lithium is the signature and Kyber is the chem.
Um, and I know less about di Lithium, so I’m very interested in this. This might be more, you know, mitigations of these lattice based post quantum implementations and how to deploy them.
David: And you know they’re related cuz Keber is the light saber crystals and di lithium are the Star Trek warp
Deirdre: Mm-hmm. , and they made it easier by just calling it just crystals all caps, dash Kyber and Crystals dash di Lithium, because we’re all a bunch of nerds, . It’s either Star Wars and Light Savers, or Lord of the Riggs
David: Nerds in this establishment? No, I can’t believe that would be happening.
Deirdre: Yeah. Next section is, uh, postponing protocols and agility. I’m kind of, I kind of love this because it’s literally like we have a bunch of protocols that we’ve built, like signal, like TLS, like noise, that all were built around these. Pre quantum primitives, and now we’re just like, how do we move these forward into a post quantum world?
There’s a lot of pieces we get to keep, like our block ciphers and our hash functions and a lot of key derivation stuff. But there’s some stuff that don’t have like real straightforward post quantum slot ins. Like we have all these chems now, but chems are shaped differently than Diffy Hellman, for example.
We use Diffy Hellman all over the place in places like TLS and Signal. So what do you do? to make those protocols at the top level post quantum secure. And this whole section is basically about that. Um,
David: Chems all look like a TLS RSA key negotiation. Um, which we dropped because of RSA and because it requires an extra round trip. And now like all this, uh, less RTT stuff is like your, or the not zero rtt. The one rtt or
David: decreased key R T t key negotiation stuff kind of goes out the window or you have to rethink it, um, in this chem world, which is disappointing.
Deirdre: Yeah. That’s why like we’re just holding on with SIDH because SIDH was like full on Diffy. Hellman. Like you just had to decide that like the client would always be Alice or the, the, the server would always be Bob or vice versa, and you have something that is very diffy Hellman shaped. But then it, it fell down to an active adaptive attack.
So we were like, okay, well you can use psych. And then, then you couldn’t do that either. . Um, so R i p SIDH.
David: And they’re all freaking gigantic. They’re what? Like,
Deirdre: except SIDH wasn’t, it was the small one, but all the ones
David: mean, all the chems are gigantic. You’re what are gonna be like 500 bites or something for like a chem and like one to 10 kilobytes for a signature or something ridiculous
Deirdre: Something like that. It’s either 500 or 700 something bites for For these chems. For these lattice based chems.
David: And a TLS handshake right now can easily have two signatures in the search chain, one on the leaf, one on the intermediate. Um, you don’t need to verify the root. And then each of those, like, like you probably have three SATs, at least in the leaf. So we’ll call that like five signatures. And then a signature over the key exchange.
So you’ve got six signatures. So if you just did like a naive oh, Plug in some post quantum algorithms, right? You’re looking at like, uh, like 10 kilobytes or something insane of signatures.
Deirdre: I think the smallest parameter set for di lithium, which is the post quantum finalist, is uh, two and a half kilobytes for the smallest one. And I don’t know which, you know, security level is
David: talking about like a non-negligible portion of a floppy disc, like in the naive, uh, Update of the TLS handshake?
Deirdre: Yeah. And that’s like to compare to, you know, maybe a AEADs DSA signature, which is what, 64 bites or something like that. It’s, it’s small because you have like a, like a point, or even like a X X coordinate of a point and a scaler, which is another, you know, 32 by 32 bite. Of value and uh, and then you have a lovely little signature and you could just throw tons of them into your, you know, whatever, your signature chain or your, your back and forth of your protocol.
And now it gets harder. So trying to adapt for a post quantum world. We’ve got post quantum noise, which I’m interested about. Noise is the, uh, Pattern of how you establish a secure channel that underlies wire guard, it underlies tail scale, a friend of the pod tail scale. And you might see noise, the noise protocol in a whole bunch of places.
Like you might see it in your WhatsApp. It’s, they’re using it under the hood there so that you can send your encrypted WhatsApps from WhatsApp web or whatever it was, or at least you did once upon a time noise, uh, shows up all over the place. Nowadays. It’s really, it’s really nice. Shout out to Trevor Paron who, uh, came up with the.
Uh, protocol and what are they doing here? So they’re slotting in these chems and they’re updating noise to PQ noise, and I can’t tell much more from this abstract, but I am very interested. And then they, they prove a bunch of things about it in their new model. What is a face model? F A C C E. I don’t know what that stands for.
But, uh, I’m excited to look at it. Shout out to Peter Schwa, Schwabe, who, who’s done a lot of research in this area. It’s very cool.
I’m very excited about post Quantum Privacy Pass. Using the Privacy Pass is a cool protocol that was developed by some people at CloudFlare at the time, including front of the pod.
George Tankersley, I think Felipe Valor was in, was involved in it, but has become a lovely evolving protocol. Apple has used it. I think Google has used it where you can, using anonymous credential. Do something like a capture challenge. The server, having accepted your capture challenge basically gives you a bunch of anonymous credentials.
You can show up to the service. Uh, say you’re trying to use a CloudFlare, uh, service from tour. And you’re trying to keep your privacy, you know, your IP private, but all, instead of CloudFlare being like, I don’t know who you are, you’re coming at me from a, from behind a tour exit node, you can hand the, the service behind CloudFlare an anonymous credential that you were authorized before.
They can verify it and, and authorize you, and you can access. Say twitter.com via tour because they know that you were a real person because you ha the only way that they can verify your credential is because they, uh, gave it to you in the past and you passed a challenge in the past. So this is a very attractive thing for basically privacy, preserving stuff out in the, out in the world.
Um, another fun one is in signals protocol when they’re trying to do. Private updates to group metadata. They do a little bit of this anonymous credential shuffling so that, that, like, they don’t know who is updating the group metadata. They just know that someone that they have previously, uh, given anonymous credentials to is doing the, like, add adding a new person to the group are making some update to the group metadata anyway.
So post quantum privacy pass, um, anonymous credentials are one of those things that don’t have an obvious post quantum solution for, and it looks like there the solution is a zk di lithium doing a zero knowledge, proof of knowledge, a, a snark, not a snark, based on di lithium two to. So instead of just like full on commitments and revealing commitments that you can do with something just like a, like a abstract group, which I think the original Privacy Pass used can do it with one of these, uh, snark stark constructions.
It’s like a little bit more complicated. The downside is that the token size is somewhere between 76 and 172 kilobytes
Deirdre: which is big.
David: one of the use cases for privacy pass things right now is, is Google calls it private state tokens. It’s just like, what if we just put. Privacy pass on the web generally, and, um, made, uh, like anti-bot stuff, just like a thing that anybody could use, but without, without tracking. Uh, at least that’s the goal.
And like, man, shuffling around 76 to 172 kilobytes, like as a, something that looks like a cookie. Yeah.
David: Yeah. I
Deirdre: verification 30 milliseconds.
That’s not so bad
Deirdre: I feel like this is one of those never ending, you know, tug of wars, which is like, we make things faster so that things are, are more performant so that you can, so things will be faster and then we fill up that performance
David: the induced demand problem. It’s just like adding a lane to a highway. Like every time, uh, a web browser gets 10% faster, users open 10% more tabs. And then they’re like, why does my browser keep getting slow? Right? Like, this is a real.
Deirdre: Yes, a hundred percent. Anyway, I’m, I’m very interested in that. I’m interested in any of the new post quantum solutions to some of the things that don’t have, well, like kind of straightforward replacements because we need them for a whole bunch of cryptography that’s currently deployed if we want to make each of these things post quantum resilient and we need some of the results like this.
So I’m very interested in that one.
David: Next session, we’ve got network security and key exchange. Um, the things that jump out to me here as like, uh, ly secure transport person are iCloud private relay. Um, talk in depth, loosely, uh, uh, you know how that works. But I’m sure that some more details would be interesting. Anything that they have to share there.
Looks like we have both Apple and the CloudFlare people on this talk.
Deirdre: Yeah, Chris Wood is my co-author from, uh, Chris Wood’s at CloudFlare on the Frost, uh, I E T F specification. Um, he’s pretty cool and smart, so good luck to them on that. Nice talk. So it looks like. There will be zooms for some people who are not gonna be in the flesh in Tokyo, which is good. Hopefully. Uh, okay.
Sidebar, every time Apple gives a presentation at real world crypto, there’s always a bit of a debate about whether Apple is gonna come and be like, you can’t record this. You can’t live stream this, you can’t have a phone, you can’t have a camera. Like they have done that in the. So I’m very curious to see like if they won’t allow it to livestream.
I have live tweeted and posted pictures from previous Apple presentations at Real Crypto in the past, and no one at Apple has come and tracked me down and threatened me. So I will
David: mean, it’s not like you left an iPhone, an unreleased iPhone at a bar.
Deirdre: No, no, I would never do that. No, I would never do that. I’ve never used an iPhone, but I will try to
David: your bass band firmware doing?
Deirdre: Oh, fuck Um, I will probably live, tweet it just because I, I am also on Mastodon and the server just has latency and I’m still on Twitter, and even though Elon Musk is doing his damnedest to drive it into the ground, I will probably just keep doing, doing it on there.
David: I will probably be asleep cuz it’s Tokyo. I just assume everything that happens in Tokyo happens while I’m asleep.
Deirdre: it’s uh, 15 hours ahead of you. So yes. Anyway, hopefully they will allow this to be livestream because now we could record it even if they don’t record it. I’ll try to get as much detail about that as possible cuz it’s interesting and Apple does interesting things.
David: Yeah, the other one jumping out to me here is, uh, the TLS anvil adapting combinatorial testing for TLS
David: Doesn’t look like they found anything like too groundbreaking in there, but you know, always good to have more tools like this. And they have a really big number in terms of like flaws that they found, even
Deirdre: Ooh. Ooh.
David: big fancy names.
Uh, 116 problems related to incorrect alert handling and a hundred other issues across all tested libraries. And then two new exploits and matrix ssl.
Deirdre: Where does this testing
David: I assume this is in like state machine type
Deirdre: No. No, but like is this public? Can like we can, is there,
David: I assume it’s open source based on the authors.
Deirdre: Okay. I’m interested.
David: Um, they’re the same people that created the TLS attacker tool.
Deirdre: Cool. All right. This is so useful and this is so good for updating TLS for post quantum stuff, like it’s not necessarily done yet. What else? Wifi
David: big takeaway is to just not use Matrix ssl.
Deirdre: I mean, where is that deployed? That’s
David: don’t know, it’s like a, it’s like a, it’s like a c plus plus SSL library. I’m not quite sure who uses it. The big embedded one is, it’s the one from like Armor, Qualcomm, and I always embed TLS. Yeah, that’s the big one.
Deirdre: Not okay. So that, that’s not the same. Okay. Yeah. Cellular radio? No. Ciphers and Android. Cool. Hey, Yama Yamas really cool. Um, they do cellular security stuff. A lot. I’m, whatever, whatever they found. I’m very, I’m very interested to tell me how my phones are, are broken, even some more, and then, I don’t know, careful with Mac, then sign, ed hoc, lightweight, authenticated, key exchange protocol.
Not familiar or wait, I mean, any of the lightweight stuff I don’t pay very close attention to because I just, just don’t tend to work in a lightweight environment.
David: Yeah, but the, uh, I mean if it wasn’t for post quantum, some of the primitives, there are really cool duplex objects are just like the greatest thing ever when doing protocol handshakes, which as discussed is like the only thing that I care about, Um, it’s like if you think about in like noise or kind of TLS, you basically like you.
Your kind of encryption state of like, and then your handshake state. And then at some point you have to like hash everything that came through and make sure that the state’s aligned and that nobody tampered with, with your handshake. And like, that’s fine. That’s not like horrendous to do or anything. Um, duplex objects let you combine that into a single state, so the operations you get are like encrypt, decrypt, squeeze, and um, and like mac.
So absorb and squeeze basically. So basically like effectively, Shove data in and then get a Mac. And so you just use that for everything in the handshake. And then at the, at some point you roll your Diffy helmet into the a squeeze like privately, and then you pop out a Mac, you check out. Everything aligns on both sides,
David: can squeeze out a key too.
Deirdre: God. What’s the, there’s like a hash function. There’s like all these sponge based hashes nowadays that like, I think are the foundation to that sort of
David: Yeah, you need a sponge.
David: Cyclist is like a general construction, and then you can apply it to like any sponge. And so there was zodu and then was the hash function, and then Zodiac, I think was the instantiation of cyclists by the Shah three people. It’s a. different hash function that went into the N competition.
And then there were some others that I assumed did similar things.
Deirdre: I just googled sponge crypto and the first thing was a, there’s a coin called sponge like a cryptocurrency, and I’m just like, dammit, dammit.
David: Anyway, the takeaway here is something got picked by the n lightweight crypto, um, competition, and it is also a duplex object. I don’t know. And it is not from the regular crowd that usually wins these, so it
Deirdre: from the,
David: and it wasn’t Zodiac, it was not the
Deirdre: from the Belgians. It was not from the Belgians. This time The Belgians are great, but got someone else this time. Cool. Next session building and breaking secure systems, WhatsApp. End to end encrypt. end-to-end encrypted backups. Cool. I think we talked about this previously, but this is like, there’s some HSMs on their backend.
They deployed a bunch of stuff and they’re, they made it. You can end-to-end encrypt your decrypted texts once they reach one of your WhatsApp ends to WhatsApp to their cloud. So they can’t look at your backup so that if your phone, your phone or your, you know, computer falls in the river, you can log into WhatsApp on a new phone and be like, give me my messages, please.
That I, you know, my history of messages. And they did it in a nice way because they have kind of the, the straightforward way, which is like, Derive a key or derive a like very good password and that is your key and you must remember it, which is nice for the people that like really want to do that. But they also have, I think they have this HSM projected, you ha, you set a password for your key.
that is encrypted under your password and that encrypted blob lives in their HSMs so that it’s better for, and then like they do the key stretching and whatever from your password. Like pbk what, whatever the new hotness for pbk, df or whatever. And it was just very nice that they were able to deploy, like the thing, like it makes sense that it could just do it like that, which is a, but they, they deployed it straight off the bat.
So I’m very interested in the, this whole talk because their white paper is all right. They haven’t. , they have like a few page white paper, uh, officially from WhatsApp. I’m interested to see what they say on the specifics about this, although it’s only half an hour.
David: and then they presumably throw away the credentials to the hsm. I believe Apple described it as a, uh, a one way physical hash hash function, aka a shredder. And, you know, having talked to some people who ha have worked at Apple, that did actually happen.
Deirdre: Yeah, I hope so too. Yeah, he, uh, the Apple HSMs, if you get Apple iCloud backup service in China, they’re required to live in China and use Chinese hardware and things like that. So that’s a, a whole nother ball of wax.
David: the China ones.
Deirdre: other in this channel. Anand encrypted cloud storage is hard from another friend of the pod, Kenny Patterson talking about.
And their further attacks on mega, which is supposed to be, uh, encrypted back up in their cloud storage thing. And then a thing about e-voting in the French elections and how they broke it. Neat.
Deirdre: Everyone keeps trying to do that
David: without Thomas here. We’ll never know if either of these count as stunt packing
Deirdre: Uhhuh. Yeah. We need a, we need that professional eye to, to rank them. But yeah, I, I like all this stuff because it’s, this, this whole session is breaking, building and breaking secure systems. But whenever you have end-to-end encryption, like people lose their devices, the, the devices. The repository because the provider can’t back up things for you or if they can back things up for you, it has to be encrypted as well.
This is sort of like the natural, natural kind of follow-on state from end-to-end. Encrypting things is you need to have backups or fallbacks in some way, and then designing those also relies on secure encryption, secure design, and all that sort of stuff. Cool. There’s some stuff about. Crypto for people.
David: Yeah, we’ve got a talk from Steven Samo about,
David: um, design, applied cryptography in humans. Um, he’s from Google. He does a lot of, you know, anti-abuse, that type of
David: for the people, you
Deirdre: Mm-hmm. for humans. Cryptography for grassroots organizing. I know Sunny Camara does a lot of encrypted search. If I recall correctly, not like fully homomorphic encryption stuff, but like kind of the stuff that has been ongoing before.
Fully homomorphic encryption was a thing.
David: he gave the keynote at crypto and I wanna say 2020. That was, is very good. That roughly covered how most of the work in crypto ends up just kind of helping like cloud providers and not a lot of it is directly applicable to, to Joe six pack to steal a term. John
Deirdre: and this is . And this seems, this talk seems to be about collecting information about people decentralizing movements, using cryptographic tools to adapt the trust to the existing trust and communication protocols of organizing from physical to digital spaces. Great. Awesome. . Love it. Yeah. And then ppm, privacy, preserving Measurement, and Private Information Retrieval from Sophia Shelley, Pete Snyder.
Alex Davidson. I know Sophia works at Brave. This is cool because I think Sophia is part of a new I E T F working group on private measurement protocols like ppm. And if you’ve ever heard of preo,
Deirdre: and I think the newer one is Star, which is trying to be able to encrypt. Sorry, uh, collect metrics in a kind of private aggregate way and using kind of similar things to multi-party computation techniques so that you can get useful statistics out of private aggregate data.
and I’m very happy to see more of those because currently the metrics that we use in the world, like think Google Analytics, think the crash reporting on your Android app or iOS app, it tends to be just sort of like you hand it over to the app developer or the service provider that you know, the app developer is contracting with and they just say, oh, we treat your data very securely.
We encrypt it in.
Deirdre: And our data, like our hard drives that our databases run on are encrypted or something like that. It’s like, okay, you can see everything. It’s just not spilling out on the internet. So this, this is very nice because then it can be just, you don’t necessarily care all the time about individual users or individual apps, metrics you do care about.
Like, hmm, we have a baseline across all of our deployed. That like the 90th percentile response rate is blah, and then you deploy a change and then, oh no, that time doubles. Or you know, whatever. So in those sort of situations, privacy, preserving metrics, if they’re performant, are just a no-brainer. So it’d be nice to, to see more of them.
David: Yeah, I know Firefox has some p i r stuff that they were using for at least some other metrics and like Chrome might have had it at some point, they don’t currently. And instead take steps for anonym, either like anonymizes the metrics and uh, at submission or like, and. doesn’t record anything that has too low of a prevalence of like how many clients it seizes it on for things that might involve like a URL and you know, cert that’s still like, you know, good practice.
I have faith that they’re doing it cause they work there. But like it would be good to not have to trust them,
David: but if you do wanna know Lake more and that, I think one of his students, Kimberly Ruth, uh, did a paper, a worldwide view of browsing the worldwide web. Used a bunch of that Chrome data and so they describe all of the like practices as part of that.
And so that’s probably a good reference. I’m sure they’re on Google’s site somewhere too, but if you want to like see it from a third party,
Deirdre: Uhhuh, . Cool. All right. Side channels and back doors. It wouldn’t be a real world crypto if there wasn’t an S G X just falling on its butt. Talk . And I love these cuz it reinforces my biases where
David: Intel got rid of sgx like.
Deirdre: For specific consumers. I forget exactly, but blah, blah, blah. Let’s see, do I recognize some of these names?
I think Daniel Gien is an and Andrew Miller. I think they are part of the usual suspects who look at any secure enclave or trusted compute or SGX like environment and they’re just like, let me at ‘em. . Okay. What else? Randomness in Cisco.
David: In the Cisco adaptive security appliance.
Deirdre: What is that?
David: Well that’s the, I first encountered Cisco ASAs almost 10 years ago now. Like we had just built census and I had seen this like article on some, no name, like it was showed up on Hacker News, I think. But it was from some like random website that was like, oh, hey, sometimes like your male SMTP connections can be downgraded to not use stark TLS
Deirdre: Uh huh.
David: by changing the command on the wire.
And then when we built like census internally, I was like, I wonder what happens if I search for a bunch of Xs in this field. Like, which is what they were doing on, on mail. And it was something like 10% of all the mail servers that we saw, or, and so we were like, what’s going on here? And so we wrote a paper, neither Rain nor Snow, nor Mitm back in 2015 in partnership with Google.
That was characterizing how often like opportunistic, uh, start TLS was getting. Downgraded. Um, and the primary downgraded was Cisco appliances that were doing it so they could inspect the mail,
David: cuz like if it gets encrypted, then they can’t read the contents in a middle box.
Deirdre: Boo. It smells exactly like the, the great firewall feels, which is like, you are not allowed because we are not
David: as, as you know, like if you do a good job, you won’t see this at all. Right? Like if the attacker is talented. But I like went through every error message we got in a scan and like grouped them and bucketed them and was like, okay, so this one, bucket that out. Keep going. And we did, I did find one that only showed up on like, Four servers where they replaced chunking with funking and they replaced, uh, I don’t remember what they replaced start TLS with, but it was only on four male servers and they were all in Ukraine.
And this was in tw, or excuse me, they were all in Crimea in
Deirdre: Oh wow.
David: well, I know what’s happening here.
Deirdre: Hmm. That’s funny. I, I, I don’t, yeah, I don’t hear much about this. So, in this paper or this presentation, EC dsa, Shutter in key duplication in a large amount of X 5 0 9 certificates generated by Cisco a s a gateways after some yada yada yadas, we recover some RSA keys, some E C T S A keys and signature nos.
David: It’s like the dab and wheat keys all over again.
Deirdre: nonsenses EC D S A bad random. Shutter. We, we have a little section in our, uh, schor signature thing about mitigating bad randomness because for, for all of these reasons, but it’s slightly different with EC dsa. I think ec DSA is even more fragile with these, with the bad nuances. But fun
And then, um, we’ve got friend of the pod, Mac Green and Nadia Henneger and a bunch of other people. Speaker Adam Seoul Backdoor in Macaulay Schor generator. So this is like flavors of dually C D, RRB G, but it’s this other one. And the Cly shore Drb G
David: Ultimately we were unsuccessful in finding a plausible back door in it. So
Deirdre: bring more
David: interesting. But, uh,
Deirdre: All right.
David: definitely the people look at that.
Deirdre: Yes. Uh huh.
David: moving on, we’ve got a section on messaging and encryption, including the Threema stuff that we’ve already discussed on this podcast.
Deirdre: Interop. Uh, I think this is about, uh, EU Digital Markets Act. I think this is a lot of policy stuff that it’ll, that’ll affect deploying, end-to-end encryption. Stuff for messaging, this is a cool one. Metadata protection for MLS and its variants.
I think this is a follow on from an either another paper by these researchers or just another paper. if you switch from a signal world to an MLS world, this is messaging layer security, you have this, you know, async ratcheting tree between all these people about how to figure out how to encrypt messages to thousands and thousands of group members.
How do you protect the privacy of the group metadata of any other metadata? About your secure messaging session. And it turns out, once you all can agree on a secret key, um, secret key material, you can do a lot of stuff with it. And I think they’re using that. To sign and encrypt, encrypt metadata about the group.
And then you can check that everything is good by verifying signatures with some of the same stuff. I think from a quick skim, that’s what, uh, that’s what this is about. But it’s a, it’s cool to see it. There’s a nice line of research coming out of like, we have MLS now, now, like how do we get some good stuff out of that, in this kind of new way to do group, uh, end to encrypted group messaging new compared to say, I’m excited to see that.
And then deniability. This is interesting because like how deniability is not used in practice and the challenges are rising from the design of a deniable system. Yeah.
David: nobody takes advantage of that property,
David: but maybe that means that it’s like successful because nobody’s doing the thing that you would do if it didn’t have that property either.
Deirdre: Right. F e and multi-part computation. Real f h e. Uh, invited talk . Cool. Prime match. Privacy. Preserving inventory matching system. Okay. Trading financial stocks. Okay. Huh, interesting.
David: I’m not sure we need more anonymity in financial stocks, but if there’s a reason for it, I’m sure they’ll tell us.
Deirdre: Uh, benchmarks of prime match production adopted by JP Morgan. Hmm hmm. Interoperable private attribution.
David: At least it wasn’t adopted by Silicon Valley Bank. Am I right?
Deirdre: God, am I right? I p a private advert. Uh, private advertising, huh? Yeah. This is one of the applications of multi-party computation that I am aware of, which is like matching advertisers with ads and targets. So interesting. Oh, learn more?
David: Google has a bunch of work in that space under the Privacy Sandbox banner.
Deirdre: Yeah. Okay. Speed. Running the rest of these Hack spec. Love a hack spec.
I, uh, endorse hack spec. It lets you write something that looks like rust to have a, you know, kind of a, a reference implementation of a crypto primitive or whatever. But you can compile it too easy. Crypt, cock, jasmine, a bunch of cool back ends. It’s very fun.
Deirdre: Hi, assurance. Go from, Friend of the pod. Felipo.
Valor. Yay. Love Felipo. He’s doing this on his own now. Kind of like an independent Go cryptographer.
David: Professional open source maintainer, I believe.
Deirdre: Yes, a cool way to do it. The, and it’s not, that’s not just like, please donate to my Patreon to support my open source software sort of deal. Um, he’s doing it a different way.
This is cool. Kryp op, uh, proofs in cock. Targeting, they’re connecting it up with fiat crypto, which is also come like doing a bunch of stuff in clock and then you can compile it to your, you know, a c target, a rust target. I’ve used it for stuff. It’s very nice. This is neat. I’m, I’m interested. Privacy CK proofs, oh Dam.
And Trish to fight disinformation.
Deirdre: I am very interested in this because I was part of a workshop recently where we had all these tabletops about like say it’s 30 years in the future, and the prime minister of your country has just died, and the oppressive government is trying to say he just died for non suspicious reasons.
And the rebels are trying to say he was killed because he was too sympathetic. The, the rebels, blah, blah, blah. Like how do you deal with this? With like all the technology that’s theoretically available in 30 years And it just all devolved into just like campaigns and appeals to emotion and trusted speakers and like, and we were trying to be like, does technology or cryptography help at all in this scenario?
We’re like, Nope. We don’t trust any of it. It’s just . So this is kind of interesting because I’m like, my first instinct is it’s not gonna. But I wanna see what Dan says about it.
David: It looks to be about verifying where and when a digital image is taken, which is interesting. Then there’s a retrospective from DARPA about real world crypto, which should be interesting.
David: Next session is all about threshold crypto, which I, I think NIST is doing a threshold crypto like
Deirdre: Threshold signatures are, yeah. And leading with Chelsea. Go. Chelsea and Elizabeth Cretz. And Tim Ruffing. Yep. There’s a bunch of frost stuff. Also Musig. Two stuff that you can, you know, hopefully use in your future, future Bitcoin signatures you can use in your Zika signatures and a whole bunch of signatures threshold DC dsa.
I think frost is better. And then Blockchain. Blockchain. Blockchain. A new primitive called verify verifiably encrypted threshold. Key derivation. Ooh, I need to learn about this. I’m, I’m familiar with ACEing Connolly. No relation. I’m pretty sure there’s no relation. All the conies know each other and are related to each other, just statistically speaking.
Okay. Advanced encryption.
David: last session of the conference.
Deirdre: T Lock Time? Time lock Encryption based on threshold bls. Cool. All right. Portis Access Control Distributed System attribute ACE encryption. Cool. All right. Key manager sort of deals and . And last, last talk, ask your cryptographer if context committing a A E A D is right for you.
From Mihir Bilar, Phil Rockaway. This is like a, like a Thomas Risen Reen part. This is a Paul Grubs. This is like a star studded collection. Yes. Commitment in A E A D. I think I saw this paper. It’s pretty cool. Context commitment, not seeing same kind of attacks as key commitment standardization efforts are there for target context commitment.
I’m interested in this because we, there’s a lot of talk about committing to your protocol transcript about committing to all, like if you’re doing a proof or you’re doing a signature, there’s all these things that you, you want. Put into the protocol you’re running or the keys or the nonsenses or the commitments or all the sort of stuff that you’re putting into the thing that you are going to verify at the end to make sure that everything is kosher and no one is tampered with it.
And so I think this is basically taking the sort of authenticated encryption with auth, uh, additional data and basically expanding that additional data to have more context to it. And maybe there’s some good reason to do that. Cool.
David: Yep. That’s the program.
Deirdre: that’s real world crypto. I’m excited. I’m gonna go hope, make it to Japan and have a very busy week, and hopefully see some people and I will try to live.
Tweet it on Elon musk twitter.com.
David: Uh, I was at Enigma, you know, a month or two
Deirdre: Oh yeah,
David: and I kept seeing people wearing security, cryptography, whatever, hoodies
David: and I’m like, shit, I don’t even have one of those. All I have is a mug. It was a very weird experience. Um
Deirdre: Re real people in the real world do like do by our merch and wear our
David: and then people who didn’t have the merch were starting to feel left out.
So be sure to get merch before you go to Real World Crypto. Um, we make
Deirdre: security cro whatever.
David: of cents per purchase. We’ve decreased our profit margins as much as possible, so,
Deirdre: I can tell you, we have made a hundred dollars from , from profits, from all the merch that we have made. No, the, the merch is just for fun. We like the merch and maybe because someone was like, Ooh, could we do a thingy? Uh, there might be some more merch soon, but I don’t know if we’ll do it before.
Real world crypto.
David: And we have a new website.
Deirdre: Hey, we
David: same, it’s at the same domain, security cryptography, whatever.com. But we will be posting, um, transcripts there. Episodes will be listenable there, as well as, you know, wherever you get your podcasts. And some of the transcripts may even be, uh, uh, very nicely edited.
Deirdre: Mm-hmm. . And so you can, uh, you can read it nicely on a nice webpage and you can link to the webpage and it looks like something close to a blog post or an article version of the episodes, not just some weird link that doesn’t seem to make any sense.
David: not a link to a random descript U R L or random bus sprout url.
Deirdre: that, yeah, that, you know, get tied to versions of Descript that I didn’t realize. ,
Deirdre: people are like, your Brinks, your links are rotting. I’m like, oh, shit. That’s because we upgraded
David: Link Rat at, that’s CW Pod
Deirdre: So we have that new website. Same place, same great s Scw place. Same Great Scw time. Cool. That’s it. Bye.