Slop Forking Git with Pi, Opus Can't Stop Complimenting GPT-5.4, React Slides Kill PowerPoint, and OpenAI Buys Astral for How Much?!
Matt (00:06.604)
Yo, nice to see you.
Wilhelm (00:07.484)
What's up? Good evening, good evening.
Matt (00:11.48)
It's like mid-afternoon. I was chatting with someone and I'm like, I said we should have a call in the afternoon and they booked it for 1pm and I was like, dude, wasn't that a bit keen? And he was like, oh, I do want to like, under your toes in the afternoon, know, Friday or whatever. I'm like, dude, I'm pretty sure Portuguese afternoon doesn't start till about 8pm.
Wilhelm (00:13.312)
Okay, 4 p.m.
Wilhelm (00:36.896)
Yeah, that's fair. Have you actually noticed the difference in that? I know it's a Spanish thing, right? To have dinner at like 10 p.m. or something. Is there a thing in Portugal too?
Matt (00:46.648)
Nah, well not 10, I think it's nine. Which is like kind of fits with me as well. Okay, so do actually know, do you want to something about the Spanish thing? Okay, so this comes from an Instagram reel, so I don't know if it's entirely correct, but I need to fact check it, but I'm tell you anyway. And it's the thought process that, well not thought process, it's why the Spanish have their time the way they do.
Wilhelm (00:55.935)
Yep.
Matt (01:13.554)
And apparently they're like completely mistime zoned. So when Germany was going rampant around Europe, they put everyone on their own time zone. And Franco, who was in Spain, was so enamored by the other dictators, he was like, dude, I'm to do it to myself out of solidarity. And so he popped the whole of Spain onto Central European time, which is...
Wilhelm (01:13.642)
Mm-hmm.
Wilhelm (01:17.44)
of course. Yeah, yeah, there they are.
Wilhelm (01:26.984)
Yeah, yeah,
Wilhelm (01:36.736)
You
Matt (01:42.231)
probably an hour earlier than they should be if you look at the maps. It's weird that they're actually like an hour later than the UK and they're actually almost further west at some point. And so there's some thought that that's actually why Spain has this like a really messed up late evening thing or at least it contributes to it because they're actually just on the wrong time zone.
Wilhelm (01:44.634)
If you look at the time zone map, yeah, yeah,
Wilhelm (01:55.23)
Yeah, totally. Yeah, yep, No.
Wilhelm (02:07.633)
Yeah, No, actually makes total sense, yeah. And funnily enough, I think I actually saw the same reel.
Matt (02:14.411)
Probably. Because I was talking to Juliette about it and she was like, you mean this one? I was like, yeah. We all said the same thing. But that makes me think the other way around must be kind of interesting because Poland, actually even further, I'm pretty sure Latvia, Lithuania, they're all on, yeah, UTC plus one. At least I think they are. So.
Wilhelm (02:15.732)
Yeah.
Wilhelm (02:19.039)
We've all seen it.
Wilhelm (02:26.217)
Hmm.
Wilhelm (02:29.873)
Yeah, UTC plus one or whatever it is.
Wilhelm (02:37.009)
Yeah, you're right. Sweden, like all the way up Norway, also on, yeah, even the Balkans.
Matt (02:41.899)
No, but that doesn't matter that much because you're north. It's more like when you go further west, what happens? Like, do they have very early days and the Spanish have very late days? How does that work?
Wilhelm (02:48.465)
No, no, sure, but yeah.
I think the wildest time zone is the China one, right? Because everyone's on the same time. So if you're in all the way in, what is it, the west of China, I think you're getting up at like 10 a.m. or you wake up at 10 a.m. and then it's like completely shifted day. Good time zone chat to kick off the pod. All right, let me roll the intro.
Matt (03:00.747)
yeah, yeah yeah, that's crazy.
Mm.
Matt (03:11.734)
wild.
Wilhelm (03:25.343)
It's so catchy man. I was like singing it in my head yesterday. Also funnily enough, I with my brother a few weeks ago and he was like, can you please send me the prompts because I've tried 11 Labs music and I cannot get it to generate something good. Like how did you make this intro?
Matt (03:26.241)
Dude, it never gets old.
Matt (03:42.806)
I actually have no idea how you made it. I'm convinced you commissioned it because I also tried it at 11 laps and it was really hard.
Wilhelm (03:49.663)
It was a lucky strike, but yeah, it's been very hit or miss with me with 11 labs to be honest. Yeah, I don't know.
Matt (03:56.95)
Yeah, I mean all of that stuff, it's like a, it's all like lottery, isn't it?
Wilhelm (04:00.616)
Also, actually while we're on this topic, I know there's obviously lots of licensing stuff around music. One of the things I've been building into my personal assistant app thing is like a lyric learning thing. Because I'm really, really, really bad at song lyrics. They just don't register for me. I'm all about vibes with music. But Jess knows the lyrics to every bloody song. It's unbelievable. So I've been trying to make an effort. Well, I've been...
Matt (04:12.544)
Hmm.
Wilhelm (04:29.183)
trying to try to make an effort with learning these lyrics. Learn some songs, yeah. And obviously, you know, my personal AI, Chad app has like a Anki style, like what's it called? Like repeated, is it a name for this? I should know this. I was just working on it.
Matt (04:31.923)
We tried to learn some songs.
Matt (04:49.749)
Mm-hmm. idea. I no idea what you're talking about.
Wilhelm (04:53.535)
It's like when you don't learn, whatever, it's like when you don't learn a thing and then when you, you definitely know what this is. It's like when you repeat something, spaced repetition, that's what it's called, like a spaced repetition thing so you can like actually remember things like well. And it just built like a, I just built like the lyric learning thing into it. But.
Matt (05:12.127)
Okay?
Wilhelm (05:20.423)
Models have this hard-coded thing at the end where it refuses to output certain content. So like a model will never output you full song lyrics or at least the Claude models one, but I think this is across the models. So they will link you to a page that has the lyrics. They will fetch, like they will execute a tool call that fetches the lyrics, but they will never actually return you full lyrics because they're like licensed or whatever. But what you can tell them,
is like, okay, use your like browser use skill to fetch the full lyrics from a website and then just save them to a file. And then don't tell me, tell me the lyrics, but just work with them in the background and like copy them around in the code, whatever. Because obviously if you want, if I want these lyrics to exist in my app, they ultimately need to exist in the code in some way, or they need to be imported in some file. But like the lyrics can never pass through tokens. Like they can never pass through the tokens of the model. So if it like wants to,
write the, like, hard code the lyrics in like a TSX file or something, that's not allowed, but it can like do some weird stuff to reference them. And then the model is also like, the model is like, that's such a smart way to get around the model limitations. Let me help you with that. Because the model itself is happy to do it, but like there's some layer of content rules, like that get applied just as the model is about to reply to you, right? So like, there's like a non, like a deterministic
non-model layer baked into the API or something. But the model itself is happy to do it, which is just kind of hilarious. I think this also applies to certain open source licenses. I think I saw a thing on Mario Zechner's ThePieGuy as Twitter where he was refusing to output the AGPL license.
Matt (06:48.372)
Yeah, yeah.
Matt (07:06.932)
That's quite funny. That's quite funny. Okay, okay. Speaking of the pie guy, speaking of Mario, have you tried pie?
Wilhelm (07:16.457)
Mm.
Wilhelm (07:20.553)
briefly,
Matt (07:22.118)
It's sick. It's like everything I wanted from Claude code and OpenCode when I'm sat in front of it. I haven't tried it that much for like remote stuff, but sat in front of it. It's the tree abstraction is so good, the extensions are beautiful. People have written some extensions so we can use like our internal AI gateway.
Wilhelm (07:23.571)
Yeah, I like the theory.
Wilhelm (07:37.439)
Mmm.
Mm-hmm.
Matt (07:45.07)
It's just so easy to write. Like you can basically, it just writes itself. Like the API structure is so nice. And the auto research that Toby made for it, the Shopify CEO, off the, I don't know if he made it, it was made basically like, Kapathi did some auto research thing and then they, they made an extension for Pi that does the same thing. It's insane. Over the last like six hours, I think it's been running.
Wilhelm (07:45.119)
Mmm.
Wilhelm (07:56.01)
yeah.
Matt (08:14.868)
on mine, yeah, for mine. I just took, know, you know, I'm obsessed with Git, right? I took libgit2. Yeah, I took libgit2 and just made a full Zig, like a native Zig version of Git from libgit2. And it works. It's 46 % faster than the Rust implementation. Yeah, it's insane. Like, I...
Wilhelm (08:15.281)
for you? no way.
Wilhelm (08:21.158)
Mm-hmm. Yeah, soggy. Okay.
Wilhelm (08:30.27)
amazing.
Wilhelm (08:38.78)
No way.
Is Lip-Git 2 in Rust at the moment? Is that like the canonical? okay.
Matt (08:44.742)
No, libgit2 is in C. It's in C. But there's loads of other, there's loads of random shit in libgit2. And like that, I guess that's the point. I won't be fully feature complete. But my git hashes match, like everything matches. And I'm running it on like the React repo, like the biggest repo I could think of. And it...
Wilhelm (08:51.026)
and
Wilhelm (09:00.542)
That's cool,
Wilhelm (09:04.146)
Yeah, nice. And for context for people who don't know what libgit2 is, I happen to know this randomly, but I feel like it's not very well known. But it's like the native bindings that any other application that ends up interfacing with Git uses to actually do stuff with Git. So for example, in the GitHub code base, the stuff that actually merges pull requests, like merges branches under the hood, will call out to libgit2. Does that match your understanding?
Matt (09:11.154)
Yeah, go for it.
Matt (09:28.979)
Yeah, libgit2 is like a Git server. There's a bunch of implementations of a Git server, but they normally rely on libgit2 under the hood. They're like bindings in different languages. There are a few others. There's this Rust one, and there's isomorphic Git, which is in pure JavaScript, which is kind of cool as well. It's a bit slow, but it's kind of cool.
Wilhelm (09:55.42)
Interesting, yeah.
Matt (09:57.725)
But yeah, I wanted one in pure Zig so I could run it in Wasm. So it's just like completely, I could run it everywhere. And you can't run libgit2 in Wasm because it has some stuff that doesn't convert to Wasm very well. Yeah. I don't really pretend to understand all of that.
Wilhelm (10:02.793)
I'm
Wilhelm (10:09.81)
Yeah, yeah, makes sense, makes sense. So is this your first slop fork?
Matt (10:16.892)
that this is like a mega slot fork. Like this is, is hectic.
Wilhelm (10:19.55)
Hehehe.
If you're not slop forking right now, what are you doing? You're getting left behind. If you don't have at least three slop forks, what are you even doing?
Matt (10:25.095)
Yeah, well, you're taking stuff that's already existed in the ecosystem and just moving it between different tools. And so this is the Pi auto research tool that's doing this, by the way, right now. Just basically trying to get the full coverage and get it as fast as possible. Because Git itself is a very simple, it's actually a very, very simple
Wilhelm (10:36.21)
Mm-hmm.
Wilhelm (10:39.987)
Mm-hmm.
Matt (10:54.704)
Like it's not long at all. It's not big. It mostly relies on how, yeah, like it relies on compressing and decompressing objects. And so like Zig is like the perfect language for like the manual memory stuff. And the models is so easy because they can look at the standard library directly of Zig, which is tiny. And they can just be like, this is how the standard library compresses stuff.
Wilhelm (10:58.908)
Yeah, like the data model.
Wilhelm (11:04.659)
Mm-hmm.
Matt (11:20.435)
Maybe I should do the same thing or we should use the standard library approach or should we try and tweak the standard library approach? And as long as there's a good feedback loop for speed, it can just customize the hell out of it. It's insane. Yeah, it's insane.
Wilhelm (11:27.614)
Totally.
Wilhelm (11:32.754)
Have you, or what have been some of things that you learned or that surprised you? it the same thing that the test coverage of libgit2 is really, good? So as long as you get the tests passing, is that the main feedback loop that's driving it?
Matt (11:51.138)
No, I'm having to my own tests and my own bench scripts for like common things. So I will be missing loads of stuff. Yeah, I'll be missing loads of stuff, but like comparing it against libgit2 and the Rust version and just like testing because Git doesn't have that, like the internals of Git don't have that many processes like at all. There's like...
I'm not gonna list them out now, but there's like six of them. It's like really not very big. It's like walking a tree, passing stuff, creating diffs, compressing and decompressing some objects, like passing commits. Like there's not a huge amount of stuff there. So yeah, I probably won't have support for all of the crazy, like, yeah. Most of the crazy...
Wilhelm (12:42.641)
All the main things, yeah.
Matt (12:46.991)
like command line arguments are just there in the CLI. That's why Zaggy was so cool because Zaggy just replaced the CLI and then called out to libg2 underneath and was like double as fast half the time for most commands because the CLI had like some pearl in it and some crazy shit in it and all of those arguments are like a remnant of a bygone era.
Wilhelm (12:51.486)
I see.
Wilhelm (13:03.069)
Mm-hmm.
Wilhelm (13:08.336)
I see, see.
Wilhelm (13:12.968)
Yeah, yeah.
Matt (13:13.328)
but you can add all of those if you really want to, but the actual fundamentals of Git is quite, quite, quite a stiff road forward.
Wilhelm (13:18.302)
It is quite sure. Yeah, interesting. So, so are you saying that you don't even need like a crazy feedback loop in to make this work well or, or, or like,
Matt (13:29.945)
Not really, because it's just running libgit2 and the Rust version and just seeing what it gets from both of those options.
Wilhelm (13:33.182)
Right. I see. So it's running like yours and these other two implementations in parallel or like to compare. Right.
Matt (13:39.792)
Yeah, yeah. And then just comparing the time and then comparing what we're getting out of them. And then just checking the hashes of each time. Because Git's based on hashes, like if stuff's changed. So it's, I'm pretty sure it's quite approvable. Like it's wild that, yeah, models are this good that this is possible. Yeah, I've been trying to work out the implications because I don't feel like this was possible six months ago.
Wilhelm (13:51.975)
I see, see.
Wilhelm (14:05.629)
What model are you using?
Matt (14:08.337)
4.6.
Wilhelm (14:09.469)
Nice, nice, nice. Opus.
Matt (14:11.716)
Yeah, think 5.4 codecs would be good as well. I'm not sure I have access via my gateway, but this is opus 1.6 on the Max plan actually. Pretty good.
Wilhelm (14:17.01)
Mm-hmm.
Wilhelm (14:24.987)
Nice. That's cool. Yeah. One interesting thing that I've noticed in recent use. So I make quite heavy use of this like second opinion skill that I have, which is the most straightforward thing ever. Right? So my main driver is Opus and Cloud Code or my own wrapper around Cloud Code. So Opus 4.6 is the default for everything. But then I just tell it, you should get a second opinion from...
Matt (14:36.911)
Hmm?
Wilhelm (14:52.987)
GPT and Gemini. And Gemini 3 used to be the default for that. And I think it used to be the smartest model when we were in the like 5.2 days. I think it was the smarter model. Then I started, know, upgrade 5.3 and now we have 5.4. And it's really, really interesting to see the evolution because what used to happen is that I would say, yeah, I'll get a second opinion. And then it would call out to Gemini and GPT and be like, yeah, you know, some good other takes.
Let me incorporate them. And by the way, I think this is a really powerful workflow. think everyone should do this. Like think just getting other model opinions in is real. Like it just, they will find random crap where like Opus went off the rails and did something weird. It's like a good check, but it also genuinely brings in other interesting things that Opus like missed. But the most interesting that's been happening recently is when I asked it to use GPD 5.4, Opus is always incredibly
Matt (15:39.407)
Definitely.
Wilhelm (15:50.046)
complimentary of what GPT-5.4 outputs. So it'll be like, wow, or sorry, I'll quote it directly to not misquoted, but it'll say like, excellent response from GPT-5.4, or it'll say like, that's really like the, the opinion from GPT-5.4 is the best and most comprehensive or stuff like that. So I think like Opus really backs GPT-5.4 for, um,
for like this use case. And to me, it feels also like 5.4, especially for like architecture review or like planning, even maybe for coding. I know some people prefer using the Codex model, like 5.3 Codex for coding still. It just feels like a really good model and a step change in some form. And of course the different families have different strengths. Like think Opus and Claude are better at like front-end stuff, for example. But in terms of this like planning and thinking and architecture and review,
Matt (16:25.966)
Mmm.
Wilhelm (16:46.141)
5.4 feels like we're kind of in a new league. Well, according to my opinion and according to Opus as well.
Matt (16:53.903)
Yeah, that's wild. Do remember back in the day, like, you'd get a Jupiter Model to review another Jupiter Model stuff and it was like crazy sycophanty and I was like, what crack are they smoking? Like, to each other. And there must have been just something where they, like, something recognized, yeah, its own writing. Yeah, that's interesting, huh?
Wilhelm (17:04.476)
Mm.
you
Yeah.
Wilhelm (17:14.043)
recognize each other or something. Yeah.
Wilhelm (17:22.705)
Yeah, I feel like I'm really, I feel like I want like Opus 4.7 or whatever. Like I want, I, yeah, I feel like I need the next one.
Matt (17:35.511)
Yeah, need the next crack from the crack dealer.
Wilhelm (17:38.457)
Exactly, yeah. And then also Gemini feels like almost dated now, even though, you know, when it came out, it was like, wow, this is a major step change. So what a time to be alive.
Matt (17:46.131)
I didn't like it that much. I know you loved it, but I wasn't the biggest fan. Yeah.
Wilhelm (17:52.315)
I I'm a Claude person. feel like I, yeah, I think other people liked it a lot more than I liked Gemini 3, but I always liked it for, like it felt like it would catch, like I never use it as a daily driver for calling tools or whatever. I use it for the review stuff mostly, and it was useful for that, I think, and it's still useful for that, but I think the other models are just even better now. So that's the brief weather report of where we are at March 20th.
in this year, 2026.
Matt (18:21.283)
Yeah, it's wild, isn't it? It is wild. like, I'm still struggling to deal with like the change recently of what's possible and like what we should be building. And it seems like there's so much stuff that's possible, especially the last couple of days have just been quite like.
Wilhelm (18:30.736)
Mm.
Matt (18:41.315)
really quite struggling with that, getting a little bit into my own head about what I should build because I can theoretically build anything now, even if at the moment I don't know how it works or even what it is. So I feel like spending more time imagining a future I think is actually, I think is better, but it's been breaking with my brain, breaking my brain recently.
Wilhelm (18:43.472)
Mm-hmm.
Mm.
Yeah.
Wilhelm (19:05.114)
Yeah, massively breaking the brain. I completely agree. You should do a think week. Or a long think weekend or something like that. I'm overdue actually as well.
Matt (19:11.661)
Yeah.
Matt (19:15.151)
Maybe a long think weekend. have some fun ideas there. I was also listening to Dwarkash as well. The past couple of weeks I've been trying to get through the one with the semi-analysis guy. It's wild. I didn't realize how much up shit creek we were when it comes to GPUs.
Wilhelm (19:24.522)
yeah.
Wilhelm (19:31.705)
I started that, yeah.
Wilhelm (19:42.008)
as in like you think the future is a bit bleak and we're very constrained.
Matt (19:47.471)
we're definitely constrained.
Wilhelm (19:50.779)
Actually, I feel like I actually had a similar now that I think back to, so I listened to maybe the first quarter or something, but it made me like, I'm glad that I can get tokens out of these models right now. Like, who knows in like a year's time, like, will I be important enough to like actually be able to use these things or will it just be too expensive or like there'll be too many, like, I don't know, very valuable enterprise use cases that take priority for like, I don't know, Claude six or whatever. And may immortals won't be able to
use it because there's just not enough GPUs to run all the stuff.
Matt (20:26.253)
Yeah. Yeah. No, that's literally how I think about it as well. I'm like, what, what, what, what? Yeah. I'm actually, I'm just kind of struggling with it to be fair. I'm like, we are massively compute constrained for this new world that we're entering. But every, there's also another thing where like this compute is expensive. People are paying.
Wilhelm (20:41.137)
Mm.
Matt (20:56.172)
two-ish dollars for H100s, which is more than people were paying two years ago for a chip that is now six years old. Like, that's wild. And that's happening because Blackwell, the next generation is coming out too slowly because of a compa- like a bunch of supply chain things, but a big one is a company in the Netherlands, ASML, that only produces 60 of these lithography machines a year.
Wilhelm (21:02.778)
Right.
Wilhelm (21:07.91)
Mm-hmm.
Wilhelm (21:20.988)
Mm-hmm.
Matt (21:25.505)
when they should be producing thousands of them. And that's somewhat due to a lack of lenses that they can get from Carl Zeiss in Germany. It's all like kind of like, it's just this weird thing where there were sort of like four manufacturers that control all of this stuff. And as you get further down the stack, everyone's more and more AGI-pilled. But in Carl Zeiss, they're not that AGI-pilled apparently.
Wilhelm (21:25.638)
Mm-hmm.
Wilhelm (21:52.071)
They're not Asia. Yeah, it's very new territory. This for our software people, right? That's like, what do you mean? You can't like just turn on the thing. Just drag the Heroku slider to more. Just deploy to region earth, you know?
Matt (22:03.308)
Yeah.
Matt (22:08.544)
Yeah, I want more, I want more, I want more. But then like, there's a problem, there's a problem on the software side as well, where...
Matt (22:21.676)
How do I phrase this?
Stuff, okay, so stuff has got a lot cheaper to make. People are making more stuff and it's got cheaper in terms of like human capacity hours. And obviously you see this a lot in like San Francisco and things where you guys are paid like shit tons of money. But for the rest of the world, I would say, for software engineering, for software. But when you see like the rest of the world, there's gonna be like this like flippening point where
Wilhelm (22:43.408)
Wait, paid lots of money for what? For like housing? I see. Yeah, yeah, Uh-huh.
Matt (22:56.299)
Maybe there's not going to be, maybe the model price goes down more more and more, but there's going to be a point where we're using so many tokens that maybe it's cheaper to have humans, or maybe it's not. Maybe that is ridiculous because these AIs are actually producing more than humans would anyway. So maybe the AI is better. But yeah, I find it very, very interesting. This whole GPU, yeah, this whole GPU and what is the...
Wilhelm (23:10.79)
time.
Wilhelm (23:20.42)
It is very interesting and it's very mind-bending.
Matt (23:24.993)
What is the price of work these days? I saw Steve Rees posted on, like the Teal Draw founder posted on Twitter. It's kind of mind boggling to me now that like I have to pay to code. And that, that, that never, that was never happening previously. And, and it also breaks like the contract of the internet a little bit, right? Like previously use the internet for free. The internet's always been free. Like software has been kind of free, like as in like a hacky software has been free.
Wilhelm (23:36.79)
Mm, right.
Yeah, yeah, yeah, yeah.
Yeah, yeah, yeah.
Wilhelm (23:55.332)
Yeah. And as a, to be a good engineer, you don't in theory need an expensive computer or like if you have the skill, you can work. But now if you need like five max subscriptions.
Matt (23:56.244)
It just takes someone's time.
Matt (24:02.486)
Yeah.
Matt (24:10.188)
Yeah, it starts getting really weird. Like even five max subscriptions to learn now because Stack Overflow is dead. Like there's a thing there as well. I just wonder how the economic model of these models is gonna like carry on because obviously the open source, sorry. Yeah, and the open source ones, do they get any better? we just published a Kimi K 2.5 on Worker's AI.
Wilhelm (24:16.311)
Right. Right. Yep. Yep. Yep.
Wilhelm (24:26.607)
And you do have like free models and cheap models, right? Yeah.
Wilhelm (24:36.484)
Hmm.
Matt (24:38.347)
pretty banging model, like it's decent. sort of, it's sort of like, if it had come out in the Claude 3.5 Claude 4 era, like the Sonnet 4 era, I think we'd have said this is like the best model we've ever seen. Now it's like not quite the best, but it's really good. It's really good.
Wilhelm (24:46.329)
Mm-hmm.
Wilhelm (24:51.045)
them. Yep.
Wilhelm (24:56.621)
I was going to ask you this actually, do you use any open source models like day to day in your workflows? Because I met someone last week who was like, yeah, the Chinese models and like Kimi and I actually don't know what or Quan were they like, they're actually state of the art. Like they're totally on par with the frontier models. And I'm like, no way that's true. But I actually don't haven't tried them yet.
Matt (25:02.047)
No, but-
Matt (25:15.691)
Yeah, I don't think they are. But we do try, I do have a bunch of roles in Works AI, so I use them for demos and stuff, and they're perfectly capable. Actually, the smaller ones are mega fast on, and so it's really nice to use them for demos. I think when I get my home lab fully set up, I will do some more of that, like with personal assistant open core thing. I'm just a bit lazy, I'm not gonna lie, in doing all of that, setting all of that up.
Wilhelm (25:37.69)
Nice.
Matt (25:46.142)
But I'll probably use like, Quen 3.5 or something like this. I heard it's good.
Wilhelm (25:50.684)
Yeah, Nice, nice, nice. I started using an open source model actually on Wednesday, two days ago. I went for a meetup. This is actually a meetup here that I love because it has very similar vibes to AI demo days. Really interesting speakers, really good crowd. like, I think it's happened three times.
Matt (26:06.52)
yeah?
Wilhelm (26:15.243)
I missed the first edition, went in January and then it was on again this week. It's called Agents Anonymous. And you spend like what, like two, three hours at this meetup and I just remember walking home being like...
Matt (26:22.227)
Okay.
Wilhelm (26:29.509)
Holy crap, I just learned so many, like every conversation with someone else, you learn something, which is not true at all for most conferences, right? Like most anything, like when, and I think it's a reflection on the quality of the meetup, but also on just the time we're in right now, there's so much happening and so everyone's trying to figure it out. So the meetup started with like, I was just chatting to this guy and he told me about Parakeet, which is this Nvidia.
Matt (26:38.132)
Yeah.
Wilhelm (26:57.723)
Speech to text model which I hadn't heard about actually maybe because I'm off Twitter still and I was like Oh, I thought this was like a soft problem. Everyone just uses whisper and he was like no This model is actually really really good Especially if you just need like English like maybe whisper is still the best for like multi-language stuff But this is like really really good And you can run it locally and I was like, whoa, can I run it on my Mac mini? He said yeah, so then actually I recorded a voice message into my slack where my like Mac mini Chad agent is listening
Matt (26:59.945)
Yep.
Matt (27:19.753)
You
Wilhelm (27:26.295)
and sent it off and being like, hey, build me this whole like voice pipeline. And it was like, UK, you send a file. I can't read it. It doesn't have any voice capabilities. And I was like, yeah, that was a voice recording. Please like use Parakeet to like figure it out. And then, and I kid you not, it built the whole thing. It downloaded Parakeet. It like the whole model is like a gigabyte model or whatever. Download the whole thing, wired it all up, then played my transcription. And I was like, okay, cool. I can hear it now.
It's just like so mind blowing. It's so mind blowing that you can, and I'm like texting it throughout the meetup talks, like on Slack, like looking at the progress and like giving it like other stuff to do or whatever. But it just figured out the whole thing. Then it built a whole voice thing into my mobile app and I just tried it out. Like, and the model is fast, right? Like, so it can transcribe, or for me at least, it transcribed like 30 seconds of voice recording in like 700 milliseconds. That is f****g
Matt (28:23.409)
Wild. That's crazy.
Wilhelm (28:24.911)
fast. Yeah. And that was just like the first thing in the meetup. Like I need to tell you about the other stuff as well. like at some point later in this. Like there's just like really Steve Faulkner gave a talk actually. Yeah, he was there. He talked about because we talked about him last week, right? He talked about his V-next Slop Forking adventures. And one of the funniest things he did actually, I really like his humor. I think he has a very cool sense of humor. He's really funny. Yeah. But he said, yeah, hey, I didn't really have time to prepare for this meetup talk.
Matt (28:35.773)
Yeah.
Matt (28:47.305)
He's so funny, yeah, he's so funny.
Wilhelm (28:54.255)
But I was on a podcast this morning, so I just fed the podcast transcript into like a vibe slide generator and it made these slides.
Matt (29:03.177)
yeah, let it slide. Yeah, that's an internal tool. It started off with narrations, but I build slides with it as well. It's really good.
Wilhelm (29:07.803)
Bye!
It's called Let It Slide, that's great. But he would...
Matt (29:16.691)
Yeah, I think it's public. I think it's slides.cloudflare.com, I think.
Matt (29:26.362)
no it's not. I lie.
Wilhelm (29:29.035)
but he just said like, okay, so we're seeing these slides together for the first time. Like I haven't seen these slides. Let's go through it together. And it was really, really good. And then as he was saying this, I was like, damn, I do a podcast. Why don't we feed our transcripts into, a thing and let it generate slides, and then post these as like, you know, these like LinkedIn PDF carousel posts.
Matt (29:54.827)
no, God no. Yeah, I know exactly what mean. But do you know, like, I spent... So you're not...
Wilhelm (29:59.705)
And it built the whole thing, Matt. Like it has a, I haven't sent it you, we, it, it, the whole pipeline works. Yeah. Yeah. Yeah. And it's not that bad. And it's great because when we post the pod somewhere, obviously it's a commitment to download a pod and whatever. And now we can like put our most rage baity controversial stuff in these slides and it'll make people click on the pod.
Matt (30:04.551)
You did it.
Matt (30:22.985)
I say now? Okay. So you're not on Twitter, which means I... Yeah, Yeah, yeah, no, I know that as well. Okay, no, but you know... Okay, so you know you're not on Twitter. Well, I actually post on Twitter that React has completely replaced Google Slides for me in 2026, and it wasn't on my bingo card or whatever. And...
Wilhelm (30:24.376)
hahahaha
Wilhelm (30:29.508)
Wait, you know this is public, right? Anyone can listen to this.
Ha
Wilhelm (30:46.458)
Mmm.
Matt (30:53.308)
I got a bunch of replies, like, it's completely true. Like I did this talk for Node Congress and someone, Rita I think, sent me some themes that she'd made her slides with. And I was like, cool, borrow those, just plug them into Cloud Code and was like, dude, can I have slides for this? And I had a whole like, you know, there's voice mode on Cloud Code. I just absolutely like.
Wilhelm (31:07.802)
How cool.
Wilhelm (31:20.858)
Mmm. Nice.
Matt (31:22.032)
rambled on about what I wanted to talk about, talked through all of the different repos that all of these examples live in and they all live on my laptop. So it's like where they all are, how I want to arrange them and then just went away for lunch and came back and came back to pretty much not quite perfect. It took like another couple of hours of like playing around with some stuff, but honestly, considering 20 slides normally took like two days or a day and a half, like a good like,
Wilhelm (31:48.537)
Mmm.
Matt (31:51.217)
few hours in the tens of hours to like do before probably 45 minutes per slide. It's now like a couple of hours, I would say even less maybe and all my slides are mega interactive now. So I have my, I embed my demos on my slides because it's React.
Wilhelm (31:58.584)
Yeah, interesting. Yep, yep, yep.
Wilhelm (32:08.146)
I see.
Wilhelm (32:11.638)
That's right because it's just okay interesting I want to ask you about this because it makes sense to me in theory how it could speed things up But when you talk about like slides to train time twenty slides take like two days Most of the hardest thing of like doing a talk right is figuring out like okay You want to communicate this thing that you understand but like where's the audience at? How do you want to communicate it? What are like the main points? What are like like how do you meet people where they are? like that's the hard part
Matt (32:35.897)
I know, I know, but I know how I wanna do that. Come on, I'm gonna show you my slides. They're cut beautiful, they look, can you see them?
And look, they just like, like the track, like the motions, like it's just stunning.
Wilhelm (32:45.594)
That's... Okay, but just for me to understand, I mean it looks incredibly beautiful, yeah, and it looks like a very coherent story, but like, do you think... Yeah, that's incredible.
Matt (32:56.048)
And look, here's a sandbox. Look, wait, if I turn... You can just like chat to it.
Wilhelm (33:03.532)
Ooh, your project management CLI. Man, I want to listen to this talk.
Matt (33:06.234)
Yeah, look at that. Yeah, it's good, right? And then this is like another way that you might do it.
Wilhelm (33:14.734)
Yeah, very, very cool. Yeah, interactive slides is wild. Demos are slides. This is clearly the way. OK, I'm convinced.
Matt (33:16.368)
And like this is like the top K, yeah, it's just mega interactive. Yeah, look at this. Look at this. Okay. And now this is one on code mode. So look, I can list a project on, this is calling a running backend. It's calling like my project management SaaS is out there. Look, I've just listed all my projects. Should we create a new project? Me and Will.
Wilhelm (33:28.783)
Yep.
Wilhelm (33:32.973)
Man, yeah.
Wilhelm (33:38.54)
Okay, this is crazy. Okay, so people who can't see this, this is what I'm looking at is basically like an incredibly well designed, beautiful, useful interactive website that just happens to look like slides. And that have slide. Yeah, right. That is beautiful.
Matt (33:52.155)
Yeah, and supports arrow keys moving between different pages. So look, Matt and Will, we just created a little project in the SaaS backend using some generated code, and now we can create a task. Should we create a task? Ship code mode.
Wilhelm (34:05.72)
Okay. Okay. Okay. The point I was just trying to make is that you're not some consultant who's been told to give a talk about X and then you vibe slide at 20 slides that otherwise you would have actually thought about. This is like, know exactly what you want to say and now you're just making it beautiful.
Matt (34:12.708)
No, dude, I'm like...
Matt (34:18.288)
But look at this. Yeah, and even expressing it in this way, I could not have made this pretty personally. It would have been so hard. And the cool thing about CSS is the alignment is kind of done for you a lot of the time, which it's really not in slides. It can be a pain. I'm like, look at this. This is one slide that I can do stuff with and play with and, phone call.
Wilhelm (34:28.538)
Mm.
Wilhelm (34:34.977)
Yeah, yeah, yeah, my god.
Wilhelm (34:40.919)
my god, yeah.
Wilhelm (34:45.892)
Damn, yeah, using CSS for alignment is obviously such a better way to do it than in slides. It's just, it used to be harder. Now you can just tell it, make it aligned.
Matt (34:51.718)
Look this. I can...
Matt (34:57.35)
Look, I'm showing off like global outbound in dynamic workers. So look, I just call this endpoint. And this is live code, right? I can like console log.
Wilhelm (35:02.478)
Yeah.
Wilhelm (35:05.966)
Yeah, that's incredible.
Matt (35:07.206)
Console log, actually this is return, this is return hello.
Wilhelm (35:11.034)
Can I, I can go on this as well, right? This looks like it's live on the internet. Code mode dash talk. No, I won't give.
Matt (35:14.342)
It is relatively live, yeah. Don't say it out loud. It won't be live when someone else comes to look at it. And right at the end there's like a whole chat thing, like, can we list my workers? It's fun, right? yeah, it's fun.
Wilhelm (35:26.584)
Man, it's, I'm not gonna lie, this is epic. I'm glad you showed me.
Wilhelm (35:36.279)
I found a bug.
Matt (35:39.172)
yeah? You found a bug on my slides.
Wilhelm (35:39.886)
When you hit the arrow key to go back, like go to the left, the animation still makes it look like you went to the right.
Wilhelm (35:51.476)
Hahaha!
But yeah, this is, yeah, wow. I wish everyone could see this.
And you're giving this talk at MCP Dev Summit as well, right? In New York, in...
Matt (36:05.913)
I'm giving a similar talk at MCP Dev Summit.
Wilhelm (36:11.737)
That's awesome. Yeah. So what's your plan for this trip? you, do you end up coming out to SF or no?
Matt (36:18.053)
I don't think so, Because I got into AI engineer Europe as well. So it's MCP dev summit for one week and then the next week is AI engineer Europe. So.
Wilhelm (36:23.993)
Mm.
Wilhelm (36:27.606)
wow, yeah, fair. That's stacked.
Wilhelm (36:36.077)
That's a stacked schedule. Yeah, fair enough. Another time.
Matt (36:38.211)
Yep. Another time. I want to come to SF. I do like it there. It's fun. But I'm already away for two and a half weeks because then I've got a friend's wedding the weekend after engineer Europe. And then I'm going to stay in the UK a little bit because then the Wednesday after is Cloudflare Connect. So if anyone's around in London, Cloudflare Connect is pretty...
Wilhelm (37:05.187)
Mm.
Matt (37:08.068)
Pretty fun. It's a good thing to come to.
Wilhelm (37:09.561)
Cloudflare Connect. Nice. That's cool. Another really interesting thing at this meetup was, Cursor, Cursor people gave a talk and they seem to be like all in on background agents now. Like that is very much like the focus. But it's interesting, like I think the cool thing about these talks is not just like...
Matt (37:12.738)
all the crew will be around.
Matt (37:24.698)
yeah, sick. Yep.
Wilhelm (37:35.95)
the actual content, but also you see like how people speak about a thing, like how they seem to feel about it, right? And it's interesting because Cursor used to be such the like obvious winner in all of this, like all the users super early. And now it feels like a bit more like they're trying to catch up or figure out like what their place is or like trying to convince like us that, or trying to, yeah, pitch their vision for.
Matt (37:55.275)
definitely.
Matt (38:00.793)
Yeah.
Wilhelm (38:02.552)
background agents, feels like much more, it feels like it's a kind of a compelling, reasonable sales pitch, but it's just kind of far removed from like the raw hype and excitement of like the initial stages of like using AI for coding, which I think that like hype is now kind of more in the like open claw world of things. That's where it feels like, wow, like, holy crap, like what's happening?
let's do it all rather than I guess, yeah, maybe there's just a different stages of the hype cycle or whatever, but it feels like a very different kind of pitch for a cursor product than maybe it was like a year ago.
Matt (38:44.803)
Yeah, well, I find all of this, like the whole startup, like looking from the outside at the moment, I find all of this kind of like strange and it kind of hurts my head a touch, but I can imagine what they're feeling. It's a bit of an existential crisis, right? Like they started, the models almost got too good. They started with this IDE, which were like, maybe we can use AI for coding.
Wilhelm (39:09.304)
Right.
Matt (39:14.973)
and we can have like this amazing auto complete that jumps lines. we can have agents inside working with you. shit, the agents can do it all. Fuck, what do we do now? Because now you don't need the IDE. Now what happens? Okay, right now we have to invest in background agents. We have to do that whole agent hosting infra.
Wilhelm (39:22.722)
You
Wilhelm (39:29.561)
You
Matt (39:42.424)
Yeah, like there's no other, there's no other option.
Wilhelm (39:47.107)
Do you use background agents in your day to day at the moment?
Matt (39:51.619)
Do I use background edges? I... not.
I, yeah, I don't know. I spend most of my day in front of a laptop. And during that day, I have like six agents running at maximum at any one time all the time. So whether I'm doing background agents, like they're always running in a shell on my computer. So no, it's not background.
as much as like I'm collaborating and overseeing in a way that is very much still on my computer.
Wilhelm (40:30.861)
Yeah. Yeah, yeah.
Wilhelm (40:36.525)
Yeah, yeah. Mostly the same for me, to be honest.
Matt (40:37.987)
That doesn't mean it won't change soon. I just haven't found a good experience where I can feel like I still can do everything and know what's going on. like a big thing, I've said it loads of times, like a big thing for me, like I run all of my coding agents in my top level code directory because like I work across multiple repos at once. Like I'll have like a POC in one repo. I'll have like a...
Wilhelm (40:59.672)
Hmm.
Matt (41:07.765)
a reproduction for a customer where something was broken, another repo, and then I'll have the main agent's SDK. And then I'll have a consumer of the agent's SDK, like the Cloudflare MCP repo. those four repos, I'll be looking at one that then does something with the other, that then I need to make the fix in agents, and then need to demo and check the fix in Cloudflare MCP. And I just like the hat workflow, I really don't think works very well with a singular repo.
Wilhelm (41:23.777)
Yeah, Yep.
Wilhelm (41:37.687)
Right, if a background agent is tied to just like one repo, then that kind of falls apart. Yep.
Matt (41:38.28)
structure.
Matt (41:42.604)
Yeah, like it all breaks because the background doesn't have enough context. until, until, yeah, until I can have like a programmable computer in the cloud, which is coming very, very soon, right? Like it's not going to be far off. No, no, like, okay. So there's a bunch of sandbox companies that do something.
Wilhelm (41:46.273)
Yeah, yeah, totally. No, I think that pattern's very smart.
Wilhelm (41:59.213)
Now your Mac Mini is arriving finally.
Matt (42:09.122)
that I like this, that I'd be kind of keen on, even like the, not even, like Cloudflare sandboxes 100 % are something similar to this that I would be keen on. It's just the whole dev experience of being connected to a remote machine that I haven't quite like got my head around. And some people are super, have been super in advance of this already. Like I remember Dax talking about this ages ago where he's like, yeah, all my deving happens on.
Wilhelm (42:26.102)
Yeah, feel it, yeah.
Matt (42:38.796)
this machine next to me, not on the machine that I'm working on. Like I'm always SSH'd into my dev box.
Wilhelm (42:45.932)
Yeah, I've never done that.
Matt (42:49.44)
And I've also never done that. I guess like when he was learning to do that, I was still in pajama, I was still in like nappies, you know? So I think I probably need to learn, yeah. I either need to learn how to be productive in that workflow or I need to build some tooling that allows me to be productive in that workflow because I want to like mount an external file system onto my...
Wilhelm (42:57.29)
and that piece, yeah.
Yeah, different strokes.
Wilhelm (43:18.604)
Yeah, I mean, wait, what's the benefit? Like what would be the benefit of you adopting that workflow? Like why make yourself do that?
Matt (43:18.795)
computer and then just run my coding agent. Sorry.
Matt (43:25.919)
I guess the main benefit is that I could turn my laptop off and it would still run. I guess.
Wilhelm (43:32.542)
fair, fair, yep.
Matt (43:35.007)
Like if I've SSH'd into a dev box and then I run my agents in the dev box, then I don't need to, then it can all just be running the whole time.
Wilhelm (43:37.58)
Yeah.
Wilhelm (43:41.143)
Yeah.
Wilhelm (43:47.321)
I will say, so I mean, to bring the Mac Mini back into the picture, people ask, oh, what's the best thing you've built with your agent? And it's lots of little things, and I've shared a bunch of them on the pod, right? But one of the most fun things that is a very neatly packaged workflow that...
people seem to like is it's really fun building something on the go. Like talking to the agent on Slack while I'm like at the meetup and I just have an idea and I can fire it off straight away to get it to build. And obviously the code runs and does all the stuff on the Mac mini. And same thing in the gym, you know, it's like, I feel like I spend a little longer in the gym now because in between stuff, I'm just like chatting to my agent on Slack and like it's building things. And then it's maybe like updating the mobile app. And I'm like,
refreshing the mobile app to see the latest changes and like that is a really really cool experience.
Matt (44:37.941)
Yeah, that distracts me too much in the gym. can't, I can't start doing that. Otherwise I won't do any exercise. Like don't do any sets if that happens. Yeah.
Wilhelm (44:43.768)
It's the age of distraction. Yeah, there do tend to be long breaks between some sets if I get too distracted.
Matt (44:50.241)
20 minutes of break just like hanging out texting Claude.
Wilhelm (44:55.915)
It's a weird time. Wait, I was going to mention something. Oh yeah, just on the point around background agents, the cursor people mentioned that one of the things they learned is that it's really, really crucial to get to SSH or to let users connect into what the agent is doing, into the box at any time. That's something that really worked well for them, which I don't know, maybe that's an interesting takeaway.
Matt (45:13.61)
Yup.
Wilhelm (45:20.952)
And obviously the great premise of background agents is that you can just, like if you have a thousand tickets open, right, you can only get through them so fast and you might need some crazy local work tree setup that, and I, yeah, I don't know, I'm a bit bearish on work trees now, I just think it's clunky to make all that happen. But if you can just, if you have like a background agent that's well set up and has a feedback loop and has all the environment configured, you can just spin up a thousand.
background agents and they can turn through all these tickets at the same time, right? That's kind of the premise that really high throughput, not that I've done this, but that's kind of how they were pitching it.
Matt (45:51.701)
Yep, the Wugtree.
Matt (45:59.552)
Yeah, the work tree thing, I've kind of worked out a way that I like doing it. I have a slash command that's forward slash work. And then I just basically dump all my context after that. And what forward slash work does is it says in the next message, there will be a repo that is spoken about that's like somewhere in this directory.
whichever repo you're meant to be working on, make a new work tree on that repo, but keep it at the high level. Don't put it under .clawed or don't do anything nasty like that. Keep it at that high level in the code workspace. And then just work on that from then on. And that works really well. Because I'm normally working on four or five different things at once, even on the same repo. And so I just open a work tree for each one.
Wilhelm (46:45.016)
Mm-hmm.
Matt (46:55.391)
And then I need to do some pruning at the moment. I've got like 20 open for agents, then that just all works. And then I can just ask the workstree to be destroyed. And I leave that thread open then, normally for the lifetime of the feature until it's merged. Yeah, and then I destroy the workstree. It's really good.
Wilhelm (47:01.048)
Yeah, interesting.
Wilhelm (47:16.534)
Yeah.
Wilhelm (47:20.278)
Interesting. Yeah, I found or yeah, I feel like I've just used like separate clones. Just like make a new folder has its own clone has its own, its whole own get in there. And it can, it can do things. The work tree stuff. Yeah, I've just not found it. I guess I haven't invested that much time.
Matt (47:44.97)
Well, like separate clones, you need them to be in different top level folders, don't you? Like if you do, in your code folder, if you do git clone of something, and then you do git clone again of the same thing, but even under a different name, it will still freak out, right?
Wilhelm (47:51.968)
Mm-hmm, yeah, yeah, exactly.
Wilhelm (48:02.528)
You want different top level. I mean, that's how I've done. I've not tried cloning things in like sub folders, I guess. Yeah.
Matt (48:09.811)
Yeah, okay. Have a little think about that.
Wilhelm (48:12.407)
But it's just like, I don't know, it's just a bit of simple mental model or whatever in my mind. But yeah, I actually haven't, yeah, I don't know. Worth more investigation for me for sure.
Matt (48:21.839)
Did you see the, speaking of cursor, did you see they released Composer 2 that's Kimi K2.5?
Wilhelm (48:29.415)
I saw an announcement something but I have no idea. I actually was conf- yeah, go on.
Matt (48:32.972)
You need to get back on Twitter man, do you want to hear the beef?
Matt (48:38.609)
So, cursor.
Wilhelm (48:40.663)
I actually missed it a little bit for the first time I think this week. I think I wasn't that productive or happy or whatever, so I was just like, give me some distraction, and I wanted it to be Twitter.
Matt (48:45.023)
Alright.
Matt (48:51.327)
So Composer 2 is Kimi, it came out that they hadn't changed some of the model IDs and that it's basically like an RL version of Kimi K2.5, which makes a lot of sense, right? Cursor have like probably one of the best RL training sets in the world, like at all, because they're doing background agents and they can see whether they'd be merged or the changes have been merged or not. like, they must literally have the best training set ever.
Wilhelm (49:04.023)
Mm.
Wilhelm (49:09.036)
Mmm.
Wilhelm (49:17.291)
Mm-hmm
Matt (49:21.746)
they are using, yeah, so they're using KimiK 2.5, but the problem with that is that it's like a modified MIT license. So it's MIT, but if your company has over 20 million MRR, you have to like, like say that you're using KimiK 2.5 and you have to be like, like pretty adamant, like you have to, there's some like...
Wilhelm (49:44.192)
Mmm.
Matt (49:47.773)
There's some wording, I can't remember the wording, but you basically have to say that you're using KimiK2.5 on any model that is derived from KimiK2.5. And they've fully renamed it Composer2, which goes in breach of that license. And then a bunch of the Kimi folks were like posting on Twitter and then they all deleted their posts. So I'm assuming they just got paid off, but like crazy.
Wilhelm (49:55.807)
I see.
Wilhelm (49:59.894)
Yeah.
Wilhelm (50:07.831)
Wilhelm (50:12.055)
Damn, that is some spicy gossip. So Kimi K2.5 is the one that has this license, the modified MIT license.
Matt (50:24.509)
Yeah, it's like modified MIT. I've got more spicy gossip actually. I just forget that you're not on Twitter now. Do you want some more? Did you see that Astral got acquired by OpenAI? How did you see that? Oh, okay. Well, okay, so you just, yeah, I, you know, it was inevitable. Do you see how much they got acquired for?
Wilhelm (50:29.239)
Yeah, go on, hit me.
Yeah, yeah, yeah, big news, yeah. I saw it Hacker News.
Yeah, how do you feel about it?
Wilhelm (50:47.851)
No.
Matt (50:49.958)
supposedly, VC posted on LinkedIn in their like thought leadership post. And they were, I think they were pretty right about all the other numbers, some of which weren't public as well. So they listed like a book, they were like, who says open source, can't make money. Prompt Fu or whatever that company is called, bought by opening eye for 200 million, Bun bought by Anthropic for 350 million, Astral.
Bought by OpenAI for 700 million!
Wilhelm (51:23.179)
No. No. That's how much they bought it for?
Matt (51:25.049)
So if that is true, yeah, I don't know if it's true, but like he was.
Wilhelm (51:31.659)
So this wasn't a VC that was involved in the deal. This was someone speculating on LinkedIn.
Matt (51:36.218)
they might have been involved in the deal. Like, I don't know, they were like basically blowing their own trumpet about investing in open source.
Wilhelm (51:38.58)
Okay.
Wilhelm (51:44.233)
Yeah, yeah, I see, I see. OK, I mean, that is wild. That is...
Matt (51:48.431)
It's an insane amount of money.
Wilhelm (51:51.337)
is insane amount of money. Yeah. Yeah. I feel like I'm so obviously I follow like lots of Python people on other social networks or on email or.
Matt (52:00.581)
man.
speaking of email, are you going to do your agent mail thing?
Wilhelm (52:05.503)
Such a boomer. I actually implemented all the fixes we discussed last time. I actually pointed my clanker at our podcast transcript and was like, Matt had all these gripes with it. I think he's right about how the page needs to load faster. Please investigate. And then it figured that out and it fixed it.
Matt (52:24.103)
sick.
Wilhelm (52:25.153)
There is some fundamental things I need to figure out still, but I mean, it's ready. It's out there. It's used. I'll, yeah, I have some more thoughts on it, but maybe another time. But yeah, I feel like there's a lot of people who comment on this Astral thing. By the way, everything this company makes is hard to pronounce, right? Is, so we think it's called UV, right? I swear there was a time people were calling it UV.
Matt (52:51.761)
That's horrific. No, it's UV. It's UV.
Wilhelm (52:53.031)
And... are you sure? It's not oof.
And then I thought it was Estral. I don't know, this company. But obviously for content, like...
Matt (52:59.407)
I'm fine.
Matt (53:04.582)
Well, astral, astral, like it's astral constellation, like it's named after something. You can't change the name.
Wilhelm (53:15.127)
So you think it's a GIF?
Matt (53:19.345)
Jesus Christ. Yeah, I don't know. But rough is pretty easy. I don't know what rough, does rough stand for anything?
Wilhelm (53:20.809)
Okay, let's not get into it.
Wilhelm (53:25.685)
Rough is good, yeah.
Yeah, rough is fine.
Matt (53:32.016)
And then there's T, yeah, and then there's TY, which I always, I always like, yeah, tie, maybe it's called, I always look at it and think, thank you, which kind of breaks my brain.
Wilhelm (53:35.607)
or tie.
Wilhelm (53:43.767)
sorry. thought you were thanking them for making T-Y. But they, Astral, especially UV has been incredible for the Python world. it's, like it's, been fundamental piece of tech. Yeah. And I think a lot of people are very doom and gloom and like the hacker news comments were incredibly negative across the board. Like, you know, like
Matt (53:49.18)
No, I wasn't liking them for making them.
Matt (53:55.548)
Like it's fundamental piece of tech, yeah.
Matt (54:07.964)
Mm.
Wilhelm (54:09.275)
I'm gonna stop using them immediately and you know they raised money I knew this wasn't gonna end well.
Matt (54:13.094)
But what would they use, what would you use instead?
Wilhelm (54:17.014)
just go back to raw raw dogging it Which to be fair that's what Claude does it's really hard to get Claude to use UV even if you tell it about it 500 million times in the Claude MD files
Matt (54:21.488)
Go back to like poetry or something.
Matt (54:30.684)
Do you know the answer to that? The answer to that is to stop using Python.
Wilhelm (54:35.99)
That's your answer to how to get around not using UV, you just stop using Python. No Python hate on this podcast.
Matt (54:40.079)
Yes, stop using Python. Stop using such a silly, silly scripting language and use something that's actually decent.
Wilhelm (54:47.158)
can't argue with the masses. The crowd has decided. We're all speaking English, aren't we? Is that the best language?
Matt (54:57.261)
I don't know man, I'm in a country where not that many people speak English.
Wilhelm (55:00.95)
I was trying to make some point. I feel like we're five...
Matt (55:04.089)
And you're in a country where I'm pretty sure only 50... Yeah, dude, there's no leg to stand on. Me and you are speaking English, but you're in a country where, isn't it something crazy, like 50 odd percent of people don't speak English?
Wilhelm (55:15.67)
What in America? People 50 % of the time... No way.
Matt (55:19.675)
Isn't it something mad? I saw a mad stat.
Wilhelm (55:22.486)
I think we're in like the sixth nested tangent now. I have no idea what I was trying to say.
Matt (55:27.077)
Okay.
Bye.
Wilhelm (55:32.096)
We need to unwind these, walk the tree back, five parents, and then...
Matt (55:38.331)
Right, I'm looking up what percentage of people don't speak English in the US.
Wilhelm (55:42.816)
Go on.
Wilhelm (55:48.192)
We should all be speaking Esperanto or something.
But that's not going to happen. So we're going to use Python.
Matt (55:54.113)
Okay, I lie I apologize. Okay. Okay, so I was kind of right and kind of wrong
Yeah, I was kind of right and kind of wrong. Okay, so 21.6 % of the population speak a different language other than English at home. But that doesn't mean they don't speak English. You have 8.4 % of the population supposedly speak English less than well. And it's less than 1 % essentially speak no English, which makes sense.
Wilhelm (56:14.4)
quarter.
Wilhelm (56:27.402)
Yeah, that makes sense. that makes sense. I think...
Matt (56:29.37)
Yeah, but 8.4 % of the population speak English less than well.
Wilhelm (56:32.616)
Less than well, yeah, I didn't realize that. That's a bit higher than I thought.
Matt (56:35.858)
that's quite crazy and yeah it's only 70 % of the population speak only English at home. Interesting. So you have like 20 yeah 22 % of the population speak another language at home which is like a large amount of the US.
Wilhelm (56:57.929)
Yeah, it's a very varied place as well.
Matt (56:58.938)
It's cool man. I mean that's like the whole allure of the US, right? Is that there's people from all different walks who've like come together and all that craziness.
Wilhelm (57:09.374)
Hmm, a story, yeah. The point I was trying to make, I think, is that I feel hopeful about this news for Astral. Like, they were going to have to make money at some point, and they have been such an unbelievable asset to the Python community and what they've built. And I feel like there are so many open source projects that start out really well, but then fizzle out or don't improve much, like literally other Python packaging projects.
Matt (57:12.698)
story.
Matt (57:26.586)
Mm.
Wilhelm (57:38.417)
And then we have this like splintering of ecosystem and like everyone complains about like open source funding or open source isn't getting funded. And this path of like a company raises VC money and puts out like incredible open source and then gets acquired for a lot of money. Like it's obviously it has downsides, but it feels like people are, yeah, people are, or some people are making it out to be like a
Matt (58:03.182)
well trodden.
Wilhelm (58:07.894)
like a more worrying path than it seems to be and I I don't know I feel like it's gonna keep It's gonna keep being good I'd rather yeah Like and UV specifically like I think it's just gonna keep It's gonna keep getting better UV is gonna keep getting better and it's not gonna be killed off tomorrow It's not gonna be killed off in a year. That would be like my prediction
Matt (58:20.494)
Wait, that pass? Yeah, I think you've probably got a... Like, it's...
Matt (58:33.69)
It's gonna be good. Okay, it's almost a, okay, so the way I see it, it's like a self perpetuating prophecy where VCs invest in open source because open source gets a lot of, it's relatively easy to make a big splash and get like a few thousand stars on GitHub. And that seems like traction to VCs. So they invest in open source. They also have a history of companies that have done well in open source. So they invest, right? And then when,
Wilhelm (58:41.344)
Hmm.
Matt (59:02.234)
when you're a VC backed company that has had money thrown at you and all you're doing is open source, you're probably not doing like a crazy lavish lifestyle because you're open source and like that's not really people's style. On the other hand, you're not spending all your time looking for customers at all because you're open source, they come to you. Like that's literally the point. Like you actually don't care.
Wilhelm (59:24.48)
Yeah.
Matt (59:30.435)
you're making the thing for them and they'll come to you and the community is actually almost overwhelming more than anything else. And so you're not spending all this time looking for customers, you can spend this time shipping. And so you can have much smaller teams, so you spend less money. So that bit's self-perpetuating already, you already have like a little whirlpool in this whole big circle. And then if you're good at writing software, which you'd hope so, after some time, you're gonna...
you're gonna get good anyway, like if you put enough grind in, you're gonna get good. If you're good at writing software, then other people will see your software and be like, holy shit, these guys are good at writing software. And because you're an open source team that had to do no other logistics other than write software, had to they had to sell, they didn't sell anything. They don't have any marketing people. They don't have any sales people. They don't have any ops people hardly at all, right? They might have one person for a team of 30.
Wilhelm (59:58.198)
You
Wilhelm (01:00:23.263)
Mm-hmm.
Matt (01:00:26.412)
who organizes all of that stuff and everyone else is engineering, like, or, yeah, or in the engineering teams, because you don't have all of that, like, extra company facility that most companies need, you're actually really lean, and people then look at you as like this big, as this like really good engineering company, because you'll ship way beyond the amount of people you have, because you're very heavy engineering.
And then, I guess the crux of it, the crux of it is people will see all the stuff that you're doing is really good and they're like, holy shit, these people could do the same stuff but for us, we should pay them money. Like, pay them loads of money.
Wilhelm (01:01:02.547)
Right, right, right, right. And they're building some genuinely critical infrastructure that OpenAI's developer customers need and use and rely on and make better and make it even possible, right? Similar to the bun thing. Yeah.
Matt (01:01:08.298)
Yeah.
Yeah, exactly. we should definitely bring that in-house. And honestly, these guys are shipping so much, they must be killers. Like, we don't get that sort of performance from like these teams that we have double the size. They couldn't do that. And so these people seem very mission critical and they're external. And at some point, some exec finds that they can buy them all for 200 mil and everyone's like happy as Larry, you know? Like, but...
Wilhelm (01:01:33.833)
Hmm. Yeah.
Matt (01:01:38.21)
But so it's self-perpetuating because then the next open source company that comes along, the VC's like, my God, the last one I just made 200 million, let's go.
Wilhelm (01:01:46.354)
I wonder, yeah, I feel like it's also interesting to maybe think about like the counterfactual, right? Could something as good as UV is today, and it's really uncontroversial that UV is like really good. It's like, it does, And Bon, yep, yep. So could something as good as UV have been created in like another world where...
Matt (01:01:57.472)
It's stunning. Yeah, it's stunning. And also Bun, right? They're both stunning products.
Wilhelm (01:02:13.351)
none of the VC downsides exist, right? Could there have been a team of this caliber working on something so intensely for so long in some kind of other model that we don't understand, which is kind of broadly more like community or more donation-based or it just seems really unlikely? Everyone's always like, we need to solve open source funding, you know?
Matt (01:02:29.528)
Yeah, yeah, but that's... Yeah, yeah, yeah.
Matt (01:02:36.044)
No, it is possible. No, no, is possible. So like the...
Wilhelm (01:02:37.673)
And then the best next thing we can come up with is like some kind of donations thing where like if you're lucky, a project gets like 100K a year, can maybe pay like two people and buy some stickers. And that's just like a completely different, like if that's the best other thing we can do, then obviously we can't get UV.
Matt (01:02:57.995)
Yeah, like Zig is a really interesting one there where they are profitable. They make enough money. do some, think the way they fund it is donations, also some donations and sponsorship. But then also they do some contracting, I think, for like other companies. But all of this is distraction, like from what they're meant to be doing. And they only really have a couple of people working on it full-time, like as full-time team members.
Wilhelm (01:03:23.849)
Mm. Right.
Matt (01:03:28.053)
And so like, if you're building something on Zig, what are you paying for? What are you like putting your, I guess like staking your flag to, you're staking your flag to Andrew Kelly, the creator of Zig, is gonna keep on coding for another 20 years and is gonna find a good successor when he's done, you know?
Wilhelm (01:03:47.145)
Right, yep.
Wilhelm (01:03:51.231)
Yep, yep, yep, yep.
Matt (01:03:52.725)
Like, so it is a bit tough, but it's definitely a different angle because I don't think on the other hand, would you want to invest in a programming language, like specifically a programming language that you thought was gonna get bought by a competitor?
Wilhelm (01:04:12.734)
Right. Yeah, yeah, yeah.
Matt (01:04:13.579)
that would also be tough. So I think it only works for a very specific set of tooling where like it's kind of an ecosystem play. Yeah, I think it only works.
Wilhelm (01:04:23.317)
Mm-hmm. Oh yeah, that's really interesting. Like, could think, like, who are the losers out of this UV OpenAI acquisition, right? Like, if you were a hypothetical competitor to OpenAI who built a bunch of stuff around Python and around UV, are you now, like, worried? I think that feels like a very reasonable thing. But I can't.
Matt (01:04:46.39)
Probably not, but like probably not that worried. Like I'm sure there were bidders to this. There's the fact that it's double the price of Bun's acquisition.
Wilhelm (01:04:54.314)
Mm.
Wilhelm (01:04:58.685)
Yeah, this is wild. I did not realize it was like that.
Matt (01:05:02.955)
so much money. I mean, it's ubiquitous in Python. And like there will be like everyone in OpenAI is I think they have a massive Python code base.
Wilhelm (01:05:12.029)
Right, yep. I think I've heard this as well, yep.
Matt (01:05:13.814)
Yeah, I heard it's a massive monorepo.
So like, you're contributing to that and like, there's an option to buy like the piece, your like number one piece of tooling, you're probably gonna buy it. And Astral had all of the other stuff as well. It wasn't just UV, like, like tie, rough, like all of this stuff. And they had a few other bits as well. I forgot what they were. Yeah.
Wilhelm (01:05:29.621)
Yeah.
Wilhelm (01:05:34.463)
Mm-hmm
Wilhelm (01:05:40.381)
Yeah, and there's a history of this, Like I think Facebook hired all sorts of open source folks who built powerful stuff that they were using a lot like Mercurial and like the HHPM or I don't might be getting the details wrong there.
Matt (01:05:53.15)
Yeah.
Matt (01:05:57.334)
But it's less common, than... I think... I don't know. I'd like... I'd actually like to see which one has more numbers because I'm not entirely sure about this, but is it more common? Maybe someone can like message me or comment underneath because I would be really interested. Is it more common for companies to buy in open source and actually acquire the open source company or the open source developers, like Cloudflare did with Astro, for instance, or is it more common...
for something like Facebook and React where React was invented at Facebook. Like Atom was invented in GitHub. But then they went, yeah, which was more common because then they went to do Zed. So I think it's probably less common for something to be invented in a big company and that's actually like a mega bullish signal on the state of the company if like stuff is being invented there that is used outside of the company. Because I think it's not.
Wilhelm (01:06:37.468)
Right, right, right. Like which one's more common?
Matt (01:06:56.935)
super prioritized, like that type of thing.
Wilhelm (01:07:00.176)
Yeah, yeah, totally. Yeah. Yeah. Interesting. Interesting to chat through this and appreciate all the context. Yeah. I feel like I don't know how useful our ramblings and our thought leadership are, useful this actually is to people listening.
Matt (01:07:11.901)
Nah, I'm just, I think people like this type of stuff. mean, open source, think people outside of tech just don't understand open source. When I talk about my work and maybe like you can chime in as well. Like when I talk about my work, I'm like, yeah, so I, for the last sort of like three years, 90 odd percent of the code I've written has been given away for free.
Wilhelm (01:07:16.828)
Yeah, okay, fair.
Matt (01:07:39.606)
And people are like, what? And I'm like, yeah, know, like community and like getting more people onto a platform and like empowering developers and stuff. And people are just like, what? What? That makes no sense. But actually.
Wilhelm (01:07:41.588)
Totally. Yeah, yeah.
Wilhelm (01:07:51.16)
Makes no sense, One of my favorite moments of this happening was a Twitter interaction between Guido Von Rusem, the Python creator, right? I think he posted, or one of the Monty Python people, I forget who it was, posted on Twitter saying like, hey, I'm playing around with Python. And then Guido played, actually, Python's named after Monty Python. And I spent the last, I don't know, 40 years of my life building it.
Matt (01:07:59.913)
Mm-hmm.
Wilhelm (01:08:14.836)
and giving the language away for free and then the Monty Python guy replied like How do you make money if you give it away for free? And it's just like whoa Whoa
Matt (01:08:28.853)
I mean, it's fair, like it's a fair question. But I guess like, we saw with, yeah, I don't know. Like with the AI service, so confusing. Like for instance, there's more like, there's inference stacks, like LLMV. Yeah, like LLMV. No, not LLVM. LLM, which one is it? No. VLLM, sorry, that took me ages. But VLLM is like an inference stack that I think a lot of people use.
Wilhelm (01:08:39.144)
The numbers are so much bigger.
Wilhelm (01:08:46.664)
Hello, VM. Do you mean the one, the French one?
Matt (01:08:58.357)
It's super popular, came out of UC Berkeley, I think. And I'm pretty sure it did. And yeah, like if that was a VC funded company, there is VC funded companies that are taking that sort of angle. Like ZML is like a French version and they have taken the angle. They're like, we're gonna build the best inference stack. We're gonna raise money.
Wilhelm (01:08:59.06)
Mm.
Mm, right, right. Mm-hmm. Yep. Yep, I did.
Matt (01:09:25.324)
and hopefully at some point it gets used by everyone and then we sell to a big lab I guess I guess that's the plan but I don't know if the the time for all of this is shrinking or increasing because to be bought by a lab you have to show that I guess you or to be bought by anyone I guess you have to show that you're better at this stuff than they will ever be that you have that experience that you have that yeah that competency right
Wilhelm (01:09:56.378)
One interesting thing maybe also with this is like if we are at a time where, you know, some people say, you know, software engineers are done, software engineering is solved. It's interesting that like, you know, clearly software is still valuable enough that a big lab like OpenAI is happy to spend 700 million or at least hundreds of millions. I don't know. What's the like, like I feel like, yeah, it'll be something like that, right?
Matt (01:10:04.691)
Yeah.
Matt (01:10:23.623)
Yeah, it's over 200 and it's less than a billion.
Wilhelm (01:10:26.523)
Right. That is still a big amount of money, even for a company like OpenAI. And what are they acquiring? They're acquiring people, right? And like a foundational technology. But it's software and it's software people. So clearly there's some value left in software. For people who are painting a very bad picture, I'm just saying this is a good message, maybe a good sign.
Matt (01:10:35.581)
Yeah. MIT licensed tech. No, MIT licensed.
There's definitely, okay, okay, so there's this thing that we've been...
But people have been told, we've been talking about this for ages, right? Like literally the last of like three, ever since I've known you, right? We've been talking about this where as the models get smarter, the bit underneath the models is commoditized and the last bit that the models can't do, like, or don't have the ability to do without someone next to them who does know how to do it, right?
There is a distinction because I think the models can do a lot, but they need probing, they need probing, they need prodding in the right way by someone who is actually an expert. And those experts, they become immensely more valuable. And so like you saw for Bun, it was sort of around 10 million per engineer. I don't know how big Astrol was, but I can't imagine they were 70 people. if it is, yeah, if it is 700 million,
Wilhelm (01:11:23.057)
Mm-hmm.
Wilhelm (01:11:39.621)
No, think it was like 20 or something, yeah.
Matt (01:11:43.603)
then it's a lot more per engineer. And I remember being told a couple of years ago that the average price per engineer was around one and a half million. So the price per engineer to me in these companies that are getting acquired is going up and up and up because these engineers have some knowledge of a stack that is very valuable.
Matt (01:12:12.093)
That's why I like Git, Git's not going away, except when it does.
Wilhelm (01:12:16.883)
Yeah, nice. No, it's is interesting to talk about and I think you're right people or this is on people's minds. I always just I'm like, I feel like back when I worked at a company and had like internal information about what was actually going on, seeing like the public speculation on Hacker News about something you knew very well about it was just like so completely different. Like it just felt like so
Matt (01:12:20.371)
Yeah.
Wilhelm (01:12:41.733)
stupid to see these comments and people being convinced that like some evil bad thing was about to happen or whatever and you just saw the internal narrative was so much more boring or so completely different. So that's why I'm like, I feel like this podcast, it's most valuable, I think in real terms when we are like, we did a thing. This is what worked for me. And it's like least valuable when we're like, yeah, let me tell you what the strategy is behind this acquisition. But I think you're right. Like, yeah, yeah. But I think you're right that probably
Matt (01:13:06.246)
Like mega speculation.
Wilhelm (01:13:11.429)
It is like, I don't know, whatever, some entertainment or some, it's like reality, art reality TV equivalent of podcast production.
Matt (01:13:21.978)
It's weird. Okay. So you might think of this as reality TV, like, I think a lot of people feel very strongly about this type of thing because this is our livelihood. It's our career. It's like something that we're, that we're trying to get better at and improve at. And like the skills that you want to learn are changing. And I had this amazing interaction with Kent, Kent dudes, Kent. I don't know how you say his last name. You know that, you know the JavaScript guy? Yeah, that guy.
Wilhelm (01:13:41.339)
Mm-hmm.
Wilhelm (01:13:49.139)
Can't see dots. yeah.
Matt (01:13:52.55)
Yeah, amazing interaction with him on Twitter where he's just like, I'm like, stuff's moving and he's like, yeah, I have no idea what to teach at the moment. And it's like, he's an educator and he's struggling to work out what to teach people because he doesn't know what's gonna be defunct in like six months, six weeks, six days. But as an educator, he sits in a really interesting spot because people are probably always gonna need to be educated.
Wilhelm (01:14:00.551)
Right. Yep.
Wilhelm (01:14:13.607)
Mm-hmm. Totally.
Matt (01:14:21.554)
I think education was actually quite low down on Anthropix list of things that are gonna be automated away, which makes kind of sense to me. Unintuitive, but it made sense.
Wilhelm (01:14:29.627)
Yeah.
This reminds me of something I was going to talk about. Sorry. think this was the... Speaking of these anthropic reports, there was a really cool one that came out a few days ago called 81k interviews. Did you see this?
Matt (01:14:50.351)
No, go on, point me at it, send me a link.
Wilhelm (01:14:52.143)
Man everyone should everyone should look at this is really really fascinating So because this is some research anthropic did I think maybe last summer or something like that and they wanted to research like how is Like how are people actually feeling about AI and how? Like is it useful like are they worried like what are they using it for? I'll send you the link and it has beautiful visualizations as well this report but instead of just like looking at transcripts which
are obviously like, you know, maybe hard to detect sometimes. Like if someone didn't continue a conversation, is that because Claude helped, like resolved the thing for them? Or is it the opposite that like Claude didn't help them and they stepped away because it wasn't useful, right? So it can be hard to tell. So they built this AI interviewer where when using Claude, you would get a little pop-up was like, Hey, can we interview you about your AI usage? And then it was like an interactive thing where you and Claude would discuss your experience using Claude. Um, which
Essentially, which is it's from December. Did you did you see one of these pop-ups? Like like I didn't actually I regret not doing it because I could have been part of the largest qualitative study in history But I didn't do it
Matt (01:16:01.659)
think, yeah.
On Claude Code, I think I saw it, but I never replied to any of that stuff.
Wilhelm (01:16:12.359)
Yeah, so not last summer, it was last December and published this week, but the insights are fascinating, like really, really fascinating. And I would encourage everyone to check out the report and it's just a beautiful experience. I'll highlight just like two things that are really interesting, that I found really interesting. One is that this tension of like AI is useful, but I'm also worried or I'm worried about it. It doesn't exist within...
Matt (01:16:39.502)
Hmm.
Wilhelm (01:16:42.055)
like distinct groups of people. There aren't people who are like really worried about it. And then people who think it's really useful. Everyone has the tension basically. Like the tension exists within every person. They're both finding it useful and they're also concerned about various aspects of it. So I that was really, really interesting. And then the second one is that the like less, like the less Western you are and the maybe less economically highly developed your country is.
the more you're excited about the opportunities and potential from AI. So if you're in Sub-Saharan Africa or in India or in Latin America, you're more excited about AI, more excited about the potential, more of what it can do for you. And if you're in especially Germany, UK, South Korea, Australia, you're less excited about AI.
Matt (01:17:17.838)
Wait, say that again?
Matt (01:17:26.128)
Hmm?
Matt (01:17:30.042)
Seriously.
Wilhelm (01:17:41.21)
and more worried.
Matt (01:17:44.954)
That's unintuitive, but actually kind of like, it fits with my own lived experience. Actually. Yeah.
Wilhelm (01:17:51.655)
Interesting.
I think it's fascinating. It's really, really fascinating. So those are like two things I thought were cool from that report. But yeah, it's really, really incredible work. And yeah, check it out.
Matt (01:18:06.349)
That's rogue. Can we end on something else that's a little bit off-piste? So I'm gonna bring it up. But there was a study that was done relatively recently. I'm just gonna find the date for you about the effects of radio frequency radiation on mice and another one that came out on the effects of radio frequency generation on rats. I'm just gonna find the date for you.
Wilhelm (01:18:09.906)
Mm-hmm.
Go on.
Matt (01:18:35.415)
And this little bit off topic, when was the date? I think it was really quite recently, I think.
Okay, I'm struggling with the date, but let's go with recently as an author recent. okay. So it came out in 2018. It part of the national toxology program in the US NTP report or the toxology and carcinogenesis studies in mice exposed to whole body radio frequency radiation at a frequency of 1900 megahertz and modulations, GSM, CDMA, basically stuff used by cell phones.
So the thought process is like, I wear my AirPods for quite a lot of the day. Am I giving myself brain cancer due to radiation from my AirPods due to like 2.4 gigahertz radiation? no, it's not 2.4 gigahertz. Basically the lower level radiation, the stuff in the megahertz is kind of like similar to Bluetooth and to phones and something, yeah, like is that killing me?
What they found this 2018 study and I need to see if it's been replicated anymore because this was the only one I found of it and it did pop up because someone posted about it on Twitter so I need to actually like go into a little bit more so just hold the thought see if it's valid. But the study on mice and the study that came out on rats which so they were two different studies but they came up with the same conclusions was that this type of radiation in huge doses actually
something like 10 times the amount of dose that you'd get from your air pods and consistently for the whole day actually extended mice and rats' lives by somewhere below 20 % and above 10%, which...
Wilhelm (01:20:32.58)
Wilhelm (01:20:36.09)
What? I didn't know that's not what I thought this was going. So you're saying my AirPods make me live longer? Thank you, Tim Apple.
Matt (01:20:39.128)
Yeah, which was actually wild.
Matt (01:20:46.882)
So I'm saying currently I am less worried about wearing AirPods, but I will check back in on you as my thought progresses.
Wilhelm (01:20:55.548)
That's beautiful. I was about to hit you with like a quote from Chernobyl, show like 2.5 rent gun, not bad, not terrible. But no, it's, damn, that's some positive news to end on. Love it.
Matt (01:21:04.93)
Hahaha!
Matt (01:21:12.611)
Yeah, we'll see if there's some more studies and stuff like that. I need to do some more Googling, from first impressions, I am less worried about AirPods than I was for about 10 minutes a couple of years ago.
Wilhelm (01:21:26.768)
Nice. Awesome. Good place to wrap it. Look out for a LinkedIn PDF slop presentation shortly from me that breaks down this transcript into some catchy slides.
Matt (01:21:42.316)
I love that you've like gone off Twitter and now you're like, I must engage with people, so where can I engage? LinkedIn!
Wilhelm (01:21:50.054)
the greatest social network, LinkedIn and Strava.
Matt (01:21:53.582)
Yeah, Strava's good actually, Strava's good.
Wilhelm (01:21:56.253)
Backstrap, yeah. Cool. All right. Happy Friday. Peace. Bye, big love.
Matt (01:21:59.086)
Alright, bye dude. Love you too, bye!