WEBVTT

00:00:00.000 --> 00:00:04.600
Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:00:05.040 --> 00:00:09.700
This is episode 477, recorded April 20th, 2026.

00:00:10.020 --> 00:00:11.020
I am Brian Okken.

00:00:11.280 --> 00:00:12.140
And I'm Michael Kennedy.

00:00:12.440 --> 00:00:17.540
And this episode is sponsored by us and you guys.

00:00:18.240 --> 00:00:21.520
So there's a bunch of courses over at Talk Python Training.

00:00:21.920 --> 00:00:27.200
There's pytest courses over at PythonByte, wait, Pythontest.com, my own site.

00:00:27.320 --> 00:00:28.280
I forgot the name of it.

00:00:28.280 --> 00:00:35.920
But thanks to everybody for the Patreon supporters and a lot of people encouraging me and grabbing copies of LeanTDD.

00:00:36.460 --> 00:00:38.700
And I've gotten some good feedback.

00:00:38.840 --> 00:00:40.880
So I'll plug that later in the show as well.

00:00:41.180 --> 00:00:43.580
Yeah, if you'd like to reach out, send us topics.

00:00:43.700 --> 00:00:47.440
One of the topics I'm covering is from somebody that sent it in.

00:00:47.700 --> 00:00:49.000
And we really appreciate that.

00:00:49.000 --> 00:00:56.520
So get a hold of us through Bluesky, Mastodon, or through email or the contact form on pythonbytes.fm.

00:00:56.520 --> 00:00:59.520
And all of that info is at pythonbytes.fm.

00:00:59.620 --> 00:01:07.080
If you are listening and you're thinking, hey, I'd like to watch this live sometime, you can just head on over to pythonbytes.fm.

00:01:07.080 --> 00:01:09.120
Or just look around.

00:01:09.220 --> 00:01:13.920
You can find links to watch us live on YouTube or watch the past episodes.

00:01:14.620 --> 00:01:21.940
And finally, please join the newsletter because we send out links and information and background information.

00:01:22.340 --> 00:01:27.220
And some people have mentioned to us before that some topics are a little over their head.

00:01:27.220 --> 00:01:28.500
But we don't want that.

00:01:28.680 --> 00:01:33.600
So we send you some background information so that you can understand every topic we talk about.

00:01:33.720 --> 00:01:37.860
And with that, I'll take a rest and it'll be your turn.

00:01:38.240 --> 00:01:38.500
You know what?

00:01:38.520 --> 00:01:40.200
I need to take a rest too, honestly, Brian.

00:01:40.340 --> 00:01:47.160
I've had a long weekend, big party for my wife and had a bunch of our friends in.

00:01:47.220 --> 00:01:49.280
We rented this party van bus thing.

00:01:49.360 --> 00:01:50.720
I drove around a bunch of wineries.

00:01:51.080 --> 00:01:52.240
And I just need to rest.

00:01:52.240 --> 00:01:55.720
But, you know, the Django rest type?

00:01:55.860 --> 00:01:56.260
I don't know.

00:01:56.340 --> 00:01:57.460
I need legit rest.

00:01:57.640 --> 00:01:58.660
I need legit rest.

00:01:58.860 --> 00:02:00.860
But I'm going to tell you about Django rest.

00:02:00.940 --> 00:02:07.120
In fact, modern rest, which is a framework for Django that is type-based.

00:02:07.400 --> 00:02:13.720
So all the classes use runtime type information to do all of their magic, right?

00:02:13.740 --> 00:02:16.280
Not just autocomplete or blending.

00:02:16.580 --> 00:02:18.840
And it has true async support.

00:02:18.840 --> 00:02:23.860
So think Django Ninja-like, but with a different take, okay?

00:02:24.120 --> 00:02:25.760
So this is a pretty cool project here.

00:02:26.080 --> 00:02:30.840
And, you know, I just looked at it and thought, you know, this looks like something that's really fun.

00:02:31.240 --> 00:02:40.880
It actually, one of the differences than, say, Django Ninja is it supports multiple model foundations, I guess.

00:02:41.220 --> 00:02:44.560
So Pydantic, which is the first one listed here, which is great.

00:02:44.560 --> 00:02:47.400
But also msgspec and Attrs.

00:02:47.640 --> 00:02:48.360
Remember Atters?

00:02:48.520 --> 00:02:49.620
Like Attrs is still a thing.

00:02:49.760 --> 00:02:49.960
Yeah.

00:02:50.200 --> 00:02:53.540
From Hennick and, you know, kind of data class style.

00:02:53.880 --> 00:03:02.300
And one of the things that's interesting here is it says if you use msgspec, it allows you 5 to 15 times faster APIs than the alternative.

00:03:02.620 --> 00:03:07.060
msgspec is all about ultra-compact exchange on the wire type of thing.

00:03:07.500 --> 00:03:07.600
Okay.

00:03:07.600 --> 00:03:12.980
Also has true support for ASGI, ASGI, async applications.

00:03:13.300 --> 00:03:16.580
But one of the things that's interesting is it's just good old Django.

00:03:16.840 --> 00:03:18.320
Like nothing too new.

00:03:19.020 --> 00:03:20.900
Nothing that you wouldn't expect.

00:03:21.040 --> 00:03:24.620
So if you're doing Django, it feels just like, yeah, that totally fits in.

00:03:24.820 --> 00:03:31.280
There's a getting started page here, which is a little zoomed for all of us, that we can go down and sort of go through.

00:03:31.440 --> 00:03:32.020
It's pretty interesting.

00:03:32.020 --> 00:03:41.880
One of the things that's interesting also supports PyPy, P-Y-P-Y, not the mispronunciation of PyPy, but literally PyPy, which I think is interesting.

00:03:42.000 --> 00:03:43.860
And Django 4.2 or above.

00:03:44.540 --> 00:03:46.480
Hat tip to an upcoming topic.

00:03:46.800 --> 00:03:50.640
The default recommended way to install it is uv, then Poetry, and then Pip.

00:03:50.740 --> 00:03:51.460
So that's pretty cool.

00:03:51.460 --> 00:04:08.040
So you've got to do things like when you install it or you set it up, you'd say, I want Django modern rest as a package bracket Pydantic or bracket adders or bracket msgspec so that you get your various dependencies installed that you're going to need.

00:04:08.100 --> 00:04:09.020
Or, you know, just whatever.

00:04:09.200 --> 00:04:10.860
Just put Pydantic as a dependency as well.

00:04:10.900 --> 00:04:11.520
Then you're good to go.

00:04:11.720 --> 00:04:12.560
Also interesting.

00:04:12.760 --> 00:04:18.680
Remember I talked about LLMs.txt and how I added that to Talk Python so people can get better.

00:04:19.040 --> 00:04:20.960
LLMs understand better how to work with Talk Python.

00:04:20.960 --> 00:04:23.740
They also understand how better to work with my courses.

00:04:24.120 --> 00:04:25.100
So this does this as well.

00:04:25.200 --> 00:04:32.020
It has explicitly a LLMs.txt and a LLMs-full.txt.

00:04:32.180 --> 00:04:37.000
So if you just say, hey, Claude or whatever I'm working with, I'm going to start this project.

00:04:37.320 --> 00:04:40.180
And it's using Django modern rest, a pretty new framework.

00:04:40.280 --> 00:04:41.160
You might not know it.

00:04:41.420 --> 00:04:48.700
So you can actually just drop that URL and say, please read this before you begin this project and make a note that this is a resource for you.

00:04:48.700 --> 00:04:51.020
It also has support for Context 7.

00:04:51.220 --> 00:04:52.240
Are you familiar with Context 7?

00:04:52.580 --> 00:04:52.780
No.

00:04:53.040 --> 00:04:57.620
So Context 7 is, I honestly don't really know what to make of Context 7.

00:04:57.720 --> 00:04:59.960
I thought I understood it, but I kind of don't necessarily.

00:05:00.180 --> 00:05:06.160
But what it does is Context 7 is a website where you can enter different libraries into.

00:05:06.160 --> 00:05:13.660
And then they parse it and turn it into something that AIs can use to understand that library better.

00:05:13.820 --> 00:05:18.520
I'm not sure how great it works, but you come in here and it has different skills, for example.

00:05:18.660 --> 00:05:23.920
Like it has a Django modern rest from Django rest framework skill.

00:05:23.920 --> 00:05:37.240
So you could give it this skill and say, hey, use this agent that understands both of these frameworks because I want to upgrade from Django rest framework DRF, which has some of the craziness that we talked about last week.

00:05:37.300 --> 00:05:37.560
Remember?

00:05:37.920 --> 00:05:39.380
Or two weeks ago, but last episode.

00:05:39.640 --> 00:05:41.860
Anyway, this is a pretty cool project.

00:05:42.580 --> 00:05:43.020
Interesting.

00:05:43.580 --> 00:05:44.300
Yeah, yeah, yeah.

00:05:44.640 --> 00:05:47.080
It's got stuff for, hey, for mine.

00:05:47.280 --> 00:05:48.480
Let's see what it says about it.

00:05:48.500 --> 00:05:48.800
I don't know.

00:05:48.860 --> 00:05:49.940
Actually, it just takes me to it.

00:05:49.940 --> 00:05:54.100
But, you know, it's got, you can submit your own library to this, by the way.

00:05:54.160 --> 00:05:55.860
So that's, I think, how this got here.

00:05:55.900 --> 00:05:58.140
I think I may have submitted this and so on.

00:05:58.140 --> 00:06:01.820
But, yeah, anyway, you can say, hey, I want AIs to understand my library better.

00:06:01.960 --> 00:06:04.440
And this also has a MCP, which you can install.

00:06:04.740 --> 00:06:06.400
Actually, I'm not a super huge fan of it.

00:06:06.420 --> 00:06:07.760
I got other things that I do for this.

00:06:07.820 --> 00:06:15.320
But, anyway, it's interesting that they explicitly went to that effort to help you get started, both converting and just working with.

00:06:15.500 --> 00:06:16.600
All right, so let's look at this showcase.

00:06:16.600 --> 00:06:21.360
Like, notice here, oh, actually, I gave them some short shrift here.

00:06:21.460 --> 00:06:21.800
Look at this.

00:06:22.220 --> 00:06:23.860
They do message spec, which is cool.

00:06:23.940 --> 00:06:24.820
So you can do msgspec.

00:06:25.000 --> 00:06:27.900
But they also do Pydantic, Atchers, Data Classes.

00:06:28.100 --> 00:06:29.160
We're going to be coming back to that.

00:06:29.440 --> 00:06:30.100
Typed Dict.

00:06:30.440 --> 00:06:31.840
Now, that I did not see coming.

00:06:32.100 --> 00:06:32.660
In Typed Dict.

00:06:32.820 --> 00:06:36.540
And straight named tuples as one of your foundations, if you like.

00:06:36.680 --> 00:06:37.380
How interesting is this?

00:06:37.860 --> 00:06:38.040
Okay.

00:06:38.500 --> 00:06:39.300
All right, so let's go down.

00:06:39.340 --> 00:06:41.820
If you scroll down, I didn't want to do the, I'm going to do message.

00:06:41.960 --> 00:06:43.120
I'll do Pydantic, whatever.

00:06:43.120 --> 00:06:44.760
So we can go down a little bit further.

00:06:44.760 --> 00:06:46.440
And there's a full example here.

00:06:46.700 --> 00:06:49.940
And it just shows you, like, a one file Django thing.

00:06:50.020 --> 00:06:53.260
So it shows you how to set up your Django app, your templates, et cetera, et cetera.

00:06:53.420 --> 00:07:00.860
And then you just can create, you create these models for data exchange in your web app, which I think is pretty interesting.

00:07:00.860 --> 00:07:05.840
Because a lot of people think of the model exchange to be, like, basically database classes.

00:07:06.060 --> 00:07:07.420
But that's not really what you want.

00:07:07.480 --> 00:07:09.080
You want, like, what does this form submit?

00:07:09.120 --> 00:07:13.280
Or what does this API receive as data, regardless of how we store the database, right?

00:07:13.800 --> 00:07:14.000
Yeah.

00:07:14.000 --> 00:07:21.700
So as user create model, which just has an email, or a user response model, which just has a UUID, which is the ID of the created user, right?

00:07:21.920 --> 00:07:23.480
So then you can create an endpoint.

00:07:23.760 --> 00:07:26.040
And you say this thing accepts a post.

00:07:26.500 --> 00:07:33.960
It is, you derive the class from controller of Pydantic serializer or controller of actors serializer.

00:07:33.960 --> 00:07:40.340
And then for your signature of your function, you say, hey, I want body of user create model.

00:07:40.420 --> 00:07:43.820
And then it automatically parses and validates that Pydantic style.

00:07:44.060 --> 00:07:44.920
And then you just use it.

00:07:44.980 --> 00:07:49.120
So you'll never get to your code if your Pydantic model doesn't validate, parse, all those things.

00:07:49.200 --> 00:07:50.560
So you don't have to check that kind of stuff.

00:07:50.720 --> 00:07:51.080
Pretty neat.

00:07:51.340 --> 00:07:51.980
Yeah, that is cool.

00:07:52.240 --> 00:07:52.380
Yeah.

00:07:52.400 --> 00:07:53.580
So there's a lot more to this.

00:07:54.160 --> 00:07:57.400
People can dig into it and explore it.

00:07:57.400 --> 00:07:59.200
But it looks like a pretty strong contender.

00:07:59.200 --> 00:08:03.860
If you bump over to GitHub, it's got about 1,000 stars, 100 forks.

00:08:04.020 --> 00:08:04.620
Pretty active.

00:08:04.700 --> 00:08:05.460
How old does it look?

00:08:05.700 --> 00:08:06.100
A month?

00:08:07.000 --> 00:08:07.680
No, six months.

00:08:07.960 --> 00:08:09.260
It was created six months ago.

00:08:09.380 --> 00:08:11.260
So I think that's still, that's pretty good growth.

00:08:11.520 --> 00:08:13.180
1,000 stars in six months.

00:08:13.440 --> 00:08:15.100
I mean, it's not open claw, but that's.

00:08:15.520 --> 00:08:18.600
It looks like there's a lot of active development going on still right now.

00:08:18.800 --> 00:08:20.740
Yeah, let's, yeah, a commit an hour ago.

00:08:20.800 --> 00:08:21.600
Let's look at the commits.

00:08:22.060 --> 00:08:22.920
What's going on here?

00:08:23.100 --> 00:08:26.460
Four hours, five hours, seven hours, 11 hours.

00:08:26.460 --> 00:08:29.100
Yeah, that's just, those are the active commits of today.

00:08:29.200 --> 00:08:30.380
That's pretty solid, honestly.

00:08:30.580 --> 00:08:31.080
Really good.

00:08:31.340 --> 00:08:31.460
Yeah.

00:08:31.740 --> 00:08:37.400
So anyway, I throw it out there as another Django area that people can pay attention to.

00:08:37.560 --> 00:08:38.560
Another Django framework.

00:08:38.820 --> 00:08:45.900
It's very similar to FastAPI and very similar to Django Ninja, but it seems like it's a little more flexible in the way you work with it.

00:08:46.440 --> 00:08:46.880
Yeah.

00:08:47.220 --> 00:08:51.440
Interesting comment here from SendPos.

00:08:51.820 --> 00:08:54.500
Django Ninja, Django Bolt, Django Modern REST.

00:08:54.500 --> 00:08:56.780
I guess people get back to Django and enjoy it.

00:08:57.000 --> 00:08:57.560
Nice to see.

00:08:57.560 --> 00:09:07.600
And I think that has, I think there's a lot to be said for using LLMs and AIs because Django's been around for a while, so they know how to deal with it.

00:09:08.620 --> 00:09:08.820
Yeah.

00:09:09.780 --> 00:09:12.480
Django is very well understood by AI.

00:09:12.700 --> 00:09:15.640
So that's actually, that's actually a huge bonus in my mind.

00:09:15.820 --> 00:09:16.040
Yeah.

00:09:16.280 --> 00:09:16.460
Yeah.

00:09:16.620 --> 00:09:18.380
Well, should we shift gears?

00:09:19.020 --> 00:09:19.480
What's new?

00:09:19.820 --> 00:09:21.580
Well, what's new is Python.

00:09:22.180 --> 00:09:22.160
So.

00:09:22.460 --> 00:09:23.640
I think it's been on for 30 years.

00:09:23.640 --> 00:09:24.360
What are you talking about?

00:09:25.400 --> 00:09:28.420
Well, so Python, I'm, I'm looking forward.

00:09:28.860 --> 00:09:31.440
So there's a Hugo von Caminad.

00:09:32.540 --> 00:09:33.240
I'm sorry.

00:09:33.360 --> 00:09:36.060
I always mispronounce your name, but we love Hugo.

00:09:36.060 --> 00:09:49.340
So, so 315, three, there's 315, alpha eight is out plus a release of 314, 314, 4, and also 313, 13 are out.

00:09:49.580 --> 00:09:53.420
There's a post about that, but I, I'm looking forward to 315.

00:09:53.700 --> 00:09:58.400
So when does 315, we're, we're looking at the status of the versions.

00:09:58.400 --> 00:10:04.260
we get, we still have like six months to go before we can really solidify the start using it.

00:10:04.340 --> 00:10:08.980
But, but I think that I'm excited to get started sooner anyway.

00:10:08.980 --> 00:10:12.280
So we've got what, a beta has come out in May.

00:10:12.620 --> 00:10:15.640
CR is in September and the final is planned for October.

00:10:15.760 --> 00:10:17.180
But, look at it.

00:10:17.180 --> 00:10:23.500
What, what, what's already in there all already in the alpha, we've got, explicit lazy imports.

00:10:23.500 --> 00:10:27.740
and that we would, we've been talking about that on the show.

00:10:28.100 --> 00:10:30.420
and that's, that's already there.

00:10:30.500 --> 00:10:32.160
Frozen dict built-in type.

00:10:32.300 --> 00:10:34.600
anyway, what do I have up here?

00:10:34.640 --> 00:10:38.100
I've got, yeah, the frozen, frozen dict built-in type.

00:10:38.200 --> 00:10:45.740
This is pretty cool to be able to, to by default do a, like a dictionary that can be used as that's hashable.

00:10:45.740 --> 00:10:52.560
You assign it at instantiation time and, or when you define it and, and you can't change it after that.

00:10:52.560 --> 00:10:54.020
so it's hashable.

00:10:54.160 --> 00:10:55.380
So that's, that's pretty cool.

00:10:55.640 --> 00:10:55.760
Yeah.

00:10:55.780 --> 00:11:05.580
And one thing I'd like to add to this frozen dict that I think is super interesting, we've had other frozen types like set, I think, and list, but in this Python T, the free threaded

00:11:05.580 --> 00:11:13.280
Python world, one of the things that can really unlock concurrency is not having to worry about locking on different objects.

00:11:13.400 --> 00:11:19.460
If you work with frozen dicts, it's read only, and you can just have all the threads ram on it all at once, right?

00:11:19.520 --> 00:11:21.760
You don't have to worry about locks once it's created.

00:11:21.760 --> 00:11:27.280
So people just consider adopting immutable data types in general when possible.

00:11:27.280 --> 00:11:34.380
Like if you're creating a dict, but you're not going to change it, frozen dict seems like something cool to put in place that like adds a little more security.

00:11:34.380 --> 00:11:37.900
So like if you don't expect it to change, like you can set it up now.

00:11:37.900 --> 00:11:38.980
So it cannot change.

00:11:39.220 --> 00:11:39.560
Yeah.

00:11:39.560 --> 00:11:45.660
And even things that generally we think of changing, we, you can use data flows like algorithmic stuff to do.

00:11:45.660 --> 00:11:46.840
That's functional.

00:11:46.840 --> 00:11:55.300
Like, like you said, it's a functional model for some part of your system that can be easily async asynced because it's all, it's all immutable types.

00:11:55.300 --> 00:11:56.780
So yeah, pretty cool.

00:11:57.660 --> 00:12:02.080
We got unpacking of comprehensions better.

00:12:02.460 --> 00:12:03.420
So that's kind of fun.

00:12:03.700 --> 00:12:06.340
Star and star star are working better for that stuff.

00:12:07.200 --> 00:12:07.800
Let's see.

00:12:07.840 --> 00:12:09.620
What did I have here that I wanted to talk about?

00:12:09.800 --> 00:12:10.340
I don't remember.

00:12:10.660 --> 00:12:10.840
Anyway.

00:12:11.960 --> 00:12:12.440
Yeah.

00:12:12.720 --> 00:12:13.900
Lots of great stuff.

00:12:14.140 --> 00:12:14.460
Let's see.

00:12:14.540 --> 00:12:15.720
Annotated type forms.

00:12:16.180 --> 00:12:18.720
Oh, Python now uses UTF-8 as default encoding.

00:12:19.020 --> 00:12:21.960
Can't wait for that because I'm tired of typing UTF-8.

00:12:21.960 --> 00:12:25.180
Did you shout out lazy imports?

00:12:25.940 --> 00:12:26.200
Yeah.

00:12:26.400 --> 00:12:27.900
Well, explicit lazy imports.

00:12:28.080 --> 00:12:37.300
That's, that's probably what I'm most excited about is, is being able to just say, just say lazy import Jason or lazy import whatever.

00:12:37.540 --> 00:12:41.680
And it doesn't actually get imported until somebody actually uses it at runtime.

00:12:41.940 --> 00:12:49.080
That's going to make, that's just such a clean interface and it's going to make everything, a lot of stuff so much faster.

00:12:49.080 --> 00:12:58.520
In my world, it's the testing stuff because so much you, because pytest imports everything to start with and, but it doesn't need anything.

00:12:58.520 --> 00:13:01.880
Like the tests only need this stuff when they're, the tests are actually running.

00:13:02.080 --> 00:13:06.080
So having test runs will be a lot faster with lazy imports.

00:13:06.560 --> 00:13:09.940
And can I add that that's a little bit of foreshadowing right there?

00:13:10.300 --> 00:13:11.040
Is it?

00:13:11.220 --> 00:13:11.680
It is.

00:13:11.800 --> 00:13:12.120
Carry on.

00:13:12.120 --> 00:13:12.560
Okay.

00:13:12.560 --> 00:13:12.760
Okay.

00:13:13.020 --> 00:13:20.320
No, I just, just a lot, a lot of exciting stuff going on in, in 3.15 that I'm, I, I'm art.

00:13:20.600 --> 00:13:23.120
Oh, I'm already excited to play with it.

00:13:23.120 --> 00:13:31.120
And, and it used to be, I think like, I don't even remember how far ago it was where it was sort of hard to grab an alpha release.

00:13:31.120 --> 00:13:39.040
But now with uv, I just said uv self update and uv Python install 3.15 and bam, I had the alpha.

00:13:39.840 --> 00:13:41.120
So it's pretty great.

00:13:41.120 --> 00:13:41.220
That's awesome.

00:13:41.400 --> 00:13:41.560
Yeah.

00:13:41.600 --> 00:13:47.520
You can even just say uv, V, E, and V, and just say, you know, dash that, like give it a version of the Python.

00:13:47.760 --> 00:13:50.560
And if you don't have it, it'll just go and say, okay, we're getting 3.15.

00:13:50.820 --> 00:13:51.160
Yeah.

00:13:51.360 --> 00:13:51.900
Pretty cool.

00:13:52.180 --> 00:13:52.340
Yeah.

00:13:53.240 --> 00:13:57.240
Anyway, that's all I wanted to say, but just, I'm excited about 3.15.

00:13:57.480 --> 00:13:57.680
Cool.

00:13:57.780 --> 00:14:00.140
I am excited about 3.15 as well.

00:14:00.480 --> 00:14:13.200
And that is because I just, I'm excited about a lot of the things there, but I'm extra excited about this, this lazy, this PEP 810 lazy imports, because I've discovered that it can make a mega

00:14:13.200 --> 00:14:17.860
difference and it lets you write a lot of clean code, a lot of cleaner code than you would right now.

00:14:17.860 --> 00:14:24.960
So I wrote an article that was sort of a, a guide of some project I did a couple of weeks ago.

00:14:25.180 --> 00:14:28.980
And I would have talked about it last week, but we skipped last week because I was at a conference.

00:14:29.200 --> 00:14:36.760
So we're talking about this week and the title of the article is cutting Python web app memory by over 31%.

00:14:36.760 --> 00:14:44.240
And that's across the entire server, running Python bytes, running talk Python, Talk Python Training, all those things.

00:14:44.240 --> 00:14:50.640
So I just sat down and said, you know, it's kind of ridiculous how much memory these apps use, like Talk Python Training alone.

00:14:50.960 --> 00:14:55.020
I'm just going to focus on that, but like apply to this, to most Python web apps or APIs.

00:14:55.180 --> 00:14:59.360
It alone was using one point, almost 1.3 gigs to run.

00:14:59.460 --> 00:15:01.140
Like that seems a little ridiculous.

00:15:01.560 --> 00:15:03.480
And there's a separate little search daemon process.

00:15:03.480 --> 00:15:05.860
And it was using 700 megs.

00:15:06.020 --> 00:15:06.740
Just chilling.

00:15:06.920 --> 00:15:08.380
Why do you need so much memory?

00:15:08.720 --> 00:15:09.560
Bad Python app.

00:15:09.740 --> 00:15:10.860
Who wrote you is what I want to know.

00:15:10.860 --> 00:15:17.020
So I set about the process of going like, at least let me understand what, where this memory is going.

00:15:17.360 --> 00:15:20.420
And if there's any way that I could do something to make it better.

00:15:20.520 --> 00:15:20.920
Okay.

00:15:21.240 --> 00:15:23.240
It's not like we were running out of memory, right?

00:15:23.260 --> 00:15:30.460
I have a 16 gig server running in the cloud and I think it was using nine or 10 gigs.

00:15:30.520 --> 00:15:34.200
So there were six gigs left, but at the same time, it's what if I want to run other apps?

00:15:34.200 --> 00:15:43.580
Like I want to self host something that would maybe power the web apps or, you know, be like a CRM or some other thing that I just want to run and not have to set up other infrastructure.

00:15:43.800 --> 00:15:45.580
It'd be great if there's like, oh, there's so much RAM.

00:15:45.640 --> 00:15:46.400
It doesn't even matter.

00:15:46.640 --> 00:15:47.360
You know what I mean?

00:15:47.640 --> 00:15:54.000
And RAM by far is by far more critical and scarce than CPU.

00:15:54.400 --> 00:15:58.820
Not, you know, put this RAM crisis aside, this stuff that AI is triggering.

00:15:58.820 --> 00:16:04.960
Just straight, you get 16 gigs RAM, you get eight CPU for most people, most workloads.

00:16:05.240 --> 00:16:08.820
The CPU is pretty chill and the RAM is a lot higher.

00:16:08.940 --> 00:16:12.420
You got to get a lot of traffic before CPU becomes the problem, right?

00:16:12.660 --> 00:16:14.200
So thinking about the RAM, I think is important.

00:16:14.480 --> 00:16:17.760
So I started working on this and said, well, what can I do?

00:16:17.820 --> 00:16:22.160
And the starting point was 100, 1,280 megabytes.

00:16:22.380 --> 00:16:33.240
And the little search daemon thing that I told you about that once an hour, once every day, I can't remember the schedule, I think it's a few times a day, it'll pull all the content of Talk Python training and turn it into a search engine.

00:16:33.360 --> 00:16:38.100
So like you go over here and you're like, hey, I'm interested in, you know, taking some class.

00:16:38.360 --> 00:16:43.680
And in your class, you can say, oh, I'm not logged in, but you could just go over and search and say, well, do you talk about pytest?

00:16:43.740 --> 00:16:44.140
Let's see.

00:16:44.460 --> 00:16:45.780
Well, why, yes, we do.

00:16:46.280 --> 00:16:51.260
And it has like all these really nice, deep understanding of like the hierarchy of stuff.

00:16:51.340 --> 00:16:52.820
It's not just like a regular search, right?

00:16:52.880 --> 00:17:00.780
Like it's something I put together, but it's still, this is a ridiculous amount, 700 megs for something that just like reads from the database and writes to the database and otherwise is doing nothing.

00:17:01.020 --> 00:17:02.880
So I'm like, well, how can I do this better?

00:17:03.080 --> 00:17:05.840
The first thing I did, there's five things I'm going to talk about.

00:17:06.080 --> 00:17:14.760
Number one is I was running two to three worker processes to scale out web requests because everything was still based on Pyramid.

00:17:15.020 --> 00:17:15.860
It's synchronous.

00:17:16.060 --> 00:17:16.660
It's WSGI.

00:17:17.000 --> 00:17:23.000
I'm going to have fewer worker processes than I really want to have better concurrency in the one worker process that's there, right?

00:17:23.000 --> 00:17:26.680
I don't want like one slow request to be just like, well, that's it.

00:17:27.180 --> 00:17:29.000
The site's not responding, right?

00:17:29.040 --> 00:17:29.600
Because of the kill.

00:17:29.600 --> 00:17:38.240
I decided the first thing to do was to rewrite everything in Quart and it could be other languages, anything that's async.

00:17:38.300 --> 00:17:45.480
I could have rewritten in FastAPI, but I really like the Flask model and Quart is the true async version of Flask, right?

00:17:45.800 --> 00:17:47.500
So that's what I did.

00:17:47.500 --> 00:17:54.220
And that let me turn it, turn the other worker process off, which right there just cuts your memory straight in half, right?

00:17:54.240 --> 00:17:56.560
Because they both use the same amount of memory and there's two of them.

00:17:56.780 --> 00:17:58.320
And you know, people think like, oh yeah, whatever.

00:17:58.440 --> 00:18:00.480
Like Michael, your site's just a, it's just a blog.

00:18:00.560 --> 00:18:01.620
Like dude, what are you talking about?

00:18:01.640 --> 00:18:05.240
Like you seem to think like there's a lot going on here, but I work on a real app.

00:18:05.380 --> 00:18:06.580
So it doesn't apply to me.

00:18:06.800 --> 00:18:11.840
I ran my little tallyman thing against Talk Python training, not the, any of the podcasts of just the courses.

00:18:12.680 --> 00:18:16.040
178,000 lines of Python, 300,000 lines total.

00:18:16.040 --> 00:18:19.320
Like that's enough to like spend some time figuring out what's going on, right?

00:18:19.340 --> 00:18:23.380
That's complicated enough for most apps I imagine, at least to be somewhat representative.

00:18:23.820 --> 00:18:26.740
But you're like running courses and podcasts and stuff.

00:18:26.820 --> 00:18:28.440
This is more complicated than just a blog.

00:18:28.660 --> 00:18:29.280
Yeah, that's true.

00:18:29.380 --> 00:18:29.880
That's true.

00:18:30.060 --> 00:18:30.240
Yeah.

00:18:30.300 --> 00:18:31.260
But tell Reddit that.

00:18:31.560 --> 00:18:43.540
So then the next thing, the number two was I'm going to rewrite this in that the raw plus DC design pattern that I talked about just using straight queries, not ORMs or ODMs.

00:18:43.640 --> 00:18:46.020
Although I have something interesting to say my extra about that, but still.

00:18:46.040 --> 00:18:50.680
So just straight queries and then mapping data classes, or it could be pydantic or adders, right?

00:18:50.720 --> 00:18:51.340
It doesn't really matter.

00:18:51.440 --> 00:18:53.420
Pick a model, a simple data model.

00:18:53.640 --> 00:18:55.600
And that actually made a pretty big difference.

00:18:55.600 --> 00:19:04.300
That dropped 200 megs, 100 megs per worker off just switching away from an ORM to just raw queries and data classes.

00:19:04.840 --> 00:19:05.540
Makes sense.

00:19:05.860 --> 00:19:08.660
And it almost doubled the request per second, which is wild.

00:19:08.660 --> 00:19:09.060
Okay.

00:19:09.160 --> 00:19:17.740
So then number three, well, once I had done the court thing, I was able to just tell Granny and like, I want one worker or not two or three.

00:19:18.040 --> 00:19:19.180
For a while it was three or four.

00:19:19.260 --> 00:19:20.780
And I've been like dialing it back.

00:19:20.860 --> 00:19:21.520
It's just got faster.

00:19:21.860 --> 00:19:23.640
So that saved 500 megs there.

00:19:23.720 --> 00:19:27.720
So now we're down to, yeah, we're down to 530 megs for this process.

00:19:27.720 --> 00:19:36.020
And then for the search thing, what was happening is it would load up a bunch of imports, PEP 5, 8, 8, 10, getting exciting here.

00:19:36.060 --> 00:19:49.100
It would load up a bunch of imports and then it would run a bunch of code that would like get all, I don't know exactly where the memory was going, but a lot of stuff that it would work with by doing, interact with the database and so on would get cached or just left in memory.

00:19:49.100 --> 00:19:53.440
So it was using 700, 708 megs of memory.

00:19:53.720 --> 00:20:05.620
So I said, well, what if like really the main core loop of the app is just start, look at a timer after a certain amount of time, run a really complicated set of queries, and then

00:20:05.620 --> 00:20:11.740
write a bunch of structured data indexed back into a certain structure so I can query ultra fast, right?

00:20:11.960 --> 00:20:20.460
That, that main part doesn't need, it was importing like all the library, like all of Talk Python training, which would pull in everything that Talk Python training's main dunder.

00:20:20.560 --> 00:20:25.280
And it would pull in, which would pull in all the libraries that every, you know, it would cascade into this mega import.

00:20:25.600 --> 00:20:38.120
So I said, well, what if you just had the loop and then in a separate file, you ran, you would start a process that ran that separate file that did the indexing and then stopped and like just finished, like, cause it didn't need a response.

00:20:38.180 --> 00:20:39.600
It was like, okay, I'm done indexing.

00:20:39.660 --> 00:20:44.360
And that temporary sub process would be the thing that did all the imports.

00:20:44.360 --> 00:20:46.080
And when it shuts down, those imports go away.

00:20:46.080 --> 00:20:49.200
So that took it from 708 megs to 22.

00:20:49.540 --> 00:20:50.180
That's great.

00:20:50.420 --> 00:20:51.540
That's insane, right?

00:20:51.840 --> 00:20:54.240
And all I had to do is change where, sorry, go ahead.

00:20:54.260 --> 00:20:58.220
The sub process is still getting like 700 megs or whatever, but, but it goes away.

00:20:58.380 --> 00:21:01.360
But for like 30 seconds to a minute, not constantly.

00:21:01.620 --> 00:21:01.840
Right.

00:21:02.060 --> 00:21:07.680
And these spikes are, I mean, they're fine, but it's like, you know, not everything is spiking in memory at the same time, just like they don't in CPU.

00:21:07.860 --> 00:21:11.540
It's just like, it's not a fixed cost, which is pretty interesting.

00:21:11.540 --> 00:21:22.900
You know, the work was just like, I'm just going to do the index, move the indexing function to a new file, move the imports that it needs to a new file and just do a sub process to call it instead of calling it directly done.

00:21:23.160 --> 00:21:23.240
Yeah.

00:21:23.460 --> 00:21:27.240
And it made, I don't know what that division is, but like 20 times, 30 times better.

00:21:27.440 --> 00:21:28.420
You know, it's incredible.

00:21:28.420 --> 00:21:32.940
And the last thing, this one I think is going to surprise people.

00:21:33.040 --> 00:21:37.220
And this is the one that really hits the point home for lazy imports.

00:21:37.380 --> 00:21:47.340
If you type the words import Boto3, because you're doing something with S3 or something similar, your working memory goes up by 25 megs per process, per worker.

00:21:47.480 --> 00:21:51.820
If you type the words import matplotlib, your working memory goes up by 17 megs.

00:21:51.900 --> 00:21:54.640
If you type import pandas, your memory goes up by 44 megs.

00:21:54.640 --> 00:21:58.080
Those three imports right there are almost a hundred megs of memory usage.

00:21:58.080 --> 00:21:58.560
Yeah.

00:21:58.680 --> 00:21:59.500
It's a lot.

00:21:59.760 --> 00:22:00.840
So are they needed?

00:22:00.920 --> 00:22:02.620
If you're doing core data science, they are.

00:22:02.940 --> 00:22:08.120
But for me, there's like an admin section where I can go and view some reports.

00:22:08.280 --> 00:22:16.820
If I view the reports, I run, I use these libraries, but if I don't view the reports, which I really don't look at them hardly ever, just like maybe once a month, like, Hey, I wonder what that looks like.

00:22:17.100 --> 00:22:17.820
Let me go hit it.

00:22:17.920 --> 00:22:20.540
It pulls all that stuff in and then it generates the report.

00:22:20.540 --> 00:22:24.260
But the worker process recycles a couple of times a day.

00:22:24.720 --> 00:22:24.740
Yeah.

00:22:24.740 --> 00:22:32.500
So even if I view the report five hours later, that stuff's unloaded again and I get a new version and it's not another month until I load up that a hundred megs.

00:22:32.600 --> 00:22:44.260
So I went from like 500 megs to 450 just by saying, well, instead of importing the top of the file, let's import in the function that generates the actual picture, you know, the report that I need from this.

00:22:44.300 --> 00:22:46.460
And boom, a hundred megs less memory usage.

00:22:46.460 --> 00:22:50.460
And if that had the word lazy in front of it, I wouldn't have to rewrite my code.

00:22:50.580 --> 00:22:53.120
It would have effectively the same behavior.

00:22:53.260 --> 00:22:57.940
It wouldn't import until I actually run the function, but I could do PEP 8 magic and put it at the top.

00:22:58.040 --> 00:22:58.680
What do you think of that?

00:22:58.880 --> 00:22:59.500
That's pretty cool.

00:22:59.560 --> 00:23:05.220
So now I'm thinking that like lazy imports, it imports when it needed, but it doesn't, it doesn't ever unimport.

00:23:05.220 --> 00:23:14.720
And I'm wondering if like a future Python will add like, you know, some, something to the lazy import that like caches stuff out of memory.

00:23:15.000 --> 00:23:15.900
Right, right, right.

00:23:15.900 --> 00:23:22.320
Like the runtime could see nobody is caring about it being imported anymore and no one has set a value on it.

00:23:22.380 --> 00:23:24.460
So maybe it could just go away a hundred percent.

00:23:24.460 --> 00:23:35.560
But that means, I mean, people I think often think of lazy import as something that's a speed, a speed up like, oh, it's faster because you don't have to do all the imports until you use them.

00:23:35.740 --> 00:23:36.680
I'm sure that's true.

00:23:36.760 --> 00:23:37.760
I don't have numbers around it.

00:23:37.900 --> 00:23:44.120
But what's really interesting is there are some imports that are mega in amount, how much they actually increase your working memory.

00:23:44.560 --> 00:23:48.240
If they're lazy and you don't use them very often, they will not run very often.

00:23:48.240 --> 00:23:50.580
And I think it'll actually make a pretty big difference.

00:23:50.580 --> 00:23:55.420
So, you know, like you just write code, like just do import inside the function instead of at the top.

00:23:55.520 --> 00:23:57.240
Like your lenders will go, you shouldn't do this.

00:23:57.340 --> 00:23:58.340
Like you leave me alone.

00:23:58.420 --> 00:23:59.000
I'm doing this.

00:23:59.060 --> 00:24:00.640
This is, this is really good for me.

00:24:01.400 --> 00:24:07.580
So the final thing, I just moved a bunch of caches to disk caches instead of memory caches, which is good.

00:24:07.880 --> 00:24:19.660
And so I put a little picture in there, but it, it saved a ton of memory across 3.2 gigs, less memory used on the server by applying that to Python bytes, talk Python and Talk Python Training.

00:24:19.660 --> 00:24:22.940
There's a bunch of other apps, but they, oh, and the search thingy.

00:24:23.220 --> 00:24:25.760
But yeah, those were the four big ones.

00:24:26.040 --> 00:24:26.140
Cool.

00:24:26.180 --> 00:24:26.620
Pretty cool, huh?

00:24:26.820 --> 00:24:26.960
Yeah.

00:24:27.040 --> 00:24:27.780
Yeah, it's very cool.

00:24:28.060 --> 00:24:35.820
So a little bit long there, but I thought that people might appreciate some kind of a roadmap and how you can do that yourself.

00:24:35.940 --> 00:24:44.920
Like, so the total savings was 190, 1,988 megabytes to 472 megabytes used across the apps.

00:24:44.920 --> 00:24:49.180
That is a lot of difference and you can get a way, you can run more apps.

00:24:49.380 --> 00:24:54.940
You can scale out more and get way better performance because now you can like run more workers if you really needed to or whatever.

00:24:55.020 --> 00:24:57.980
I think there's a lot of, a lot of benefits there.

00:24:58.480 --> 00:24:59.380
So cool.

00:24:59.460 --> 00:25:01.860
A lot of excitement out in the audience.

00:25:01.860 --> 00:25:03.720
They're talking about this topic.

00:25:03.800 --> 00:25:04.340
I think it's cool.

00:25:04.600 --> 00:25:05.120
All right.

00:25:05.220 --> 00:25:07.660
Well, one of the things that I get excited about is testing.

00:25:07.660 --> 00:25:09.440
You don't say.

00:25:09.700 --> 00:25:11.520
Pick that up about me a little bit.

00:25:12.160 --> 00:25:17.260
So I want to talk about trike right now, like as in a tricycle.

00:25:17.520 --> 00:25:20.300
So this is a new, a new project.

00:25:21.060 --> 00:25:22.320
And how new?

00:25:22.540 --> 00:25:29.200
So it's got four stars, but it, I mean, it just went up like last month or something.

00:25:29.720 --> 00:25:30.620
Very recent.

00:25:30.620 --> 00:25:37.140
So taking a look at this, this was submitted by the person that created it, Justin Chapman.

00:25:37.260 --> 00:25:41.940
But, but I'm, you know, I like the idea of like thinking outside the box.

00:25:41.940 --> 00:25:43.140
So for testing.

00:25:43.280 --> 00:25:55.020
So this is trike is a rust based Python test runner with a just style API, which is, so I'm not, I'm not familiar with just, but that's what said, said JavaScript just.

00:25:55.180 --> 00:25:55.960
I don't remember.

00:25:56.300 --> 00:25:57.500
Anyway, maybe.

00:25:57.500 --> 00:25:58.100
I don't know.

00:25:58.100 --> 00:25:58.800
I think so.

00:25:58.800 --> 00:25:59.080
Yeah.

00:25:59.460 --> 00:26:04.940
So yeah, you, you can tell how Python focused I am most of the time.

00:26:05.680 --> 00:26:07.840
But so what is it going to look like?

00:26:07.880 --> 00:26:09.620
It's let's, let's zoom in a little bit.

00:26:10.940 --> 00:26:11.800
Getting started.

00:26:12.020 --> 00:26:16.140
So the, it, it looks like it looks way different than by tests.

00:26:17.140 --> 00:26:24.160
So we've got a, what a, let's say we've got a normal function that's add, for example, and we want to test that.

00:26:24.160 --> 00:26:27.960
We'd say like with describe and add.

00:26:27.960 --> 00:26:32.580
So with describe and then some comment with this, like your test name, I guess.

00:26:32.580 --> 00:26:35.580
And then decorators of tests.

00:26:35.660 --> 00:26:39.000
And then another, I think that's just a description, but it's saying.

00:26:39.000 --> 00:26:39.300
Yeah.

00:26:39.300 --> 00:26:42.120
First I thought it was a string that was being parsed to make it run.

00:26:42.120 --> 00:26:42.280
Yeah.

00:26:42.280 --> 00:26:43.060
Whereas this one plus one.

00:26:43.120 --> 00:26:44.840
I think that's just the message that comes out.

00:26:44.840 --> 00:26:45.020
Yeah.

00:26:45.020 --> 00:26:46.140
It's just the test case.

00:26:46.140 --> 00:26:51.140
And so this is a very basic, we're going to get more from Justin to describe this.

00:26:51.220 --> 00:26:53.480
I've like reached out to him and said, this is really interesting.

00:26:53.600 --> 00:26:54.560
I'd like to know more.

00:26:54.660 --> 00:26:55.760
So I'm going to do more research.

00:26:55.880 --> 00:27:00.620
I haven't really played with this yet, but, but I, but I, I'm intrigued by it.

00:27:01.360 --> 00:27:03.160
I kind of like pytest.

00:27:04.140 --> 00:27:07.360
pytest is, has, uses just assert.

00:27:07.360 --> 00:27:14.960
And for async for, for things that soft asserts, I guess I use the, my pytest plugin called check, pytest check.

00:27:15.260 --> 00:27:16.440
But this is different.

00:27:16.540 --> 00:27:21.400
All by default, all of these are soft asserts.

00:27:21.400 --> 00:27:22.660
So it doesn't stop the test.

00:27:22.720 --> 00:27:24.640
You can, you can expect a lot of things.

00:27:24.940 --> 00:27:26.160
It uses the expect keyword.

00:27:26.360 --> 00:27:27.460
So what do we got here?

00:27:27.500 --> 00:27:33.420
We've got a watch mode so that watches, watches to see if, if you have new things to, to test.

00:27:33.420 --> 00:27:37.540
Native async support, fast test discovery in source testing.

00:27:37.940 --> 00:27:43.880
So you can, the ability to just put tests right in the source code, instead of having to have a separate test.

00:27:44.000 --> 00:27:47.620
You can do that with pytest, of course, but mostly people don't.

00:27:47.760 --> 00:27:49.440
Doc test support, kind of like pytest.

00:27:49.600 --> 00:27:52.520
Client server mode, which is, that's, this is an interesting one.

00:27:52.620 --> 00:27:57.760
So the client server, they're all interesting, but this idea that you can have a server running.

00:27:58.060 --> 00:27:59.060
So why would you do that?

00:27:59.120 --> 00:28:06.440
So one of the things like I was just talking about for memory wise, if you run pytest, it has to import everything, imports a lot of stuff, and then you're running tests.

00:28:06.700 --> 00:28:11.700
The server is just doing that so that it's, it's, while you're doing watching, it's, it's done that.

00:28:11.760 --> 00:28:16.840
It's got a warm cache of everything so that, that the individual tests can go faster.

00:28:17.240 --> 00:28:22.040
So you get client server mode, you know, pretty assertion of diagnostics.

00:28:22.160 --> 00:28:25.360
Of course, we would expect no less from a new test framework.

00:28:25.360 --> 00:28:31.300
So does it basically parse all, do the test discovery once and then just rerun it?

00:28:31.500 --> 00:28:33.660
Unless, is that, is that why that, then?

00:28:34.040 --> 00:28:35.120
I think so.

00:28:35.240 --> 00:28:38.440
So it's doing, like, for instance, it's doing a changed mode.

00:28:38.780 --> 00:28:43.760
Oh, like there's, I think it's for picking up new tests.

00:28:43.760 --> 00:28:52.020
So as you're like, if you're doing test and development and you're modifying a test and modifying code, it'll pull in the stuff that changed into the, into the server.

00:28:52.020 --> 00:28:59.560
But it can probably skip discovery or limit discovery, like the change file or something like that.

00:28:59.620 --> 00:28:59.780
Yeah.

00:28:59.880 --> 00:29:00.360
That's interesting.

00:29:00.400 --> 00:29:00.780
Well, right.

00:29:00.860 --> 00:29:09.200
And it's using, I think it's using Git information to find out which, which, which elements have been modified and are, yeah, why not?

00:29:09.320 --> 00:29:10.760
Most people are using Git anyway.

00:29:10.920 --> 00:29:12.860
So I think that's how it's using it anyway.

00:29:13.140 --> 00:29:13.720
Oh, interesting.

00:29:14.000 --> 00:29:21.800
I commented that I just sent him information about pytestCheck and, and he added that to the, Oh yeah.

00:29:21.920 --> 00:29:23.680
Soft decisions like pytestCheck.

00:29:23.840 --> 00:29:24.160
Pretty cool.

00:29:24.660 --> 00:29:32.700
And I'll throw out there that I also, I'm plus one for them on the fluent API instead of the raw assert.

00:29:32.900 --> 00:29:37.120
I really like the expect this to equals dot.

00:29:37.120 --> 00:29:49.820
And like, you kind of like put it together as kind of an English like sentence where, you know, you can say like in list and give it an item and a list or something like, rather than, you know, just doing the exact rule.

00:29:50.100 --> 00:29:50.320
I don't know.

00:29:50.320 --> 00:29:51.600
I like that fluent API.

00:29:51.600 --> 00:29:51.940
Do you?

00:29:52.160 --> 00:29:52.360
Okay.

00:29:52.440 --> 00:29:52.640
Yes.

00:29:52.880 --> 00:29:53.920
Cause that's not something.

00:29:54.020 --> 00:29:54.260
I don't know.

00:29:54.480 --> 00:29:55.280
No, it's not weird.

00:29:55.400 --> 00:30:05.020
It shows up a few times in a, I've seen it in a couple of different test frameworks and it's, and even some, there's a, there's a, there's an extension for unit tests to be able to do this.

00:30:05.080 --> 00:30:10.380
And then I think I've seen pytest extensions to do this, but I, it's not something I am.

00:30:10.380 --> 00:30:16.920
I, I could get used to it, but, it reads in English fairly well.

00:30:17.160 --> 00:30:18.140
So yeah, that's why I like it.

00:30:18.180 --> 00:30:25.560
It's more writing, but the truth is if your editor is not auto completing when you type two, you're probably doing it wrong, right?

00:30:25.560 --> 00:30:29.440
You're not writing all those words, those characters you're, you're selecting them from a short list.

00:30:29.680 --> 00:30:29.860
Yeah.

00:30:29.920 --> 00:30:37.680
And for me, I like the readability rather than looking in a search statement and go, what is it really trying to get out with this like combination of something that resolves to a Boolean?

00:30:37.680 --> 00:30:44.580
You can say like, you know, expect this value, like to be in the list or to not be in, you know, like something like that.

00:30:44.620 --> 00:30:44.800
Right.

00:30:45.060 --> 00:30:45.500
Or, yeah.

00:30:45.820 --> 00:30:46.140
I don't know.

00:30:46.160 --> 00:30:47.740
I like the readability of it.

00:30:47.740 --> 00:30:53.840
I, I would just, just notice also this is made by Zensical, tie into previous conversations.

00:30:54.060 --> 00:30:55.180
At least the website is.

00:30:55.420 --> 00:31:02.660
So, anyway, I think that I'm, I think there's some interesting, interesting work here.

00:31:02.780 --> 00:31:04.300
I like the async support.

00:31:04.300 --> 00:31:08.380
I definitely want to play with this because I think that's, it's an interesting idea.

00:31:08.680 --> 00:31:12.100
so anyway, cool, cool.

00:31:12.220 --> 00:31:12.640
Well done.

00:31:12.740 --> 00:31:13.400
It's very new.

00:31:13.520 --> 00:31:15.620
We'll have to, we'll have to see where it goes.

00:31:15.620 --> 00:31:22.860
and I sent him some, I haven't, I'm sorry, Justin, I haven't, haven't read your reply yet.

00:31:22.920 --> 00:31:26.080
He's, he already responded to me with some questions.

00:31:26.080 --> 00:31:27.360
I sent some questions yesterday.

00:31:27.560 --> 00:31:29.880
kind of like I was curious about startup.

00:31:29.980 --> 00:31:33.560
Apparently there is a setup, setup, tear down sort of fixture.

00:31:33.560 --> 00:31:35.560
Like there is a fixture feature.

00:31:35.620 --> 00:31:36.760
I just haven't figured it out yet.

00:31:36.760 --> 00:31:38.300
So it's somewhere in the documentation.

00:31:38.600 --> 00:31:44.800
so anyway, exciting things and I'll keep, keep an eye on this, keep, in, on this space.

00:31:44.800 --> 00:31:50.420
So indeed, let me throw out a meta topic before we get to our extras, Brian, sort of inspired by this, but more broad.

00:31:50.760 --> 00:31:57.700
I know people are hesitant to adopt new frameworks, like a new testing framework or a new web framework or a new database thing or something.

00:31:57.820 --> 00:32:12.200
But with the agentic AI stuff that we have these days, if you pick one and you're like, Oh, it turns out it's no longer updated or I don't like it anymore or whatever, you know, it's so much easier to just go make that back into one of these or, or convert it onto the next thing.

00:32:12.200 --> 00:32:17.880
Instead of having some huge, like, Oh no, now we've got to take, you know, two weeks and we're all rewriting the tests.

00:32:17.880 --> 00:32:20.860
Like you probably could get two from pie dust pretty quickly.

00:32:21.200 --> 00:32:21.480
Yeah.

00:32:21.840 --> 00:32:22.040
Yeah.

00:32:22.100 --> 00:32:23.340
We probably could.

00:32:23.400 --> 00:32:30.780
And also, yeah, that, that, the questionable thing that the scary thing is like, like, this is pretty new.

00:32:30.780 --> 00:32:31.960
Is it going to stick around?

00:32:32.480 --> 00:32:35.660
Like, that, that, that's the big one.

00:32:35.780 --> 00:32:38.540
A lot of people have excitement around something.

00:32:38.700 --> 00:32:44.880
I was a little bit, so I took a look at this is under the Jay chap.

00:32:44.980 --> 00:32:48.280
and he's the current CTO of a new V.

00:32:48.460 --> 00:32:53.240
So he's got a, he's probably using this at work, with his, on his day job.

00:32:53.240 --> 00:32:54.820
So this is a good thing.

00:32:54.820 --> 00:32:59.520
he's contributed to ty and uv, which is kind of cool.

00:32:59.520 --> 00:33:00.360
So that's cool.

00:33:00.640 --> 00:33:09.700
There's some hints that maybe this will stick around and, maybe in, especially if he's using it, on a regular basis, he's probably using it and supporting it himself.

00:33:09.700 --> 00:33:11.840
So, a little bit more.

00:33:11.940 --> 00:33:12.120
Yeah.

00:33:12.140 --> 00:33:14.040
So I do check these, these sort of things out.

00:33:14.080 --> 00:33:21.840
I've seen a lot of new projects that are, probably assisted by you, AI to get created.

00:33:22.000 --> 00:33:24.220
It might just be a fun toy for somebody.

00:33:24.220 --> 00:33:30.100
And that's not something I really want to cover in this, on this podcast, but something that looks like maybe they're serious about it.

00:33:30.280 --> 00:33:30.500
Yeah.

00:33:30.560 --> 00:33:31.320
We'll cover it.

00:33:31.400 --> 00:33:31.980
It doesn't matter.

00:33:32.300 --> 00:33:32.980
So cool.

00:33:33.120 --> 00:33:33.320
Yeah.

00:33:33.320 --> 00:33:34.480
It looks very, very neat.

00:33:34.960 --> 00:33:35.560
All right.

00:33:35.800 --> 00:33:37.540
I've got a handful of extras.

00:33:37.760 --> 00:33:40.380
You want to hit your extras next or go ahead.

00:33:40.660 --> 00:33:40.840
Go ahead.

00:33:40.980 --> 00:33:41.220
Okay.

00:33:41.220 --> 00:33:42.280
let's see.

00:33:42.280 --> 00:33:49.820
I, I saw this came up in a couple of newsletters and I was intrigued by an article called why aren't we uv yet?

00:33:50.060 --> 00:33:59.180
talking about it did, did some analysis of different, not sure how they got them, but like some top Python projects, stack over.

00:33:59.320 --> 00:34:02.360
I don't know, that, you know, uv is popular.

00:34:02.500 --> 00:34:04.020
Why isn't it being used more?

00:34:04.020 --> 00:34:18.140
The interesting thing, I just, I'm not, I will, we'll link to it of course, but, I think one of the reasons why this, this gap looks so big is that a lot of people with requirements.txt are, are you using, are still using UV?

00:34:18.500 --> 00:34:29.880
we, I have a lot of projects at work that, are requirements.txt based and everybody I know, I'd like that our instructions are to use uv, that you just because it doesn't,

00:34:29.880 --> 00:34:34.860
we're not publishing, uv lock file doesn't mean that we're not using uv.

00:34:35.140 --> 00:34:38.400
So every single one of my projects, no, that's not true.

00:34:38.480 --> 00:34:44.440
Almost every one single one of our, my projects has a requirements.txt and no uv lock and it's all uv.

00:34:44.700 --> 00:34:44.980
Yeah.

00:34:45.320 --> 00:34:50.720
So I don't know if the assumptions of this study are correct is all I'm saying.

00:34:50.960 --> 00:34:51.180
Yeah.

00:34:51.180 --> 00:34:51.880
That's a really good point.

00:34:51.880 --> 00:34:56.740
I very much prefer requirements.txt with pinned stuff and like then the uv lock.

00:34:56.820 --> 00:34:57.800
I don't know why I just do.

00:34:58.400 --> 00:34:59.480
Well, I don't know why.

00:34:59.480 --> 00:35:06.740
Well, right now I, I'm, I'm leaving it open for the develop, the developer to choose, if they want to use uv or not.

00:35:06.880 --> 00:35:07.060
Yeah.

00:35:07.280 --> 00:35:12.180
we probably get to the point where we're not making that a choice, but, anyway.

00:35:12.420 --> 00:35:15.740
I find that requirements.txt diffs a little nicer.

00:35:15.880 --> 00:35:19.280
It's like, it's easier to read and especially get diffs to look and go, oh yeah, okay.

00:35:19.280 --> 00:35:20.300
This is what changed.

00:35:20.600 --> 00:35:28.760
Whereas the uv lock has got like so much, especially with the hashes and so on that it's like, ah, you know, there's so much noise in the uv lock versus the requirements.txt.

00:35:28.760 --> 00:35:29.560
So yeah.

00:35:29.720 --> 00:35:32.420
But I, I'm, I, mine are pretty noisy though.

00:35:32.420 --> 00:35:41.000
Cause I'm using like, requirements.in and, like using uv to publish a more detailed requirements.txt.

00:35:41.220 --> 00:35:41.340
Yeah.

00:35:41.340 --> 00:35:43.500
That's, I do the same thing, but I don't include the hashes.

00:35:43.580 --> 00:35:44.040
Maybe I should.

00:35:44.180 --> 00:35:44.600
I don't know.

00:35:44.600 --> 00:35:51.120
I think according to listening when I was listening to Python bytes recently and we are supposed to save ashes there.

00:35:51.580 --> 00:35:51.980
Okay.

00:35:52.240 --> 00:35:53.460
what else?

00:35:53.560 --> 00:35:57.240
the PyCon us, talk schedule is up.

00:35:57.500 --> 00:35:59.760
if you're going, you can check it out.

00:35:59.760 --> 00:36:10.240
One of the things that noticed that I knew, I knew as a, as a submitter, I did notice this, but there's an entire AI, channel, like what are they track an AI track?

00:36:10.240 --> 00:36:13.440
And I don't, I don't know how I feel about that, but yeah, whatever.

00:36:13.560 --> 00:36:14.540
I think I'm excited about it.

00:36:14.540 --> 00:36:14.860
Honestly.

00:36:15.160 --> 00:36:15.420
Oh yeah.

00:36:15.620 --> 00:36:15.820
Okay.

00:36:15.860 --> 00:36:16.060
Yeah.

00:36:16.060 --> 00:36:16.540
I think so.

00:36:16.580 --> 00:36:20.280
I won't be there, but everybody that does, you're going to be there.

00:36:20.440 --> 00:36:27.180
I think you will be there in spirit and I will hand out Python bytes stickers and they will be carrying some of you with them.

00:36:27.820 --> 00:36:28.220
Cool.

00:36:28.760 --> 00:36:29.720
let's see.

00:36:29.920 --> 00:36:37.540
Oh, I wasn't going to cover this, but now that I've already have it up, Justin Jackson, has a, what has technology done to us?

00:36:37.540 --> 00:36:38.680
blog post.

00:36:38.800 --> 00:36:40.420
I was just reading about that recently.

00:36:40.620 --> 00:36:43.380
Oh, that's it for my, Oh, I have one other extra.

00:36:43.540 --> 00:36:44.000
Here it is.

00:36:44.000 --> 00:36:48.400
the lean TDD book, I hinted at that earlier.

00:36:48.560 --> 00:36:54.900
I, in a couple of days ago, put out version 0.6.1.

00:36:55.600 --> 00:36:57.160
I'm still on, on track.

00:36:57.280 --> 00:37:00.640
So I, this is really close to what I want to read.

00:37:00.640 --> 00:37:04.100
So I'm, I, when I, I'm taking a trip business trip.

00:37:04.100 --> 00:37:10.940
And then when I get back from the business trip, I'm going to start recording the audio book for this, but I've got new, new cover art with the, little rocket.

00:37:10.940 --> 00:37:16.360
I like rockets and, I'm pretty excited about the state of this right now.

00:37:16.400 --> 00:37:17.520
I'm happy with the flow.

00:37:17.760 --> 00:37:21.440
the first, the first iteration of it, I didn't enjoy reading it.

00:37:21.440 --> 00:37:24.560
And why would I want somebody to buy a book that I'm not enjoying reading it?

00:37:24.620 --> 00:37:26.600
But now I'm like reading it all the time.

00:37:26.600 --> 00:37:28.340
I don't want to read other people's books anymore.

00:37:28.340 --> 00:37:31.000
I'm, I'm liking my, anyway, enough about me.

00:37:31.480 --> 00:37:33.520
that is my extras so far.

00:37:33.760 --> 00:37:34.200
Awesome.

00:37:34.260 --> 00:37:34.640
Awesome.

00:37:34.880 --> 00:37:36.240
Congrats on making progress on the book.

00:37:36.240 --> 00:37:42.800
So I was going to cover that three 14 four Python three 14 four is out, but you kind of already talked about that before.

00:37:42.920 --> 00:37:45.500
Well, we just zoomed by 14 four though.

00:37:45.640 --> 00:37:46.340
So I'm glad.

00:37:46.520 --> 00:37:46.740
Yeah.

00:37:47.000 --> 00:37:49.400
There's some nice stuff out here and I'm not going to go into detail.

00:37:49.520 --> 00:37:54.620
It's just an extra and all those kinds of things, but there are fixes CVE such and such.

00:37:54.880 --> 00:37:57.480
There's two, two security vulnerabilities address.

00:37:57.540 --> 00:37:58.140
Actually three.

00:37:58.260 --> 00:37:58.380
Sorry.

00:37:58.400 --> 00:37:58.800
I missed one.

00:37:58.800 --> 00:38:06.580
So at least, there's at least three security CVE things fixed and just three 14 four.

00:38:06.740 --> 00:38:10.760
And there's a couple of other security issues as well that don't seem to have CVEs.

00:38:10.760 --> 00:38:12.100
So that's alone.

00:38:12.200 --> 00:38:12.780
It's probably worthwhile.

00:38:12.780 --> 00:38:18.060
So I will instead tie back to your, why aren't we uv are, why aren't we uv yet?

00:38:18.060 --> 00:38:29.280
And just point out that if you just type uv Python upgrade, it will do now an in place upgrade of three 14 three or two or one into three 14 four.

00:38:29.680 --> 00:38:33.160
And so your virtual environments and all that stuff just should pick that up at the bank.

00:38:33.260 --> 00:38:36.620
If not, at least a minimum of you can now recreate the virtual environment with it.

00:38:36.920 --> 00:38:37.020
Yeah.

00:38:37.080 --> 00:38:43.320
Well, not only that, any of them that you have installed, if you've got all the, all the versions of Python, you've got to install.

00:38:43.460 --> 00:38:46.400
Well, if you say uv Python upgraded upgrades all of them.

00:38:46.600 --> 00:38:47.760
So yeah, it's excellent.

00:38:47.760 --> 00:38:50.920
And in true uv style, it does it in parallel because it should.

00:38:51.440 --> 00:38:51.520
Yeah.

00:38:51.520 --> 00:38:56.420
Actually, after you mentioned that, I just went over and clicked and did it and it's done already.

00:38:56.680 --> 00:38:57.980
So yeah, that's awesome.

00:38:58.380 --> 00:38:58.540
All right.

00:38:58.540 --> 00:38:59.560
One more release.

00:38:59.620 --> 00:39:00.580
I just want to give a shout out to you.

00:39:00.580 --> 00:39:04.420
I've been kind of like bagging on Beanie a little bit, although I'm a huge fan of Beanie.

00:39:04.420 --> 00:39:05.120
Bagging on Beanie.

00:39:05.380 --> 00:39:06.220
Beanie bags.

00:39:06.420 --> 00:39:09.560
Just saying, like, remember, I've been talking about my raw DC pattern.

00:39:09.720 --> 00:39:11.640
Like there were two reasons I was moving to it.

00:39:11.640 --> 00:39:24.760
One, because I think AIs are much, much better understanding raw query syntax instead of ORMs or ODMs wrapped around something, which wrapped around classes, which, and then results somehow sometimes into raw queries, like that kind of thing.

00:39:25.000 --> 00:39:25.140
Yeah.

00:39:25.280 --> 00:39:31.680
So I've been talking a lot about that and so on, but also that it was some of the libraries I was using were no longer updated.

00:39:31.920 --> 00:39:34.200
And I'm like, ah, that's such a, such a hassle.

00:39:34.200 --> 00:39:37.800
Like you looked at the releases for Beanie and you waited for that to come out.

00:39:37.840 --> 00:39:44.020
You'd see that like seven months ago, it was like, oh yeah, we fixed a couple of things.

00:39:44.160 --> 00:39:51.200
And then a while ago, we made some changes that introduced like some weird breaking changes, but like we kind of fixed the breaking changes later.

00:39:51.300 --> 00:39:53.380
You know, it was not really getting a lot of love.

00:39:53.500 --> 00:40:00.280
So there's actually a major release to Beanie that has like a ton of fixes and a ton of contributors, a ton of stuff.

00:40:00.380 --> 00:40:02.700
So if you're using Beanie 2.1.0 is out.

00:40:02.760 --> 00:40:04.080
You should definitely check that out.

00:40:04.480 --> 00:40:04.700
Nice.

00:40:04.980 --> 00:40:05.140
Yeah.

00:40:05.760 --> 00:40:07.600
I feel like we got a joke maybe.

00:40:07.820 --> 00:40:08.380
What do you think?

00:40:08.660 --> 00:40:09.720
You want to take this one?

00:40:09.940 --> 00:40:10.720
Yeah, I'll take this one.

00:40:10.820 --> 00:40:25.100
But on the topic of updates, I am, that's one of the things with agents that I didn't realize that I was going to enjoy is that I think I want to write it, write this up for next week or the next time we record.

00:40:25.100 --> 00:40:33.500
But the maintaining an open source project is easier now with, when you can offload some work to an agent.

00:40:34.300 --> 00:40:37.000
I'm actually a better maintainer now than I was before.

00:40:37.500 --> 00:40:38.160
Me too.

00:40:38.400 --> 00:40:39.180
There have been some projects.

00:40:39.300 --> 00:40:40.860
I'm like, gosh, that's kind of tricky.

00:40:41.040 --> 00:40:41.380
I don't know.

00:40:41.620 --> 00:40:42.940
It really justifies the effort.

00:40:43.080 --> 00:40:44.360
Some people are asking for features.

00:40:44.480 --> 00:40:47.800
I'm like, yeah, we really should support that new feature.

00:40:48.180 --> 00:40:49.540
Hey, Claude, how hard would it be?

00:40:49.580 --> 00:40:50.280
And that'll sketch it.

00:40:50.320 --> 00:40:52.600
I'm like, okay, yeah, this is totally doable.

00:40:52.600 --> 00:40:58.940
Yeah, I was doing some gardening this weekend and having an agent work for me while I was doing my own thing.

00:40:59.240 --> 00:41:03.080
So anyway, let's have something funny.

00:41:03.520 --> 00:41:05.240
And I was going to bring this up.

00:41:05.340 --> 00:41:13.300
This is, I think, an April Fool's joke from Mother Duck, which is a great for their Duck DB company, I think.

00:41:13.620 --> 00:41:13.720
Right.

00:41:13.840 --> 00:41:16.460
Mother, they're the company behind Duck DB.

00:41:16.620 --> 00:41:19.200
And this is like their commercial offering.

00:41:19.300 --> 00:41:20.980
So you can run Duck DB better, basically.

00:41:20.980 --> 00:41:21.540
Okay.

00:41:21.820 --> 00:41:26.020
Well, they put out Human DB instead of Duck DB, Human DB.

00:41:26.360 --> 00:41:28.960
And they did like Human Feet instead of Duck Feet.

00:41:29.860 --> 00:41:33.600
And it says, blazingly slow, emotionally consistent.

00:41:34.140 --> 00:41:37.240
The world's first human-powered analytical database.

00:41:37.600 --> 00:41:39.880
Why pay for compute when Dave is right there?

00:41:41.260 --> 00:41:51.080
This is just pretty darn fun that you can just, you can do pip install Human DB and import Human DB and do queries.

00:41:51.080 --> 00:41:54.600
And it just, and it just like plays for you.

00:41:54.720 --> 00:41:55.540
And it plays an audio.

00:41:55.980 --> 00:41:57.560
Dave is squinting at this.

00:41:57.820 --> 00:41:58.940
And it's like, yeah.

00:41:59.660 --> 00:42:01.140
Can you actually install it?

00:42:01.440 --> 00:42:02.140
Yeah, I did.

00:42:02.280 --> 00:42:04.640
I did install and ran it.

00:42:04.640 --> 00:42:07.020
And it's pretty funny.

00:42:07.220 --> 00:42:10.000
Dave is, yeah, contacting Dave for this query.

00:42:10.780 --> 00:42:14.300
The website is very complete about how this all works.

00:42:15.200 --> 00:42:18.340
It's got in-brain storage, post-it indexing.

00:42:19.620 --> 00:42:20.880
Suboptimal but colorful.

00:42:21.140 --> 00:42:24.700
Each index is handwritten and stuck to the monitor bezel.

00:42:25.880 --> 00:42:26.040
Yeah.

00:42:26.340 --> 00:42:29.240
Hola processing, online analytical humans.

00:42:29.240 --> 00:42:30.780
Eventually consistent.

00:42:31.020 --> 00:42:31.860
Dave will get back to you.

00:42:32.100 --> 00:42:33.220
He'll get back to you.

00:42:33.600 --> 00:42:35.140
SLA is one business day.

00:42:35.440 --> 00:42:37.240
Or three if it's quarter end.

00:42:37.320 --> 00:42:39.060
Or five if Dave's on PTO.

00:42:39.440 --> 00:42:40.620
We'll circle back.

00:42:41.340 --> 00:42:42.800
SQL or natural language.

00:42:43.580 --> 00:42:43.740
Yeah.

00:42:45.040 --> 00:42:47.340
Dave learned SQL first, then English.

00:42:47.460 --> 00:42:48.340
He understands both.

00:42:48.720 --> 00:42:49.740
Just ask him anything.

00:42:50.160 --> 00:42:51.280
So, yeah.

00:42:51.880 --> 00:42:52.920
So, yeah.

00:42:52.920 --> 00:42:54.620
I did pip install this.

00:42:54.760 --> 00:42:55.560
Played with it.

00:42:55.560 --> 00:42:59.180
And it's pretty funny to watch.

00:42:59.320 --> 00:42:59.900
Oh, you can do.

00:43:00.040 --> 00:43:00.820
There's examples.

00:43:01.460 --> 00:43:04.400
So, select average salary from employee where or whatever.

00:43:05.320 --> 00:43:06.160
It just runs it.

00:43:07.020 --> 00:43:09.920
Borrowing Gary's ledger pad from the query.

00:43:10.220 --> 00:43:11.220
And then you have to wait for it.

00:43:11.580 --> 00:43:17.500
The average engineering salary is somewhere between, somewhere around $87,000, give or take.

00:43:17.800 --> 00:43:20.320
I ran those numbers using Gary's ledger pad.

00:43:20.460 --> 00:43:21.520
But he wants it back.

00:43:21.640 --> 00:43:22.640
So, you know, rounding.

00:43:22.640 --> 00:43:27.860
So, I just had a lot of fun with it.

00:43:27.900 --> 00:43:33.340
I probably spent 20 minutes playing with HumanDB a couple weeks ago.

00:43:33.760 --> 00:43:34.840
Yeah, that's really funny.

00:43:35.400 --> 00:43:35.920
Benchmarks.

00:43:36.140 --> 00:43:37.160
Oh, it's got benchmarks.

00:43:37.800 --> 00:43:43.000
So, DuckDB is 0.003 seconds.

00:43:43.180 --> 00:43:45.040
HumanDB, two to four business hours.

00:43:45.680 --> 00:43:45.980
Nice.

00:43:46.980 --> 00:43:49.220
DuckDB is pennies to run.

00:43:49.220 --> 00:43:52.920
But HumanDB is $49 a month plus snacks.

00:43:53.400 --> 00:43:53.600
Nice.

00:43:53.940 --> 00:43:54.540
The vibes.

00:43:54.860 --> 00:43:56.200
Clinical versus immaculate.

00:43:56.520 --> 00:43:57.180
Gut feeling.

00:43:57.540 --> 00:43:58.540
A built-in gut feeling.

00:43:58.680 --> 00:43:59.120
That's great.

00:43:59.540 --> 00:44:00.520
Remembers your birthday.

00:44:00.700 --> 00:44:01.320
Dave is thoughtful.

00:44:01.720 --> 00:44:03.380
And DuckDB will not remember your birthday.

00:44:03.520 --> 00:44:05.340
But it probably will if you put it in the database.

00:44:05.740 --> 00:44:06.580
Anyway, that's funny.

00:44:06.840 --> 00:44:07.360
Oh, wow.

00:44:07.420 --> 00:44:08.200
They've got pricing.

00:44:08.520 --> 00:44:09.660
They have an enterprise tier.

00:44:10.020 --> 00:44:10.400
Enterprise.

00:44:10.640 --> 00:44:11.240
Let's talk.

00:44:11.620 --> 00:44:12.900
Unlimited human analysts.

00:44:13.100 --> 00:44:14.260
On-call overnight human.

00:44:14.700 --> 00:44:16.340
We'll figure out the SLA.

00:44:16.800 --> 00:44:18.560
Dedicated Slack workspace.

00:44:18.940 --> 00:44:20.280
Quarterly team pizza party.

00:44:21.460 --> 00:44:22.440
This is funny.

00:44:22.600 --> 00:44:23.120
Dave gets equity.

00:44:23.440 --> 00:44:24.320
Dave gets equity.

00:44:25.440 --> 00:44:27.660
The $0 one for the free one.

00:44:27.960 --> 00:44:29.360
Emotional support not guaranteed.

00:44:30.120 --> 00:44:30.960
So, that's funny.

00:44:31.260 --> 00:44:32.900
I wonder, can you just buy it?

00:44:33.120 --> 00:44:33.860
Upgrade to Pro.

00:44:34.720 --> 00:44:35.240
No.

00:44:35.540 --> 00:44:36.440
Because it is a joke.

00:44:36.720 --> 00:44:39.120
But it's funny when people carry it farther.

00:44:39.120 --> 00:44:41.200
And they actually take a credit card or something.

00:44:41.980 --> 00:44:42.920
Yeah, it's pretty funny.

00:44:43.080 --> 00:44:43.580
I love it.

00:44:43.580 --> 00:44:43.800
Anyway.

00:44:44.040 --> 00:44:46.300
I have one really quick thought to close things out.

00:44:46.560 --> 00:44:46.980
Okay.

00:44:47.000 --> 00:44:47.680
That's a little bit practical.

00:44:47.960 --> 00:44:57.000
For those folks out there that are interested in this, why aren't we uv yet study, you could look at the requirements.txt file.

00:44:57.320 --> 00:45:04.400
And if it was generated with a pip compile sort of thing, a uv pip compile, it would say the command is uv pip compile.

00:45:04.460 --> 00:45:05.580
It'll say generate with command.

00:45:05.660 --> 00:45:06.840
And that command will start with uv.

00:45:06.840 --> 00:45:11.080
So, you could parse the requirements.txt files and get greater visibility.

00:45:11.080 --> 00:45:15.120
But only for the ones that pip compile, not install just unpinned versions.

00:45:15.560 --> 00:45:20.240
Well, and also, if that's how they're, if they are generating the requirements.txt with uv.

00:45:21.320 --> 00:45:21.400
So.

00:45:21.680 --> 00:45:22.500
It would raise the numbers.

00:45:22.660 --> 00:45:24.080
I don't know by how much.

00:45:24.340 --> 00:45:24.540
Anyway.

00:45:24.880 --> 00:45:25.060
Yeah.

00:45:25.060 --> 00:45:32.840
Oh, also, so many projects or libraries that are not, they don't have either.

00:45:33.800 --> 00:45:33.840
Like.

00:45:34.080 --> 00:45:34.260
Yeah.

00:45:34.340 --> 00:45:34.560
True.

00:45:34.820 --> 00:45:35.220
That's true.

00:45:35.700 --> 00:45:38.260
Libraries just have their dependencies in piproject.toml.

00:45:38.340 --> 00:45:40.800
So, there's no, there's not uv lock or requirements.

00:45:41.220 --> 00:45:41.440
You know.

00:45:41.900 --> 00:45:42.940
The world's complicated, Brian.

00:45:43.920 --> 00:45:45.200
Do you know what's not complicated?

00:45:45.640 --> 00:45:45.940
What's that?

00:45:45.940 --> 00:45:47.320
We are at the end of the episode.

00:45:47.700 --> 00:45:48.660
So, thanks everybody.

00:45:48.660 --> 00:45:50.460
I'm going to press the goodbye everybody button.

00:45:50.600 --> 00:45:51.140
Goodbye everybody.

00:45:51.380 --> 00:45:51.700
Bye.
