Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book

#455: Gilded Python and Beyond

Published Mon, Oct 27, 2025, recorded Mon, Oct 27, 2025
Watch this episode on YouTube
Play on YouTube
Watch the live stream replay

About the show

Sponsored by us! Support our work through:

Connect with the hosts

Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too.

Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it.

Michael #1: Cyclopts: A CLI library

Brian #2: The future of Python web services looks GIL-free

  • Giovanni Barillari
  • “Python 3.14 was released at the beginning of the month. This release was particularly interesting to me because of the improvements on the "free-threaded" variant of the interpreter.

    Specifically, the two major changes when compared to the free-threaded variant of Python 3.13 are:

    • Free-threaded support now reached phase II, meaning it's no longer considered experimental
    • The implementation is now completed, meaning that the workarounds introduced in Python 3.13 to make code sound without the GIL are now gone, and the free-threaded implementation now uses the adaptive interpreter as the GIL enabled variant. These facts, plus additional optimizations make the performance penalty now way better, moving from a 35% penalty to a 5-10% difference.”
  • Lots of benchmark data, both ASGI and WSGI
  • Lots of great thoughts in the “Final Thoughts” section, including
    • “On asynchronous protocols like ASGI, despite the fact the concurrency model doesn't change that much – we shift from one event loop per process, to one event loop per thread – just the fact we no longer need to scale memory allocations just to use more CPU is a massive improvement. ”
    • “… for everybody out there coding a web application in Python: simplifying the concurrency paradigms and the deployment process of such applications is a good thing.”
    • “… to me the future of Python web services looks GIL-free.”

Michael #3: Free-threaded GC

  • The free-threaded build of Python uses a different garbage collector implementation than the default GIL-enabled build.
  • The Default GC: In the standard CPython build, every object that supports garbage collection (like lists or dictionaries) is part of a per-interpreter, doubly-linked list. The list pointers are contained in a PyGC_Head structure.
  • The Free-Threaded GC: Takes a different approach. It scraps the PyGC_Head structure and the linked list entirely. Instead, it allocates these objects from a special memory heap managed by the "mimalloc" library. This allows the GC to find and iterate over all collectible objects using mimalloc's data structures, without needing to link them together manually.
  • The free-threaded GC does NOT support "generations”
  • By marking all objects reachable from these known roots, we can identify a large set of objects that are definitely alive and exclude them from the more expensive cycle-finding part of the GC process.
  • Overall speedup of the free-threaded GC collection is between 2 and 12 times faster than the 3.13 version.

Brian #4: Polite lazy imports for Python package maintainers

  • Will McGugan commented on a LI post by Bob Belderbos regarding lazy importing
  • “I'm excited about this PEP.

    I wrote a lazy loading mechanism for Textual's widgets. Without it, the entire widget library would be imported even if you needed just one widget. Having this as a core language feature would make me very happy.”

    https://github.com/Textualize/textual/blob/main/src/textual/widgets/__init__.py

  • Well, I was excited about Will’s example for how to, essentially, allow users of your package to import only the part they need, when they need it.

  • So I wrote up my thoughts and an explainer for how this works.
  • Special thanks to Trey Hunner’s Every dunder method in Python, which I referenced to understand the difference between __getattr__() and __getattribute__().

Extras

Brian:

  • Started writing a book on Test Driven Development.
    • Should have an announcement in a week or so.
    • I want to give folks access while I’m writing it, so I’ll be opening it up for early access as soon as I have 2-3 chapters ready to review. Sign up for the pythontest newsletter if you’d like to be informed right away when it’s ready. Or stay tuned here.

Michael:

Joke: You're absolutely right

Episode Transcript

Collapse transcript

00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:05 This is episode 455, recorded October 27th, 2025.

00:10 And I'm Brian Okken.

00:11 I'm Michael Kennedy.

00:13 And this episode is sponsored by the wonderful people at Python Bytes.

00:17 Actually, both and everybody here.

00:20 So please check out the work we do for a little bit of money.

00:26 We've got Talk Python training with tons of wonderful courses.

00:29 And his new book, of course, Michael's new book.

00:32 And then also, if you want to check out pytest, there's the complete pytest course.

00:38 Or you can take the pytest course at Talk Python Training.

00:41 And as always, thank you to our Patreon supporters.

00:45 You guys rock.

00:47 We don't shout you out all the time, but we always appreciate you.

00:50 If you'd like to submit a topic idea or some feedback for us or just say hi,

00:57 Check out the show notes at pythonbytes.fm.

01:00 There's links to our socials.

01:04 We've got the show, and both Michael and I are on Bluesky and Fosstodon,

01:10 but that's really Mastodon, so wherever you're at there.

01:14 If you're listening to this and you're like, what, sounds like they're recording live somewhere.

01:18 We can see it.

01:19 Yes, we are recording and streaming, but you can watch all the back episodes or participate in the chat if you want to.

01:28 Check out pythonbytes.fm/live with all the details there and a link to when we're going to record next.

01:36 Before we get started, one last thing.

01:39 We do send out a newsletter that's getting better and better as we go along,

01:44 thanks to both Michael and I working on it a little bit.

01:47 So there's a newsletter we send out with all the links.

01:51 So you don't have to take notes.

01:53 We'll send you out an email.

01:55 So sign up for the newsletter and we'll send you those links and extra details too.

02:00 It's a great resource.

02:02 Yeah, I think people really appreciate it, Brian.

02:04 You know, if you look at when you send out the email and then book back in the campaign

02:09 and how many people actually open to the thing that we send out or click on it,

02:13 those are really close to 100%, which is ridiculous.

02:16 So that's a really good sign that other people might also like it.

02:20 Yeah, I think that people are using it in place of taking notes.

02:23 And also, if you're in a car or something, you don't want to try to remember.

02:26 You don't have to.

02:27 So awesome.

02:29 All right, Michael, what do you got for us right away?

02:33 Well, let me see what I can do here.

02:37 Let's see what the Cyclops says.

02:39 A Cyclops.

02:40 It's like a little programming kitty, but it's a Cyclops.

02:43 It's a cute.

02:44 Like Opt, O-P-T.

02:46 operations cycle.

02:48 I get it.

02:49 So we've heard of Click.

02:51 You and I were both just actually singing the praises of ArgPars,

02:55 which is pretty interesting.

02:57 I am such a fan of ArgPars these days because it's just built in and it's

03:02 simple and it's good enough.

03:04 If you were building something where the CLI

03:08 API or interface or dev user experience was super

03:12 important, you might choose something other than ArgPars.

03:15 So what are your options?

03:16 You could choose Click, you could choose Typer.

03:19 Typer is actually built on Click, so in a sense it's like Starlette and FastAPI.

03:23 You choose one, you get the other in a sense.

03:25 So there's a blend there, and there's others.

03:27 I know there's a bunch of different CLI building options.

03:31 But the Cyclops is one that sort of has the second mover advantage over Typer.

03:37 And their sort of tagline, I don't remember where I saw it, but

03:40 somewhere in here, it's not actually right here on this page.

03:44 but no shade at typer, but it says what you thought typer was going to be.

03:48 And my second boomer advantage is like, all right, we've seen different interface things built based around type information.

03:55 And now that we've seen the pluses and the minuses, could we do a slightly different version that has more pluses and fewer minuses?

04:02 And so there's a whole page of the typer comparison here.

04:05 And I'm not a super expert on typer.

04:09 My CLI, who is not that great.

04:12 So there's 13 different things, 13 different components or features or aspects of the API

04:20 that they chose to address to say, Typer's great.

04:23 We're basically inspired by Typer.

04:25 That's what we're doing.

04:26 However, there's a couple of these 13 areas I think we should make better, right?

04:30 So you could actually go through and look at all of them.

04:33 Probably the biggest one is that Typer was made with Python 3.7, 3.8,

04:39 where Python's types were at an earlier stage in their life.

04:44 You had to say things like, this is an argument of int,

04:49 or it is an int when it's actually an argument of int or something like that.

04:52 And this Cyclops thing was built once annotated was introduced,

04:58 which allows you to have sort of a dual role.

05:00 Like, it really is one of these, but its type could also be like its native value, right?

05:05 Allowing the removal of proxy defaults.

05:08 Why does that matter?

05:09 Because if you build a CLI with Cyclops, and you use types,

05:14 you can still use it as regular functions as if they just had an integer

05:18 where it said it took an integer and so on.

05:20 So things like that.

05:21 It's got about, I think, a thousand GitHub stars or so.

05:24 It's not crazy, crazy popular.

05:26 But if you're really loving that concept of having types for your CLI,

05:31 give this thing a look.

05:33 Yeah, actually, I got a shout out to him for doing things like comparing to Typer

05:39 because, like you said, a lot of us, I use Cyper as well.

05:43 And I've used Typer, I've used Click and ArgParse.

05:47 And so you're probably not going to grab too many of the ArgParse people

05:53 because they're just using it because it's built in, maybe.

05:57 You're like the unit test people, Brian.

05:59 What can you do?

06:01 You talk pytest for 10 years.

06:02 I yell louder.

06:04 That's what I do.

06:06 But I think it's a fair thing because I'd be thinking, well, if I'm not going to use ArgParse, I'm probably going to reach for Typer or Click, lately Typer.

06:17 But having this comparison here and then even a migration from Typer, I think that's cool to show that.

06:27 And there's always room for making CLI tools better.

06:31 We love CLI tools.

06:32 Yeah.

06:33 Yeah, there definitely are.

06:34 But yeah, so cool.

06:35 I ran across this and I built something with it recently and it was nice.

06:40 Oh, did you?

06:41 Checking out.

06:41 Yeah.

06:42 A little, you know, I'm starting to add.

06:44 I've been kind of, I've drunk the Kool-Aid now that maybe having an actual CLI,

06:50 even from my own tools, might be a little bit better than just,

06:53 here's a script I run and maybe I'll pass it something

06:56 and just look at sys.argv and see if it's in there.

06:59 Oh, man.

07:00 Let's do a little bit better.

07:01 And now that I've started to do that, I've been playing with these different ones,

07:04 which is, I think, where I kind of got inspired to go down this path and talk about it today.

07:08 Yeah, awesome.

07:09 Cool.

07:11 I want to talk about the present, the future, a little bit of both.

07:17 So there is this interesting article, of course, in Python 3.14.

07:22 Well, we had it in 3.13, is that the GIL was optional with what they call the free threading version, right?

07:32 Free threaded Python.

07:33 Not the gill-less, not the galectomy.

07:37 Free-threaded, I think, is where we landed from the PEP.

07:40 Okay, but is it kind of the same thing?

07:42 Yeah.

07:43 Okay.

07:46 Oh, right.

07:47 The galectomy was a different project.

07:48 Yeah, it was.

07:49 Gosh.

07:50 Anyway.

07:51 Larry Hastings was doing that.

07:53 But I believe, I mean, the free-threaded stuff, there's a lot of things that had to happen.

07:57 And actually, my next topic is also on free-threaded Python.

08:00 And it's not just let's take out the gill, but they had to rework memory management.

08:04 the structure of object.

08:05 There's like a lot of moving parts in order to make this possible.

08:09 Okay, cool.

08:11 Okay, so with the free-threaded Python, so now we have in 3.13,

08:16 we had two versions of Python released.

08:18 So if you go to download Python, you get a choice.

08:21 And with UV2, you can add a T to it, 3.14 T, and you get the free-threaded version.

08:29 But like, and also definitely encourage everybody test and publish that you've tested.

08:36 In CI, you go ahead and test it because the CI tools support both.

08:42 Why do we care about this?

08:44 Well, because some people are migrating to it because as with 3.14,

08:49 it's no longer just an experimental thing.

08:52 So here, I'm going to cover an article by Giovanni Barili.

08:58 Sorry, Giovanni.

08:59 I can pronounce that one.

09:00 I'm pretty sure this is the same Giovanni that makes Granian.

09:04 Oh, okay.

09:06 I think so.

09:07 The future of Python web services looks gill-free.

09:10 So it starts out with two major changes when comparing free-threaded variant of 3.14 versus 3.13.

09:19 First, free-threaded support now reaches phase two, meaning it is no longer considered experimental.

09:25 So everybody's on board with this is what we're going to at least try to do for a while.

09:30 not just try to do but we're the python core team is supporting this and we're going to do it

09:36 moving forward what does that mean it means that people can depend on a free threaded version and

09:42 change their code accordingly if necessary which is great especially for a lot of async stuff

09:49 like web services so uh secondly the implementation is now completed meaning that the workarounds

09:56 introduced in 3.13 make code sound without the gil are now gone, and free-threaded implementation

10:05 now uses the adaptive interpreter as the gil-enabled variant. What does that mean? It means that the

10:13 additional optimizations... Oh, okay. Also, those facts plus additional optimizations make the

10:19 performance penalty that we had in 313 way better or way less annoying so in 313 the free thread if

10:27 you're doing non-async code just straight code you had to face a like a 35 time penalty which sucks

10:35 um now it's like a five to ten percent difference and i think it's going to shrink even more

10:40 but um so if you're if you don't need the free threaded don't use it unless or or use it whatever

10:47 But if it's time critical, don't right now.

10:51 But I think it's going to get better.

10:54 And especially if you are using async and stuff, it's way faster to use the free-threaded, of course.

11:00 So this article does a shout-out to Miguel Grimberg's article about performance.

11:06 I think we covered it last week.

11:07 We did. You did.

11:09 No, maybe I did. We did. I can't remember who started it.

11:13 So this article has a whole bunch more benchmarks.

11:17 And I'm not going to walk through all of them, but talks about doing a service with JSON responses and both measuring with WSGI and ASCII implementations.

11:31 And yes, talking about using Gradient.

11:35 So that makes sense.

11:37 So I'm going to jump down.

11:38 So there's a bunch of metrics, but basically the end result is if you're using ASCII or ASGI, the free-threaded one is obviously the way to go.

11:49 And you don't get really much of a hit because of free-threaded implementation at all.

11:56 You get speedups and memory usage is reasonable.

12:00 So what I want people to do, if you're going to check this out, is to jump down to the final thoughts because there's some great stuff here.

12:08 Essentially, so what do I want to pick out?

12:13 On asynchronous protocols like ASGI, despite the fact that concurrency model doesn't change that much,

12:20 it's a shift from one event loop per process to one event loop per thread.

12:25 I think that's a big change, actually.

12:27 Just the fact that we no longer need to scale memory allocations

12:30 just to use more CPU is a massive improvement.

12:33 That's cool.

12:34 Didn't know that.

12:35 That's neat.

12:36 For everybody out there coding a web application in Python,

12:40 simplifying the concurrency paradigms and the deployment process of such applications

12:44 is a good thing, obviously.

12:47 And the conclusion being, for me, the future of Python web services

12:52 looks like the Guilfri.

12:54 So if you're doing web services, try out the free-threaded version.

12:58 At least one developer there is completely happy with what we have now.

13:02 Yeah, super nice.

13:03 A bit of real-time follow-up.

13:05 Yes, indeed. It is the same Giovanni.

13:08 All right.

13:08 Who creates a granion.

13:10 So with any of these web production app servers where you run your Python code when it's running,

13:16 whether it's Django, Flask, whatever, there's a process that runs your code.

13:19 Very often, the way that we've got them to work around the gill,

13:23 like in the web, the GIL is a much smaller problem.

13:26 It's still a problem, but it's less.

13:28 Because what we do often is we create a web garden.

13:33 So when you set up gran, you can say, and start four copies of yourself.

13:37 So that, yes, they all have the gil, but we'll round robin requests between the fours.

13:43 If you've got four cores, four CPUs, you basically can say, well, each one of these

13:47 is dedicated to a CPU and it can kind of like match, right?

13:51 Yeah, so we're running on good stuff.

13:53 But the thing is, when you have this free threaded option,

13:57 you can actually have true concurrency in your one worker.

14:00 So instead of scaling out four copies, you could have just one and just say,

14:04 let that one take 10 concurrent requests or whatever it needs to take.

14:08 - Yeah.

14:08 - Right, so that's how the free thread gets better.

14:10 And like, well, okay, why?

14:11 What's like six, one, half dozen, the other?

14:14 No, the memory becomes a problem when you create these little web gardens

14:17 because if normally your server would use half a gig and you create four, well now it's two gigs

14:23 and maybe that bumps you up in another tier and so on, right?

14:26 - Yeah, and if you need that memory for data, you don't have it, so.

14:31 - Yeah, so I think that's kind of one of the angles your writing concurrent code, but the foundations of your code running can do so more efficiently

14:39 because they're no longer needing to work around processes, limitations like, well, we got to

14:44 fan out these processes because of the kill. Okay, back to the regular scheduled programming,

14:49 free threaded Python, let's keep going. So here's an interesting article from a snake wearing a very

14:57 fast, it's a very fast snake with jet engines that is wearing a garbage collector. So what I

15:04 want to talk about is unlocking the performance in Python's free threaded future GC optimization.

15:09 So garbage collection is not something that we frequently talk about a ton in Python. You know,

15:15 it's just, we don't think a lot about GC. We don't think a lot about memory. I don't know why, like,

15:20 as you know, Brian, in C, C++, people are like always about it, right? Like, oh my gosh,

15:25 what size of point and what size of integer are we going to use for this, right? And like,

15:29 it's just way more sort of in the forefront, but it's nice every now and then to peel back the

15:34 and get a look at what's going on here.

15:36 So Neil Schemenauer wrote this article over here from QuantSight.

15:42 They do a bunch of data science primarily.

15:44 But here, check this out.

15:45 This is news to me, actually, even though I'm still interested in these things.

15:49 It says, first, the most important thing to know is that free-threaded Python uses a different garbage collector

15:54 than the default Python, the gil-enabled.

15:57 The gilded, I still want to call it the gilded Python.

15:59 The gilded Python.

16:01 So we have the default GC, which people probably understand pretty well.

16:05 When you create a number, like a thousand, when you create a list,

16:10 when you create an object from a class, all of those things have a data structure at the top of them

16:16 that allows them to manage and track the memory.

16:19 And people think of the GIL as being a threading thing.

16:22 The GIL is really a protection mechanism for memory management in Python, right?

16:27 It's basically protection against a race condition in the reference counting.

16:32 So basically interacting with this structure that has seven things reference me

16:37 and I reference it, however, you know what I mean?

16:39 Like that sort of deal.

16:41 - Yeah.

16:41 - And when the thing goes out of scope, it decrements that and that's where the GIL comes in

16:45 to like make that decrement safe and so on.

16:47 So that's the regular one, but we don't even have that structure anymore

16:52 in the free threaded one.

16:54 So what?

16:56 So instead, when you allocate stuff, it goes into a special memory managed heap called memalloc,

17:04 or managed by memalloc, right?

17:06 And this allows the GC to loop over all these objects

17:09 and figure out which ones are junk and which ones aren't.

17:13 A little bit like a mark and sweep garbage collector.

17:17 So there's a couple of interesting things.

17:20 One is that most of the mark and sweep garbage collectors

17:23 and that type of thing, they have generations, like a generational one.

17:28 So it has Gen 0s where most objects come, then it'll check that frequently and so on.

17:33 That's also how the garbage collector that looks for cycles

17:36 in the regular Python goes, right?

17:39 But it doesn't work here because of unmanaged C extensions

17:43 and a bunch of stuff like that.

17:45 So what it does is it can mark all the objects reachable

17:50 from what it calls known roots.

17:52 So instead of trying to scan the entire memory space,

17:54 it says, well, what are globals?

17:55 What are locals?

17:56 what are like a you know it sort of follow those and it marks those as active and that can they can

18:02 automatically be excluded from the search which is kind of like a generational optimization so

18:08 pretty interesting but here's the takeaway we were talking about making things faster by using free

18:13 threaded python with this one it's the free threaded gc collection is between two to 12 times faster

18:19 than the 3.13 version wow that's pretty yeah that's pretty wild right yeah yeah so depending on how

18:25 your algorithms work, do you have a super pointer,

18:27 pointer, heavy allocation, heavy type of algorithm,

18:30 then, you know, we'll probably see a bigger benefit

18:32 than if you allocate a few things and jam on those.

18:35 So anyway, if you want to peel back the covers

18:37 and see the GC, I didn't even know they were different,

18:39 but apparently they're different.

18:41 And presumably, maybe they could even have

18:43 like multi-threaded GC stuff going on.

18:46 That might be cool.

18:46 Or background thread GC.

18:48 I know those are a lot of different things that happen in some of these systems.

18:51 These garbage-coach systems.

18:52 Yeah.

18:53 And finally, A little bit of real time, not quite real time, but real world follow up, I'll say.

18:58 So for some reason, I can't remember what I was doing.

19:01 Something I had to do, I had to go back, go through all the examples of my async programming course at Talk Python.

19:07 So I went back in there and I said, all right, well, let me just make sure everything's running, update all the dependencies.

19:14 I can't remember what it was.

19:15 But anyway, something had to be updated.

19:16 So I made sure all the dependencies got updated and pinned them to newer ones.

19:20 And I'm like, well, there's a lot of changes here.

19:21 Let me make sure they're all running.

19:23 So go around and I ran every example from the course, probably 25 little apps.

19:28 And many of them are like, well, here's a web app, or here's just a way to do

19:33 multiprocessing, have a look at it.

19:33 But a lot of them were like, let's actually do this with threads.

19:37 And in this example with doing IO, you see it does make a difference.

19:40 In this example doing computational stuff, no difference or slower because there's

19:44 just overhead and it's still running one at a time, right?

19:47 Well, I decided to type uv-python 3.14t run those examples and run them free threaded.

19:56 The ones that used to have no benefit or a net negative are now like seven times faster.

20:03 It would go like synchronous one, 10.2 seconds.

20:08 Async or threaded one rather, 10.25 seconds.

20:13 And now just put the T on the end, three seconds.

20:18 that's pretty cool that is pretty cool man so to me it's really down to what are the framework

20:25 what are the packages and libraries that you're using if it's green across the board for free

20:30 threaded that's pretty interesting it opens up some real possibilities yeah also i i think i'll

20:37 be looking forward to the i know i don't know if we have a deadline or timeline for this but

20:41 when we can go back to having one version of Python and it's being the free-threaded one,

20:49 it'll be, you know, maybe we can switch it to be the default is the free-threaded one.

20:55 And for a couple of versions, there's the other one also.

20:57 You got to say 3.14G.

21:00 Yeah, or something like that.

21:02 And I think that would be good because the default way we think about how to program

21:08 and how to program quickly And the rules of thumb based on the GIL will be gone.

21:15 So we'll teach people different.

21:17 Yeah.

21:18 So anyway.

21:19 Yeah.

21:20 And before we move on real quick, a follow-up to your topic from chart here.

21:24 Also, if you're on a Kubernetes, if you're on Kubernetes,

21:27 it makes setting reasonable CPU and memory requests more difficult

21:31 to have to scale out like the WebGarden type stuff.

21:35 Oh, okay.

21:36 Yeah.

21:36 So if you can just have one process like this is the one,

21:39 I think it's easier.

21:40 I believe that's what he was getting at.

21:41 Yeah.

21:42 And also, I mean, from the C++ world, yes, processes and threads both can work.

21:49 But thinking in threads is easier than thinking about multiprocesses.

21:53 100%.

21:54 Because you have shared memory.

21:54 It just is.

21:55 You don't have to proxy stuff over and figure out how to sync it back up and copy it.

22:00 To queues and all sorts of...

22:02 Anyway.

22:03 Okay.

22:04 I can't resist.

22:05 I got to say, I got to put this out there in the world and see what you think.

22:09 One thing right now, we have the gilded Python and we have the free threaded Python.

22:13 And I can choose to run the free threaded one, as we've been talking about, if I want that.

22:16 But what about this, Brian?

22:17 This is what I want to put out in the world.

22:19 I want, like right now, if I have a project and I've got some data and I need to process

22:24 this in parallel on the gilded one, I need to maybe do multi-processing.

22:28 So that's going to spin up five little baby Python processes all to go jam on that, right?

22:34 That's how you get around the GIL.

22:35 What about this?

22:36 what if you could pass a flag to multi-processing and it would start up free-threaded Python with threads

22:43 instead of a bunch of different processes that can't coordinate on whatever algorithm

22:47 that you just say run multi-processing, but this time do it in free-threaded Python

22:53 as your little multi-process burst and then come back to me with the answer.

22:56 That would be cool.

22:57 - Yeah, yeah.

22:58 - You're not convinced?

22:59 I mean, you could that way, 'cause you weren't right. - I mean, your code

23:01 has to be different though.

23:02 - Just that one function though.

23:04 And you don't have to pay, you know what I mean?

23:05 like this one function that i'm going to call multi-processing on i'm going to write it assuming

23:09 that it can be it will be an advantage to be threaded or it will be an advantage to be concurrent

23:14 but the rest of my code i don't have to rewrite like it'd be a cool way to sort of bring kind of

23:19 like scython lets you bring and say this one function if just this were way faster it would

23:22 make all the difference you could do that but for free threading i would like to see that would be

23:26 interesting yeah yeah or just a chunk to say like yeah this um this subsystem is a through with

23:34 With free threading.

23:37 All right, go ahead.

23:38 Okay, completely different tangent is we're going to go back in time by a couple of one or two episodes.

23:45 We've been, the last few episodes or two or three talking about lazy imports because I thought they were coming in 315.

23:55 They're proposed in 315, but they're not here.

23:58 So, or they're not accepted yet.

24:00 I haven't checked.

24:01 I don't think that anything's changed.

24:04 So last episode, I did talk about lazy imports that you can use today.

24:10 Responding to that, Bob Bilderbos had a discussion on LinkedIn.

24:17 And part of that discussion, we had Will McCoogan hop in and said that he's excited about the PEP

24:26 and that he has a lazy loading mechanism for textuals widgets, which totally makes sense.

24:33 like widgets, you might just need a button, but you don't need all of the other widgets.

24:40 Be cool if you could just load what you needed.

24:42 And he's got that.

24:43 So I checked it out.

24:44 I was checking this out.

24:45 And I think this is sort of a small topic, but I think it's cool.

24:49 So the idea that I want to highlight is I presented last week a method for doing lazy

24:58 loading now or lazy importing now on things that you depend on.

25:02 But what if you're like, you don't have to wait.

25:04 You can be lazy today.

25:06 But what about like, yeah, exactly.

25:09 What about if you're the package?

25:10 If you have a big package that you have a bunch of stuff that people might want to import,

25:15 making it so that you're not the problem, that your package is going to import really fast

25:21 because behind the scenes, you're doing lazy importing for people.

25:25 And that's essentially what Textual does.

25:27 And the implementation is like really easy.

25:32 So he's using a, basically, when you access anything, he's overriding the get adder function.

25:40 So if you access a widget, it's going to try to just grab it out of a dictionary.

25:46 And of course, right away, it's going to fail.

25:50 And then if it fails, he goes ahead and imports it and then stores it in.

25:56 So the next time somebody tries to grab it, it's going to be there.

26:00 And so there's a little bit of misdirection here.

26:04 But in the end, you get lazy loading on every widget access.

26:11 So pretty cool.

26:13 And so I thought, you know, this is pretty neat.

26:16 I don't want it just to be hidden.

26:18 It's not hidden.

26:19 It's an open source project.

26:20 So I went ahead and wrote all this up in a new post called Polite Lazy Imports for Python Package Maintainers.

26:28 Very nice.

26:28 link to that also but i just thought that the implementation is totally clever and um and maybe

26:35 it's maybe for other package maintainers this is like well yeah that's the way you do it but um it

26:40 was new to me i thought it was neat so there we go also want to shout out i didn't realize there

26:45 was both get adder get adder is used for uh for if it's not found for missing items but also there's

26:53 another one called get attribute. And I did look up, I really appreciate that Trey Hunter had posted

27:00 this every Dunder method article. So pretty cool. I totally have this bookmarked so that I'm like,

27:08 if I want to understand the Dunder method in Python, I go here.

27:12 Yeah, very cool. I know it for classes, but for modules as well. Interesting.

27:18 Yeah, I don't think, and packages. So this is, you go ahead and throw it. Well, I guess,

27:23 It is for modules, but if you put that in a dendronit file,

27:28 then it's for your entire package.

27:30 Yeah, exactly.

27:31 It's pretty cool.

27:32 I love it.

27:33 Nice.

27:34 All right.

27:34 Is that all your extras?

27:36 Oh, that's just...

27:37 Your main thing.

27:38 That's my main thing.

27:39 What do you got for extras?

27:40 I do have one extra.

27:40 The one extra, so one of the things I've been doing is I've been,

27:47 you know, I paused Python people a long time ago, And I totally stopped doing testing code.

27:53 I'm focused on this too and focused on work, but I'm also writing more.

27:57 And one of the things I'm doing is writing a book on test driven development.

28:01 And I don't have an announcement yet, but in the next couple of weeks, I think I might

28:06 have an announcement.

28:07 But if you'd like to know what, I guess what I'm going to do is I'm going to write it.

28:11 I'm going to write the book, but to motivate me and to make people not have to wait.

28:16 I'm doing it as a, like a, you know, like Manning does like the early access books and

28:22 Pragmatic has beta books.

28:24 I'm kind of doing that.

28:25 I'm doing a lean, trying to do a lean startup applied to books.

28:29 And I'm going to release it after I've got, I'm like, you know, do a rough first draft,

28:34 clean it up a little bit and then release it for every chapter.

28:38 And then as I go along, incorporate feedback from people.

28:41 And then once the whole thing is good and polished, I might bring on an editor or something

28:46 or maybe just release it.

28:48 But if you'd like to know more about that, I'll announce it here.

28:52 But also, if you'd like to know the minute it's available,

28:56 join the newsletter over at Python Test, and I'll let you know.

28:59 Yeah, we really do appreciate when you join our newsletter.

29:01 You might think, oh, we can just announce these things

29:03 on social media or on the podcast, but it's not the same people.

29:07 A lot of people miss it because they miss an episode,

29:09 or social media is just a screaming feed of stuff.

29:12 It really makes a difference.

29:13 So join the newsletter.

29:15 Also, I want to talk about there's newsletters and then there's newsletters.

29:20 So this isn't really – I don't do a weekly eight tips on testing.

29:24 That would be cool.

29:25 I just don't have time to do that.

29:28 So it's mostly announcements if there's something to keep track of.

29:32 But people like you and me and others, we're using things like – I'm using Kit, but there's others.

29:39 I've used other email things before.

29:41 I don't keep track of who's subscribed.

29:43 So I'm not going to sell this to anybody or anything.

29:45 It's just an announcement thing.

29:47 And you can unsubscribe anytime.

29:49 And I don't even know.

29:50 Like once you're unsubscribed, you're gone.

29:52 I don't have the access to that.

29:53 So there's people that abuse newsletters and people that I don't think there's a lot of people in the Python tech space that do, though.

30:00 So it's not spammy.

30:02 Anyway, moving on.

30:03 Do you have any extras for us?

30:05 I actually have a couple.

30:07 Let's jump over and hear about it.

30:08 Very exciting news.

30:10 I am really happy with this new course that I created called

30:13 Agentic AI Programming for Python Devs and Data Scientists.

30:17 Cool.

30:18 Yeah, and the idea is basically, so we have these agentic AI tools

30:24 like Claude Code and Cursor and Juni and all of them,

30:28 but how do you actually be successful with them, right?

30:31 I've talked a few times about just insane productivity

30:34 and really good outcomes that I've had, and people are like, how are you doing that?

30:38 So this course is like a real world set of guidelines and examples and best practices

30:46 for working with agentic AI.

30:47 So build, spend an hour building like a really detailed application.

30:52 But we also, it also talks like almost for an hour about like guardrails and roadmaps

30:56 and how do you get it to do exactly what you want.

30:59 You're like, when I say build an app, don't just build the app, but build it with uv,

31:03 write high test tests, format it with rough, with this Tommel.

31:07 You don't have to do any, you don't have to, but you know, like how do you get it to give

31:10 you what you want?

31:11 Not something in the vague general space of what you asked for.

31:15 You know what I mean?

31:16 So sort of a practical agentic AI programming course.

31:20 So I really, I'm getting like crazy amounts of good feedback on this course.

31:24 So people check it out.

31:25 The link is in the, in the show notes.

31:27 I think it's talkpython.fm dash slash agentic dash AI.

31:30 So that one's fun.

31:31 What else?

31:31 I'm also going to be on talking with Hugo Bowen Anderson.

31:37 He's running the Vanishing Gradients podcast.

31:40 He was on Talk Python a while ago, and now I'm going to be on his show,

31:44 Data Science Meets Agentec AI.

31:47 Sort of a follow-up from my Talk Python in production book,

31:50 plus this course I just talked about.

31:52 Those are those kind of ideas.

31:53 We're going to spend some time riffing on that.

31:54 So that's tomorrow night US time.

31:57 So check that out.

31:58 All right. Also, OpenAI introduced an AI browser wrapping Google Chrome called Atlas.

32:05 This is interesting.

32:06 I played with it.

32:08 I'm not convinced yet.

32:09 I mean, it's kind of fine.

32:10 I don't know.

32:11 I don't even know what to think about these things.

32:12 That's not why I included it.

32:13 Maybe it's sort of interesting as a side note.

32:15 But I linked to the Ars Technica article.

32:18 Holy smokes, you guys.

32:19 Check out the comments.

32:21 They are not loving it.

32:23 And it's not OpenAI's fault exactly.

32:25 It's just this idea of AI taking over everything is not.

32:32 At least if there's not everyone hating on it, there's a very vocal group of people who don't love it.

32:37 So, and I just noticed that Kyle Orland wrote this and he actually wrote a whole book on Minesweeper,

32:42 which, you know, props to him for that.

32:43 But check out the- - Oh, that's awesome.

32:45 I gotta check that out.

32:47 - I'm like, geez, it's been a long time to thought about Minesweeper.

32:50 Anyway, this is actually just a really interesting cultural like touch point, I think.

32:54 So people can check that out.

32:55 All right, James Abel, one of the, I think he was behind Pi Bay this year, but yeah.

33:01 In the San Francisco area said, Hey, we somehow we were talking about dunder name.

33:07 If dunder name equals dunder main today, I have a package.

33:10 It's real, real simple.

33:11 But instead of writing this, if you want to like get people started, you can just

33:15 import my thing and say, if is main as a simpler, saner way, it should, that maybe

33:20 should be a built in, don't you think?

33:22 Python.

33:23 Yeah.

33:24 Or what I would rather see is if isn't main, here's what I would like to see, but

33:28 I don't know that Python has a mechanism for it.

33:30 I would like to say at the top, something to the effect of register this function as

33:36 me, and then when it's done parsing the file, then it runs it because even this,

33:42 which is super helpful, it's still, if you accidentally put code below it, it's a

33:46 problem, you know what I mean?

33:47 I would like Python to enforce, like we'd load the whole file and then it runs

33:50 like a cool mechanism for that.

33:51 But still, this is pretty nice.

33:53 And at first I thought I was just like wrapping if name is main, right?

33:57 But in fact, it's like doing some a little bit different here.

34:01 It's using the inspect stack to go back and then look at the value.

34:06 Because obviously, it itself is not main, so it can't do that test.

34:10 So I don't know, kind of cute.

34:12 Yeah.

34:12 If you don't want to have a dependency on this library,

34:16 you could take that line of code and put it somewhere as a utility class in yours.

34:20 But, okay, I'm just going to throw this out there.

34:22 If you want to do something really fast, make sure you time this.

34:26 Because every time I've added inspect, I love the inspect library.

34:30 But whenever I add it, it slows things down.

34:33 Okay.

34:35 Just benchmark.

34:38 Benchmark it.

34:38 All right.

34:39 Benchmark it.

34:40 All right.

34:41 Inline.

34:42 All right.

34:43 If you're an IntelliJ person like PyCharm, wouldn't it be nice, Brian?

34:47 Wouldn't it be incredible if while you're working, a little pet could walk around in the bottom?

34:53 Anthony Shaw, I believe, is the one who added this to VS Code.

34:57 Yeah.

34:57 PyCharm people can love animals too.

35:00 So you could install the pet into your PyCharm now.

35:03 Oh, cool.

35:03 Nice.

35:04 I don't know that I'm going to be doing it.

35:05 This is a last minute edition.

35:07 I just got a brand new M5 iPad.

35:10 How insane is that?

35:11 I've had the last one I had, I got five years ago.

35:13 I'm like, it's probably time.

35:14 And I got a 13-inch iPad Pro, which is an insanely large iPad.

35:19 But I started using it as the second monitor for my laptop.

35:23 if I'm traveling or if I'm at a coffee shop or something.

35:26 And that's a super cool experience.

35:27 Just put them side by side.

35:29 If they're on the same network or the cable, they just do real, real low latency, dual monitor stuff.

35:35 And yeah, it's pretty neat.

35:36 So just a little shout out to that.

35:37 That's incredible.

35:38 It's like having a new laptop, a second laptop for the price of a second laptop.

35:43 Yes, it is.

35:44 But here's the thing.

35:46 Yes, I agree.

35:47 But it's also a really nice reading device, which I do a lot of reading and stuff.

35:50 And I was going to get one of the cheaper ones.

35:53 One of the things that drives me, I have a MacBook Air as my main laptop computer and

35:57 the same chip for my main computer.

35:58 I don't have a high end computer and it's plenty good for Docker stuff or programming

36:04 stuff, all these things.

36:05 It's totally fine.

36:06 But here's the thing.

36:07 The iPad Air, while great, it's peak brightness is something like five or 600 nits.

36:13 And if you're working, like I like to work in my back outside area in the summer, like

36:17 that's kind of not relevant right now.

36:18 But in general, you know, sit outside, enjoy the weather, get out of the office.

36:22 And if it's at all bright through a window or somewhere, it's like really a pain.

36:26 This thing is like the best screen you can buy on a Mac, period, whatever.

36:30 And it's a thousand nits.

36:32 So you could push the computer to the side and just put the laptop in front of you and

36:36 type on it.

36:36 It's really nice.

36:37 Anyway, that's the main reason.

36:39 I want a brighter screen without having a MacBook Pro.

36:41 I feel so bad for you having to deal with like working outside in such bright light.

36:47 It's really horrible.

36:48 I know.

36:48 You should really.

36:50 It's hard.

36:50 It's hard doing me.

36:52 All right.

36:54 Carrying on with the jokes.

36:55 No, go ahead.

36:55 Well, before we get to the joke, I wanted to, I guess, highlight, going back to my announcement

37:01 of possibly a book.

37:05 Japanol7 says, a TDD book by Brian.

37:08 Can't wait.

37:08 Also, Talk Python training courses are great.

37:11 Kudos.

37:12 Yeah.

37:12 I'm awesome.

37:13 Yeah.

37:13 I'm looking forward to getting that book out, and I'm looking forward to that AI course

37:17 of yours.

37:18 Yeah, same.

37:20 All right.

37:21 I'm not looking forward to surgery, I'll tell you what.

37:23 And it's getting to be weird, Brian.

37:25 I mean, like doctors using AI and stuff.

37:28 Actually, we probably are going to get better diagnoses as for certain things.

37:32 Oh, dear.

37:33 But here's a surgeon situation.

37:35 There's a person who just, they're in post-op, okay?

37:38 They're laying there like, oh, man, a little woozy from the anesthesia coming out of it.

37:42 And the doctor, which is a robot with a ChatGPT-like, I don't think it's, is it the same?

37:48 know something an ai logo for a robot face and the patient says but why is the scar on the left

37:55 if the appendix is on the right the ai surgeon says you're absolutely right let me try that one

38:00 more time please don't try it one more time it's so bad it's pretty bad it's pretty funny that

38:10 actually drives me nuts when when i'm like this doesn't sound right is this you know

38:15 The user is pointing out I've made a mistake.

38:17 Oh, you're right.

38:18 Yeah.

38:19 Oh, well.

38:20 Yeah.

38:22 So, yeah, just get a second opinion.

38:25 So if opening is going to operate on you, have Anthropic be the backup, I guess, is the moral of the story.

38:31 I don't know.

38:33 I don't think that's the moral of the story.

38:34 You don't think so?

38:36 Okay.

38:36 Maybe somebody trained in medicine.

38:39 Trained on medicine?

38:40 I'm sure they are.

38:43 Trained with heavy medication.

38:45 Yeah, exactly.

38:47 All right, cool.

38:47 Well, fun as always.

38:49 Well, definitely fun as always.

38:50 And we'll see everybody next week.


Want to go deeper? Check our projects