Transcript #374: Climbing the Python Web Mountain
Return to episode page view on github00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.
00:05 This is episode 374, recorded March 11th, 2024.
00:11 I am Michael Kennedy.
00:12 And I'm Brian Okken.
00:13 And this episode is brought to you by Scout APM.
00:16 Check them out at pythonbytes.fm/scout.
00:22 I'll tell you more about them later.
00:24 If you'd like to connect with us, Mastodon is the main place.
00:27 We're also on X Twitter.
00:29 But primarily on Mastodon.
00:31 And that's at mkennedy, at brianockenden, at pythonbytes, all at fosstodon.org.
00:36 And watch us live, usually Tuesdays at 10 a.m. Pacific time.
00:40 However, different today.
00:42 But check it out at pythonbytes.fm/live.
00:44 You'll be able to get notified about the next upcoming recording.
00:47 I usually put that up right after we're done with this one.
00:50 Brian, before we jump into your first topic, I want to just take a moment and appreciate the beginning of summer.
00:59 First step towards summer coming.
01:01 It is daylight savings switch over here in the U.S.
01:05 And I know it is much maligned by people.
01:07 But I was delighted to see the sun was up past 7 p.m. yesterday after living in dark and rain for months and months.
01:14 I'm feeling the summer.
01:16 I know it's not quite there yet.
01:18 Yeah, I didn't really notice it until after I woke up.
01:21 Yeah.
01:22 I woke up and I'm like, oh, gosh.
01:25 I thought I'd slept into like 10, but it was really 11.
01:28 So sleep the day away.
01:32 Oh, beautiful.
01:32 Beautiful, beautiful.
01:33 All right.
01:34 Well, over to Python.
01:35 What's your first thing?
01:36 Okay.
01:36 Well, I want to talk about spaghetti code a little bit.
01:39 Actually, or how to fix it.
01:41 So there's an article.
01:43 Is that what you get when you're real hungry?
01:44 You're like, what are you going to have for dinner?
01:46 Maybe some spaghetti code.
01:47 A little carbonara.
01:48 Is that how you pronounce it?
01:51 I've never been able to pronounce it.
01:53 I'm probably going to get email that it's not, but that's how I pronounce it.
01:56 Article from somebody that goes by Piggly.
01:59 Six ways to improve the architecture of your Python project using import linter.
02:04 And I kind of like this article, even if you don't want to use import linter.
02:08 So because of the six ways to improve the architecture, I love that.
02:12 I love digging into that.
02:14 So, and I actually have a couple of projects that I'm working on that I would like to maybe
02:19 clean up the architecture a bit.
02:20 So kind of a fun picture.
02:22 Not sure what's going on here.
02:23 Just a whole bunch of boxes pointing at each other.
02:25 I guess just indicating that there's circular dependencies or something going on.
02:33 And anyway, so a little bit of a discussion about why complex architecture might be bad,
02:40 but I don't think any of us need convincing of that.
02:43 Simplifying is a good thing.
02:44 So the first part talks about maybe setting up an idea of like, maybe you could, you have
02:49 an idea of like setting your architecture into layers.
02:52 If that's what you want to do, like a user layer or an application layer plus utilities or something services.
03:01 But there's a tool called the import linter that there's a discussion about how to set it up so that you can use that to both configure what you want your layers to be and then test for it.
03:15 It's like a linting to make sure that you're not importing in the wrong direction.
03:19 So the idea would be like you're not importing, like if you want the top level one to be calling the lower level ones, but not the other way around.
03:26 So the importing is a way to test that, which is because of that.
03:31 I'm not sure what these arrows mean because it doesn't quite make sense if it's not inheritance or calling.
03:36 I don't know.
03:37 Anyway, it's just an error.
03:38 That looks pretty interesting.
03:40 I think that means that something from that layer is importing something from the bottom layer, the lower layer.
03:45 Yeah, but yeah.
03:47 Okay.
03:48 I think the things are with, you can have stuff like import across within, but then it can only, I think it would be an error or a warning if maybe say like the data layer imported the UI layer sort of thing.
04:01 I think that's what it's saying.
04:02 Yeah.
04:02 Oh, and it makes more sense if I would have drawn the top layer at the top.
04:07 I think it makes more sense to me if we do it the other way.
04:10 Anyway, that's a drawing thing.
04:12 It doesn't really matter.
04:13 And also-
04:14 The problem is by somebody in Australia because everything is upside down down there.
04:17 Oh, yeah.
04:17 Yeah, that makes sense.
04:18 Yeah, yeah, that was it.
04:19 Okay.
04:20 Okay.
04:20 So what happens when you run this is the recommendation is you're probably going to get a bunch of lint warnings or errors.
04:29 Or maybe not.
04:31 If not, awesome.
04:32 But if you do, the recommendation here, which I kind of thought was cool, was to kind of exclude all of them.
04:41 Like go through and add these ignore imports and just go through and ignore the ones that failed.
04:47 And I'm like, okay, well, then you're going to pass and it's just going to ignore everything.
04:52 But the idea behind it is to do it one at a time.
04:55 So it's going to be overwhelming to get a bunch of errors.
04:57 So ignore them all.
04:58 And then un-ignore one at a time and then go through and fix them.
05:04 So that's where the reason why I picked out this article isn't really because import lint is cool.
05:10 It might be cool.
05:11 But it's these ways to fix these import dependencies I thought were really great to walk through.
05:19 There's six of them.
05:20 First one is merging and splitting modules.
05:23 And there's an example where you've got resources that calls clusters.
05:27 Clusters calls resources.clusterutils.
05:30 And that's going the wrong direction.
05:33 But to fix it, you maybe move those clusterutils down into the lower, like a lower library, like splitting them off.
05:39 Kind of sort of straightforward, but it's good to like think about this.
05:45 So I like that part of it.
05:47 Dependency injection.
05:48 And if you're the kind of person that's kind of afraid of dependency injection, then don't even think about that term.
05:55 Because it's not actually that scary.
05:57 Just the term was scary to me for a while.
05:59 So anyway, dependency injection might help to be able to pass dependencies down into lower layers from upper layers.
06:07 So that's a good way to get around it.
06:11 The interesting, I'm glad they dug into this, is when you do the inversion of, what is it?
06:18 I forget, the inversion of control.
06:20 Inversion of control, yeah.
06:20 Yeah.
06:22 Sometimes with Python, that works great, except for type hints.
06:26 Sometimes you need to get the type hints, getting those right.
06:31 So talking about how to use the protocol from typing to get that to work right.
06:38 That discussion's in here, which is great.
06:40 I love protocol.
06:41 Then another discussion about using simpler dependency types, and then delaying function implementation.
06:48 And then even, what was the last one?
06:52 Look at 4.6, configuration driven.
06:54 That was interesting.
06:55 I've never really thought about doing this.
06:57 The idea is you'd have a settings file or something, and have that, and then have the string that you would import, like marketing.sendSMS.
07:08 And then later on in your calling thing, you use import string to import it.
07:14 So you still have the backwards dependency, but you don't know it at compile, at write time.
07:22 You know it at run time.
07:22 I don't know.
07:23 I don't know what I think about that.
07:24 But anyway.
07:25 Yeah, that's interesting.
07:27 It's a way to do it.
07:28 And then the last one is replace function calls with event driven approaches.
07:34 So like callbacks and stuff like that.
07:37 So these are great architecture things to think about when you're cleaning up architecture.
07:41 Because it's not always trivial to just say, just don't do that.
07:45 Don't do those imports.
07:46 Well, how do you get around it if you've designed it that way?
07:48 So I think a decent discussion about software architecture here.
07:52 I do think that's pretty interesting.
07:54 Interesting.
07:55 And also, just more broadly, I like the idea of ignoring all the issues you might run into
08:01 with linters and then slowly turning the screws to make it a little tighter is good advice.
08:05 Because if you have existing code of any significant size and you've never done linting on it,
08:11 and then you turn on a linter?
08:12 Yeah.
08:13 You feel bad about yourself.
08:14 You're like, I've done all that wrong?
08:17 93 errors?
08:18 Great.
08:18 Great.
08:18 Well, we'll turn that back off because I got work to do.
08:21 You know what I mean?
08:21 Yeah.
08:22 Well, I'm kind of going through that with the rough right now.
08:26 So I've got some projects where default rough works fine, but the rough has like 800 rules or something you can add.
08:33 And so I'm trying to turn on some of those extra features.
08:37 I have just a few at a time.
08:39 See, testing the waters.
08:40 See how many failures I get and whether I want to clean them up.
08:44 So, yeah.
08:44 Yeah.
08:45 Interesting.
08:45 Just interesting timing.
08:48 I'm going to be speaking with Charlie Marsh on Talk Python tomorrow.
08:51 Oh, awesome.
08:52 About 24 hours from now, whatever that makes it for you.
08:55 Yeah.
08:56 Out there listening.
08:56 Yeah.
08:57 And we're going to talk mostly about UV, but I'm sure we'll talk a bit about rough as well.
09:00 Cool.
09:00 Well, I'm going to release a episode with Charlie Marsh today.
09:04 Well, how about that?
09:06 Awesome.
09:07 Awesome.
09:07 Do you guys talk about UV as well?
09:09 Yeah.
09:10 We talked about Flicate or not Flicate.
09:13 We talked about rough and Astral and UV, most of the conversations around UV and some of the discussion around that.
09:21 Nice.
09:22 Yeah.
09:23 That'll be fun.
09:23 All right.
09:24 On to the next one here by Pierce Freeman.
09:28 That's kind of a cool name, isn't it?
09:29 It could be like a 00 sort of agent.
09:31 Yeah.
09:32 009.
09:33 I don't know what that is.
09:34 Not James, but something like that.
09:36 Anyway.
09:36 Pierce Freeman to the rescue.
09:38 Exactly.
09:39 To bring you React and FastAPI and Python.
09:43 In fact, with this framework called Mountaineer.
09:45 So Mountaineer is a batteries included web framework for Python and React.
09:50 So this, it plays in a similar space as Fast UI from Samuel Colvin and the Pydantic crew.
10:00 I don't know enough about them to know how truly similar they are.
10:05 I think this is coming from a different angle.
10:08 I think you're doing, you know, so Fast UI is more about maybe you don't really have to do the
10:12 React side unless you want.
10:14 And it kind of brings that stuff together.
10:15 Whereas this kind of like you're embracing TypeScript, you're embracing React, but you also get some
10:22 really nice Python integration.
10:23 And so if you look down here, what does it say?
10:26 It says it lets you easily build web apps in Python and React.
10:30 And if you are familiar with this, it should sound familiar to you and should basically seem like what you're used to.
10:36 And if not, then maybe not.
10:39 But it says each framework like Flask or FastAPI or Django or whatever has its trade-offs and features.
10:45 And for this one, it focuses on developer productivity above all else with production speed of close seconds.
10:51 So type hints up and down the stack, front-end, back-end database.
10:54 Trivially easy client-server communication.
10:57 So you don't find yourself creating a bunch of APIs so that your React stuff can talk to the API so it can actually get its data and all that, which is pretty interesting.
11:07 It comes, this is kind of cool.
11:09 So one of the things you can run into, I'm sure you've seen this, Brian, it drives me absolutely bonkers.
11:14 You'll go to a website and it'll have something on the screen and then like half a second later, it'll shift around and stuff will all come into existence.
11:23 You know, you'll be like, it'll have like the footer will be touching the top and then it'll bump out, it'll be a spinner and then stuff.
11:29 You know, it's like, what is it doing?
11:31 Well, especially if you start reading and then a picture pops in and what you read pops off the screen.
11:37 Yeah.
11:38 Another thing that drives me crazy is if you paste something into an input, but you don't type it, sometimes that won't trigger the data binding.
11:47 And, you know, like if it's a password manager type thing or something, or you just paste something, it'll say, oh, this is invalid.
11:55 What is wrong with this?
11:56 Oh, if you put a space and delete the space.
11:57 Oh, oh, it's valid now.
11:59 It's like, uh-huh, I see.
12:00 So something that would be really nice is if you just could ship straight HTML, right?
12:05 Yeah.
12:06 Or the first view of that was normal.
12:08 And so what this comes with is they actually bundle the V8 engine.
12:12 So on the server side, it can render what your browser would do for you as final HTML and it delivers.
12:19 So optimize server side rendering for better accessibility in SEO.
12:24 That's pretty cool.
12:25 Yeah, that is cool.
12:26 As a Python thing.
12:27 Like normally you hear that as a node thing or something, right?
12:29 Yeah.
12:30 It also does static analysis for validations of like links and CSS and so on.
12:35 And like I said, skip the API or Node.js server just for front-end clients.
12:41 Okay.
12:41 So all of these things are pretty cool.
12:43 If, let me give you a warning and a disclaimer.
12:46 One, if you don't know React real well, I don't think you'll really appreciate the benefits here that much.
12:51 Like this is really for React people 100%.
12:53 Second, that's not me.
12:55 So I'm going to do my best to like tell you why you might want this.
12:59 I think I can kind of get there.
13:00 So first of all, it has a scaffolding type thing called Create Mountaineer app.
13:06 And they suggest pipx.
13:08 And I'm loving to see more pipx come along for these kinds of tools, right?
13:12 Like you run this once.
13:13 It's not part of your app.
13:15 You just pipx install it and everything gets going.
13:17 It's great.
13:17 Also uses poetry pretty heavily.
13:19 So what it does is it creates a Python bits and then some TypeScript stuff for your front-end.
13:27 And it comes with a CLI as well, which is nice.
13:29 It has built-in Docker containers for like managing Postgres databases and so on if you want to use that,
13:35 but you don't have to.
13:36 And let's see.
13:37 So another interesting integration is it uses SQL model.
13:41 So that's the typing and the data layer aspect.
13:44 So SQL model is Pydantic plus SQLAlchemy basically by Sebastian Ramirez.
13:49 So that's cool.
13:50 And then you go down here and you create controllers.
13:54 So it's kind of a class-based type of thing.
13:56 First, it seems a little unnecessary, but as you interact with it and expose more features to the React layer,
14:03 you'll see that kind of relevant there.
14:05 So you just say, I'm going to render, say, my database things and some other pieces of data.
14:10 And then down somewhere, you've got your React code and your TSX, your React component there.
14:18 And if you've written React stuff, you should know pretty well how it works.
14:22 But it manages passing all that data over like server state dot to-dos.
14:26 Then you can just work with that, which is interesting.
14:28 But it gets more interesting later if you say, let's add an async function like add one of my to-dos.
14:35 And they have a to-do example, right?
14:36 So if you put this at side effect decorator on it, here's where the React integration comes in.
14:41 That's pretty wild.
14:42 So it automatically generates type typed.
14:46 Let's just say typed TypeScript because, I mean, you can't have non-typed TypeScript.
14:50 But generally, TypeScript is typed.
14:52 You generate TypeScript versions of the functions that you write in Python on the JavaScript side.
14:58 So you immediately can just start calling it and using those features, right?
15:02 So it kind of has this really tight integration with React, like as you would expect.
15:06 What else?
15:07 I think that's pretty much it.
15:08 But if you're a heavy React shop and you want a nice Python backend and you want a tight data integration between those, this is probably worth a look.
15:18 Yeah.
15:18 Neat.
15:19 Yeah.
15:19 Yeah.
15:20 It looks pretty neat to me too.
15:21 I'm not, like I said, I'm not really a React person.
15:23 So I'm not sure that I'm necessarily going to use it.
15:26 But if I were to start adopting React, I may well decide that that's what I want.
15:31 Speaking of cool things, Brian, how about Scout APM?
15:36 Let me tell you real quick about Scout APM.
15:40 They're big supporters of Python Bytes.
15:42 So we appreciate that very much.
15:44 So if you are tired of spending hours trying to find the root cause of issues impacting your performance, then you owe it to yourself to check out Scout APM.
15:53 They're a leading Python application performance monitoring tool, APM, that helps you identify and solve performance abnormalities faster and easier.
16:02 Scout APM ties bottlenecks such as memory leaks, slow database queries, background jobs, and the dreaded N plus one queries that you can end up if you do lazy loading in your ORM.
16:12 And then you say, oh, no, why is it so slow?
16:15 Why are you doing 200 database queries for what should be one?
16:18 So you can find out things like that.
16:19 And it links it back directly to source code so you can spend less time in the debugger and healing logs and just finding the problems and moving on.
16:27 And you'll love it because it's built for developers by developers.
16:30 It makes it easy to get set up.
16:31 Seriously, you can do it in less than four minutes.
16:34 So that's awesome.
16:35 And the best part is the pricing is straightforward.
16:38 You only pay for the data that you use with no hidden overage fees or per seat pricing.
16:43 And I just learned this, Brian.
16:46 They also have they provide the pro version for free to all open source projects.
16:51 So if you're an open source maintainer and you want to have Scout APM for that project, just shoot them a message or something on their pricing page about that.
16:58 So you can start your free trial and get instant insights today.
17:02 Visit pythonbytes.fm/scout.
17:05 The link is in your podcast player show notes as well.
17:07 And please use that link.
17:09 Don't just search for them because otherwise they don't think you came from us.
17:13 And then they'd stop supporting the show.
17:14 So please use our link pythonbytes.fm/scout.
17:17 Check them out.
17:18 It really supports the show.
17:20 Awesome.
17:20 Awesome.
17:21 What's next, Brian?
17:22 Well, what's next is Guido Van Rossum.
17:25 Oh, yeah.
17:26 So he's blogging a little bit lately.
17:30 And he put up a post called Why Python's Integer Division Floors.
17:37 And I think this is just an interesting little bit of history.
17:43 And this was a difference between Python 2 and 3.
17:47 I guess, no, it's integer division always did flooring.
17:51 But we did a kind of a thing of what one slash meant.
17:55 So if you do two slashes, it's always inner division.
18:00 If you do one slash in, if it was both integers in Python 2, it would be integer division and it would possibly not be the floating point result.
18:09 But in Python 3, it will generate a float.
18:11 But in Python 3, it will generate a float if you have it.
18:13 Like, for instance, one third will one divided by three results in a float.
18:18 But if you do integer division, it's not.
18:21 It's something else.
18:22 Is that why the 2 to 3 conversion was so hard?
18:24 I'm just kidding.
18:25 What was that?
18:26 It was strings.
18:27 Everyone knows it's strings.
18:28 So is that why the Python 2 to 3 story was so hard?
18:32 No, I'm just joking.
18:37 Yeah.
18:37 So if you do five to two slashes, so integer division of five divided by two, you get two.
18:46 And which is normal.
18:49 That's everywhere.
18:52 But if you do it, like, say, negative five, you don't get negative two.
18:57 You get negative three.
18:58 So there's a question.
19:00 So Guido said that he had a question about that, of why it's different, why it's like that.
19:06 And so it's going.
19:07 It does a division.
19:09 And then it goes to the closest integer to the negative, closest to negative infinity.
19:16 And there's reasonings behind that.
19:19 But it's different than C.
19:21 Apparently, I completely forgot what C did.
19:23 C does it closer to zero.
19:26 So negative five divided by two, you'd end up with negative two.
19:31 So the history there is that there was a choice to make.
19:37 And it works with the modulo operator as well.
19:44 Modulo will create the remainder where integer division creates the integer division.
19:51 The integer quotient.
19:52 So such that A divided by B equals Q with remainder R such that R is between zero and B.
20:00 There's math here.
20:03 But it just you have a choice as to what you want to do, whether you want R to be possibly negative, positive or negative for the remainder, or if you want R to always be positive.
20:14 And basically, Guido chose the one that looks nicer in math.
20:22 So that we go do a floor division instead.
20:27 The interesting take on this, I thought, was why did C choose the other way?
20:34 And that's the part that I thought was really interesting.
20:37 And his answer is, or his guess is that, well, C was doing it way before Python was.
20:44 And C was doing it on hardware that it may have been easier to do division closer to zero instead of floor division.
20:54 And part of that reason might be because some of the early hardware architectures were using sine plus magnitude rather than two's complement, which is kind of, I didn't know that.
21:05 I must have either if I did know it from CS.
21:09 I forgot it.
21:10 Definitely know.
21:11 Remember two's complement.
21:12 But anyway, some interesting history there.
21:14 It's interesting.
21:16 Also, one of the things I brought up, one of the reasons, it's kind of a silly little article to bring up.
21:20 But one of the things I wanted to bring it up is because a lot of new Python people actually don't remember, will forget about energy division and just assume that division is division.
21:34 But energy division is really handy for a lot of cases.
21:37 So don't forget about it.
21:38 That's true, actually.
21:40 Yeah, I find myself sometimes doing int of some float result.
21:43 Maybe I could just double slash it and not have to.
21:46 So comment from the chat, which I'm not sure how to take this.
21:51 I'm thinking that they're bored with this topic, but.
21:53 I'm not sure either.
21:56 Anyway.
21:58 No comment.
21:59 All right.
22:01 We're going to bring the hatchet out on this one, Ryan.
22:03 Bring in the hatchet.
22:04 So there's this thing I ran across called hatchet, which is a distributed task queue for more resilient web apps.
22:13 Now, I don't recall exactly what it is written in.
22:17 I don't think it's written in Python.
22:20 It's unclear.
22:21 It's primary language is Python, but also as a go.
22:24 Anyway, it doesn't matter.
22:26 It has a Python.
22:26 Its first SDK, its first integration is with Python.
22:32 So here's the idea.
22:34 You've got some work.
22:36 It's going to take a little while to run.
22:39 You know, your web app says, hey, I want you to ship this thing or I want you to run some analytics.
22:46 And if those analytics take 10, 15 seconds to run, maybe they're computational and they should run out on another computer rather than on your main web front end.
22:56 Or they shouldn't block it for 15 seconds or whatever.
22:58 When you're talking about your how do you break up import dependencies across the wrong layers and stuff, maybe one way to fix that is to move some of the compute work completely to its own place.
23:09 Right.
23:09 Like these this whole queuing mechanism is super fascinating for creating like truly scalable, like multi user thing.
23:17 Right.
23:17 So if the majority of your work is in some place and you kind of don't really need the answer to give a response, you can just throw it onto one of these background queues and let it go.
23:27 So one of the problems that you run into, though, is what if something goes wrong or how do I see what is running?
23:35 Something fails.
23:36 Can I resume it?
23:36 Maybe it was really important, like ship this thing to the customer.
23:39 But when I put that on the queue and it finally got around to being run, the API at UPS was down for who knows, whatever reason.
23:48 Right.
23:48 Things like that.
23:49 And you can't ship it.
23:50 And then just go into the ether as an error or you resume it.
23:53 Another problem you run into is fairness.
23:56 Right.
23:57 If there's a ton of work coming in faster than the processor can handle.
24:01 Is there some something where maybe only the new ones get worked on and the older ones get almost abandoned?
24:07 Right.
24:07 There's all these interesting things.
24:09 So this thing called hatchet is in the realm of many different things that attempt to solve this problem.
24:14 Right.
24:15 It's interesting.
24:16 It's interesting.
24:16 It's Y Combinator backed.
24:18 It's a company that presumably will have a price.
24:22 But there's also just an open source self, you know, take it for yourself and run with it version over on GitHub.
24:29 So 100% open source.
24:30 2000 or 2200 GitHub stars.
24:33 Pretty interesting.
24:34 But I think the business model is it says request cloud access.
24:37 Like they can run it for you or you can run your own.
24:40 Right.
24:40 Yeah.
24:41 So the website has a bunch of these little animations, which are kind of cool.
24:45 Talks about fairness, batch processing.
24:48 And like as you click on them, see how it like has these little fun, fun animations.
24:52 Like what does this mean?
24:52 Oh, I see.
24:53 Oh, I love it.
24:54 I do too.
24:55 I do too.
24:56 So talks about fairness, batch processing, workflow and event stuff.
25:02 It's engineered for scaling challenges, which is pretty awesome.
25:06 So low latency, 25 milliseconds on average, which means if you put something into the queue, there's ways in which you can sort of make callbacks to check on the process of the work and it shouldn't take all that long.
25:19 Bunch of rules about rate limiting.
25:21 Also durable.
25:23 So you can replay events.
25:25 You can do cron jobs and say, hey, every every morning, just drop this into the queue and run it at 7 a.m. or whatever.
25:32 You can schedule one time jobs.
25:34 It has ability to avoid being destroyed by spikes.
25:38 Like if for some reason a whole bunch of work comes in all at once, maybe you got IoT things that do a bunch of work when people come into the office and the very first sees a bunch of stuff and then it chills out.
25:49 You could smooth that out.
25:51 All kinds of stuff.
25:51 But like I said, it supports three technologies.
25:54 Python number one, type script number two and go number three.
25:58 And it's really easy to do.
26:00 You just go down here and you just put a, you know, when this event happens, I want you to run this class.
26:06 And then it has a bunch of functions methods.
26:08 You say, here's a step.
26:09 Step one, do this work.
26:11 Step two.
26:11 So you just basically put some decorators on a class and then plug it in and off it goes.
26:17 Super easy.
26:17 And to run it, you just say, push an event, whatever the name it is with the data and it'll go, put it off in the background and run it.
26:23 So, yeah, it also has like nice visualizations.
26:26 Like here you can see there's this on hatchet.run.
26:29 You can see there's this like live view of how is the work running through the system and all kinds of stuff.
26:35 So it looks pretty neat to me.
26:37 Open source.
26:38 People can check it out.
26:39 It's worth knowing about.
26:40 Yeah, it's pretty cool.
26:41 Neat.
26:42 Yep.
26:42 Indeed.
26:43 Indeed.
26:44 That's it for our items.
26:46 How are you feeling?
26:47 How extra are you feeling?
26:48 I have only got one extra that I almost that I pretty much mentioned already.
26:52 So I'll do it quickly.
26:53 So Python test is at 215 right now as we look.
26:57 But the most recent episode that will come out probably today is 216, which will be Charlie Marsh talking about UV.
27:06 So check that out also.
27:08 Awesome.
27:09 Yeah, you did mention that.
27:10 That's really good, though.
27:11 How about you?
27:12 Got any extras?
27:13 I'm feeling somewhat extra.
27:14 I got two exciting announcements.
27:16 One, I have a free new course over at Talk Python Training that covers a bunch of awesome technologies.
27:23 But the core idea, the title is build an audio app with AI, with Python and Assembly AI.
27:31 So the idea is what would you do if you, say, had access to, I don't know, a podcast's worth of data that's been going for many years?
27:41 Like Python Bytes or Talk Python.
27:43 Honestly, the thing we build lets you access a whole library of podcasts.
27:48 And you can go in there and do things like, hey, create a transcript, which seems kind of straightforward.
27:52 But once you have transcript data, you could get really cool search, like building your own custom search engine, not just over the title and the show notes and stuff, but also all the spoken words, which is kind of neat.
28:03 So you could create a summary.
28:07 What are the key moments of this?
28:09 And actually, what if I could just have a Q&A with like you and me around what we said in the show?
28:15 So kind of creating an LLM ChatGPT type of thing, but where it knows about any given episode out on the Internet.
28:24 So really fun.
28:25 People will learn FastAPI.
28:27 They learn Pydantic.
28:28 They learn HTMX.
28:30 They learn Beanie.
28:31 And they learn Assembly AI and build a cool thing.
28:34 And the whole course is like a four-hour free course.
28:36 So they can check that out.
28:37 Wow, neat.
28:37 Sounds fun.
28:38 Thanks.
28:39 And next to another new course, Rock Solid Python with Python Typing.
28:45 So this one is not a free one, but it basically shows you not just the how, but the why and when of Python typing.
28:54 A bunch of different examples.
28:56 Obviously, the language, but things like FastAPI and Pydantic, how do they use it?
29:02 What you talked about protocol before.
29:05 What is protocol?
29:05 Where does it fit?
29:07 A bunch of design patterns and guidance for Python typing and how to think about how you should use it.
29:13 So people should also check this one out.
29:15 I'm really proud of this one.
29:16 And it is also around four hours, 4.4 hours.
29:19 And Pradeep asks out there, are these courses paid or free?
29:22 Yes.
29:23 The Build an Audio app course is 100% free.
29:27 You just have to create an account.
29:28 The Rock Solid Python course is 49 bucks.
29:31 Yeah, but that's like 10 bucks an hour.
29:33 That's almost free.
29:34 Yes.
29:36 It's certainly not a lot of money, you know, compared to other ways you might go learn about things like this.
29:42 So anyway, both of these courses are awesome.
29:44 People should check them out.
29:45 So those are the two big announcements.
29:48 I also have a couple of interesting things I just want to give a quick shout out to.
29:51 Previously, we spoke about Doku.
29:54 Doku?
29:54 I'm going to go with Doku because I think it's based on Docker.
29:56 I don't know.
29:56 Yeah.
29:57 Doku.
29:58 Doku.
29:58 Doku.
29:58 Doku.
29:58 Doku.
29:59 Doku.
29:59 Doku.
29:59 Doku.
29:59 Doku.
29:59 Doku.
29:59 Doku.
29:59 Doku.
29:59 Doku.
29:59 Doku.
30:00 Doku.
30:00 Doku.
30:00 Doku.
30:00 Doku.
30:00 Doku.
30:00 platform as a service alternative to Heroku.
30:04 We already spoke about that.
30:05 But when we did, Ray out there on Mastodon, thank you, Ray, said, hey, you guys, love the episode on Doku.
30:12 Haven't tried it myself.
30:14 I'm a big fan of Heroku.
30:15 However, I set up Coolify.
30:18 Coolify.
30:19 I haven't talked about this, have I?
30:21 I don't think so.
30:22 Yeah, I don't think so.
30:23 Okay.
30:23 So Coolify is kind of the same.
30:27 So it's pretty similar, but it has a nice GUI to configure everything and keep an eye on the status
30:32 and all those things.
30:33 So Coolify is self-hosting with superpowers.
30:36 It's a self-hosted alternative to not just Heroku, but also Netlify and Verisil.
30:42 So Netlify for static sites.
30:44 You basically set this thing up, get it going.
30:47 It'll run any language on basically any server.
30:51 You just push to some Git branch, you deploy it.
30:54 It does automatic SSL certificates.
30:57 So it just had a static site.
30:59 And I just want it to run over SSL on the internet.
31:01 Boom, done.
31:02 It just does, lets encrypt automatically as part of creating the app up there and gets it going.
31:08 So it's probably a little bit of setup to get it going and get it running on Docker and stuff.
31:13 But once you do, it just becomes the substrate for all of your apps that you want to put out there.
31:19 And you don't have to think about anything but Git basically.
31:21 That's pretty cool actually.
31:23 Yeah, it looks really, really nice.
31:24 So people can check this out.
31:27 Yeah.
31:28 It has a paid cloud version and a self-hosted version with 17,000 or more people using it self-hosted.
31:37 So that's pretty cool.
31:38 It's interesting that they give stats there.
31:39 But that's my last one.
31:41 I think this is really neat as well.
31:42 So Ray, thanks for sharing extra details.
31:45 Yeah, nice.
31:46 Cool.
31:47 Shall we close it off with a joke?
31:48 Yeah, let's.
31:50 Yeah.
31:50 Okay.
31:51 So speaking of, I want to run my stuff on production.
31:55 How do I do it?
31:56 This is a great, a great one.
31:57 Back to workchronicles.com.
32:00 And so I don't know how you feel about this, Brian, but I think it's pretty true.
32:04 There's two engineers talking.
32:05 He says, oh no, I broke production.
32:07 Will I get fired?
32:09 The Morrissey and developer looks over.
32:12 If we fire engineers who break production, we will need to fire everyone eventually.
32:17 Yeah.
32:19 Yeah.
32:20 Also, if you fix it, you'll get a promotion or a raise or something.
32:25 So there you go.
32:27 Yeah.
32:27 Maybe you just need to keep your fix rate above your destroy rate.
32:32 Yeah.
32:33 Yeah.
32:33 There you go.
32:34 Well, that's what I brought for us today.
32:38 Cool.
32:38 I like it.
32:39 Yeah.
32:40 Excellent.
32:40 Okay.
32:41 Well, nice to kick off the week with a little Python bites.
32:45 Happy to do it early.
32:46 Yeah.
32:46 Good to see you.
32:47 And thanks to everyone who joined.