Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book

#465: Stack Overflow is Cooked

Published Mon, Jan 12, 2026, recorded Mon, Jan 12, 2026
Watch this episode on YouTube
Play on YouTube
Watch the live stream replay

About the show

Sponsored by us! Support our work through:

Connect with the hosts

Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 11am PT. Older video versions available there too.

Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it.

Michael #1: port-killer

  • A powerful cross-platform port management tool for developers.
  • Monitor ports, manage Kubernetes port forwards, integrate Cloudflare Tunnels, and kill processes with one click.
  • Features:
    • 🔍 Auto-discovers all listening TCP ports
    • ⚡ One-click process termination (graceful + force kill)
    • 🔄 Auto-refresh with configurable interval
    • 🔎 Search and filter by port number or process name
    • ⭐ Favorites for quick access to important ports
    • 👁️ Watched ports with notifications
    • 📂 Smart categorization (Web Server, Database, Development, System)

Brian #2: How we made Python's packaging library 3x faster

  • Henry Schreiner
  • Some very cool graphs demonstrating some benchmark data.
  • And then details about how various speedups
    • each being 2-37% faster
    • the total adding up to about 3x speedup, or shaving 2/3 of the time.
  • These also include nice write-ups about why the speedups were chosen.
  • If you are trying to speed up part of your system, this would be good article to check out.

Michael #3: AI’s Impact on dev companies

  • On TailwindCSS: via Simon
    • Tailwind is growing faster than ever and is bigger than it has ever been
    • Its revenue is down close to 80%.
    • 75% of the people on our engineering team lost their jobs here yesterday because of the brutal impact AI has had on our business.
    • “We had 6 months left”
    • Listen to the founder: “A Morning Walk
    • Super insightful video: Tailwind is in DEEP trouble
  • On Stack Overflow: See video.
    • SO was founded around 2009, first month had 3,749 questions
    • December, SO had 3,862 questions asked
    • Most of its live it had 200,000 questions per month
    • That is a 53x drop!

Brian #4: CodSpeed

  • “CodSpeed integrates into dev and CI workflows to measure performance, detect regressions, and enable actionable optimizations.”
  • Noticed it while looking through the GitHub workflows for FastAPI
  • Free for small teams and open-source projects
  • Easy to integrate with Python by marking tests with @pytest.mark.benchmark
  • They’ve releases a GitHub action to incorporate benchmarking in CI workflows

Extras

Brian:

  • Part 2 of Lean TDD released this morning, “Lean TDD Practices”, which has 9 mini chapters.

Michael:

Joke: Check out my app!

Episode Transcript

Collapse transcript

00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:05 This is episode 465, recorded January 12, 2026, and I'm Brian Okken.

00:12 And I am Michael Kennedy.

00:13 And this episode is sponsored by us and you.

00:18 So thanks to everybody that does support us through Patreon supporters

00:24 and also through grabbing the pytest course or the Lean TDD book

00:30 or so many of the massively cool courses at Talk Python Training

00:35 and also Michael's fun book on deployment and stuff like that.

00:40 So if you'd like to, we're recording this for the podcast,

00:45 but also broadcasting it live.

00:47 If you'd like to join the live, you can join us at YouTube,

00:50 but you can just go to pythonbytes.fm/live and there's a link on how to get that

00:56 and what the timing is and all that.

00:58 Also, you don't need to take notes.

01:01 You can just sign up for our newsletters.

01:03 So go to pythonbytes.fm, grab the newsletter and we don't do very many spammy things.

01:09 We just let you know what we talked about and with all the links and a little bit of extra information.

01:14 Yeah, and we love growing that list.

01:17 It grows by a little bit every week and it's so fun to see.

01:23 And with that, Michael, what do you got for us first?

01:26 Let's talk about PortKiller.

01:28 So PortKiller is an interesting Mac app.

01:32 And if you're not a Mac, I apologize.

01:33 I know it's a Windows as well.

01:34 It's a Windows as well.

01:36 So if you're a Mac or Windows, which is a good chunk of us, here is a cool application.

01:41 So let me just lay out the problem for you, Brian.

01:44 Either you've got 20 terminal tabs open or in one of your four IDE windows,

01:52 You've got your API running or you've got your web app running or something like that.

01:58 And you go try to run it again, not knowing that fact.

02:01 And it says, sorry, that port is already in use.

02:04 You're like, oh man, where is it now?

02:06 And that's on you to know where you ran it.

02:09 You know, I mean, there's a button to say run in your IDE versus running the terminal.

02:12 And so maybe just press the button.

02:13 You forgot, oh yeah, that thing was like one of the running things.

02:17 But since 2025, there's another reason this is a pain is some of these agentic AI things.

02:23 They're like, oh, let me start the server for you so I can test it.

02:27 And it'll like temporarily fire up the web server and make a request and like lose track of it and not kill it.

02:33 And it's not in your terminal.

02:35 It's not in your ID.

02:36 It's just there somewhere in the background, hidden.

02:39 You can't get to it to shut it down.

02:41 You go to a task manager activity monitor and you say Python is like four things like, hmm.

02:47 Well, three of those are supposed to be running.

02:49 Which one do I kill, you know?

02:50 So it's just a real hassle.

02:51 And so this port killer solves that problem, not just for Python, but for applications

02:56 in general, right?

02:56 So it's a powerful cross-platform port management tool for developers.

03:01 It monitors ports, manages Kubernetes port forwarding, and even does integrated Cloud

03:08 Flare tunnels.

03:09 So it's really, really slick, actually.

03:11 So what you do is it's got a little notification section menu bar thing that drops down and you can have multiple CloudFlare tunnels, which we've talked about in Grok and I've talked about Rathole, like CloudFlare tunnels or a thing like that.

03:25 Like put this on the internet so other people can get to it.

03:27 So maybe I can help debug or show off something, right?

03:30 Like if you're trying to debug a webhook, the only way to do that is online, right?

03:36 Because some other server is trying to get to your dev machine.

03:39 So these tunnels are super helpful.

03:40 We talked about that.

03:41 So that will actually manage these things.

03:44 Like, for example, you can go to say your Flask app and I'm pretty sure you can, I don't see the UI exactly, but I think you can go to it and say, oh, expose this Flask application as a cloud flare tunnel.

03:56 Boom.

03:57 It's off to the races.

03:58 That's pretty cool.

03:59 Yeah.

03:59 Yeah.

04:00 But it also shows you all the local ports that are open and you can go kill them.

04:05 So what you can do is you can actually go to either search for a process or a port.

04:09 You're like, it says that port, you know, 5,000 or 8,000.

04:13 Let's go with 8,000.

04:14 It's a little safer.

04:15 8,000 is taken.

04:17 You can't run your app.

04:18 Why?

04:18 Just go in there and type 8,000, find the thing and press kill.

04:21 Cause you're like stupid clod code left that thing running and I can't find it to kill it.

04:25 So I'll do it here.

04:26 You know what I mean?

04:27 Or you can just see what ports are open, what's running, all sorts of stuff.

04:30 And I think it's super neat.

04:31 You install it on macOS via homebrew, or you can just download a DMG.

04:37 So that's pretty cool.

04:39 And yeah, I mean, that's pretty much what I got to say about it, but it is nice.

04:43 That's cool.

04:44 Yeah.

04:44 I don't like this is this obviously I'm not even sure why I'm commenting on this,

04:50 but I most have like one or two ports open when I'm working on a web app

04:56 because I'm not heavy duty web developer, but yeah, it looks neat.

05:00 Well, the thing is you might not have very many, but you can only have one app on that one.

05:04 And if it gets lost, especially if the little AI thing goes off

05:09 and then leaves it just dangling you're like oh man yeah you got a rogue agent you got to hunt down

05:14 yeah exactly so it's even if you're just doing one it's like it's still valuable yeah yeah it's

05:19 pretty cool neat uh well what am i talking about um fine question i'm going to talk about uh making

05:26 things faster and profiling so henry schreiner sent this to us and um we're he's a fan of the

05:33 friend of the show and we're fans of henry shiner's work so he's a core developer um the article is

05:40 how we made python's packaging library three times faster and this is cool because um packaging is

05:48 is one of the um is one of the heaviest hit uh packages on pypi even though a lot of people

05:54 might not uh pip install it themselves it's used by a lot of other stuff um and um i uh let's see

06:02 I often am doing it when I'll reach for packaging if I'm trying to compare versions.

06:07 At runtime, I'll say if I'm testing with something at 3.15

06:13 versus a different one or something like that, I'll often use packaging for grabbing versions.

06:17 Anyway, there's a bunch of fun stuff in the packaging package.

06:23 So how did they make it three times faster?

06:25 The article talks about graphing and plotting and how he's figuring this out for a while.

06:30 And I'm going to scroll past all of that and talk about the actual updates.

06:38 One of the things I love about this is we often hear about premature optimization is bad.

06:47 But there are times where you see stuff that's taking a while and you're like, I really wish this was faster.

06:52 And so there is a time for optimization.

06:56 And often it goes along something like this.

06:59 You put some profiling together, you try to measure the different parts of it, and then measure it and try to reproduce those, make sure they're consistent times so that they're not random.

07:10 But then you just do like 10% at a time.

07:12 So this is an interesting one of like 10% speed up of just stripping zeros out of stuff.

07:20 They had a reversed list, iter tools drop while away to reverse it,

07:29 to strip the zeros.

07:30 And they just made it a little different and more actually more readable and less elegant,

07:38 but it's nice.

07:39 So instead of one line, there's five lines of code with a for loop,

07:44 but it sped up the whole thing by 10%.

07:48 And that's awesome.

07:50 And then, okay, so that's a little thing, 10% speed up.

07:52 And then a faster regex, 10% to 17% faster on 3.11 and above only.

07:59 And that's another thing is sometimes newer versions of Python have more tools at your disposal.

08:05 So you might have to have two implementations, but why not make it faster for the newest?

08:12 So that's pretty cool.

08:14 Removing, I don't actually get this.

08:17 Removing single disk.

08:18 Oh, yeah.

08:19 Removing single dispatch, 7% faster.

08:23 Remove duplicate version creation, 37% faster.

08:26 Awesome.

08:27 Removing named tuples, 20% faster.

08:31 Just all these different little tiny speedups, and they're like 10% to 20%.

08:35 And in the end, it's a lot faster.

08:38 So I really like write-ups on people just taking a little bit befores and afters.

08:45 And that doesn't mean to say if you've got something that looks like the before code that you should speed it up.

08:51 You should measure and make sure that it's readable.

08:54 I mean, I think the first goal is readability.

08:57 Next, measure to see if it's too slow or something.

09:00 Yeah.

09:01 Write it in the form of readable, understandable, and then figure out how to make it fast.

09:06 Yeah.

09:06 If needed.

09:07 Yeah, but there are tools available if you are ready to try to make things faster.

09:13 Indeed.

09:14 All right.

09:15 All right.

09:16 I have a, I don't know how to phrase this.

09:18 I think it's not good.

09:19 Let's call this a not good story, but it's quite the story.

09:22 So this is AI's impact on dev companies, open source.

09:27 We start with a quote from Simon Wilson's sort of link blog, which itself is a quote.

09:33 Okay.

09:33 So Adam Watham is the CEO of Tailwind Labs, which is the maker of Tailwind CSS.

09:40 And the story about what's happening right now with Tailwind CSS is bonkers.

09:46 Like you would have never predicted it.

09:48 But once you hear the story of like, ah, yes.

09:50 Okay.

09:51 So from Adam, he actually did this 30 minute morning walk where he like sort of walks and

09:58 talks and it's kind of like a journal, but public or something like that, I guess.

10:02 And he says, we had six months left.

10:06 He says, I just laid off some of the most talented people I've ever worked with.

10:10 and it blink and sucks.

10:11 The direct written version is, this all actually came from this thing called LLMs.txt,

10:18 which actually is pretty interesting.

10:20 Like robots.txt, but for LLMs, I'm thinking about putting this on Python bytes and stuff

10:24 to like help LLMs be more efficient working with what we produce.

10:29 So if somebody wants to ask a question like, hey, what did Python bytes say about this?

10:33 Like what did Michael and Brian cover here and what was their advice, right?

10:37 If we put that as a, hey, LLM, you could technically scan the entire system

10:42 and try and understand that, or you could use our search for these keywords

10:45 and then here's how you access the transcript and then parse just, you know what I mean?

10:48 You could give like a little advice.

10:50 So someone had proposed as a PR to put that onto Tailwind

10:54 and said, "Could you please put that onto the Tailwind documentation

10:58 so that we'll be more efficient talking to Tailwind or asking questions of Tailwind from our AI agents?"

11:04 And people lost it, people lost it.

11:07 It wasn't good.

11:08 And Adam said something to the effect of, like, this actually is a pretty decent request,

11:14 but the reality is that 75% of the people in our engineering team lost their jobs yesterday

11:19 because the brutal impact of AI.

11:21 And if I add this feature, it's going to make it worse.

11:24 All right, so here's the insane aspect.

11:28 If you look at traffic to the, if you look at downloads, sorry, on NPM,

11:33 I guess that's traffic of a form.

11:34 If you look at the downloads of Tailwind, They're up six times year over year.

11:40 Wow.

11:41 Six times more usage.

11:43 Don't you think that would make the product more viable?

11:46 No.

11:46 Because if you go to the Tailwinds documentation, it's all free, Tailwinds open source, but there's a thing that says get Tailwind Plus.

11:52 You see that in the docs.

11:54 However, even though there's a 6x increase in usage, there's a 40% drop in traffic to the documentation and revenue is down five times or down 80%.

12:06 Why? Because instead of going to the docs to get help, you just say, hey, AI, install Tailwind and do this.

12:12 Or you don't even say that. You say, make it pretty. And like Tailwind is the default styling mode of AI.

12:18 That's pretty sad, huh?

12:20 Yeah, it definitely is. I don't. Yeah, it sucks. I don't know what the solution is.

12:27 I don't either. Avoiding LLMs.txt is probably not the answer, but it's also not a solution for sure.

12:33 I mean, it might need a different business model. You know what I mean?

12:37 Yeah. I mean, we've both seen this as well. I was faced with higher server costs on my blog. And then I put in, we talked about this earlier, I put in monitoring to see who it was, and it was mostly robots. And then I cut robots out, and we got zero traffic.

13:00 So I'm like, not zero.

13:02 That's not the answer either, though, is it?

13:04 It's like saying, I do not want to appear in Google because I hate the traffic.

13:07 I mean, I know it's not exactly the same, so don't email it.

13:10 But with Google, they said, if you don't let our robots in to scrape your site for our LLMs,

13:15 then we're not going to scrape your site for search traffic either.

13:20 I think that's the thing.

13:23 And that's lame.

13:24 I think that you should be able to say, no, I don't want somebody to scrape my site for AI.

13:29 but please do keep looking at it for search.

13:32 We just like, just take a moment and just say that there are really good alternatives

13:37 to Google, like Startpage, like Kaggy.

13:41 Yeah, but a lot of them are all built off of the same data that Google's selling.

13:45 And Startpage is built off the stuff.

13:47 So it is in a sense, really, really the same.

13:50 Yeah, that's a good point.

13:51 Okay, I'm not even done with the story, Brian.

13:53 Okay, let's hear the rest of the drama.

13:55 Okay, so that's half the story.

13:56 There's a really good video by this guy.

13:58 I just came across Maximilian Schwarzmuller.

14:00 He does a bunch of great developer videos.

14:03 It's more of like a-

14:04 - And an awesome name.

14:05 - I know it is an awesome name.

14:06 He's more like a JavaScripty guy, but he does a lot of just like thought pieces

14:12 on programming in general, and he's good.

14:14 So he did a 11 minute, 55 second video on this tailwind

14:18 is in deep trouble, talks about it, actually shows a lot of graphs and data.

14:22 So it's pretty interesting.

14:23 Then the other is a video from the Primogen.

14:26 And this is the other side of, this is another company in the same theme

14:32 and Stack Overflow is cooked.

14:33 It is insane.

14:36 Let me see if I can find a spot here.

14:38 Let me see if I can find a spot here with this graph.

14:39 So there's an insane graph if you watch that video, but I can just tell you the words.

14:44 I'll tell it to you more, Brian.

14:45 So in, I don't have my show notes up, so I'm about to do this from memory,

14:48 but I believe in the first month of Stack Overflow's life,

14:52 there was 3,800 questions asked.

14:55 Yeah, in 2009.

14:56 I've got the notes up.

14:57 Yeah, okay.

14:58 Almost nothing.

14:59 I mean, it's just getting going.

15:00 It's like a forum.

15:01 Like, no, of course, there's going to be almost no questions.

15:03 It goes up and it stays 200,000 new questions asked per month almost through the pandemic.

15:11 And it starts going down.

15:12 And this is not purely from AI.

15:15 It's going down like pretty precipitously for two years before ChatGPT comes out.

15:22 When it does in like mid-2022, it drops even harder.

15:26 So here's an insane stat.

15:28 In 2016, there were 200,000 new questions asked at Stack Overflow.

15:32 In 2026, well, technically last month in December 2025, there were 3,740-something questions asked.

15:41 The same as the first month they opened.

15:44 200,000 to three or whatever.

15:46 3,000.

15:47 It's insane.

15:48 That is insane.

15:50 So actually, I think the primary guide's got a premium to prepare your goodbyes.

15:55 It's pretty much it.

15:56 Now, it does seem to still be going strong, like selling this data, this pure data in a sense, to AI companies.

16:03 But for how long is that going to last?

16:05 That's like, we don't have any cows left and we can't get milk.

16:09 But we got a lot of cheese that's really good in the freezer.

16:12 So we can sell that for a while until it's gone.

16:15 Without new questions, there's not new information there.

16:19 And who's answering, even if there are new questions, who's answering them?

16:23 Yeah, I bet you half of those 3,000 or 4,000 are like AI generated.

16:27 Honestly, I can't believe this.

16:28 I knew that they were going down, but 200,000 to a couple thousand is an insane level.

16:34 Yeah, and they were actually one of the better ones as far as ads and sponsors and stuff.

16:41 I mean, Stack Overflow, we make fun of it and stuff, but it's a part of software,

16:46 and it wasn't really that spammy.

16:48 It just was, it didn't have a bunch of pop-ups and stuff.

16:52 So I actually kind of liked Stack Overflow.

16:55 I know.

16:56 I mean, I don't know.

16:57 I don't know exactly how to feel, but it even went so far as we had the Stack Overflow keyboard

17:01 that only had three keys.

17:03 Yeah.

17:04 Right?

17:04 It had the, instead of the controller command, it had the Stack Overflow logo and then a C and a V.

17:10 It's a beautiful joke.

17:12 Just, you know, that's how important that used to be.

17:15 Yeah, you take an error message and just throw it into Google search and you get a Stack Overflow question answering it.

17:21 That's how you figure stuff out.

17:25 Exactly.

17:26 There was a period where we read books and then we copied and pasted from Stack Overflow.

17:30 And now I guess we copy and we just take it from AI.

17:33 But I think this is noteworthy in a big way in the developer world, right?

17:39 This is big.

17:40 Yeah.

17:41 And I mean, I know that the people behind Stack Overflow, there's a lot of money there.

17:46 And I think that the top of the chain, they're probably fine.

17:50 But everybody that got laid off, that's lame.

17:55 But Tailwind has always been a lightweight business model,

17:59 even though they're doing some amazing things.

18:02 And like you said, it's being used by almost everyone.

18:04 So it's crazy.

18:05 Yeah.

18:06 Kiva out in the audience points out, it's insane how symmetric this graph is.

18:10 I agree.

18:11 It is insane.

18:12 It's like almost a perfect trapezoid.

18:14 Not quite, but almost.

18:15 Yeah.

18:16 While I'm pulling up stuff from the comment, Henry points out that he's just a PyPA member

18:21 and builds many, many Python things, but not quite a core development.

18:25 Oh, okay.

18:25 Keep the record a little bit.

18:27 Okay.

18:28 Well, Henry Schreiner is a core asset in my learning journey.

18:33 I guess I'll put it that way.

18:35 Over to you.

18:35 Okay.

18:37 I wanted to talk about Henry Schreiner's art.

18:41 No.

18:42 Similar.

18:43 So we talked about Henry Schreiner's article on Python packaging and making it faster.

18:48 And it reminded me of something that I wanted to talk about for a while and I just forgot about.

18:53 So thanks, Henry, for the reminder.

18:55 And it was because I was and it's around profiling.

18:59 So I was looking through not the packaging library.

19:03 I was looking through FastAPI once looking at their I kind of do this.

19:09 I kind of troll sometimes.

19:11 go through GitHub workflows of different packages to see what sort of tricks they're using in their

19:16 workflows. And I remember... It's like the view source of the modern day.

19:20 Yeah, workflows. And it's one of the wonderful things about open source packages is you can do

19:25 this. So FastAPI, plus the folks at FastAPI are really great about trying to stay cutting edge

19:32 and helping other people. So looking through here, I saw COD speed benchmarks. And I'm like,

19:39 what is this so um that's my topic right now is the cod speed tool um and um it's probably maybe

19:47 it's code speed but it's spelled cod so i'm going to pronounce it cod speed um it could be measured

19:53 in like how fast does a cod swim in the ocean and this is like a is it is it a laden cod or a uh

20:01 a heavily laden cut bite your knee caps off get back here

20:07 So I'll talk about the actual tool, but as we're looking through FastAPI, there's also in the merge requests for new features, there's a performance report that is linked and you can go and take a look at it.

20:23 If you click on that, you go and see, like, you can go and poke around at that particular

20:28 merge request.

20:29 I'm looking at the overview.

20:30 His FastAPI on codspeed.io has this graph of basically making is not very noisy of like

20:39 the different times they've measured things and how the metrics, the performance metrics

20:45 have gone up and down.

20:45 And it's really kind of cool.

20:47 So I was like, can I use this?

20:49 Yes.

20:50 So I'm going to try this out.

20:51 I haven't tried it yet.

20:52 but at codspeed.io.

20:54 And there's a nice rabbit logo, which is nice.

20:57 Anyway, it integrates into dev and CI workflows to measure performance, detect regressions.

21:04 And if it was just a paid tool, I wouldn't probably be covering it.

21:10 But it's one of those awesome tools that's actually not that expensive per month

21:16 if you're actually paying for it for your project, for a commercial product.

21:20 But it's free for open source projects.

21:23 So I love those sorts of things.

21:25 They're given back to the community with free tools.

21:28 It's great.

21:29 So what I wanted to talk about is, if we go back to the homepage, it's pretty cool.

21:37 One of the things it talks about is catching performance and not noisy metrics.

21:43 So I'm not sure how they're doing this, but they're saying that the traditional metrics

21:47 kind of jump around a lot, but that cod speed has got a small variance.

21:51 It's possible that they run this on dedicated hardware.

21:55 So you're not subjected to other things on the VPC.

21:58 Other noise.

21:59 Yeah.

22:01 So they're doing flame graphs to pinpoint problems and speed ups and slow downs.

22:06 But one of the things I was really impressed with is right there on the front page,

22:10 they tell you that you can design performance tests, integrate with your CI workflow and get reports.

22:16 And there's like a Python button you can click on.

22:19 And it's like, oh, you just take a, do a pytest.

22:23 You just mark whatever test with pytest mark benchmark.

22:26 And those are the ones that are used in your benchmarks.

22:28 It's easy.

22:29 You're already using them for testing your code.

22:31 So you can just grab some of your, maybe some of your high level tests that test the whole system

22:37 and try to hit and hit, you know, grab a handful of them that hit most of your system.

22:41 Might be good.

22:42 You can throw them on all of them.

22:43 Maybe, I don't know.

22:45 And then what do you do in CI?

22:47 So once you try to integrate it in CI, there's tabs for GitLab and GitHub.

22:54 But for GitHub Actions, they have their own action that you can use.

23:00 And you just say, use that, and then run whatever your code is.

23:04 So I went back to, what, FastAPI's workflow to see what that was.

23:08 And their metrics, when they're running, they're just running pytest.

23:12 It looks like they give it a cod speed flag.

23:15 I don't know what that does, but maybe it does some cod speed stuff.

23:19 So just fun.

23:20 I thought it'd be good to give a shout out to these folks.

23:23 And I'll try it out on one of my projects and maybe report back how it's going.

23:28 But you can look directly how FastAPI is using it.

23:30 It's pretty cool.

23:31 Yeah, two quick real-time follow-ups.

23:33 I'm going to be interviewing Sebastian Ramirez and his team about FastAPI cloud tomorrow on Talk Python.

23:42 Nice.

23:43 So maybe if I can keep it in my mind, I'll ask them about this as well.

23:47 It's pretty interesting.

23:48 Yeah.

23:49 But if people want to watch that live, that's tomorrow morning.

23:51 And then Henry points out that Astral uses COD Speed.

23:56 And that was mentioned when people are asking about proper benchmarking for packaging.

24:00 That might have been where I saw it.

24:03 All right.

24:04 Well, those are our topics.

24:07 Do you have any extras for us?

24:09 Let's see what I have.

24:10 I indeed do have some extras.

24:13 So remember how I talked about DevOps, Python, supply chain security made easy,

24:18 and Python supply chain security made easy and all that.

24:21 And I gave some examples using pip audit, how to set up cached build time checks

24:27 so that your Docker images won't build if there's a CVE detected.

24:32 Oh, yeah.

24:33 Well, boy, oh, boy.

24:34 Are there more CVEs in the PyPI space that I realized?

24:38 Because just middle of last week, I couldn't ship Python bytes

24:42 because there's a vulnerability in here.

24:44 We're not going to let it out.

24:45 So here's the cool thing.

24:47 It absolutely caught it.

24:48 It wouldn't let me release a new one, but it is a hassle.

24:51 And I've also run into a little bit of a bigger hassle,

24:54 not a bigger, an unforeseen hassle.

24:56 I said, okay, well, what we're going to do is we're going to delay updating to the new thing

25:00 by one week to make sure that that stuff gets fixed.

25:03 Well, turns out when they find a problem, they fix it and they release it that day.

25:07 But if I apply that technique, it doesn't get rolled into my stuff for a week.

25:12 So got to be a little bit careful to just sort of pin some stuff for a little bit or not.

25:17 Just don't update the dependencies until you get that.

25:20 Or if it's not, you know, whatever you're going to do about it, right?

25:22 But yeah, it works because I couldn't release the website

25:24 till I like manually put different versions in to make it skip those issues.

25:28 That's good. It's a good thing.

25:29 All right, on to the next one.

25:31 You probably have heard, Brian, that you should, anytime you're doing security, you need to have a decent password.

25:37 You know this as a user, but if you build anything that has user auth,

25:41 well you want to have some level complexity i feel like there's so many web builders out there that

25:47 just suck so bad you're like how do they let you behind a keyboard for example one of my banks this

25:52 is a international bank limits my password length not minimum maximum to like 12 characters that's

26:00 insane that is just the stupidest i'm like are you putting that straight in the database because

26:04 if you're putting that like if that's because that's a database constraint you better not be

26:07 putting my password straight in the database.

26:08 You know what I mean?

26:09 Really bad.

26:10 But in the extreme end--

26:13 Does the length affect what the hash length looks like?

26:16 No, the hash is the same length from one character and a million characters.

26:20 That's why I'm sus of it.

26:22 But there are actually some nuanced aspects.

26:26 So if you're using MD5 to hash your passwords, no.

26:30 Don't do it.

26:31 Please don't do it.

26:31 But there's some really nice things you can do.

26:35 you can use more modern ones.

26:37 And especially something that you can do that's really powerful is to use

26:41 what are called memory hard problems.

26:43 So cracking passwords is an embarrassingly parallel problem

26:47 in that like I've got a hash and I'm gonna try a bunch of different stuff against it.

26:51 And if I got a GPU, maybe I run like 4,000 variations in parallel.

26:55 And if any of those match, then we call it good, you know, and use all the cores.

26:58 That's because GPUs scale compute super well, but you know what they don't scale well?

27:02 Memory.

27:03 So one of the new-ish algorithms is called Argon2.

27:08 Really highly recommended because it is not subject to this brute force attack.

27:14 Because instead of using a lot of compute, it used a lot of memory.

27:17 So you can't parallelize that as easy, right?

27:19 You're like, well, each hash takes, I don't know, 15 megs to do an attempt.

27:24 So, you know, how many of those can you parallelize before you run out of memory sort of thing, right?

27:28 Well, it turns out that this has become a point of a problem.

27:33 So hackers have started using, as a distributed denial of service type of thing, very long passwords.

27:41 So like one megabytes worth of text of password.

27:44 Because the Argon2 and the memory hard ones, the bigger the password is, the bigger the text is, the more memory they use.

27:52 So if you just jam like an insane amount of text, all of a sudden it becomes not just memory hard for hackers, but even for your server.

28:00 It's like, okay, this is off the charts, right?

28:03 So I just want to put this out there on people's radar.

28:05 Like Prudus out there says, my bank also restricts password length.

28:09 Boo banks, they suck.

28:10 But maybe they shouldn't allow a million characters, like limited to 100 or 50.

28:16 Something way bigger than you're going to do, but like not unbounded, right?

28:20 Don't just say less than.

28:21 It's a super easy check, but this is definitely something to keep your radar on.

28:26 Character restrictions are terrible also.

28:29 The ones that say like, you can't use some special characters.

28:31 Like, why would you not be able to use special characters?

28:34 Exactly.

28:35 I honestly think like a good chunk of this, and maybe this is the bank thing as well,

28:39 is like it's a tech support thing.

28:41 Somebody who is like 78, doesn't really know what they're doing,

28:45 can't type in their password.

28:47 They're like, I called tech support, and they're going to help me type it in or something.

28:50 I don't even know.

28:51 But I feel like it's, you know, restrict the characters use

28:54 so they're more legible.

28:56 I don't know.

28:56 But why would you allow an L and a 1?

28:58 I don't know.

28:58 It's all messed up.

29:01 or an o and a zero like can't use zeros you only can use those or vice versa well the ones that

29:05 like require you to use a number also it just seems weird it's like these particular like 10

29:12 characters are special and we need one of those 10 to be in of of all the possible characters these

29:18 10. that's a really good point makes me so frustrated i i have uh one password generate

29:23 these like ones that are easier to type if you got to type them on your phone or something but they're

29:27 like 30 characters and capital words, lowercase words, dashes, and it says, you don't have

29:33 a number or a special character.

29:35 I'm like, do you really think it's going to make a difference if there's an exclamation

29:38 mark on this 30 character random?

29:40 Like, no.

29:41 Anyway, I am here with you.

29:43 All right.

29:43 Last extra, I sold my computer and bought a new one at this place called Swappa.

29:50 So I kind of want to talk about Swappa really quick, because I just think this is a good

29:54 Good alternative to buying computers from Apple directly.

29:57 It's kind of like eBay, but specifically for computers.

30:01 So you don't buy it from the company.

30:03 They facilitate you buying it from another person, which I think is kind of interesting.

30:07 Okay.

30:08 So I had this Mac mini super minimum version that I had bought with some trade-ins that

30:13 effectively I paid like 360 bucks.

30:16 I sold it for $600 on here and then turned around and bought a maxed out Mac Mini Pro version,

30:24 which has got like 14 cores and lots more memory, like a lot of upgrades for like $1,700 instead of $3,000.

30:31 And you apply the $600 and it gets like, hey, that's almost nothing in price.

30:36 So someone was saying like, oh, hey, I don't have an Apple Silicon thing and I need to do builds for it.

30:41 And what a hassle that is.

30:42 So like even the most brand new Mac minis are for sale for like four or 500 bucks.

30:48 But if you look and you just want, I just need an Apple Silicon thing.

30:52 You go to the 2020 ones and it's like 280 bucks for a new, not new, but mint ish computer,

30:58 right?

30:58 Like a decent M1 if you need to grab.

31:01 Anyway, I thought that just throw that out there because I had a nice experience both

31:04 buying and selling my computers there.

31:06 And now I got like a really maxed out one, which I used to make my unit test go twice

31:10 as fast.

31:10 Nice.

31:11 That was fun.

31:12 Yeah.

31:12 nice uh segue there uh so uh as an extra i've got um actually so the you all know i've been

31:21 working on a new book so uh lean tdd and um i'm kind of excited about this update because

31:29 i was working on just the next chapter and then it was getting big and i was writing a lot so i

31:35 decided to break it up so um i split the split the book into three parts there's foundations

31:40 that's talking about lean and TDD.

31:43 And I think I want to expand the TDD.

31:45 I've got some questions on test-driven development from people that were new to it.

31:48 I just sort of assumed everybody knew about TDD.

31:52 So I'll probably expand that later.

31:54 But so I've got foundations.

31:57 And then what I released today, this morning, was, what is it, one, two, three, four, five, six, seven, eight,

32:04 nine new chapters on part two, which is lean TDD practices.

32:10 and so instead of dumping this all in one big chapter i put it i split it all up and some of

32:16 these are to be fair i mean in pdf form some of these are a couple pages so there'll be like

32:21 three or four pages in your e-pub but they're uh i i don't want to keep it's it's a lean book as

32:27 well so it's i think that the total right now i'm up to like 70 pages um after this i do want to try

32:34 to wrap this up by the end of this month still. So I got a couple of weeks left. I've got the last

32:39 part is considerations. I want to cover the role of test engineers, if you have them on your team,

32:44 and also talk about test coverage and monitoring. But so that's only a few topics I got left,

32:52 and that'll wrap up the first draft. So excited about, I've been really excited about the flow and

32:59 getting this in here it is funny though um things that you think you just know in your head when you

33:05 try to write it down in a sequential form it's like wait i how should i phrase this i i thought

33:11 i could just talk about this easy but trying to get it out of your head is um yeah sometimes tricky

33:16 so that's my thing um also appreciate so i i had set up uh github um uh repo that doesn't

33:24 have the book, but it has just a place for stuff. It's Lean TDD book under Aachen for feedback.

33:32 And not very many people were using it, but just recently, I want to shout out to Timothy Malahy.

33:39 He gave me a bunch of stuff, like look through a bunch of typos. I really appreciate that.

33:46 And then some other ideas about maybe expanding on some sections. So I like the feedback.

33:52 Awesome. Congrats on the progress too.

33:54 Ah, thanks. Yeah, that's my extra. How about something funny?

33:59 Yeah, let's do something funny over here. So this one is about how amazing agentic coding is,

34:05 even if you don't even know anything about programming. You don't know anything about

34:08 the web, Brian. You can do it. Here's a quote. I believe this is on Reddit.

34:12 Claude Code is blanking insane. I know literally nothing, all caps, about coding. Zero, all caps.

34:19 and I built a fully functioning web app in minutes.

34:23 Check it out at localhost.3000.

34:30 Yeah.

34:31 Nope.

34:31 Nope.

34:31 We're not going to be checking it out at 3000 because you know nothing.

34:36 That's funny.

34:37 I think that captures the end of 2025 vibes is just like, it's amazing and terrible.

34:44 Yeah.

34:45 Well, I mean, when do we get AIs that can deploy for you?

34:49 Well, are we there yet?

34:50 I don't think we're that far away.

34:51 I just had Claude Code helping me set up a special web socket,

34:56 HTTP to web socket to secure web socket conversion on Nginx,

35:02 and it was like, here, boom, perfect.

35:03 But you got to know to ask, right?

35:05 That's the thing.

35:06 Yeah.

35:08 Yeah.

35:08 But these are weird times.

35:10 They're very weird times.

35:11 They're weird times.

35:12 But we're still hanging in there.

35:14 And if you're hanging in there too and hanging in with us,

35:17 I really appreciate everybody listening to the episode, listening to Python Bytes,

35:23 and supporting us through all of the different means.

35:27 And yeah, we're going to do this as long as we can.

35:30 So thanks, Michael.

35:31 Yeah.

35:32 Thank you, Brian.

35:32 See you later.

35:33 Bye, everyone.

35:33 Bye.


Want to go deeper? Check our projects