Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book

#467: Toads in my AI

Published Mon, Jan 26, 2026, recorded Mon, Jan 26, 2026
Watch this episode on YouTube
Play on YouTube
Watch the live stream replay

About the show

Sponsored by us! Support our work through:

Connect with the hosts

Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 11am PT. Older video versions available there too.

Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it.

Michael #1: GreyNoise IP Check

  • GreyNoise watches the internet's background radiation—the constant storm of scanners, bots, and probes hitting every IP address on Earth.
  • Is your computer sending out bot or other bad-actor traffic? What about the myriad of devices and IoT things on your local IP?
  • Heads up: If your IP has recently changed, it might not be you (false positive).

Brian #2: tprof: a targeting profiler

Michael #3: TOAD is out

  • Toad is a unified experience for AI in the terminal
  • Front-end for AI tools such as OpenHands, Claude Code, Gemini CLI, and many more.
  • Better TUI experience (e.g. @ for file context uses fuzzy search and dropdowns)
  • Better prompt input (mouse, keyboard, even colored code and markdown blocks)
  • Terminal within terminals (for TUI support)

Brian #4: FastAPI adds Contribution Guidelines around AI usage

Extras

Brian:

Michael:

Joke: A date

  • via From Pat Decker

Episode Transcript

Collapse transcript

00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:05 This is episode 467, recorded January 26, 2026.

00:10 Ooh, that's a lot of 26s.

00:12 I'm Brian Okken.

00:13 I'm Michael Kennedy.

00:14 This episode is sponsored by all of us, all of you, and the products that we have going at Python Bytes,

00:22 and I guess not at Python Bytes, but Talk Python Training, and at pythontest.com,

00:28 We've got the pytest course and LeanTDD, but Michael's got courses and books and all sorts of fun stuff.

00:34 And also, of course, Patreon supporters.

00:36 We still love you.

00:38 And yeah, if you'd like to connect with us and send us topics, maybe topics for the show, we love that.

00:44 You can just send us through the contact form at pythonbytes.fm, but you can also find us on the socials.

00:51 And the links are in the show notes.

00:53 We're on Bluesky and Mastodon.

00:55 Yeah, and if you'd like to see what we look like or watch us live or tape recorded,

01:03 you can go to pythonbytes.fm/live and yeah, check out that.

01:08 And it tells you when we're going to show next time.

01:10 Yeah, they can even check out a glimpse of the Python staff of power just behind you there.

01:15 Yeah, that's right.

01:17 Yeah, there it is.

01:19 Yeah, and I'm still on the fence about going to PyCon this year.

01:23 And if I go to PyCon, I'll definitely bring the staff of power.

01:26 My fence has been built and decided upon.

01:29 I bought my tickets and I got actually a pretty amazing deal on Expedia.

01:34 I was able to get four days hotel right near a block or two from the convention center and the water and a round trip flight for 600 bucks total.

01:43 So I'm going.

01:44 Oh, nice.

01:45 Yeah.

01:45 I'm going to try to ask work if I can get away with not having to take vacation time to go.

01:52 Yeah, wouldn't that be nice?

01:53 Hey, they talk about pytest there.

01:54 I know you're doing a lot with that.

01:55 Yeah.

01:56 So, all right.

01:57 Well, should we do some topics for the show?

01:59 Let's talk.

02:00 Oh, I have the toad up.

02:02 That is not what I want to talk about.

02:03 I want to talk about this thing called the Gray Noise IP Check.

02:06 Okay.

02:07 So Gray Noise is a company that does research on cyber security breaches,

02:14 et cetera, et cetera, right?

02:15 Yeah.

02:16 So they have published this thing called the Gray Noise IP Check.

02:20 And if you just go to check.labs.greynoise.io or way simpler, click the link in the show notes.

02:26 It will take you to this place that says your IP address is whatever your IP address is.

02:31 And your IP address is clean.

02:33 Your IP has not been observed scanning the internet or contained in the common business services data set.

02:41 So what is this for?

02:43 Like in the day of 30 things connected to your internet, your Wi-Fi, with your kids installing random junk off the internet.

02:51 Like my kid is pretty good.

02:52 She says, dad, is it okay if I install this?

02:55 Yes, it's okay or no.

02:56 And she actually completely reformatted her Windows 11 machine so she could play games but not worry about what was installed on it.

03:02 So, you know, not knocking the kid or anything.

03:04 But kids install stuff.

03:06 Other people install stuff.

03:08 Your TV, your smart power thing that goes to that lamp in the corner.

03:14 You know, there's a crap ton of stuff that is on your network.

03:18 And even if you run some sort of virus check on your computer,

03:21 having that come up empty doesn't mean that there's not a problem

03:24 with things that you are in control of, right?

03:26 So basically the idea of this thing is you can go here

03:29 and it will tell you if something on your network has been being bad.

03:32 Okay.

03:33 Yeah.

03:34 And I don't know, for me, I thought that was pretty cool.

03:36 So I threw it in as one of the topics so that other people can click on this and see, I guess, a couple caveats.

03:42 If your IP address just changed recently, it's very plausible that you could get a warning.

03:49 And it was whoever had it before.

03:50 You know what I mean?

03:51 Like we had a power outage for just a couple hours.

03:55 Some sort of, I think I talked about this last time because my mic had gotten all reset.

03:59 Because everything got powered 100% down because some sort of transformer thing exploded.

04:04 We got a new IP address from then.

04:06 Right. But, you know, a couple of weeks ago.

04:08 And so it turns out that I guess things are still good.

04:11 That's good to know.

04:12 But if you just had your IP address change, then it could be picking up issues from whoever had it before.

04:18 Right. Kind of like you keep getting spam with that new phone number you got.

04:21 Yeah. But these, I don't know about you, but my IP address is pretty much rock solid stable.

04:26 Unless something goes wrong with the internet or with power,

04:29 then it may change.

04:30 But other than that, it's the same for quite a while.

04:33 I'm slightly embarrassed to say I usually don't know my IP address.

04:38 And I actually don't quite get how, I mean, I get how IP addresses work.

04:42 But like for a house, is your whole house like the same IP address?

04:47 Or I don't know how that works.

04:51 Also, like, does this, can you use this?

04:52 Like if you're at a cafe to check to see if there's other like bad actors there?

04:57 Yeah, I guess you probably could actually.

04:58 Yeah.

04:58 So, yeah.

04:59 Anyway.

05:00 Interesting.

05:01 Okay.

05:01 Well, I'll have to check it out.

05:02 And I know more because I have a lot of my stuff gated behind IP addresses, like the server.

05:09 You can't even SSH to the server unless your IP address is on a whitelist, right?

05:14 So periodically when it changes, I'm like, oh, what is my IP address again?

05:17 So then I go copy.

05:18 I don't really pay too much attention, but because of that, I kind of got to pay a little attention.

05:23 Yeah, I'm like a very data in the cloud sort of person.

05:28 So I guess.

05:28 Perfect.

05:29 All right.

05:29 Well, I'm very excited about this next topic that you have because I also saw it and said, that is killer.

05:35 It is killer.

05:36 Yeah.

05:37 So what we're talking about is TPROF.

05:39 This is from, or TPROF?

05:41 I think it's TPROF.

05:42 It's the targeting profiler from Adam Johnson.

05:45 And we've got a link to the PyPA site.

05:50 But it just mostly says, yeah, it's a targeting profiler.

05:54 But we're also going to link to his article introducing it.

05:59 introducing tprof a targeting profile profiler so that yeah so it's it almost speaks for itself

06:07 what the what he's talking about is if you run a profiler on your program it gives you everything

06:13 it like it and that and you do you want to do that at first to to see all the hot spots where

06:20 are you spending your time and that's what you use profilers for to see where you know where you

06:24 maybe could could optimize some of your some of your code and so but and once you pinpointed what

06:31 you want to try to fix then you want to try to measure that and you kind of have to do and his

06:37 his pain point was running running it over the whole program again and again and again and you

06:42 don't really need to so his is a targeting profiler and you give it exactly the function that you're

06:47 caring about or the functions multiple and it'll report what the times are like the minute max and

06:53 stuff. And then he went further and said, there's a lot of times where you're just going to want to

06:58 do like a before and after function. And it probably won't be before and after, but you can

07:04 like my search routine old and my search routine new or something like that. And you can have those

07:12 be targeted and run the results and compare those. Nice. It's like a showdown, right? Instead of

07:18 trying to remember the results, you just say, here's the old way, here's the new way. How'd that

07:23 turn out right. And he's kind of got two ways to do it. You can do like a, well, so you can do the

07:28 past the functions that you want to want to look at on the command line. So you don't even

07:33 have to actually change anything in your code to do this. You can just, you can just target a couple

07:39 functions and, and call those. And then, but you can also do a dash X for a compare and it'll do

07:48 a delta time, like a percent faster or a percent slower, which is pretty cool.

07:52 Or you can not have to do the command line thing too much

07:56 and just do as a context manager.

07:59 You can say which ones you're going to compare and run those before and after with tprof.

08:06 All those are, I probably use all of those at different times, but I'm really excited to try this

08:12 to use, to optimize some code that I've been working on.

08:15 So it's pretty neat.

08:17 Apparently uses the new Python 3.12 profiling API, which is really nice.

08:22 So a little using the modern lower touch profiler as well.

08:26 Yeah.

08:26 So of course, you have to run it with 3.12 or above, but that doesn't mean that the code that you're testing against

08:36 is necessarily just 3.12 and above.

08:38 It's just when you're running it, it'll be 3.12.

08:41 You know what? 3.12 is almost obsolete.

08:43 I'm serious.

08:44 Like how insane is that?

08:45 What?

08:46 I mean, 3.10 is the oldest that is even allowed.

08:50 Like 3.9 has gone into, like, we don't even do security patches anymore.

08:54 Yeah, yeah, that's true.

08:56 3.10 is on the shadow, and then, like, it's a few years until all that thing's gone.

08:59 So it's not that old.

09:01 And we don't even mention 2.7 anymore, the legacy.

09:05 So good.

09:06 So good to be past those days, right?

09:09 I would also like to point out that this is just so good for profiling any real application, like any real application, because so much of what happens if you just see profile my particular app is all the time spent loading modules, all the time connecting to the database and just all of that, you know, setting up the log in that all, like all the stuff that happens before you get to the one function, you want to see what's going on.

09:38 and then it stops and you're like, the thing only takes 50 milliseconds,

09:41 but I've got profiling for like two seconds.

09:44 And what is going on?

09:46 It's just so lost in the noise.

09:48 And then this one, you can just say, run it, let all that, just ignore all that junk.

09:53 When you get to this function, start, when you're done with the function, stop.

09:56 You know what I mean?

09:56 Yeah, and that's actually, oh, that part of it's overwhelming to me.

10:00 Every time I've had to reach, I don't have to, you know, it's like a lot of debugging tools.

10:04 You don't have to reach for them all the time.

10:06 But when you do, it's a little overwhelming to look at all that profiling output.

10:12 And I still haven't got my head around flame graphs.

10:15 But, you know.

10:16 Yeah.

10:17 Catch them.

10:17 They're hot.

10:18 No, honestly, like the fact you just say, just profile this function, and that actually happens on the CLI, my profiling is back, baby.

10:27 Like, it's just, I'm like, is it really worth trying to dig through all that overhead and junk and decide, like, probably no.

10:32 But now maybe it really is.

10:34 Thanks, Adam.

10:35 Yeah, thank you, Adam.

10:36 And then one more thing is like a lot of times some things are slower the very, very first

10:41 times they run.

10:42 So it might be worth writing a function that actually like runs it a bunch of times or

10:48 runs it once before you get to it and then profile.

10:51 You know what I mean?

10:52 Like just like, because it might be the module loading that actually overwhelms the compute,

10:56 but that only happens once or, you know, do I see compilation?

11:00 The context manager one would be great for that.

11:02 then you could just, you could call both, like both the new and the old function once.

11:06 Exactly.

11:07 And then do a loop and do like a hundred times to the other two.

11:13 Yeah, that would be totally perfect.

11:14 That would be really, really good.

11:16 You know what we don't talk enough about though, Brian?

11:18 Honestly, I mean, toads.

11:20 Let's talk about toads.

11:21 I know, and we live in Oregon.

11:22 We got them all over the place.

11:24 We do.

11:25 We do got them all over the place.

11:26 Or frogs at least.

11:27 I don't know if we have toads.

11:28 I don't, you know what.

11:30 So with Toad, this comes from Will McGuggan.

11:33 It's kind of his next big thing that he's been working on

11:36 since Textual.

11:38 And the idea is it's kind of like a UI for Claude Code type

11:43 things.

11:43 It actually works with something called OpenHands.

11:45 It works with Claude Code.

11:47 It works with Gemini CLI, probably others that I don't know about.

11:51 When I first saw this, I'm like, OK, well, I already have a terminal.

11:53 I can do this.

11:54 Like, what-- I'm not really entirely sure what I need this

11:58 for.

11:58 But the more I looked into it, it looks really quite neat.

12:01 So if you go through, it obviously renders pictures.

12:05 Apparently not there.

12:06 It renders pictures.

12:07 Like one of the pictures is the Mandelbrot set drawn.

12:10 Cool.

12:11 And it has really nice, basically better input.

12:16 One of the challenges with things like Claude Code and friends

12:19 is you're just in the terminal.

12:20 And so can you use the mouse to select part of your text

12:24 and cut it?

12:25 No, it's the terminal, right?

12:28 And the arrow keys work right and all those kinds of things.

12:31 So it has a really nice, I'll just put this little video in the background while it's

12:36 going here.

12:36 I can get it to play and it doesn't want to play.

12:39 Well, it has a really nice input for that kind of behavior, right?

12:45 Has a nice little web server that runs.

12:49 Let's see, a few more.

12:51 It uses fuzzy search.

12:53 So something you do all the time or should do all the time.

12:56 And if you're not, you're totally missing out.

12:58 adding files a specific context to what you're working on.

13:03 So you might say, hey, I'd like to work on the login page.

13:07 And here are the notes that I, you know, use the notes that I took.

13:10 How well is it going to work?

13:12 We don't know.

13:12 You could say, I'd like to work on at login.html.

13:16 And please see at login requirements.md, right?

13:22 And that will actually pop up a select.

13:26 And so this uses really nice fuzzy search and UI dropdowns in the terminal.

13:31 It says better prompt input.

13:33 It has support for terminals within terminals.

13:37 So if you ask it to run a command that is like complex terminal output,

13:41 it will actually embed that, which is really sweet.

13:43 So people should give this a look.

13:45 I think it's pretty nice.

13:47 You know, pip install, I believe.

13:49 I'm going to check it out.

13:50 No, curl.

13:51 Curl install it.

13:52 Get it.

13:53 That's the new way, right?

13:54 UV is showing us the way.

13:55 But you can also uv tool install it if you prefer.

13:58 And that's probably the way I would install it because I already have scripts that basically manage all of my uv tools, check for updates and so on.

14:05 So check out Toad if you're doing like CLI cloud code like things.

14:11 Yeah, cool.

14:12 Yeah.

14:14 Batrachain.

14:15 Batrachain.

14:16 I don't know.

14:16 The girl is pulling it from a URL called B-A-T-R-A-C-H-I-A-N.

14:23 Yeah.

14:24 That's.

14:25 Is that him?

14:26 That's where he's got it.

14:26 Yeah, that's him.

14:27 Nice.

14:28 Cool.

14:29 You've even got your animated Zoom.

14:32 Oh, yeah.

14:33 It's got the codex as well.

14:34 I'm going to stroll.

14:35 Unified experience for your terminal.

14:36 Nice.

14:37 Nice.

14:37 Yeah, so this is, I don't know how this relates back to Toads, but it must in ways that I don't

14:43 know.

14:43 Well done.

14:44 Well done, Will.

14:46 It was the inkblot.

14:47 What do you see here?

14:48 I see a Toad.

14:49 Exactly.

14:50 Well, I want to talk about something that sort of AI related as well.

14:56 So FastAPI just made, I'm linking to a merge request, and we'll link to the actual page too.

15:04 But in the contributing to FastAPI page, there was a new change to talk about contribution instructions

15:13 about LLM generated code and comments and automated tools for PRs.

15:18 So I'm guessing that all FastAPI at least, but probably all very popular projects are having a problem with people doing PRs that they really haven't spent that much time on the PR, but they want to get their name in or something.

15:33 I don't know.

15:35 But there's some really good highlight here.

15:37 And it's not, so I'm bringing this up not just because of FastAPI is awesome, and it is, but it's just sort of an interesting thing that we're having these sort of discussions.

15:49 And this is a nice, concise verbiage if you want to add some of this pros to your own project to say what is allowed and what's not.

15:58 And it's, so we'll link to the contributing guideline, developing, contributing, and there's an automated code and AI section.

16:06 And the gist is you're encouraged to use AI tools to do things, you know, whatever tools you have at hand to do things efficiently, but also know that there is human effort on the other end for the pull request.

16:20 And they just basically want to make sure that you're not doing less effort than they have to do to just check it.

16:28 So please put a person in the middle.

16:31 So there's things like they will automatically, they will close things that look like automated pull requests from bots and whatever.

16:41 And there is a, there's, they have a section on human effort denial of service.

16:47 You know, using automated tools and AI to submit PRs or comments that we have to carefully review and handle would be an equivalent of a denial of service attack on our human effort.

16:58 be very little effort for a person submitting the pr that generates a large amount of effort on our

17:04 side please don't do that um so i think this is uh completely fair and it said uh we will block

17:10 accounts that spam us with repeated automated prs or comments use tools wisely and a nice um nice

17:17 i think what is this spider-man comment come quote uh with great power i mean tools comes great

17:24 responsibility. So I just think this was good on their part to throw this in of, yes, use tools,

17:32 but know that there's a human on the other side having to deal with it.

17:36 Yeah. Keep it focused, right? Don't submit a 7,000 line PR and say, I made it better.

17:44 There was probably some small little part that need changed and stay on target, right?

17:49 Yeah. Now that you bring that up, it is very easy now to say, oh, I want to refactor

17:54 this function to be a new, like even with, without AI, I want to refactor this function to be a new,

17:59 a new function name. And it's going to change tons of code. That's still okay to do if it's focused,

18:05 even if it hits like hundreds of files, if it's just that one thing, like separate those up so

18:10 that, so that code reviewers can go, oh yeah, you just changed that function name. That's fine. But

18:14 if you do that plus, oh, plus I, you know, formatted with black and plus I like, you know,

18:20 optimize this one function.

18:22 It's terrible to combine those together.

18:23 And, you know, let's see.

18:24 And AI, it's very common for it to just churn through that kind of stuff

18:29 and generate a bunch of changes.

18:30 The curl team actually has blocked all AI contributions, period, for the same reason.

18:36 It's like, this is out of control.

18:37 We're not doing it.

18:38 I got a weird request for a private repository.

18:43 When I was starting the pytest course, I kicked around the idea of having a GitHub group

18:50 whatever like whatever those are uh organization for testing code um i decided not to but it's that

18:57 or that or that like organization still has some of the private code i used for the course

19:02 and i had somebody request me to give it clod access to my private notes no i'm not gonna do

19:08 that um no but anyway i'll share one really quick weird clod plus github thing while we're on the

19:14 topic then we'll move on um i am working on this project that i was doing something with clod code

19:19 on this feature. And I created a GitHub issue with a lot of notes for myself about it on GitHub,

19:24 but I didn't connect Claude code to it or anything. And then I was like, hey, Claude,

19:29 let's just brainstorm about what it would look like in this code base to add this feature.

19:32 And it takes a second that goes, I see that you've considered this, this, and this. And now that now

19:38 that I've got that background, I think this I'm like, how do you know that? What it had done is I

19:43 I had the GitHub, the GH CLI installed on my computer.

19:47 And so it used GH to go explore the GitHub repository

19:50 and find the issue.

19:52 Issue 1,274 or something.

19:54 Yeah, issue 1,274 said you wanted to do it this way.

19:57 I'm like, okay, that is insane that it just got in there

20:00 and I didn't ask it to, you know what I mean?

20:03 Yeah, anyway.

20:04 Do I want GH on my computer?

20:08 I'm like, how did it figure that out?

20:09 It shouldn't have access to that information, but it's okay.

20:12 How about we move on?

20:13 Extra time.

20:14 Yeah.

20:15 Do you have any extras?

20:16 Yeah, sure.

20:17 I got a couple, and then we can flip over to you.

20:20 So first of all, remember, Henry Schreiner pointed out that you can use compiled bytecode as a flag to uv.

20:28 And it points out in the uv docs here that uv does not compile PY files to bytecode in the Dunder PyCache startup PYC file.

20:37 Instead, that is lazily done at module import.

20:41 And that really got my attention because all of the Docker deployments that we do, for example, Python Bytes runs on Docker.

20:49 So when I deploy a new version of Python Bytes.fm, what happens?

20:52 It compiles the code, installs the dependencies with uv, and then it starts, you know, there's other stuff, but effectively,

21:00 then it just starts Granian running Python Bytes, right, as a core to app.

21:06 And from the time that it actually starts starting until it's all the way started, like the website is unavailable, right?

21:12 Yeah.

21:13 Because it's not some super complicated Kubernetes thing.

21:15 It just restarts the Docker container.

21:17 And for me, for the extra simplicity of like having one to two seconds of downtime per couple times a week is totally fine.

21:25 But what I realized when Henry said that, and you talked about uv not compiling these things, that that actually means it's got to every time I start the Docker container, literally every time it has to compile the PYC files for every library that it uses.

21:42 And it uses like well over 50.

21:44 There's a lot of libraries for these web apps, right?

21:46 Yeah.

21:46 And I'm like, wow.

21:47 Okay.

21:47 So there's no scenario where it will ever not have to generate those on AppStart, right?

21:53 because I don't ever shut down the Docker container and start it again.

21:56 The only time, the only reason the Docker container will ever shut down

21:59 is because it needs a new version of code or dependencies.

22:04 So it rebuilds the container from scratch using layers

22:07 and then it will bring that back up.

22:09 So I added --compile dash bytecode to all of the uv installs

22:16 because I don't care if the build time is one second slower,

22:20 if that means the actual launch time, which involves downtime, is one second faster.

22:25 Okay, so does this pre-compile thing happen then at the time you're building the Docker image?

22:33 Yes, which has nothing to do with the uptime or anything.

22:37 It's not until you say restart the Docker container with the new image that it actually shuts down the old one

22:42 and starts the new one.

22:42 So I make the build time one second slower, but the launch time one second faster, which is awesome.

22:48 And it probably makes, like, what, the Docker image a little bit bigger?

22:52 probably yeah yeah yeah probably but it would have you know i guess it does make the image a little

22:57 bit bigger but i'm not shipping it to docker hub and back so i don't really care yeah just i mean

23:02 even if you were it's that it's that yeah one second faster to do the flip over it's great that's

23:08 i think it's and all it is just include --compile dash bytecode on your uv and stuff

23:14 yeah sweet so thanks everyone you and henry and whoever else okay two things real quick i talked

23:19 about the MCP server for Talk Python and how that works, but I did a nice little write-up about it

23:25 over on the Talk Python blog. So I just want to point people to that proper write-up, which has

23:30 some background on the MCP server and the LLMs,.txt information there. And another thing, Brian,

23:37 people sent me a message and said, so let me get this right. You just recounted the story of

23:43 tailwind CSS getting destroyed by AI and then you purposefully added AI stuff to talk Python? Are

23:52 you insane? Well, okay, maybe, but I don't think so. So I also wrote a blog post on my personal blog

24:00 about why I think, you know, hiding from AI, blocking AI crawlers is probably not going to

24:07 serve you in the long term and why that I added these things. So sort of gave my background on

24:12 why I thought that was worth doing.

24:13 So people are like, like Michael's kind of got

24:17 two contradicting thoughts in his mind at the same time.

24:20 Well, I don't think it's contradictory though.

24:22 I think it would be, it would be like you,

24:25 you know, doing AI, having an AI generated thing

24:30 on all of the content of Talk Python training.

24:34 Yeah.

24:35 And you weren't, you wouldn't do that because that's,

24:37 that's what you make money off of.

24:39 So.

24:39 Exactly.

24:40 This stuff is, basically I'm saying if you have content you would like in Google,

24:45 you probably want it in AI indexes as well.

24:49 If it's something you don't want publicly available on Google,

24:51 then you probably don't want it here as well, right?

24:54 But this is like assuming you want to show up in search results

24:57 for regular search engines, you probably do with these.

25:00 And also just really quick while we're on it, like putting in the MCP sort of stuff here means it turns questions like,

25:07 hey, what guests were on the show?

25:09 Or what was this episode about?

25:10 or maybe I could find the transcripts for a page or whatever.

25:13 It turns that from scraping my website and hitting tons of code

25:18 to database index single queries for a small fragment of text.

25:23 So if the AI is going to ask questions anyway, this is way less harm on my server, if you will.

25:30 Well, hopefully they do that instead of doing both, though.

25:33 Yeah, I know.

25:34 The other thing is, so is this a process now?

25:38 Do you have to keep this updated on a weekly basis or?

25:41 No, it's all just driven by the database.

25:43 Okay.

25:44 So, yeah.

25:44 And if you look at the, the, my personal post version,

25:47 you can actually see some pictures of like how Claude.ai, the chatbot is using.

25:53 So I'll ask it like, what are the last five episodes?

25:56 You can see it's actually calling the API endpoint, getting recent episodes.

26:00 And then, or if you ask it more, it'll like search.

26:03 And then based on the search results, it'll get the details and so on.

26:06 Yeah.

26:07 One of my next projects is to, or in the near future, is to try to build one of these for internal stuff

26:13 so the internal tools can see internal APIs.

26:16 Yeah, it's very neat, and it's not that hard.

26:20 Also, you don't need a whole framework like Fast MCP or one of these things.

26:25 It's just a couple of simple web requests.

26:29 It's not like implementing your own web sockets or something.

26:32 You can just add it to any website.

26:34 It's super easy.

26:35 Cool. All right.

26:37 any other extras nope over to you okay um i let's see what do i got i got a few um do you remember

26:44 dig i do dig is back um which uh and i'm linking to a tech crunch article and i um there's a bunch

26:53 of articles on it but um it just uh it's i did a quick skim basically they're trying to trying to

27:01 trying to be what dig used to be but not not sucky um and also um uh some interesting people

27:09 kevin rose from um and uh and one of the co-founders from from reddit are putting it together um

27:15 interesting and uh yeah anyway i i'm i'll be interested to watch to see if it becomes a uh

27:22 something that we should care about but um anyway interesting article um also uh there's a python

27:28 community in there just started there's 99 members it's 99 members of python in the world

27:34 um no there's more but in the on the dig community and uh um is put together by somebody that meant

27:40 that actually the same person that sent it to us and said hey so this wigging wigging person uh

27:46 sent it to us which is great and also talked about us python bites podcast yay so yeah thank you

27:52 Another interesting article I ran across was from, I think, I'm sorry, Marieke, why lightweight websites may one day save your life, but basically just owed to lightweight websites.

28:06 And this is a lot to do with if the intent is for people to be able to use your website, even if they're on a cell phone, even if they have like a bad connection, then lightweight websites are a must.

28:17 That's just basically it.

28:20 Think about your target audience.

28:21 If your target audience is on the move, make it light and fast.

28:25 Obvious.

28:26 Last shout out to Wiz to a article called How to Parameterize Exception Testing in pytest.

28:37 And I'm glad people are still writing articles about pytest.

28:43 And I'm going to let it slide that he did the capitalization wrong.

28:49 No capitals in pytest.

28:50 Anyway.

28:51 It's fine.

28:51 You can fix it.

28:52 If you just right-click on that and say inspect, you can fix this website.

28:54 You didn't know if you could edit other people's sites, but.

28:56 You just pay.

28:58 Really?

28:58 It doesn't last right now.

28:59 Well, for you, but then if you reload or anyone else, it doesn't change.

29:03 The trick here, which is cool, is he's parameterizing whether or not something raises an exception.

29:10 And he's using a thing from context lib called null context and importing it and changing its name to does not raise.

29:19 And this is brilliant because this is very clear code to say, yeah, these cases, that shouldn't raise an exception.

29:27 Other cases should be a zero division error or a type error or whatever.

29:31 Obviously, you're not going to write a test for test division exceptions.

29:35 But to test your own code to make sure that it's raising the right exceptions.

29:39 Yeah.

29:40 And throwing in a couple of cases for when it doesn't raise this,

29:43 this is good, clean code and short article.

29:45 So I like it.

29:46 It's a very clean code.

29:47 I would have not seen that coming.

29:49 That's pretty good.

29:50 Yeah.

29:50 It's very, very good.

29:52 Well, that's it.

29:53 Do we have something funny?

29:54 This last thing here, this joke is really here basically for you,

30:00 Brian.

30:01 This was sent in by Pat Decker, I believe.

30:03 And I think it's good.

30:04 I think you'll appreciate this coming from a testing perspective.

30:07 Okay.

30:08 Okay.

30:08 So this was on Reddit.

30:10 And the idea here is two developers talking or maybe project manager.

30:15 I don't know.

30:15 It says, your new date picker widget has crashed.

30:18 Really?

30:19 That's impossible.

30:19 I've tested it with negative numbers, special characters, null.

30:23 What have you put in it?

30:25 A date?

30:27 Yes.

30:27 Oops.

30:29 Forgot the base case.

30:30 So busy testing all the edge cases and the area conditions that I forgot to see if it even worked.

30:35 Yeah.

30:37 Yeah.

30:37 That's a weird thing.

30:38 I think that there's so many people think that testing is just about the complicated edge cases,

30:44 but you've got to get the happy path first and make sure those work also.

30:49 Yeah, absolutely.

30:50 That's funny.

30:51 Pretty good one.

30:53 You've heard that tester walks into the bar, right?

30:55 I think so, but give me your variant.

30:58 a bad memory but tester walks into a or a text test um tester walks into a bar and uh orders uh

31:08 a half a beer and one and a half beers and a negative beer and um and a million beers to see

31:14 what happens and everything everything works fine um uh i thought that bar caught on fire astro

31:20 actual custer comes into the bar orders one beer and barker bar catches on fire there you go there

31:25 you go like that it's beautiful it's a beautiful thing it's a uh it's it's like a fable that tells

31:33 the the moral story of testing yeah and then my dad joke version of course the guy walks into a bar

31:39 and says ow all right it does hurt different kind of bar sorry um spoiled your joke but

31:47 all right um that another fun episode talk to you guys next week bye y'all


Want to go deeper? Check our projects