Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book


Transcript #353: Hatching Another Episode

Return to episode page view on github
Recorded on Tuesday, Sep 19, 2023.

00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:05 This is episode 353, recorded September 18th, 2023. I am Brian Okken.

00:11 And I am Michael Kennedy.

00:12 This episode is brought to you by us. So please check out Talk Python Training and the new Python

00:20 Testing with pytest course that's in progress, but already awesome. And also thank you to our

00:24 supporters. And if you want to connect to any of us, you've got Python Bytes and Brian Okken and

00:31 M Kennedy all on Fostadon and the links are in the show notes. So let's kick it off with our first

00:37 AI topic, Michael. Let's ask the AI what our first topic should be. Let's go to Stack Overflow and ask

00:44 them what our first AI topic should be. No, seriously. So Stack Overflow has announced Overflow AI,

00:51 which is not I'm sure they have, you know, they've got Stack Exchange with all sorts of different

00:55 things like stuff for Linux, stuff for Mac, DevOps, you know, you name it. Yeah. So I don't think this

01:02 is not another AI overflow type thing in that sense. This is adding generative AI to Stack Overflow.

01:09 All right. Okay. So the best way to kind of get a sense of what's going on here is there's a little

01:15 video, a three minute video that seems like it could be about 45 seconds given how much content is in it.

01:20 But you know, spend 30 minutes with them and you'll get an example. So the most relevant portion,

01:25 there's some stuff that we don't care about here, but the most relevant portion is when you go to

01:30 Stack Overflow, you search for an answer. It gives you a list of results, right? Those results are based on

01:36 traditional search, right? You know, keyword matches. And honestly, it does surprisingly well,

01:42 when you go search there. But they're changing their search and this is maybe it's out now. I have a hard time

01:49 time telling if it's out now as a preview you can sign up for, but you go to Stack Overflow and you

01:55 search for something now you get like a ChatGPT like response if you want. And it'll say, here are

02:01 some of the possible answers, but here's also just a, you know, like if you search Google or a Kagi, you

02:07 might get a little snippet that just shows you the answer from Stack Overflow without taking it there.

02:12 It's kind of like that. It'll say, and here's a little bit of an answer. And then you can actually

02:16 start a conversation about it. Like you would with ChatGPT, like that's cool, but it doesn't quite

02:20 apply. How would you, you know, I don't, I'm not using Selenium for this. I'm using Pyrite. Could you

02:26 give me an example of the same, the same process, but with Pyrite, you know, it'll try to give you the

02:32 answer. Right. So that's all pretty cool. And they're integrating generative AI into the public

02:37 platform into Stack Overflow for teams. I didn't know Stack Overflow for teams existed, but if I had a

02:43 team, it would look really cool to use, but not, not in a huge corporation, not a ton of people

02:48 working on stuff. So they're also planning to add at the moment VS Code integration or Stack Overflow

02:56 and the Stack Overflow AI. So you can just like, you know, you've got your source control section,

03:01 your file section. It's just got a Stack Overflow section where you can start talking to it and get

03:06 generative AI answers out of Stack Overflow. So I find there's a little bit of irony here, Brian.

03:12 In that Stack Overflow was banning answers from ChatGPT. Now they're adding a feature that lets you

03:20 have a conversation and query it using generative AI, which seems normal. However, one of the features

03:27 they highlight in the video is you can say, none of this applies to me. AI helped me draft a question.

03:32 So the answers can't be generative AI, but the questions can be generative AI.

03:37 Interesting.

03:38 For the right source of AI, right? Origin, right? The Overflow AI, ChatGPT. So this is in private alpha,

03:48 I guess, is the current form. But I went to, I went there, the way I discovered is I went to search for

03:54 something on Stack Overflow. I said, do you want to try out the new Overflow AI search? I'm like, yes.

03:58 What is this? This looks very interesting. And in my profile under settings, I have the ability to

04:03 have that turned on. It is turned on, but when I go to do search stuff, it doesn't seem to use it.

04:07 So I think it's like fading in and out as they're making changes to it. But yeah, people can check

04:11 that out. And then finally, VS Code extension looks cool. Where's our PyCharm one? Come on,

04:16 bring it on, bring it on. I want this in PyCharm too. PyCharm also added its own little AI

04:21 assistant, but I'm, I'm waiting for it to get better. Yeah, you can. Yeah, it's pretty cool.

04:26 But it's, it's not something I'm totally embracing yet. I just wanted to be, I'd like it when it rolls

04:32 out completely. I'd like it to be able to be able to know, did this come from a person or from an AI?

04:38 Or was it some mix of the two or something? I don't know. Right. Kind of like with, and I,

04:45 and I kind of agree with them for not allowing AI people to just throw the, throw the question

04:54 into ChatGPT and paste the answer into stack overflow. That's, that's not, that's not what

04:58 people are expecting. So. Right. And if you want that, just put your question straight in ChatGPT,

05:03 like leave out the stack overflow middle aspect. I'm sure the real concern was, you know, there's a lot

05:12 of reputation stuff. There's a lot of like, this person is really good at answering questions and

05:16 here's how much you've contributed. And if you just, you know, you could easily game that with AI

05:21 results and I'm sure it is being gained with AI results just with a little more work, but you know,

05:28 such as the world. Yeah. It's the times are a changing. So they, they are, that's what ChatGPT

05:35 told me anyway. How about you? What's your next one? Well, I, I was going to talk about

05:40 switching to hatch. So, for packaging for Python packaging and also setting up virtual

05:46 environments and stuff, I guess. So hatches, I'm okay. So for packaging, we had set up tools

05:52 and then we had lots of stuff. We had like flit and hatchling and poetry and all sorts of things.

05:59 so, hatch, there's two parts to hatch. There's hatchling, which is the backend that helps

06:06 you build and unbuild packages and stuff. And then there's, there's hatch, which is built.

06:12 It has hatchling as part of it, but anyway, let's go through this. There's an article from Oliver

06:17 and rich talking about switching to hatch. and great graphic, by the way, there's a graphic

06:23 that says, I don't know where I'm going from here, but I promise it won't be boring. That's

06:28 pretty cool. anyway, he was using poetry, looked at PDM and now he's thinking about hatch.

06:36 It's, possibly his new love. We don't know. It's a question mark. So, there's, there's a

06:41 sentence here that bothers me and we can get back to it, but it says some prominent projects are using

06:46 are enjoy using it or using hatch. I don't know if that's true, but let's go on. He does show some

06:52 really cool things with, with hatch, which I, I didn't know it could do. For instance, when you

06:57 say hatch new project, it creates a new PI project.toml file. One of the things it does in that is it

07:03 creates these, these entry points to, environment like virtual environments. And they're

07:09 separate ones for things like there's a default one. And then there's a test one that has like your

07:15 PI test and plugins and stuff. Then there's linting that has like pyrite and black and rough and things

07:22 in it. his question really was, why would I have like, why wouldn't I have like, just a dev

07:27 virtual environment, but we're using a lot of extra tools now they might have dependencies that clash.

07:33 So if you do have dependencies that clash, maybe a separate virtual environment for each type of tool

07:39 chain might be interesting. You are speaking my language now. I didn't really think about that,

07:44 but there's about a 50, 50 chance. If I say update all the dependencies for Talk Python Training, which I

07:52 think is about 50, there's the runtime ones. And then there's the reporting dev ones. And that includes

07:57 some things like Jupyter notebooks for graphs and stuff. Yeah. There's about a 50, 50 chance that I'll

08:03 get an error saying cannot find any solution to these requirements that you're asking for because something in the

08:09 data science stack forces something to be less than or equal to, and something in the other stack is greater than or equal to.

08:16 So if you're not intersecting and you're like, well, yeah, it still runs fine. And I'm sure it'll be okay. But it's like, I'm constantly

08:26 doing a dev environment and a dev tools environment in a runtime environment is actually pretty, pretty cool idea.

08:34 So if you're not going to be able to do that, you're going to be able to do that.

08:44 You can also do scripts. So each environment can have its own set of scripts. So you could have linting scripts and testing scripts and docs scripts and things like that. Like, for instance, one of the linting ones, you could say, run black, and it has all of your black arguments or something.

08:57 You can have that. That's pretty, it's kind of neat. No need to use talks. This surprised me.

09:15 Apparently, there's a matrix ability for hatch. So you can do a test matrix, test matrices within hatch. I don't know if you can do other, why would you do a docs matrix though?

09:28 You know, so test matrix makes sense. I don't know if it makes sense anywhere else. But that's kind of neat. And then the last bit I thought was pretty cool was that, well, for what it has scripts, but I don't know if it's very convenient.

09:41 So like, say you have a Cove script for coverage within your test environment, you would run it by saying hatch run test colon Cove. It's kind of a mouthful still. So I'm not sure.

09:53 It is. Yeah.

09:54 Optional dependencies are kind of neat. You there's all most tools have optional dependencies, but the apparently hatch has a kind of a neat way to say so let's say in the example, you had to have two different optional dependencies for my SQL or Postgres, depending on what database you want to install.

10:10 database you want to install. A default could be just saying my SQL and it would pull in all of the requirements for my SQL.

10:17 You just so you can there's an easy way to do sort of a transitive default set. So it's kind of cool. Anyway, nice. The thing I wanted to come back with was the comment of like everybody's using it. I don't think everybody's using it. If the in if you look at the hatch website, it does say all these different projects are using hatch. But some of them, it's obvious they're using hatchling. Like most of my projects use hatchling. That's not the same as using hatch.

10:45 Yeah, it could just be the build back. Yeah, just for building the wheels. Right?

10:48 Exactly. That build back in is rock solid and I recommend it. The hatch is sort of the work with the top hatches that workflow tool is similar to poetry. So but still cool. I kind of like that. Poetry kind of makes you at least last time I tried it. It's sort of in had you take all of the tools with it. But hatch is more of a use the tools you want sort of a tool. So anyway. Yeah.

11:10 Kind of cool. So nice. I just checked while we were talking. You did. What did I say 48 or something for the number of dependencies? Yeah, it's insane. I don't know how I got to where 232 packages to run Talk Python Training with the reporting. I can't easily separate the dev versus runtime.

11:28 Yeah, that like more work. But yes, that's a lot of packages. And that's a it's like, are you sure you're not using like, go or something else that uses tons of dependencies? JavaScript? Yeah, yeah, yeah. I'm not saying it's a problem. It's just there's with 232. There's a chance there's a reasonable chance that a bunch, you know, a good bunch of those come from the non runtime stuff. There's a good chance that there's a clash between them. So this idea of multiple virtual environments. Yeah. Yeah. Yeah.

11:55 I'm assuming most of those are transitive. You're using some tool that's using some other tool. Yeah, exactly. Exactly. Cool. Okay, on to the next. Hold on. Comment from Mark in the audience says, Yeah, I feel like hatchling but not hatch is a fairly common pattern. When mentioned on Doc Python to me. Yeah. Awesome. Speaking of, speaking of, I just want to point out that episode 408 earlier this year,

12:24 I had ofech on the creator of hatch, talk about hatch and its benefits and all those different things. So cool.

12:32 People can check that if they can, but let's talk formatting code. Formatting code equals black. Yes.

12:38 Yeah. And others and others rough. However, rough checks your code formatting for correctness. And we've, we've discussed how fast rough is right to the point where it's like, Hmm.

12:50 Did I actually check the code? Did I enter the wrong, the wrong directory and there just found no contents, right? That kind of thing. Right.

12:57 Well, this was sent over to us from sky. So thank you sky for sending in. Charlie Marsh, creator of rough has announced the Ruff formatter.

13:08 So not just telling you what's wrong, but checking for errors, but formatting your code based on convention, similar, but not identical to black.

13:17 Black. So that's pretty cool. Let me read a few things that sky sent over here. Cause I think it's their experiences worthwhile.

13:24 So Charlie says, first of all, the formatter is designed to be a drop in replacement for black, but with an excessive focus on performance and direct integration with rough.

13:34 That's pretty cool. Right.

13:35 Right. So sky says, I can't find any benchmarks that have been released yet, but I did some extremely unscientific testing caveat there and found the Ruff formatter to be five to 10 times faster than black when running on already formatted code or in a small code base.

13:49 So five to 10 times faster, but 75 times faster when running on a large code base of unformatted code.

13:55 However, they point out that the second outcome is not that relevant because how many times do you format huge projects that are not formatted?

14:03 No, normally it's incremental, right?

14:05 So the smaller bits is maybe worth paying attention to more there.

14:10 Yeah.

14:11 So I almost missed this announcement because rough already had some, go ahead and fix it.

14:16 If you can features.

14:18 Yeah.

14:18 I did have a few fix it things.

14:19 Yeah, exactly.

14:20 And like I said, I think that was about, there's a, a violation rather than a convention, right?

14:26 It's not a violation necessarily to say I write generally in single quotes, or I write in double quotes for all my strings.

14:33 Or I might sometimes have a single quote or sometimes have a double quote where like you might have a coding convention that says all of our strings are the same.

14:41 They all use single quotes or they all use double quotes.

14:43 And I don't, there's no reason for like in the same function to have two kinds of strings unless, unless you're in that situation where like I'm saying it's possible.

14:53 So I don't want to use a single quote.

14:55 So I don't want to use a single quote.

14:55 So I don't have to escape the apostrophe.

14:57 So I'll use a double quote for that one.

14:58 Right.

14:59 But other than those like sort of weird cases, you shouldn't mix.

15:03 Right.

15:03 I think that's more what the point of the Ruff formatter is addressing and black as well.

15:07 Yeah.

15:08 Well, rough has a few and a few differences from, from black as well.

15:13 Yes.

15:14 And they, they call out as the formatters intended to emit near identical output when run over black formatted code.

15:20 This is interesting.

15:21 When run over extensively black formatted code projects like Django and Zulip, it was 99.9% the same.

15:28 However, it says somewhere when run over non black formatted code, it might make different decisions than black has made.

15:36 Yeah.

15:37 And I kind of like some of the decisions that they're making.

15:40 Just.

15:40 I do too.

15:41 I do actually like them quite a bit.

15:43 So I'm, I'm a fan of some of these things.

15:46 It's not, it doesn't have as many features yet as, as black does in terms of like controlling certain things or, you know, but they're, they're working on it.

15:55 I was talking to Charlie just an hour ago by talking.

15:58 Charlie just said, I meant submitting a GitHub issue.

16:00 And it was quickly, we're having a back and forth there.

16:02 So that's awesome.

16:03 Yeah.

16:04 I like, so it talks about.

16:05 For instance, the line endings, I think are a cool way to deal with it.

16:09 Yeah.

16:10 So.

16:10 The line feed versus carriage return line feed backslash end versus backslash R backslash end.

16:16 That's a windows versus non windows.

16:18 Yeah.

16:19 Challenge.

16:20 Right.

16:20 And so I guess if you're on a windows, you don't want it to keep like unraveling that for you.

16:25 I suppose.

16:25 Yeah.

16:26 Actually, I thought one of the things I thought I read was things about comments at the end

16:31 of the line.

16:32 Black would often put like the comment on a completely, your, your comment might not match up with what

16:39 it, what you actually commented against.

16:41 Yeah.

16:42 Yeah.

16:42 Yeah.

16:42 Rough is trying to be a little bit better about that.

16:44 So.

16:45 Yeah.

16:45 The other area where this is supposed to be different intentionally says frequently black

16:51 will suggest this at the same or the different.

16:55 Let's see.

16:56 There's, there's some places where it's specifically different versus black.

17:01 It talks about, I don't know.

17:03 I haven't read it.

17:04 I don't want to like mistake quoted here, but it's, it talks about there's a whole bunch

17:08 of sections of the variations and so on.

17:11 Yeah.

17:12 Anyway, I'm, I'm excited to try it.

17:14 And, and I think it's cool that this is happening in this space.

17:18 Indeed, Jeff out there asks, is there also a rough daemon like black has?

17:24 I had no idea about the black daemon and I definitely don't know about the rough daemon.

17:27 But what I use is, I just have the ID integrate, IDE integration for rough.

17:35 So it will automatically be running in the background, right?

17:39 You get that for PyCharm and VS Code.

17:41 I, I imagine it makes sense to have it run and just constantly checking.

17:46 So I don't know, not, not sure.

17:48 Right.

17:49 This is still an alpha.

17:50 So probably not.

17:51 So, and also good to note it's, this isn't a set.

17:55 It is kind of a separate tool, but it's part of rough.

17:58 So it's a, if you say like Ruff format, yes, exactly.

18:01 So yeah.

18:03 Ruff format.

18:04 And I actually was thinking, I can't find my little example that I was running earlier,

18:10 but there's a couple of, yeah.

18:11 So you can do things like, so you can say like Ruff format, dash, dash,

18:16 line length, dash, dash respect, .gitignore.

18:19 So it'll say if it's ignored and get, don't format it, please.

18:24 Yeah.

18:25 You know, things like packages that are installed in a virtual environment.

18:28 Don't go messing with stuff that like, I don't care about.

18:31 Right.

18:31 If it's ignored in GitHub, I didn't get, it probably is only going to cause me trouble by messing with it.

18:38 so just leave it alone.

18:39 Right.

18:39 It's kind of the way I see it.

18:41 I never considered that.

18:43 Does black change like virtual environments?

18:46 this is a question.

18:48 I don't know.

18:48 Okay.

18:48 Anyway, probably, probably not, but I don't want it to.

18:51 It does.

18:52 Yeah.

18:52 Anyway.

18:52 Cool.

18:53 That, that, the rough, rough space format.

18:55 Very cool.

18:56 well, we're kind of going into the, like the, the inside baseball on this episode,

19:03 but that's all right.

19:03 next up, we've got a suggestion from Will McCoogan.

19:08 thought this might be good.

19:10 A good one to discuss on the show.

19:12 what's wrong with Toml?

19:13 I'm like, I don't think anything's wrong with Toml.

19:17 So let's take a look.

19:18 so there's an article, I forget calm.

19:22 So it's hard to find, but Kamal Connor, cool name.

19:25 what's wrong with Toml?

19:27 And the gist of this really is Toml is, is great for smallish things.

19:33 And even considering pyproject.toml is smallish, but interesting quote from, apparently from

19:41 Martin Vignar, author of PyToml, said Toml is a bad file format.

19:47 It looks good at first glance and for really, really trivial things.

19:51 It's probably good.

19:52 But once I started using it and the configuration schema became more complex, I found the syntax

19:57 ugly and hard to read.

19:58 Not sure what he was doing with it, but anyway.

20:01 So there, apparently there's some, some funkiness with big things.

20:06 And I'm like, well, what is big things?

20:08 And what are they comparing it to?

20:09 One of the comparisons is against a thing called strict YAML, which I didn't know what that

20:14 was.

20:15 And strict YAML is YAML compliant.

20:18 It's the YAML that won't let you go out at night.

20:20 Your curvy is like nine 30.

20:22 It's, it's really, it's oppressive.

20:24 apparently it's YAML with, some of the features taken away.

20:28 So I'm not, it's not a standard yet, but apparently it's in the process.

20:34 So like, what are they doing with it?

20:35 That Toml is a problem.

20:37 And, it's around strict YAML also is built for, what is it built for?

20:43 readable story tests.

20:46 I'm like, what's a readable story test?

20:48 I want to see.

20:48 Here's some examples.

20:50 We've got, this is a, strict YAML readable story.

20:55 mappings with defined keys.

20:58 Anyway, this is sort of readable, but there's a lot of keywords in here that, I'm not

21:03 going to say this is readable.

21:04 I don't think this is that great.

21:06 Now compared to the, Toml version.

21:10 Yeah, this is weird.

21:11 You've got like these weird brackets with lots of, I don't think this is necessarily a problem.

21:16 My take on it is I wouldn't use either of these for this purpose.

21:20 Exactly.

21:21 Like I would use Python, probably to describe stories, but anyway, I don't want to bash

21:28 on him.

21:28 I guess something to think about if, Toml is, if you're using Toml for really large things,

21:34 maybe it's a problem.

21:35 I'd be curious to know, well, if, well, if you're listening, what do you think?

21:40 Is it, are you using some wacky large Toml files that are becoming a problem?

21:44 I don't know.

21:45 So that's just wanted to throw that out there.

21:49 Awesome.

21:50 Got any extras to throw out there?

21:51 my extras are totally self-serving.

21:55 but, I see the beginning of the episode.

22:00 Yeah.

22:00 So the, the last, so last, last week I announced, I think that I had a part one of

22:05 the, my test course, all buttoned down and ready.

22:08 And that was a kind of a problem because my video intro video was like, Hey, I'm starting

22:14 this course, but I've already started it.

22:16 So I re what I did is I did a few things.

22:18 I redid the video, the intro video to just sort of describe where, where this course fits

22:23 in with everything else.

22:24 and so it's like a few minutes to check it out.

22:27 the other thing I did was I had some feedback from people that said, teachable sort of

22:33 easy to use, but some people might not understand.

22:36 Maybe you should do a little intro video.

22:37 So I did a, a little intro video for how to use, how to use teachable.

22:42 and it's a few minutes also.

22:45 And one of the things I like about this is I learned some things that I didn't know.

22:48 So you can, my favorite is you can adjust the speed so you can listen to me, at like

22:53 1.25 or 1.5 speed and it'll go faster.

22:56 The other thing is you can add notes.

22:59 You can add like notes for different video places and say, and then when you, when you

23:04 click on it, so I'll even do it without explainer video.

23:07 You can grab a note.

23:09 Hi.

23:10 and then later you can go back and click on the note and it takes you back to

23:14 that part of the video.

23:15 So if you want to keep some notes, that's kind of neat.

23:18 So that's, that's our early, all my extras is that's going on.

23:21 Yeah.

23:21 That's a very useful feature.

23:22 Excellent.

23:23 Excellent.

23:24 What you got making progress on your course.

23:26 Nice to see.

23:27 Thanks.

23:27 I also have a course announcements.

23:30 So Christopher Trudeau and I teamed up to create a Django version of the HTMX course at

23:38 talk Python.

23:39 Now we have HTMX plus Django modern Python web apps hold the JavaScript.

23:43 So this is a two hour course that shows Django developers, how to work with HTMX, how to

23:49 build up like a pretty, pretty realistic, pretty complicated, but not overly complicated, but

23:55 you know, not toy type of application that they get to build throughout the course.

23:59 So check that out.

24:00 It's still on the early bird special.

24:02 So if people get to it by the 23rd, September 23rd, got a few more days to

24:08 save 10%.

24:09 If you're thinking about getting it might as well do that now.

24:11 And it has the sister flask course.

24:14 If you don't Django, but you flask.

24:15 So those are our two sides of the same coin there.

24:19 And HTMX is just awesome.

24:20 So check that out.

24:21 If people are interested.

24:22 Look, it's looking forward to that.

24:23 Yeah.

24:24 Thanks.

24:24 And then if you happen to be coding in Rust, JetBrains just released a new IDE called Rust

24:32 Rover.

24:32 A funny name.

24:35 I'm not really sure the origin of it, but it's based on the same foundation as PyCharm.

24:40 So if you're already using PyCharm or, you know, something like it, WebStorm, whatever,

24:44 and you have the muscle memory for those hotkeys and basically the way it looks and feels, but

24:48 you also want to do Rust.

24:49 Rust Rover, Rust Rover, come on over.

24:52 You know, let's do it.

24:53 Nice.

24:54 I haven't tried this out.

24:54 I don't do any Rust.

24:55 So people who do Rust can check it out and let us know what they think.

24:59 Nice.

25:00 I use the C++ version also.

25:02 The sea lion?

25:04 Yeah.

25:04 It's sort of a funny name for it.

25:06 It is.

25:07 They got good names.

25:08 Also, Skylar Costco, who is the sky from submitting the Ruff formatter, says,

25:15 It looks like --respect .gitignore is the default behavior in a Ruff formatter.

25:20 You should only need to set a flag, pass this flag.

25:23 If you do want to format .gitignored files via dash dash, no respect.

25:27 No respect for the .gitignore.

25:30 Got no respect around here.

25:32 Yeah.

25:33 Some Rodney Dangerfield programming right there.

25:35 You dissing the .gitignore.

25:38 Come on now.

25:39 New talk by the episode I just released like right before we jumped on here.

25:43 The lightful machine learning apps with Gradio.

25:46 If you want to take a ML machine learning model you've created and put it into an interactive UI

25:52 on the web that you can share.

25:53 Super easy.

25:54 Check this out.

25:55 Open source.

25:55 Very cool.

25:56 They even have some hosting options.

25:57 Nice.

25:58 Both free and paid.

25:59 All right.

26:00 So that is it.

26:01 Oh, I guess one more piece of follow up.

26:03 Since you asked Will McGugan says, are we Tomal?

26:08 Most more hypothetical issues.

26:10 I think there were some good points, but haven't faced them yet.

26:13 Okay.

26:13 Thanks, Will.

26:14 I figured you were listening.

26:15 Yeah.

26:16 Brian, are you willing to face a joke?

26:20 Well, yeah, I think.

26:21 Yeah.

26:21 Okay.

26:22 So sometimes, sometimes.

26:23 I'm worried now.

26:24 Well, I'm sure in school you probably studied like the five stages of grief or something like

26:31 that.

26:31 Okay.

26:31 Yeah.

26:32 You know, that's not something you really want.

26:33 Why?

26:34 Well, there's not open the image.

26:35 I can't do it.

26:35 Anyway, maybe it's, it's just best.

26:38 No, it's not.

26:38 But very, I got to try to read very small here.

26:40 Hold on.

26:41 So this is the five stages of debugging.

26:44 Okay.

26:44 Let me find a way to make this bigger.

26:46 So I agree.

26:47 There we go.

26:48 So the five stages.

26:49 Okay.

26:49 Number one, denial.

26:51 This stage is often characterized by phrases such as what?

26:57 That's impossible.

26:58 Or I know this is right.

26:59 And a strong sign of denial is recompiling without changing any code.

27:04 Just in case.

27:05 Funny.

27:06 All right.

27:07 Stage two, bargaining and self-blame.

27:10 Several programming errors are uncovered and the programmer feels stupid and guilty of

27:14 having made them.

27:15 Bargaining is common.

27:16 If I fix this, will you please compile?

27:18 Also, I have only 14 errors to go.

27:20 Stage three is anger.

27:21 Cryptic error messages send the programmer into rage.

27:24 This stage is accompanied by hours long diatribes about the limitations of language directed

27:32 to whomever will listen.

27:35 Stage four, it's getting serious depression.

27:37 Following the outburst, the programmer becomes aware.

27:40 The hours have gone by unproductively.

27:43 And there's still no solution in sight.

27:45 The programmer becomes a listless.

27:47 The posture often deteriorates.

27:51 And you can see all the graphics are screaming and banging on the computer or staring at the sky.

27:58 The depression one is just sunk in the chair.

28:00 And acceptance, the wheelchair is like the wheelie chair.

28:04 It's turned around and it's gone.

28:05 There's no one at the computer anymore.

28:06 Yeah.

28:07 The final stage is acceptance.

28:08 The programmer finally accepts the situation, declares the bug a feature, and goes to play some Quake.

28:13 Yeah.

28:14 So I just, yeah.

28:16 There's tons of stages missing.

28:19 Yes, this is funny.

28:20 I know it's supposed to be a joke.

28:21 But, like, get up.

28:23 Go talk to somebody else.

28:25 Don't do, like, leave the computer.

28:27 That should be one of the first things you do is go, like, take a shower or take a nap or something like that and come back.

28:32 Yeah.

28:32 That's very productive.

28:33 I also think somewhere in there, there should be searching Stack Overflow.

28:36 Yeah.

28:37 And there should be another stage where you go to ChatGPT and see if it can help you.

28:41 Yeah.

28:42 Or duck, what, rubber ducking?

28:44 Yeah, exactly.

28:45 So I never really could get into the rubber duck completely of explaining the problem to a rubber duck or some inanimate object.

28:52 But explaining it to a non-technical person, I'll, like, try to explain the problem to, like, one of my kids or something.

28:59 And I'm often, while I'm explaining it, I'm like, wait, I think the problem's there.

29:04 Anyway.

29:05 So, yeah.

29:06 That's funny.

29:06 I find time away often is the most important thing.

29:09 Yeah.

29:09 Go record a podcast or something.

29:11 Exactly.

29:12 Or go for a walk.

29:14 Go for a motorcycle ride, bicycle ride.

29:16 Like, just get away from the computer for a little bit.

29:18 Yep.

29:18 Yeah.

29:18 Cool.

29:19 Well, speaking of getting away from the computer.

29:21 Exactly.

29:23 Let's get away from this podcast.

29:24 Thank you, everyone.

29:25 Thank you.

29:25 Bye.

Back to show page