Transcript #353: Hatching Another Episode
Return to episode page view on github00:00 Hello and welcome to Python Bytes where we deliver Python news and headlines directly to your earbuds. This is episode 353, recorded September 18th, 2023. I am Brian Okken.
00:11 And I am Michael Kennedy.
00:12 This episode is brought to you by us. So please check out Talk Python Training and the new Python Testing with pytest course that's in progress, but already awesome. And also thank you to our Patreon supporters.
00:25 And if you want to connect to any of us, you've got Python Bytes and Brian Okken and M. Kennedy all on Fostadon and the links are in the show notes. So let's kick it off with our first AI topic, Michael. Let's ask the AI what our first topic should be. Let's go to Stack Overflow and ask them what our first AI topic should be. No, seriously. So Stack Overflow has announced Overflow AI, which is not, I'm sure they have, you know, they've got stack exchange with all sorts of different things like stuff for Linux stuff for Mac dev, you know, you name it.
01:00 Yeah.
01:00 So I don't think this is, this is not another AI overflow type thing in that sense, this is adding generative AI to stack overflow.
01:09 All right.
01:10 Okay.
01:10 So the best way to kind of get a sense of what's going on here is there's a little video, a three minute video.
01:16 That seems like it could be about 45 seconds given how much content is in it, But you know, spend three minutes with them and you'll get an example.
01:23 So the most relevant portion, there's some stuff that we don't care about here, but the most relevant portion is when you go to stack overflow, you search for an answer, it gives you a list of results, right?
01:34 Those results are based on traditional search, right?
01:38 You know, keyword matches.
01:40 And honestly, it does surprisingly well, right?
01:42 When you go search there, but they're changing their search and this is maybe I have a hard time telling if it's out now as a preview you can sign up for, but you go to Stack Overflow and you search for something, now you get like a ChatGPT like response, if you want.
01:59 And it'll say, here are some of the possible answers, but here's also just a, you know, like if you search Google or a Kaggy, you might get a little snippet that just shows you the answer from Stack Overflow without taking you there.
02:11 It's kind of like that.
02:12 It'll say, and here's a little bit of an answer.
02:13 And then you can actually start a conversation conversation about it.
02:18 Like you would have a ChatGPT, like that's cool, but it doesn't quite apply.
02:21 How would you, you know, I don't, I'm not using Selenium for this.
02:24 I'm using Pyrite.
02:26 Could you give me an example of the same, the same process, but with Pyrite, you know, it'll try to give you the answer.
02:33 Right. So that's all pretty cool.
02:35 And they're integrating generative AI into the public platform into Stack Overflow for Teams.
02:40 I didn't know Stack Overflow for Teams existed, but if I had a team, it would look really cool to use, but not in a huge corporation with a ton of people working on stuff.
02:49 So they're also planning to add at the moment VS Code integration for Stack Overflow and the Stack Overflow AI.
02:58 So you can just like, you know, you've got your source control section, your file section.
03:02 It's just got a Stack Overflow section where you can start talking to it and get generative AI answers out of Stack Overflow.
03:10 So I find there's a little bit of irony here, Brian, In that Stack Overflow was banning answers from ChatGPT.
03:18 Now they're adding a feature that lets you have a conversation and query it using generative AI, which seems normal.
03:25 However, one of the features they highlight in the video, as you can say, none of this applies to me, AI helped me draft a question.
03:32 So the answers can't be generative AI, but the questions can be generative AI.
03:37 Interesting.
03:38 For the right source of AI, right?
03:41 Origin, right?
03:42 the overflow a ChatGPT.
03:44 So this is, in, in private alpha, I guess, is the current form, but I went to, I went there, the way I discovered is I went to search for something on stack overflow.
03:54 So do you want to try out the new overflow AI search?
03:57 I'm like, yes.
03:58 What is this?
03:59 This looks very interesting.
04:00 And in my profile under settings, I have the ability to have that turned on.
04:04 It is turned on, but when I go to do search stuff, it doesn't seem to use it.
04:07 So I think it's like fading in and out as they're making changes to it.
04:10 But yeah, people can check that out.
04:12 And then finally, VS Code extension looks cool.
04:14 Where's our PyCharm one.
04:16 Come on, bring it on, bring it on.
04:17 I want this in PyCharm to my charm also added its own little AI assistant, but I'm, I'm waiting for it to get better.
04:24 Yeah, you can.
04:25 Yeah.
04:26 It's, it's pretty cool, but it's, it's not something I'm totally embracing yet.
04:29 I just wanted to be, I'd like it when it rolls out completely, I'd like it to be able to be able to know, did this come from a person or from an AI, or was it some mix of the two or something?
04:42 I don't know.
04:43 right.
04:44 Kind of like with, and I, and I kind of agree with them for not allowing, AI, like people to just throw the, throw the question into chat, GBT and paste the answer into stack overflow.
04:56 That's, that's not, that's not what people are expecting.
04:59 So.
04:59 Right.
05:00 If, and if you want that, just put your question straight in chat, GBT, like leave out the stack overflow middle aspect.
05:07 I'm sure the real concern was, you know, there's a lot of reputation stuff.
05:13 There's a lot of like, this person is really good at answering questions and here's how much you've contributed.
05:17 And if you just, you know, you could easily game that with AI results.
05:22 And I'm sure it is being gamed with AI results just with a little more work, but you know, such as the world.
05:29 Yeah.
05:29 That's the times are a changing.
05:32 So they, they are.
05:33 That's what ChatGPT told me anyway.
05:35 How about you?
05:36 What's your next one?
05:37 >> Well, I was going to talk about switching to Hatch.
05:42 For Python packaging and also setting up virtual environments and stuff, I guess.
05:48 Okay, so for packaging, we had set up tools and then we had lots of stuff.
05:54 We had like Flit and Hatchling and Poetry and all sorts of things.
06:00 There's two parts to Hatch.
06:03 There's Hatchling, which is the backend that helps you build and unbuild packages and stuff. And then there's Hatch, which is built, it has hatchling as part of it. But anyway, let's go through this. There's an article from Oliver Andrich talking about switching to Hatch. And great graphic, by the way. There's a graphic that says, I don't know where I'm going from here, but I promise it won't be boring. That's pretty cool.
06:29 Anyway, he was using poetry, looked at PDM, now he's thinking about Hatch.
06:36 It's possibly his new love?
06:38 Don't know, it's a question mark.
06:39 So there's a sentence here that bothers me, and we can get back to it, but it says some prominent projects are enjoy using it or using Hatch.
06:49 I don't know if that's true, but let's go on.
06:51 He does show some really cool things with Hatch, which I didn't know it could do.
06:56 For instance, when you say Hatch new project, it creates a new pyproject.toml file.
07:01 One of the things it does in that is it creates these entry points to environment, like virtual environments, and they're separate ones for things like, there's a default one, and then there's a test one that has like your pytest and plugins and stuff.
07:18 Then there's linting that has like PyRite and black and rough and things in it.
07:22 His question really was, why wouldn't I have just a dev virtual environment?
07:29 But we're using a lot of extra tools now, they might have dependencies that clash.
07:33 So if you do have dependencies that clash, maybe a separate virtual environment for each type of tool chain might be interesting.
07:41 - You are speaking my language now.
07:43 I didn't really think about that, but there's about a 50/50 chance if I say, update all the dependencies for Talk Python Training, which I think is about 50.
07:53 There's the runtime ones, and then there's the reporting dev ones, That includes some things like Jupyter notebooks for graphs and stuff.
08:00 Yeah.
08:00 There's about a 50, 50 chance that I'll get an error saying, cannot find any solution to these requirements that you're asking for, because something in the data science stack forces something to be less than or equal to, and something in the other stack is greater than or equal to, and those are non-intersecting.
08:17 And I'm like, well, it still runs fine and I'm sure it'll be okay, but it's like, I'm constantly fighting that.
08:24 So actually having a dev environment and a dev tools environment and a runtime environment is actually a pretty cool idea.
08:33 >> Yeah. One of the examples is Docs.
08:35 They're using make docs material, so it's probably got a bunch of dependencies or you might be using some other document generator tool.
08:43 You don't really care if that collides with your testing chain or something because they're different.
08:49 Okay. Anyway, neat idea.
08:51 I didn't know Hatch did that. It's cool.
08:53 You can also do scripts.
08:55 Each environment can have its own set of scripts.
08:59 You can have linting scripts and testing scripts and docs scripts and things like that.
09:04 For instance, one of the linting one is you could say, run black and it has all of your black arguments or something.
09:10 >> Nice.
09:10 >> That's neat. No need to use talks.
09:14 This surprised me.
09:16 Apparently, there's a matrix ability for Hatch.
09:19 So you can do test matrix, test matrices within Hatch.
09:24 I don't know if you can do other, why would you do a docs matrix though?
09:28 You know, so test matrix makes sense.
09:30 I don't know if it makes sense anywhere else, but that's kind of neat.
09:34 And then the last bit I thought was pretty cool was that, well, for one, it has scripts, but I don't know if it's very convenient.
09:41 So like, say you have a Cove script for coverage within your test environment, you would run it by saying, "hatch run test colon cov." It's kind of a mouthful still, so I'm not sure.
09:53 - It is, yeah.
09:55 - Optional dependencies are kind of neat.
09:58 There's, most tools have optional dependencies, but apparently Hatch has a kind of a neat way to say, so let's say in the example, you had two different optional dependencies for MySQL or Postgres, depending on what database you want to install.
10:11 A default could be just saying MySQL, and it would pull in all of the requirements for MySQL.
10:17 There's an easy way to do sort of a transitive default set.
10:22 So it's kind of cool.
10:24 Anyway, the thing I wanted to come back with was the comment of like, everybody's using it.
10:29 I don't think everybody's using it.
10:31 If you look at the Hatch website, it does say all these different projects are using Hatch, but some of them, it's obvious they're using Hatchling.
10:41 Like most of my projects use Hatchling.
10:43 That's not the same as using Hatch.
10:45 - Yeah, it could just be the build back in just for building the wheels, right?
10:48 - Exactly.
10:49 That build back in is rock solid and I recommend it.
10:52 The Hatch is sort of the top Hatch is the workflow tool is similar to Poetry.
10:57 But still cool.
10:58 I kind of like that Poetry kind of makes you, at least last time I tried it, it sort of had you take all of the tools with it.
11:06 But Hatch is more of a use the tools you want sort of a tool.
11:09 So anyway, kind of cool.
11:11 - Nice.
11:12 - I just checked while we were talking.
11:14 - You did?
11:14 - What did I say, 48 or something for the number of dependencies?
11:17 - Yeah.
11:18 - It's insane, I don't know how I got to where I am.
11:19 232 packages to run TalkBython training with the reporting.
11:24 I can't easily separate the dev versus runtime without more work, but that's a lot of packages.
11:32 - And that's, it's like, are you sure you're not using like Go or something else that uses tons of dependencies?
11:41 JavaScript?
11:42 Yeah, yeah, yeah.
11:43 I'm not saying it's a problem.
11:44 It's just there's, with 232, there's a chance, there's a reasonable chance that a bunch, you know, a good bunch of those come from the non-runtime stuff.
11:52 There's a good chance that there's a clash between them.
11:54 So this idea of multiple virtual environments is cool.
11:56 - Yeah. - Yeah.
11:57 - But I'm assuming most of those are transitive.
12:00 You're using some tool that's using some other tool.
12:02 - Yeah, yeah, exactly, exactly.
12:04 - Cool.
12:06 - Okay, on to the next.
12:08 Hold on.
12:09 Comment from Mark in the audience says, Yeah.
12:11 I feel like hatchling, but not hatch is a fairly common pattern.
12:15 when mentioned on doc Python to me.
12:17 Yeah.
12:17 Awesome.
12:18 Speaking of, speaking of, I just want to point out that episode 408 earlier this year, I had OFEC on the creator of hatch talk about hatch and its benefits and all those different things.
12:32 So people can check that if they cared, but let's talk formatting code.
12:36 Formatting code equals black.
12:37 Yes.
12:38 Yeah.
12:38 And others.
12:39 And others.
12:41 Rough, however, rough checks your code formatting for correctness.
12:44 And we've, we've discussed how fast rough is right.
12:48 To the point where it's like, Hmm, did I actually check the code?
12:52 Did I enter the wrong, the wrong directory and there just found no contents, right?
12:56 That kind of thing.
12:57 Right.
12:57 Well, this was sent over to us from sky.
13:00 So thank you sky for sending in.
13:02 Charlie Marsh creator of rough has announced the rough formatter.
13:08 So not just telling you what's wrong, but checking for errors, but formatting your code based on convention, similar, but not identical to black.
13:17 So that's pretty cool.
13:19 Let me read a few things that Sky sent over here, because I think it's their experience is worthwhile.
13:24 So Charlie says, first of all, the formatter is designed to be a drop-in replacement for black, but with an excessive focus on performance and direct integration with rough.
13:34 That's pretty cool, right?
13:35 So Sky says, I can't find any benchmarks that have been released yet, but I did some extremely unscientific testing, caveat there, and found the rough formatter to be five to ten times faster than black when running on already formatted code or in a small code base.
13:49 So five to ten times faster, but 75 times faster when running on a large code base of unformatted code.
13:55 However, they point out that the second outcome is not that relevant because how many times do you format huge projects that are not formatted?
14:03 >> Normally it's incremental, right?
14:05 So the smaller bits is maybe worth paying attention to more there.
14:10 >> Yeah.
14:10 >> Yeah.
14:11 >> I almost missed this announcement because Ruff already had some go ahead and fix it if you can features.
14:18 >> Yeah, it did have a few fix it things.
14:19 >> Yeah.
14:20 >> Yeah, exactly. Like I said, I think that was about there's a violation rather than a convention.
14:26 It's not a violation necessarily.
14:28 You just say I write generally in single quotes or I write in double quotes for all my strings, Or I might sometimes have a single quote or sometimes I have a double quote where like you might have a coding convention that says all of our strings are the same, they all use single quotes or they all use double quotes and I don't, there's no reason for like in the same function to have two kinds of strings unless, unless you're in that situation where like I'm saying it's possible.
14:53 So I don't want to use a single quote, so I don't have to escape the apostrophe.
14:57 So I'll use a double quote for that one.
14:58 Right.
14:58 But other than those like sort of weird cases, you shouldn't mix.
15:03 Right.
15:03 I think that's more what the point of the rough formatter is addressing and black as well.
15:07 Yeah.
15:07 Well, rough has a few and a few differences from, from black as well.
15:13 Yes.
15:14 And they, they call out as, as the formatters intended to emit near identical output when run over black formatted code.
15:20 This is interesting.
15:21 When run over extensively black formatted code projects like Django and Zulip, Zulip, it was 99.9% the same.
15:28 However, it says somewhere, "When run over non-black formatted code, it might make different decisions than black is made." >> Yeah. I like some of the decisions that they're making.
15:41 >> I do actually like them quite a bit.
15:43 So I'm a fan of some of these things.
15:47 It doesn't have as many features yet as black does in terms of controlling certain things.
15:54 But they're working on it.
15:55 I was talking to Charlie just an hour ago.
15:57 By talking, I meant submitting a GitHub issue.
16:00 (laughing)
16:00 It was quickly, we're having a back and forth there.
16:02 So that's awesome.
16:03 Yeah, everybody talks about--
16:05 - For instance, the line endings, I think are a cool way to deal with it.
16:10 - Yeah.
16:11 The line feed versus carriage return line feed, backslash n versus backslash r, backslash n.
16:16 That's a Windows versus non-Windows challenge, right?
16:20 And so I guess if you're on Windows, you don't want it to keep unraveling that for you, I suppose.
16:26 >> Yeah, actually, one of the things I thought I read was things about comments at the end of the line.
16:32 Black would often put the comment on a completely, your comment might not match up with what you actually commented against, but Ruff is trying to be a little bit better about that.
16:44 >> Yeah, the other area where this is supposed to be different intentionally, Because frequently black will suggest the same or the different.
16:55 Yeah, let's see.
16:56 There's some places where it's specifically different versus black.
17:01 And it talks about, I haven't read it.
17:04 I don't want to like mistake quoted here.
17:06 But it talks about there's a whole bunch of sections of the variations and so on.
17:11 Yeah.
17:12 Anyway, I'm excited to try it.
17:14 And I think it's cool that this is happening in this space.
17:18 >> Indeed. Jeff out there asks, is there also a Ruff daemon like Black has?
17:24 I had no idea about the Black daemon, and I definitely don't know about the Ruff daemon.
17:27 But what I use is, I just have the IDE integration for Ruff.
17:35 So it will automatically be running in the background.
17:39 You get that for PyCharm and VS Code.
17:41 I imagine it makes sense to have it run and just constantly checking.
17:46 So I don't know, not sure.
17:49 This is still in alpha, so probably not.
17:52 >> Also good to note, this isn't a separate tool, but it's part of rough.
17:58 So if you say rough format.
18:00 >> Yes, exactly. Yeah.
18:03 Rough format, and I actually was thinking, I can't find my little example that I was running earlier.
18:10 But there's a couple of, yeah.
18:11 So you can do things like, put it, so you can say like rough format, --line length, --respect and get ignore.
18:19 So it'll say if it's ignored in get, don't format it, please.
18:24 You know, things like packages that are installed in a virtual environment.
18:28 Don't go messing with stuff that like I don't care about.
18:31 Right.
18:31 If it's ignored in GitHub or in Git, it probably is only going to cause me trouble by messing with it.
18:38 so just leave it alone.
18:39 Right.
18:39 It's kind of the way I see it.
18:41 I never considered that.
18:43 Does black change like virtual environments?
18:46 So this is a question.
18:48 I don't know.
18:48 Okay.
18:48 Anyway, probably, probably not, but I don't want it to.
18:51 If it does anyway.
18:52 Cool.
18:53 That, that the rough, rough space format.
18:55 Very cool.
18:56 well, we're kind of going into the, like the, the inside baseball on this episode, but that's all right.
19:03 next up we've got a suggestion from Will McGugan.
19:08 Thought this might be good, a good one to discuss on the show.
19:12 What's wrong with Toml?
19:14 I'm like, I don't think anything's wrong with Toml, so let's take a look. There's an article, I forget, it's hard to find, but Toml Connor, cool name.
19:25 What's wrong with Toml?
19:27 The gist of this really is Toml is great for smallish things, and even considering pyproject.toml is smallish.
19:38 But interesting quote from apparently from Martin Vigenard, author of PyTOML, said, "TOML is a bad file format.
19:47 It looks good at first glance and for really, really trivial things, it's probably good.
19:52 But once I started using it and the configuration schema became more complex, I found the syntax ugly and hard to read." Not sure what he was doing with it, but anyway.
20:02 Apparently, there's some funkiness with big things.
20:06 I'm like, "Well, what is big things and what are they comparing it to?" One of the comparisons is against a thing called strict YAML, which I didn't know what that was.
20:15 Strict YAML is YAML compliant.
20:18 >> It's the YAML that won't let you go out at night.
20:20 Your curfew is like 930.
20:22 It's really oppressive.
20:24 >> Apparently, it's YAML with some of the features taken away.
20:29 It's not a standard yet, but apparently it's in the process.
20:34 What are they doing with it?
20:36 that TML is a problem and it's around, and strict YAML also is built for, what is it built for?
20:44 Readable story tests.
20:46 I'm like, what's a readable story test?
20:48 I wanna see.
20:49 Here's some examples.
20:50 We've got, this is a strict YAML readable story.
20:55 Mappings with defined keys.
20:58 Anyway, this is sort of readable, but there's a lot of keywords in here that, I'm not gonna say this is readable.
21:04 I don't think this is that great.
21:06 Now, compared to the Toml version, yeah, this is weird.
21:11 You've got like these weird brackets with lots of, I don't think this is necessarily a problem.
21:16 My take on it is I wouldn't use either of these for this purpose.
21:20 - Exactly.
21:22 - I would use Python probably to describe stories.
21:25 But anyway, I don't want to bash on him.
21:28 I guess something to think about if Toml is, if you're using Toml for really large things, Maybe it's a problem.
21:35 I'd be curious to know, Will, if you're listening, what do you think?
21:40 Is it, are you using some wacky large TOML files that are becoming a problem?
21:45 So that's, just wanted to throw that out there.
21:50 - Awesome.
21:51 Got any extras to throw out there?
21:52 - My extras are totally self-serving.
21:55 But I--
21:58 - See the beginning of the episode?
22:00 - Yeah, so the last, last week I announced, I think that I had part one of the Pytest course all buttoned down and ready.
22:09 And that was kind of a problem because my intro video was like, "Hey, I'm starting this course, "but I've already started it." So what I did is I did a few things.
22:18 I redid the video, the intro video, to just sort of describe where this course fits in with everything else.
22:25 And so it's like a few minutes, so check it out.
22:28 The other thing I did was I had some feedback from people that said, "Teachable is sort of easy to use, "but some people might not understand.
22:36 "Maybe you should do a little intro video." So I did a little intro video for how to use Teachable.
22:42 And it's a few minutes also.
22:45 And one of the things I like about this is I learned some things that I didn't know.
22:49 So my favorite is you can adjust the speed.
22:52 So you can listen to me at like 1.25 or 1.5 speed and it'll go faster.
22:57 The other thing is you can add notes.
22:59 You can add notes for different video places and say, and then when you click on it, so I'll even do it with a explainer video.
23:08 You can grab a note, hi, and then later you can go back and click on the note and it takes you back to that part of the video.
23:15 So if you wanna keep some notes, that's kinda neat.
23:18 So that's really all my extras is that's going on.
23:21 - Yeah, that's a very useful feature.
23:22 Excellent, excellent.
23:24 - What you got?
23:25 - Making progress on your course, nice to see.
23:27 - Thanks.
23:28 I also have a course announcement.
23:31 So Christopher Trudeau and I teamed up to create a Django version of the HTMX course at Talk Python.
23:38 Now we have HTMX plus Django, modern Python web apps hold the JavaScript.
23:43 So this is a two hour course that shows Django developers how to work with HTMX, how to build up like a pretty realistic, pretty complicated, but not overly complicated, but you know, not toy type of application that they get to build throughout the course.
23:59 So check that out.
24:00 It's still on the early bird special.
24:02 So if people get to it by the 23rd, September 23rd, got a few more days to save 10%, if you're thinking about getting it, might as well do that now.
24:11 And it has the sister flask course, if you don't Django, but you flask.
24:16 So those are our two sides of the same coin there.
24:19 And HTMX is just awesome.
24:20 So check that out.
24:21 If people are interested.
24:22 Look, it's looking forward to that.
24:23 Yeah.
24:24 Thanks.
24:24 And then if you happen to be coding in Rust, JetBrains just released a new IDE called Rust Rover, a funny name.
24:35 I'm not really sure the origin of it, but it's based on the same foundation as PyCharm.
24:39 So if you're already using PyCharm or, you know, something like it, WebStorm, whatever, and you have the muscle memory for those hot keys and basically the way it looks and feels, but you also want to do Rust, Rust Rover, Rust Rover, come on over.
24:52 You know, let's, let's do it.
24:53 I haven't tried this out.
24:54 I don't do any Rust, so people who do Rust can check it out and let us know what they think.
24:59 - Nice.
25:00 I use the C++ version also.
25:03 - The C lion?
25:04 - Yeah, it's sort of a funny name for it, but.
25:07 - It is, they got good names.
25:08 Also, Skyler Cosco, who is the guy from submitting the rough formatter, says, "It looks like --respect getignore "is the default behavior in a rough formatter.
25:20 "You should only need to set a flag," pass this flag, If you do want to format get ignored files via dash dash, no respect, no respect for the get ignore.
25:30 Got no respect around here.
25:32 Yeah.
25:33 There's some Rodney Dangerfield programming right there.
25:36 You, you dissing the get ignore.
25:38 Come on now.
25:39 A new talk by that episode.
25:41 I just released like right before we jumped on here, the lightful machine learning apps with Gradio, Gradio.
25:47 If you want to take a ML machine learning model, you've created and put it into an interactive UI on the web that you can share super easy.
25:54 Check this out, open source, very cool.
25:56 They even have some hosting options.
25:57 >> Nice.
25:58 >> Both free and paid. All right.
26:00 So that is it.
26:02 Oh, I guess one more piece of follow-up.
26:03 Since you asked Will, McGoogan says, "Are we Tomalmos, more hypothetical issues?
26:10 I think there were some good points, but haven't faced them yet." >> Okay. Thanks, Will. I figured you were listening.
26:15 >> Yeah. Brian, are you willing to face a joke?
26:19 >> Yeah. Well, yeah, I think.
26:21 >> Yeah. Okay. So sometimes.
26:23 >> I'm worried now.
26:24 >> Well, I'm sure in school, you probably studied like the five stages of grief or something like that.
26:31 >> Okay. Yeah.
26:32 >> That's not something you really want.
26:33 Why is it not open there? I can't zoom it.
26:35 Anyway, maybe it's just best.
26:38 No, it's not. I got to try to read very small here.
26:40 Hold on. So this is the five stages of debugging.
26:44 Okay. I thought we'd find a way to make this bigger.
26:46 So there we go.
26:48 So the five stages.
26:49 Okay.
26:50 Number one, denial.
26:52 This stage is often characterized by phrases such as, "What? That's impossible." Or, "I know this is right." And a strong sign of denial is recompiling without changing any code just in case.
27:05 Funny.
27:06 All right.
27:07 Stage two, bargaining and self-blame.
27:10 Several programming errors are uncovered and the programmer feels stupid and guilty of having made them.
27:15 Bargaining is common.
27:16 If I fix this, will you please compile?
27:18 Also, I have only 14 errors to go.
27:20 Now stage three is anger.
27:21 Cryptic error messages send the programmer into rage.
27:24 This stage is accompanied by hours long diatribes about the limitations of language directed to whomever will listen.
27:35 Stage four, it's getting serious, depression.
27:37 Following the outburst, the programmer becomes aware that the hours have gone by unproductively and there is still no solution in sight.
27:46 The programmer becomes a listless, the posture often deteriorates.
27:50 And you could see like all the graphics are like screaming and banging on the computer or like staring at the sky.
27:58 The depression one is just sunk in the chair and acceptance.
28:01 The chair is the, like the wheelie chair turned around.
28:04 There's gone.
28:05 There's no one at the computer anymore.
28:06 Yeah.
28:07 The final stage is acceptance.
28:08 The programmer finally accepts the situation, declutters the bugger feature and goes to play some Quake.
28:12 Yeah.
28:14 >> I just, yeah, there's tons of stages missing.
28:19 Yes, this is funny. I know it's supposed to be a joke, but get up, go talk to somebody else.
28:25 Don't do, leave the computer.
28:27 That should be one of the first things you do is go take a shower, take a nap or something like that and come back.
28:32 >> Yeah, that's very productive.
28:33 I also think somewhere in there, there should be searching Stack Overflow, and there should be another stage where you go to ChatGPT and see if it can help you.
28:41 >> Yeah. What? Rubber ducking?
28:44 - Yeah, exactly.
28:46 - So I never really could get into the rubber duck completely of explaining the problem to a rubber duck or some inanimate object, but explaining it to a non-technical person, I'll try to explain the problem to one of my kids or something, and while I'm explaining it, I'm like, "Wait, I think the problem's there." Anyway, so that's funny.
29:06 - I find time away often is the most important thing.
29:09 - Yeah, go record a podcast or something.
29:12 Exactly.
29:13 Or go for a walk, go for a motorcycle ride, bicycle ride, like just get away from the computer for a little bit.
29:18 Yeah.
29:18 Cool.
29:19 Well, speaking of getting away from the computer, exactly.
29:23 Let's get away from this podcast.
29:24 Thank you everyone.
29:25 Thank you.
29:25 Bye.