#473: A clean room rewrite?
About the show
Sponsored by us! Support our work through:
- Our courses at Talk Python Training
- The Complete pytest Course
- Patreon Supporters Connect with the hosts
- Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky)
- Brian: @brianokken@fosstodon.org / @brianokken.bsky.social
- Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it.
Michael #1: chardet ,AI, and licensing
- Thanks Ian Lessing
- Wow, where to start?
- A bit of legal precedence research.
- Chardet dispute shows how AI will kill software licensing, argues Bruce Perens on the Register
- Also see this GitHub issue.
- Dan Blanchard, maintainer of a Python character encoding detection library called chardet, released a new version of the library under a new software license. (LGPL → MIT)
- Dan is allowed to make this change because v7 is a complete “clean room” rewrite using AI
- BTW, v7 is WAY better:
- The result is a 48x increase in detection speed for a project that lives in the hot loops of many projects. That will lead to noticeable performance increases for literally millions of users (the package gets ~130M downloads per month).
- It paves a path towards inclusion in the standard library (assuming they don’t institute policies against using AI tools).
- Thread-safe detect() and detect_all() with no measurable overhead; scales on free-threaded Python 3.13t+
- An individual claiming to be Mark Pilgrim, the original creator of the library, opened an issue in the project's GitHub repo arguing that Blanchard had no right to change the software license, citing the LPGL requirement that the license remain unchanged.
- A 'complete rewrite' is irrelevant, since they had ample exposure to the originally licensed code (i.e. this is not a 'clean room' implementation).
- Blanchard disagreed, citing how version 7.0.0 and 6.0.0 compare when subjected to JPlag, a library for detecting plagiarism.
- Blanchard told The Register he had wanted to get chardet added to the Python standard library for more than a decade since it’s a core dependency to most Python projects.
Brian #2: refined-github
- Suggested by Matthias Schöttle
- A browser plugin that improves the GitHub experience
- A sampling
- Adds a build/CI status icon next to the repo’s name.
- Adds a link back to the PR that ran the workflow.
- Enables tab and shift tab for indentation in comment fields.
- Auto-resizes comment fields to fit their content and no longer show scroll bars.
- Highlights the most useful comment in issues.
- Changes the default sort order of issues/PRs to Recently updated.
- But really, it’s a huge list of improvements
Michael #3: pgdog: PostgreSQL connection pooler, load balancer and database sharder
- PgDog is a proxy for scaling PostgreSQL.
- It supports connection pooling, load balancing queries and sharding entire databases.
- Written in Rust, PgDog is fast, secure and can manage thousands of connections on commodity hardware.
- Features
- PgDog is an application layer load balancer for PostgreSQL
- Health Checks: PgDog maintains a real-time list of healthy hosts. When a database fails a health check, it's removed from the active rotation and queries are re-routed to other replicas
- Single Endpoint: PgDog can detect writes (e.g. INSERT, UPDATE, CREATE TABLE, etc.) and send them to the primary, leaving the replicas to serve reads
- Failover: PgDog monitors Postgres replication state and can automatically redirect writes to a different database if a replica is promoted
- Sharding: PgDog is able to manage databases with multiple shards
Brian #4: Agentic Engineering Patterns
- Simon Willison
- So much great stuff here, especially
- Anti-patterns: things to avoid
- And 3 sections on testing
Extras
Brian:
- <code>uv python upgrade</code> will upgrade all versions of Python installed with uv to latest patch release
- suggested by John Hagen
- Coding After Coders: The End of Computer Programming as We Know It
- NY Times Article
- Suggested by Christopher
- Best quote: “Pushing code that fails pytest is unacceptable and embarrassing.”
Michael:
- Talk Python Training users get a better account dashboard
- Package Managers Need to Cool Down
- Will AI Kill Open Source, article + video
- My Always activate the venv is now a zsh-plugin, sorta.
Joke: Ergonomic keyboard
Also pretty good and related:
Links
- legal precedence research
- Chardet dispute shows how AI will kill software licensing, argues Bruce Perens
- this GitHub issue
- citing
- JPlag
- refined-github
- Agentic Engineering Patterns
- Anti-patterns: things to avoid
- Red/green TDD
- First run the test
- Agentic manual testing
- <code>uv python upgrade</code>
- Coding After Coders: The End of Computer Programming as We Know It
- Suggested by Christopher
- a better account dashboard
- Package Managers Need to Cool Down
- Will AI Kill Open Source
- Always activate the venv
- now a zsh-plugin
- Ergonomic keyboard
- <code>Claude Code Mandated</code>
- claude-mandated.png
- blobs.pythonbytes.fm/keyboard-joke.jpeg?cache_id=a6026b
Episode Transcript
Collapse transcript
00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.
00:05 This is episode 473, recorded March 16th, 2026.
00:11 And I am Brian Okken.
00:12 And I'm Michael Kennedy.
00:13 And as often lately, this episode is sponsored by you and us.
00:19 For everybody that supports the show through Patreon or through, mostly through a lot of our offerings, like the courses at Talk Python Training and PythonTest.com.
00:30 And books.
00:31 We've got lovely books coming along.
00:34 I might bring up a book later in the show, but we'll talk about it.
00:37 Anyway, thanks a lot for everybody supporting us.
00:40 It keeps us going.
00:42 And also thank you to everybody that sends in topic ideas, either by going to pythonbytes.fm and submitting something through the contact form,
00:50 or we're going ahead and sending it to us on socials.
00:54 So we're at BlueSky and at Mastodon.
00:58 And all those links are in the show notes or at pythonbytes.fm.
01:02 You also can watch the show if you'd like, either real time or after the fact.
01:08 You can join us at pythonbytes.fm/live and be part of the audience.
01:12 And one of the fun things about that is while we're recording this, you can add comments and we might comment back or highlight your comment.
01:20 That's fun.
01:21 Anyway, the last thing I want to bring up is that you don't have to take any notes while we're talking
01:26 because all the stuff is on the show notes to links.
01:31 But if you'd like that delivered right to your inbox, plus a little background information,
01:35 some extra stuff, especially helpful for if we're covering a topic that you're slightly thinking about,
01:41 but maybe not, we'll send you some extra information.
01:44 And you just sign up to be a friend of the show at pythonbytes.fm and say, join the newsletter.
01:51 With that, what do we have to start with?
01:53 Well, I've got a doozy.
01:54 Oh.
01:55 As they say, a doozy.
01:57 And this one comes to us from Ian Lessing.
01:59 So thank you for sending it in to your point about sending us ideas about the show.
02:03 This one somehow missed my radar, but shouldn't have.
02:06 So yeah, it's a big one.
02:09 So this, you probably have seen chardette as in characterdette.
02:13 Maybe it's caradette.
02:14 I don't know.
02:15 I'm always, you know, sidebar.
02:17 It's always funny to think about abbreviations like lib or lib, you know?
02:22 It's, if you pronounce it L-I-B, I think it goes lib.
02:25 But if it's an abbreviation of library, shouldn't it be lib?
02:28 I don't know.
02:28 So anyway.
02:29 No, because that's weird.
02:30 I know it is weird.
02:30 So charadette is a library that I believe was originally done, originally created by Mark Pilgrim, but now it's maintained by Dan Blanchard.
02:43 So I think this is, this alone makes it interesting because there's something happening with this project.
02:48 And a lot of people are pushing back saying the maintainer can't make a change to it, but they're the maintainer.
02:54 Like not, I don't want you to, but you can't.
02:56 So here's the headline.
02:58 Yeah.
02:58 Caradette, does the caradette dispute shows how AI will kill software licensing, argues Bruce Perens.
03:04 Perens.
03:05 And subtitles comes from the register.
03:07 Alarm bells are ringing in the open source community, but commercial licensing is also at risk.
03:12 So told you it's a doozy.
03:14 What is going on?
03:15 So earlier this week, Dan Blanchard, and I want to point out, the maintainer of the library, released a new version of the library under a new software license.
03:22 It was LGPL, and he released it under MIT.
03:26 In doing so, he may have killed copy left because, well, MIT is a do whatever you want, just don't sue me about it.
03:32 You know what I mean?
03:33 License, which almost all the stuff that I do is MIT as well.
03:36 I just want people to just, I'd rather have people just have access to do whatever.
03:40 Like, I don't want people to just, hey, could I get an exception to use this for this thing like that?
03:44 Like nothing I'm doing is that important, you know?
03:46 Yeah, and I think of the MIT stuff as like a good license if you don't care if people use it in their commercial product.
03:54 Yes.
03:55 And look, if it's okay, I was not familiar with what this is.
03:59 I still don't quite get.
04:00 What does Caridit do?
04:02 So it is a character detection library that's used by millions of projects.
04:09 Let me see.
04:09 It has 130 million downloads a month.
04:12 So it's used by a lot of things.
04:16 Okay.
04:17 Character encoding detector.
04:18 Yes.
04:18 Yeah, exactly.
04:19 Like UTF, I think.
04:21 You know, is this UTF or Unicode or is it, how do I have bytes.
04:25 What am I going to do, you know?
04:26 So it takes a guess, right?
04:28 Right.
04:28 You're getting bytes and you're not really sure.
04:30 It's like wasn't declared or whatever.
04:32 So previously this was an LGPL project.
04:36 Dan Blanchard wanted two things from what I can read between the lines, putting words into their mouth.
04:42 One, wanted to dramatically improve this library to make it better.
04:46 Check.
04:47 You'll see he did that.
04:48 Two, wanted to set the stage such that this can just be part of Python.
04:54 Like could this just be part of the standard library?
04:56 So previously, like there's this move to say we should have less in Python and I agree.
05:01 But detecting whether or not something is Unicode or ASCII or whatever.
05:06 Maybe that does belong in the library.
05:07 Anyway, that was the goal.
05:08 It's like, could we put it in there?
05:09 Well, LGPL says no.
05:11 It would change the license of Python, I believe.
05:14 Right.
05:14 So as long as it's a GPL based license, you can't move this library into the standard library.
05:21 I don't know if that the core developers or even if Dan is a core developer was interested in this, but that was one of the goals.
05:27 Right.
05:27 So no problem.
05:29 We're going to change it.
05:29 Well, an individual claiming to be Mark Pilgrim, because you can't verify people on the internet for sure.
05:36 The original created the library.
05:37 So it's a little bit like Flask where Armin Roeneker created it.
05:40 And then now David Lord maintains it.
05:42 And David Lord gets to do whatever he wants with it.
05:44 It's his project now as the maintainer.
05:46 But I'm sure Armin still has influence over the community's opinion if he were to take a strong position one way or the other.
05:53 Right.
05:54 Neither of them chimed into this as far as I know.
05:56 I don't think.
05:57 Maybe Armin did.
05:57 I can't remember.
05:58 There's a lot.
05:59 There's a lot of chat about it.
06:00 It's kind of like giving somebody a puppy and then telling them where they have to take it, what vet they have to take it to.
06:08 Yeah, exactly.
06:09 I'm leaning on the side of Dan Blanchard here, just setting the state.
06:14 I have a slight.
06:15 There's a lot of complexity in this.
06:16 I'm not like totally just saying this is how it is.
06:19 But let's keep setting the stage.
06:21 So Mark says, you can't do that.
06:23 You can't change an LPGL.
06:26 I believe that's the typo.
06:27 LGPL license requirement.
06:29 It requires that the license remain unchanged.
06:32 License code, when modified, must be released under the same GPL license.
06:37 But I get that when somebody gets it from the source, they make a change.
06:43 It must be released under the same license.
06:45 As the owner of the project, I thought you could change the license on new code.
06:49 I don't know.
06:50 It's your software, effectively.
06:52 If you want to change the license of it, I don't know.
06:55 This is a little bit of shaky ground here to say that you can't change the license as the owner of the license.
07:01 You know what I mean?
07:02 Anyone else in the world should not.
07:04 They have to follow what I just read.
07:06 But as the owner of the license, is that true?
07:08 Well, so here's what Dan did.
07:09 Dan said, I'm going to create a new better version.
07:13 I'm going to rewrite this entire project from scratch, not using any of its source code,
07:18 and re-release it into the same package channel as the old one.
07:22 Now, one of the problems under that is, as the maintainer, he's deeply familiar with how it works.
07:28 And one of the challenges is, if you know how it works, your idea, it's like hard to do a fresh from scratch rewrite if it's burned into your mind how it works.
07:38 You know what I mean?
07:39 So what he did is he just gave the specification to Claude and said, Claude, write this so that the test pass.
07:46 And Claude wrote it.
07:47 And it wrote it extremely differently.
07:50 There's a plagiarism detection algorithm.
07:54 So it's probably more for English, but whatever.
07:56 It said it is only 1% similar to version 6.
08:00 Version 7 is only 1% similar to version 6.
08:03 So that means it's pretty different.
08:05 Dan also said it's like structurally the files are not named the same.
08:08 They're not organized the same.
08:10 It is basically not at all the same thing.
08:12 Nothing.
08:13 The only thing that's that 1% is like arg pars structure and stuff because you have the same arguments, you know.
08:18 And so they believe there's nothing here.
08:22 This is a new project.
08:23 And this is what gets the MIT license.
08:26 Now, to be clear, this is a mega improvement.
08:29 It results in a 48 times improvement in detection speed.
08:34 It now supports multi-threading for Python T.
08:38 So you can do free-threaded Python and it supports that.
08:40 There's a lot of benefits to this new version.
08:43 So I don't think anyone is saying you've messed up the library.
08:47 It's like clearly a better library.
08:49 It's only this we hate AI or AI is theft or like there's a lot of these different angles that are like focusing in like a laser onto this change.
08:59 You know what I mean?
08:59 Yeah.
09:00 So Dan says, I was just trying to accomplish these goals with the tools and times I had available.
09:05 I'd never been paid to work on this and I have a full-time job, you know.
09:09 Software licensing and the laws around it haven't been tested a lot in this new world of AI system development and a long-time open source developer.
09:15 I'm also curious how this is going to shake out.
09:18 Yeah.
09:18 But somewhere it says, yeah, after maintaining this library for years, I've wanted to make these improvements, but I couldn't.
09:26 Claude gave me the ability to do this in roughly five days, right?
09:29 So I think this is also really interesting.
09:32 But why change the license?
09:33 Because he wants to put it into Python.
09:35 Oh, okay.
09:36 Yeah.
09:37 Or maybe he just wants people to be able to more freely use it and he just doesn't care about copyleft.
09:41 I don't really know, but I believe for the article makes it imply like he wants to put it into Python and LGPL to make that not possible.
09:49 And Arnaker actually did, or Armin actually, Arnaker actually did post about this saying that he welcomed the license change and he's wanted it for years.
09:56 Kind of what I would expect that he wanted to say as well.
09:59 Now, there's a issue that has been created that version 7 presents unacceptable legal risk to users due to the copyright controversy.
10:08 There's so much.
10:10 There is so much going on here.
10:13 I don't know, because to me, the license only goes from less restrictive to more permissive or more restrictive to more permissive.
10:20 And if it turns out that it's the old version, you know, you're back to where you started.
10:25 So I don't know.
10:25 I'm telling you, there's a lot of, this may be the bigger issue is there's issue 327 fired by Mark Pilgrim.
10:33 Hi, I'm Mark Pilgrim.
10:34 The title is No Right to Relicense This Project.
10:37 And it's absolutely toxic is the word.
10:40 So I don't know.
10:41 It's very interesting here.
10:43 I mean, version 6 is still there.
10:45 People can just keep using that or fork it if they want, if they still want the old license.
10:49 I think what's happening is this is becoming a lightning rod for the debate of licensing intersecting with agentic AI.
10:57 I mean, how many people actually care that much about character detection?
11:01 You know what I mean?
11:01 I mean, it's a utility.
11:02 Apparently a lot of people.
11:03 I know a lot of people.
11:04 It's extreme.
11:05 This is like, there are just pages of stuff to go through on all of this, like page after page.
11:10 It's crazy.
11:12 So I understand the sensitivity of it.
11:15 I forgot my popcorn.
11:17 I know.
11:17 I know.
11:18 And I get it that AI ingesting the world's work and then turning that into automation.
11:25 I'm not even sure where that sits legally.
11:28 At the time, it felt like a lot of theft.
11:31 I'm not sure if it's a good tradeoff or not.
11:33 I don't get how that relates to this project, though.
11:36 Okay.
11:37 So let me add in one more detail.
11:38 Like I said, there's a lot going on here.
11:39 We'll wrap this up pretty soon.
11:40 But it's a super interesting discussion.
11:42 I think, so one of the reasons that said, she said, well, I did it with Claude Code.
11:46 They said, well, it doesn't matter.
11:47 Claude Code trained on GitHub.
11:49 Therefore, it trained on the original source code of Keredet.
11:52 Therefore, it's not a clean room re-implementation.
11:55 So I asked.
11:55 I don't know if that matters because I can leave one company, go work at another company.
12:00 As long as I don't take the source code, if I just take what I remember and do similar work, I'm allowed to do that.
12:06 Yes.
12:06 You're a human being that gets to interact in the world.
12:10 Yeah, exactly.
12:10 Yeah.
12:10 It's not like, well, I saw, I saw a picture once and it was of a tree and it was copyrighted.
12:15 So I can never, ever create a picture of a tree again.
12:17 Like, cause I looked at it.
12:18 Right.
12:18 Yeah.
12:18 I, I, that's why I said I'm on the side of, I'm on, I'm done.
12:21 I feel like I'm on Dan's side here.
12:23 So I, let me look.
12:24 So I pull, I wrote a little, I didn't write.
12:26 I asked Claude for some research on like, well, what is the legal precedence of this?
12:31 Here's the situation.
12:32 At least in the U S are there rules, rulings that have come down previously.
12:37 So I put a little document up for people to look at, but it says the closest precedence is this Thomas Reuters versus Ross intelligence.
12:45 Where somebody, I can't remember here, they took a bunch of Westlaw head notes for legal advice and then did their own custom AI training on it and built an exact tool for legal research.
12:58 That turned out to be a violation.
13:00 But the exists here, this is, this is interesting.
13:02 The existing copyright framework requires two things to prove infringement, access to the original work.
13:07 Check.
13:08 Claude did have access to the original work and substantial similarity in output.
13:13 No, not even close.
13:14 Not even, not even 1%.
13:16 I know it's at 1.3%, but like that was like structure of our pars.
13:20 You know what I mean?
13:21 That's our pars structure.
13:23 That's not charted structure effectively.
13:25 So I think this strongly fails.
13:28 Those are the two criteria that you have to have to prove similarity.
13:33 But there's other stuff.
13:35 The emerging judicial consensus is that developing, is developing that training a general purpose AI model is highly transformative, therefore is free use.
13:46 But there were some specific examples where it wasn't.
13:48 The copyright, the US Copyright Office's position is that using copyright materials for AI model development may constitute prima facie infringement.
13:57 And what's really crazy, Brian, is if this, if things like that said, no, this is copyright infringement.
14:02 Like what happens to everything created by AI, period?
14:06 You know what I mean?
14:07 And I don't know how that's going to shake out.
14:08 Not forever.
14:09 I mean, let's, let's take like the extreme case and you go, well, you know what?
14:13 All the current models have been trained on licensed stuff.
14:16 So let's just like not really just start over.
14:20 It's going to cost a ton of money to retrain a model, but do it right.
14:25 Yeah, that's true.
14:26 Only train on the stuff that's available license wise.
14:29 Right.
14:29 Look, you could just look and say, is it a GPL license?
14:31 We're not trading on that.
14:32 Is it an MIT license?
14:33 It's on.
14:34 You know what I mean?
14:35 There's probably plenty of information still out there to build out your models.
14:39 Anyway, it's, it's pretty wild.
14:41 I think people can have a look.
14:42 I would certainly say the folks who took the time to comment are very much against this.
14:48 There's a lot of toxicity and Dan support going out to you just mentally.
14:53 Cause I, I've been on the receiving end of these types of things and they're not fun.
14:57 but I'm, I kind of, I think Dan has a point here.
15:01 However, this could all be solved if he just said, okay, version seven is char dot two new
15:06 project and just put like a strong more in.
15:09 Like we will, we will never change char dot one ever again, except for security patches.
15:14 And then all the things that depend upon it go fine.
15:16 We'll just take this one.
15:17 Like I want 48 times faster and multi-threaded sounds better to me.
15:20 Let's just do that.
15:21 Yeah.
15:21 And if like we push it too hard though, one option is he just stops maintaining it
15:26 and doesn't transfer maintainership to anybody else.
15:29 and we don't want that either.
15:31 So, yeah.
15:32 Yeah.
15:33 I certainly, I think this, this debate has far, far outgrown character encoding concerns.
15:39 It's its own special lightning rod.
15:41 Like I said.
15:41 Yeah.
15:42 All right.
15:43 How are you?
15:44 Can we talk about, I got just a small tool, that I, this is a small tool
15:51 suggested by Matthias, refined.
15:54 Well, it's not a small tool, but it's, it's quick to cover, refined to github.
15:58 And this is, this is awesome.
16:01 I didn't know about this.
16:02 So, this is a, this is a web page or website, a web browser plugin, browser extension, browser
16:10 extension.
16:11 Thank you.
16:11 that does some cool stuff.
16:13 If you work with GitHub a lot and, and I, you know, looking through this, I, I'm like,
16:18 what, what's wrong with GitHub right now?
16:21 Well, there's a bunch of stuff.
16:23 The highlights, there's some highlights at the top, makes white space characters visible.
16:27 That's cool.
16:28 So you can, I mean, that's a cool enough to get this, but there's a lot more coming,
16:34 tells you whether you're looking at the latest version of the repository, if there's,
16:38 any unreleased commits, that's kind of neat.
16:41 The, shows how far behind a PR head branches tells you it's base commit.
16:47 There's a bunch of stuff here.
16:48 I'm going to highlight down to some of the stuff that, one of the nice things,
16:53 there's lots of features, but they put fire beside things that you might care about.
16:57 Like adds a build CI status icon next to the repo name.
17:01 Love that.
17:01 adds a link back to the PR that ran the workflow.
17:05 That's cool.
17:06 the, this one I'm, I installed it just for this one feature enables tab and shift tab
17:14 for indentation in comment fields.
17:16 Because, you know, if you're in a web browser, you hit tab, it goes to the next field.
17:21 I just want to put a tab in, in the field.
17:24 anyway.
17:25 Yeah.
17:26 So for Python people, it might not matter that much, but if you're doing C++ or something,
17:30 you don't want to make.
17:31 Well, I still hit tab.
17:32 I just, I just expect it to add four spaces.
17:35 But, anyway.
17:38 let's see.
17:39 Auto resize the commit field.
17:41 add reaction, and reaction avatars showing who reacted to a comment.
17:46 That's interesting.
17:47 The, one, the other one that I want to highlight just to, because I think it's
17:51 cool is highlights the most useful comment in an issue.
17:55 So it'll, you know, if there's a lot of people talking about a comment or whatever,
17:59 it'll, you know, highlight that.
18:01 So, you know, just scroll around.
18:02 and actually I haven't really noticed.
18:05 I've, I've turned this on and it just sort of stays out of the way.
18:09 There's just more features and more.
18:11 It's just a nicer experience.
18:12 So, yeah, kudos to them.
18:15 This is an absolutely mega.
18:16 So what's notable about this is you wouldn't look at your UI and know anything is different,
18:22 but there's like a hundred little changes, right?
18:24 Yeah.
18:24 So, yeah.
18:25 Anyway, I, I, I'm always nervous to install browser extensions.
18:29 I have maybe five or six that I really love that from places I trust, but go to the top.
18:33 See how many, stars this has, 30,000.
18:36 Yeah.
18:37 You know, at that level, I think it's all right.
18:39 It's probably totally trustworthy, right?
18:41 So let's, yeah, you know, it's, I think it's good.
18:44 I think it's good.
18:44 I would probably install it.
18:46 I'd have to look and see if it'll inspire me, but.
18:49 Yeah.
18:50 I don't know.
18:50 I'll play with it for a while.
18:52 I'll see, see, see if my entire computer blows up, but.
18:56 Yeah.
18:56 If you, if your computer gets.
18:58 It's all something around for a while.
18:59 It looks like nine years or so.
19:01 Wow.
19:02 Really?
19:02 No kidding.
19:03 Well, at least in the front, the front top, there's the editor config is nine years
19:09 ago.
19:09 So at least there's some commits from nine years.
19:12 Yeah, exactly.
19:13 I would imagine it is.
19:14 Yeah, that's very, very wild.
19:15 Awesome.
19:16 Okay.
19:16 Let's move on to talk about databases and in particular Postgres.
19:21 So this project I want to talk about, I want to feature everyone across, and I think it's
19:25 been around for a little while, but it's called PG dog.
19:27 Okay.
19:28 Okay.
19:28 And what it is, is it's a performance enhancing layer for Postgres.
19:32 So if you're using, you know, maybe you're using my SQL, not my SQL, using SQL light in
19:39 dev, but then in production, you're using Postgres, right?
19:41 Something like that.
19:42 And, and it's starting to outgrow its performance.
19:45 Okay.
19:45 So either it needs better uptime, the database is getting too large or something like that.
19:51 Postgres doesn't have certain features like, connection pooling and other stuff that
19:54 could be better high performance, right?
19:57 So you don't have to reconnect as much.
19:58 This thing handles a whole bunch of those.
20:01 So we go down here to their repo.
20:04 It, by the way, has 4,000 stars and its age is, a year, two years.
20:09 It looks like last year is all, it's probably it's most recent things.
20:12 So there's been other projects like this as well.
20:15 For example, PG bouncer is a friend, a colleague, it's a software, I guess, another thing that
20:22 does the same thing.
20:22 So what this is, is it's a proxy for scaling Postgres and it does connection pooling, load
20:27 balancing for queries, and it does sharding of databases, which sounds bad, but it's actually
20:33 a potentially a good thing.
20:34 So you just create a toml file to set it up and then off it goes.
20:37 I got a bunch of notes here for all these little things that kind of spread around.
20:40 So, so for starters, it's a load balancer across Postgres.
20:44 So you can run Postgres in a replica network configuration.
20:49 So I can have a Postgres database, but then I can have, let's say four other Postgres databases
20:54 that are all copies of that same data and they stay in sync.
20:57 Okay.
20:58 And from a read perspective, you could read from all five of them if they all have the
21:01 same data.
21:02 And that basically five X's your database query performance.
21:06 Okay.
21:07 So I'm just going to run a database, right?
21:08 Okay.
21:09 Just by simply sending them to different machines with exactly the same data, the same database.
21:13 Yeah.
21:14 But the problem is the consistency, right?
21:16 So it knows which one is the primary database and it can do writing to that and make sure
21:21 that it propagates to the others before it tells you that it's committed, which is kind
21:25 of the magic of replicas.
21:27 Cause if you write to it and then immediately do another read, but it happens to have gone
21:31 to this time it's round robin to a different database server.
21:35 That's bad.
21:36 Cause it might not be there.
21:37 Right?
21:38 Like I saved the database, I queried and it wasn't there and I went to test why it wasn't
21:41 there.
21:42 Then it was there.
21:43 I don't get it.
21:44 What's going on with the world.
21:45 Right?
21:46 So you want it to definitely manage that kind of stuff.
21:47 It also does health checks.
21:48 And if you've got this read primary replica configuration that I'm talking about, if one
21:54 of them goes down, it will just take it out of its rotation.
21:58 And if it's the primary one, it'll pick another primary, I believe.
22:01 So it has a single endpoint behavior, which I talked about.
22:04 So you can, you know, it understands the Postgres structure, like the basically T-SQL.
22:10 And so it updates, it knows if it sees an update or insert or create table and things like
22:14 that and sends that to the primary and then leaves the other ones chilling to do their thing.
22:19 As the failover I talked about in, it has sharding, which is really cool.
22:22 And it does a bunch of stuff to manage and keep that in sync.
22:25 You can even have different sets, different clusters of database and say, keep this one
22:30 in sync with that one.
22:31 So for example, imagine you've got an e-commerce site and it's starting to go too slow.
22:37 People do a request for, I don't know, let me, let me give you an example of pride resonates
22:42 more with people, a health provider database.
22:45 I don't know about yours, but whenever I go to figure out something with my next doctor appointment
22:49 or something, it's like the page slowly loads in and then it spins, it says checking records,
22:54 checking records, checking records.
22:55 And like five seconds later, chunk will come in and more of, and like, what is going on with this?
22:59 Why is this so slow?
23:01 You know?
23:02 And it's probably just some huge database with a bunch of insane joins and weird queries
23:08 and stuff just to tell me that my appointment is at 10 o'clock.
23:11 So what you could do is you could say, okay, your health record ID is going to be the shard key
23:16 and we're going to have 20 different servers, right?
23:19 Running our cloud setup.
23:20 And for that, we're going to somehow determine which database it goes.
23:25 So maybe we're going to say, take the hash of the health ID and use the first two letters
23:31 to figure out which database it actually goes in.
23:33 So like AA through B, whatever, right?
23:38 Goes to the first database server and the second, the third and fourth and so on.
23:41 So when you do a query, you say, I want the thing for this user.
23:44 It just goes, okay, great.
23:45 Well, that means I only query that one server.
23:48 Instead of trying to query the a hundred million records, you query, what did I say?
23:52 25, so you query 4 million, which is way, way faster, right?
23:56 On any given server.
23:57 So that's a really cool aspect.
23:59 And one of its main features is the sharding capability.
24:02 Okay.
24:03 Yeah.
24:04 Pretty neat.
24:05 Pretty neat.
24:06 Well, but if you're really trying to find out like a health information,
24:09 it might, the hash might be the problem.
24:11 Stop doing hash, man.
24:13 I don't know why these systems are so bad.
24:15 They're so bad.
24:16 Bad joke.
24:17 Sorry.
24:18 Yes.
24:19 Yeah, that's true.
24:20 I get it now.
24:21 I get it now.
24:22 I get it.
24:23 All right.
24:24 Awesome.
24:25 Over to you.
24:26 Okay.
24:27 Well, I, this is partly a public service announcement.
24:32 Maybe.
24:33 This is, I want to cover Simon Wilson.
24:36 So we know Simon Wilson has been playing with AI and agents and stuff since like, since
24:41 they came out or something.
24:43 And, I appreciate all of Simon's work.
24:46 and, and I've been watching here and there and it just like a pre learning from him and
24:52 not having to do all the experimentation that he's doing, but he's really great at explaining
24:57 it.
24:58 and I think that's a great, he's, he's, got it, this sort of book like thing
25:01 together, that we're going to link to called agentic engineering patterns.
25:06 And this is a series of blog posts, but they're fairly concise and short.
25:11 And it's really good writing as well.
25:13 there, and I think anybody, especially, well, it might be useful for early everybody,
25:20 but especially people with teams.
25:22 it'd be good to make sure that everybody's kind of on like good.
25:25 I think there's the information here is right.
25:28 Good for everybody.
25:29 So there's principles, getting started, like some intro on how, how agents work.
25:35 and testing and QA.
25:37 There's this three posts about that, which I love.
25:40 understanding code, using it to walk through using agents to walk through code
25:45 and stuff.
25:46 even these are, didn't notice these when I was looking at this the other day, an
25:51 appendix of prompts I use that might be interesting, but also GIF animation tool using WebAssembly
25:56 and GIF, GIFsicle annotated prompts.
26:00 Oh, that might be fun, but maybe not appropriate for everybody.
26:03 But, but the one that I love right here is, anti-pattern.
26:10 So in principles, there's some anti-patterns.
26:12 Well, everything, everything in the principles definitely go read.
26:16 the writing code is cheap.
26:18 Now what is agentic engineering or things, you know, basically, not basic, but like
26:24 keeping track of like heat, for instance, making tools and doing snippets, doing little
26:30 tools, having those available, not only for you to remember, but you can also tell an agent
26:35 an agent to say, Hey, I already kind of solved this over here in this project.
26:39 So use that, but apply it to this, this other project here.
26:43 Super cool idea.
26:44 and also, the, these two AI should help us produce better code.
26:49 So if the, if you're having AI produce your code, I think it should be better code than
26:54 you should, than you would produce by yourself.
26:56 Not worse.
26:57 I don't like this, this notion of produce of people not reading their code at all.
27:02 and I think that's going to blow up on us.
27:06 And, especially if you're working in teams, a bunch of anti patterns to watch out for.
27:11 And the top one is about inflicting unreviewed code on your collaborators.
27:16 This anti pattern is common and deeply frustrating, both in open source.
27:20 And I'm dealing with it at work myself.
27:22 Don't file pull requests with code.
27:25 You haven't reviewed yourself.
27:26 I'm tired of reviewing reams and reams and reams of code that I'm know that nobody actually
27:32 read that.
27:33 And why?
27:34 So why did they expect me to read it?
27:36 So anyway, great resource here.
27:38 I love the cheat sheet on red, green, your refactor is pretty great.
27:43 Also.
27:44 and I promise to highlight that since the testing is kind of my thing.
27:47 Right.
27:48 and this is the, all he has tested it and the phrase use red slash green.
27:54 So use red, green TDD is a pleasingly succinct way to get better results out of your coding
28:00 agent.
28:01 And tell your coding agent to do this and it will know to write a test first and, and,
28:07 test, you know, make changes until it's green.
28:10 What's interesting is normally we think of TDD is red, green refactor.
28:13 The refactor part, that's when you need to get involved.
28:16 So you can have the agent do the red, green part come, which is come up with a test that
28:21 describes what you want to do.
28:22 Write code until you have code that does that.
28:25 Now you go and review that code and you can talk to the aid.
28:28 You don't have to most necessarily change it.
28:30 You can talk to the agent and go, this part of the code is weird.
28:33 Can we change it to a different pattern?
28:35 Or, is there some way to clean this up?
28:37 And I've had really good results with that actually to just say kind of good, but this
28:42 part, why did we do that?
28:44 And it's surprising to me to have the agent come back and say, oh yeah, that's weird.
28:49 And change it to what I would expect.
28:51 Why didn't it just do that in the first place?
28:53 But it doesn't.
28:54 so, and maybe it will in, in the future and the future might be next week.
28:59 Who knows?
29:00 but for right now, these are great engineering pattern, great things to watch.
29:04 So thanks Simon.
29:05 And I trust him to like, keep these up to date.
29:08 So anyway.
29:09 Yeah, this looks super interesting.
29:10 I definitely want to check it out.
29:11 I've already spread this around to, to work.
29:14 And especially the people that have sent me code reviews that I'm like, you didn't read
29:18 this.
29:19 I know you didn't.
29:21 So.
29:22 I think that's part of the pushback as well as like people are lazy or they don't know
29:27 what they're doing.
29:28 And they just, here's 2000 lines of code that fixes what I was asking for.
29:31 You're like, no, go away.
29:32 Yeah.
29:33 Where they spent some time.
29:34 You're like, actually, can you narrow this down to a 10 line change?
29:37 This is all I want.
29:38 Please don't go do other things.
29:39 Like just help me understand this.
29:40 I understand this and why this needs to change.
29:42 And then I think we're still learning how all this stuff works and there are engineering
29:47 practices, but it's so the stakes are so low for getting started.
29:51 You know, normally you're like, okay, we're going to set up our build tool chain and then
29:54 we're going to learn the language and the syntax and the structures and the keywords.
29:57 And now it's just like, oh, just use regular English to just tell it stuff.
30:00 And it'll probably figure it out.
30:01 Right.
30:02 That's that gives the sense that I don't need to learn this as a skill, but you do.
30:06 Yeah.
30:07 I also think that we're getting a lot of advice about how to utilize agents from startups and
30:13 startups have a different field.
30:15 There are startups are greenfield for the most part.
30:18 They're writing new code.
30:19 Whereas a lot of software jobs are maintaining existing code bases that have been around for
30:25 decades, possibly, or at least years.
30:28 And, and you can't just not care what goes into it.
30:33 Yeah.
30:34 It's a, you've been handed this thing that is making your company money.
30:38 You can't make it worse just because the agent decided to rewrite everything.
30:42 So yeah.
30:43 Yeah.
30:44 Anyway, for sure.
30:45 Well, do we have any extras?
30:46 I got a few extras.
30:47 Why don't you go first?
30:48 Okay.
30:49 A couple ones.
30:50 This, the first one comes from John Hagan.
30:52 Thanks for mentioning this because I almost made this a top level story, but there's not
30:57 much to say about it other than this is awesome.
31:00 Upgrading Python versions with uv.
31:02 So if you do, you, we know, know that if to get all these new features, any new features
31:08 from uv, you have to say uv self update.
31:11 I think, is that right?
31:12 I think self update.
31:13 Yeah.
31:14 uv self update.
31:15 Yep.
31:16 But after you've done that, now you can say uv Python upgrade.
31:22 You can give it a specific one.
31:23 So like for instance, if you say uv, uv Python upgrade 312, it updates 312 to the latest
31:31 version, the latest dot release, which is cool.
31:35 But if you leave that off, which that's what I do, it just looks at all of the, all of the
31:40 Python versions that you have installed on your computer through uv and updates them
31:45 all to the, to the most recent like bug fix release.
31:49 And why, like, why not?
31:51 We should be doing that all the time.
31:52 I'm going to set this up as a cron job or something.
31:54 I don't know.
31:55 Yeah.
31:56 So it's cool.
31:57 And yeah.
31:58 So thanks uv making things either easier once again.
32:00 Awesome.
32:01 Awesome job.
32:02 I've already incorporated it into my little updater scripts that I run periodically.
32:05 next is also, something that's suggested by a reader and I understand New York times
32:11 magazine is, is behind a paywall.
32:14 but, but for some reason I was able to read this fine.
32:18 Maybe, I don't know.
32:19 I have a, I do have a New York times newspaper subscription.
32:22 So maybe that's it.
32:23 Anyway, coding after coders, the end of computer programming, as we know it, this is
32:29 a description of basically talking about whether or not like, it's not just whether or not like
32:34 AI is the end of coding, jobs.
32:37 I don't, you know, we don't think it is.
32:39 the conclusion here is it's not, but it's also more about that.
32:43 It's more than that.
32:44 It's talking about basically kind of some different, different life, like different changes.
32:48 And it also talks about, I believe it talked about the different differences between
32:53 percent of, improved percent of efficiency improvement of greenfield versus legacy code.
33:00 Whereas like a lot of startups say they're a hundred, a hundred times faster, but,
33:05 Amazon has said it's on average 10% faster, but that's not nothing to get.
33:09 You should still get excited about 10% faster, but, but don't expect, make people
33:14 maintaining your own code to be a hundred times faster.
33:17 the reason why it was passed to me was, was because there's this great line.
33:21 If I could see if I can find it, PI test, pushing code that fails PI test is unacceptable
33:26 and embarrassing.
33:27 Apparently this is a, a pro like a, a, an instruction that somebody has in their,
33:33 markdown files to instruct Claude to, to always run the PI test and be embarrassed if
33:40 they don't.
33:41 I love it.
33:42 This is good.
33:43 but anyway, those are my extras.
33:46 I actually, I think this is a, a well-written article for somebody that doesn't understand.
33:50 Apparently this part of this, the author was, has been covering the tech world for a while.
33:55 So nice.
33:56 And also PI test got into the New York times.
33:59 Yeah, that's pretty cool.
34:01 Yeah.
34:02 What you got extra for us.
34:03 All right.
34:04 Well, I've got a few, let's start with talk by the untraining, by per request from
34:09 one of the users, they said, Hey, I would be really great if I could, when I log into my
34:13 account, have more information.
34:14 So I updated the people who have accounts there.
34:18 If you go log into your account, it will show you all the courses you are actively learning.
34:23 I have 48 of them.
34:25 I haven't finished a bunch of them.
34:26 People might be like, Michael, you have courses on the website you haven't finished.
34:30 By the time it gets to the website, I've watched the videos two to three times.
34:33 I don't have to watch them a fourth time in sequence and like have the system record
34:38 me watching them.
34:39 So no, they're not all done, but it'll show you things like the ones you're working on,
34:42 how far are you through?
34:43 And when did you last watch it?
34:44 And when did you start?
34:45 Apparently this is things like if you're submitting this as a training evidence for your employer,
34:51 knowing when you started, when you finished and so on and whatever, how far you are.
34:55 And there's also a whole bunch.
34:56 It shows you completed ones.
34:57 I'm going to be generating certificates for people.
35:00 I'm just, it's easy enough for me to make PDF downloads, but I want to make stuff that you
35:04 could say, post to your LinkedIn profile as an accomplishment.
35:07 You know, like I've done the FastAPI course at talk Python as part of your like LinkedIn record
35:13 and other places you can put those kinds of things.
35:15 So it's not as simple as just a PDF, but hopefully stuff like that comes.
35:19 Anyway, this was fun to build.
35:20 I think it looks really neat.
35:21 I think it's especially if somebody is buying the bundle, they have access to a ton of courses
35:26 and they might not remember like what course was I taking last month?
35:30 Yeah, exactly.
35:31 You didn't even buy it, but you took it.
35:33 Then you forgot which one you're doing.
35:34 This totally solves that problem.
35:35 Yeah.
35:36 Yeah, exactly.
35:37 Yeah.
35:38 That's what the request was like.
35:39 I know I took a course.
35:40 I don't remember which one I was working on in which order would, you know, help me get
35:43 back to that.
35:44 And then of course, when you're in, in a course, it has a resume button.
35:46 So you just click that to presume where you left off, but it doesn't have a cross cutting
35:50 resume.
35:51 You know what I mean?
35:52 I do like that, how you split it up so that if, like you said, if someone, if I took the whole
35:56 course and there's something I want to go back and review, I can just look through and go and watch those.
35:59 It's labeled well.
36:00 Yeah.
36:01 Thanks a bunch.
36:02 All right.
36:03 I talked about using latency to increase security for supply chain stuff, right?
36:10 Like, hey, if I do a pip, a uv pip update or upgrade sort of thing, or similarly with sync and add and so on, just doing like an exclude newer than or whatever, give it seven days or a week or however you do it.
36:24 There's this article by Andrew Nespin that says package managers need to chill.
36:29 And right at the top, we have this post requested by Seth Larson, the security guy at the PSF.
36:35 So yeah.
36:36 Anyway, it talks about all the different, how you make your dependency manager cool chill.
36:41 Like uv has an exclude newer, which I've been using and it's mostly awesome, except for when there's a vulnerability that appears in one and you get a notification that you've got to fix it, but it just came out the fix.
36:51 So you don't want to exclude it.
36:53 But in general, it's, I think, a better thing than not.
36:58 What?
36:59 Like, remind me why?
37:00 Why would you want to exclude newer stuff?
37:02 Because for popular packages, if somebody uploads a virus inside the package, like they take over the build chain or they fish the person who created it, like the sake shared it, they fish Dan, they get access to his GitHub and they install a subtle thing that downloads some root kit or whatever info stealer to your account.
37:23 That usually gets found within the first couple of days.
37:26 And if you're always just going update, update, update, give me the latest, give me the latest, you know, the chances that you hit that are pretty high, right?
37:34 Because they won't get found in the first hour.
37:36 Even if it's found in the first hour, will people be able to react and communicate within the first hour to deal with it?
37:41 But if you just say, give it a week, like probably most of the popular ones, if there was something wrong, it would have been found out by then.
37:48 But okay, what if it got found out and got fixed and the weak boundaries there and I like upload the week old one that has the bug or do they remove it completely from the...
38:00 If there was a virus, they remove it from PyPI.
38:02 Okay, it's not even there, even if somebody picks an old one.
38:05 Exactly.
38:06 I knew that, I was just sort of playing along.
38:08 Yeah, yeah, yeah, yeah, exactly. Yeah.
38:11 Okay.
38:12 So basically, just more people singing the same message, but this is a nice cross technology.
38:16 Are you in .NET? Are you in Ruby? Are you in JavaScript? Here's how you make it chill.
38:19 Okay, so back to AI real quick.
38:22 Paul Everett and I did this video debate, although it was not that much of a debate, but it was more of a conversation, but in kind of debate format about will AI kill open source?
38:33 Not the licensing part of it, but just will it make open source unnecessary?
38:38 Will it just stop using open source and so on?
38:40 We don't think so, but we had a really nice chat and did a little quick write up, but mostly the...
38:45 write up links to the video.
38:47 So check out the video.
38:48 I also did a write up called always activate the VE and V, a shell script.
38:52 So I talked about this before, I believe.
38:54 This is not the thing.
38:55 This is the lead into the thing I want to talk about.
38:57 And so as I change directories around my computer with just the terminal, it automatically finds and activates virtual environments.
39:05 But there's like, this was a thing in dirt, dirt inf.
39:08 They said, well, we can't do this.
39:10 What if somebody maliciously sends some kind of virus and like commits a virus called VE and V into the repo and like it runs the activate script?
39:21 What if that activate script is malicious?
39:23 You know, that kind of thing.
39:24 So with the nice feedback from Scott H, I made a much more secure version that whitelist them.
39:32 And if you're, it's not whitelisted, it says, Hey, do you, do you really trust this thing or do you not?
39:36 Cause you might just open up a folder and go, Oh my gosh, there was a virtual indirect virtual environment somewhere.
39:40 And it activated and ran something that I didn't know was going to happen.
39:43 All that I think is super polished and really nice.
39:45 And I'm loving it.
39:46 So here's the news.
39:47 Viraj Kenwande or Kenwand, said I wrote the antidote for ZSH plugin management and dah, dah, dah, dah, dah, dah.
39:56 I ran across Michael's secure aware virtual environment activator script, which was pretty awesome.
40:01 So this is now a, Z shell plugin or, Oh my Z shell plugin, a ZSH safe V and V auto is what it's called, which I thought was pretty awesome.
40:11 That's pretty cool.
40:12 All right.
40:13 That's it for my extras.
40:14 Cool.
40:15 We each got a joke, right?
40:16 yeah.
40:17 I took mine down though.
40:18 so I'm going to have to rely on you to bring it into your thing.
40:22 So, all right, I'll find it.
40:23 No worries.
40:24 So this, this one is so good and it follows this AI theme that we've been going.
40:29 I remember the stack overflow keyboard and this is exactly the same vibe as the stack overflow keyboard.
40:35 The stack overflow keyboard was like the coders keyboard and it's had a control and a C and a V for the joke of just copy and pasting from stack overflow.
40:43 Yeah.
40:44 Well, if you've done anything with cloud code, it often asks permissions to make changes and it says, do you want to allow this once?
40:51 Allow this always, or do you want to reject this change?
40:54 Yeah.
40:55 And so it's the super fancy Apple looking keyboard that just says allow once always allow or reject.
41:00 So this is funny on its own.
41:02 You all have to check out the picture.
41:03 Yeah.
41:04 I put it in the show notes.
41:05 It may or may not show up in your podcast player.
41:07 I don't know.
41:08 Maybe I can, I'll just make it the poster art.
41:10 But also there's, there's two, there's two, too many buttons.
41:14 I think you just all need allow always.
41:17 I know.
41:18 Well, let's, let's review the comments because, oh my gosh, they're so good.
41:22 There's 223 comments.
41:24 Yeah, exactly.
41:25 Issue says, waste of two buttons.
41:27 A truly productive agent should only have allow all.
41:30 This is like the, the, remember the joke last week that was like, so you're new to a sarcasm.
41:39 The person that looks like an AI generated image.
41:41 Yeah, exactly.
41:42 It didn't obviously.
41:43 And there's the secret button dangerously skip permissions.
41:48 Somebody added it to their stream deck for real, that it actually allows it.
41:52 Yeah.
41:53 Matt says too, too many buttons.
41:54 But if we go down, oh my gosh, there's there's, there's, this is the.
41:59 This is the one, the actual one, Brian.
42:02 There's, there's a used version.
42:04 It says update after day one, it shows the same picture, but the allow always is like cracked and smashed.
42:10 And just like, it's just been hit like brutally.
42:13 Just used.
42:14 Yes.
42:15 Oh, this is really good.
42:18 And so someone says, you got to be safe.
42:20 And they create a little like Rube Goldberg machine that just like automates hitting allow once, but forever.
42:25 Yeah.
42:26 It's just a little bobber, bobber thing that just hits it all the time.
42:30 That's funny.
42:31 Devin says, no, no, no.
42:32 We need the, you know, Claude code.
42:34 We'll say, we're going to, I got to ask you a few questions.
42:36 Here's three options.
42:37 Do you want one, two or three or sometimes there's four or you got to choose other.
42:40 So there's one that has like a second row that says one, two, three, four other.
42:44 I would actually use that.
42:45 I know.
42:46 It's so good.
42:47 Another one has like the three, the allow ones allow always rejected.
42:51 It has a microphone button to dictate to it.
42:54 These are so good.
42:55 You got to look at the comments.
42:57 That's funny.
42:58 Anyway, that's good.
43:00 Yeah.
43:01 That's my joke.
43:02 But my favorite one of it is where it's like crushed with like after day one.
43:05 Yeah.
43:06 So, well, I'll, let me try to get mine up.
43:09 Let's see.
43:10 Yeah.
43:11 Let's see.
43:12 I can pull up for you if you don't have me.
43:13 Okay.
43:14 Yeah.
43:15 Just go ahead and pull that link up or something or the picture.
43:17 So this was submitted with something else submitted by Paul Cutler has some news about
43:22 AI too.
43:23 Are you getting it?
43:24 You want me to?
43:25 I get it.
43:26 It's just slow.
43:27 Okay.
43:28 It's just slow loading.
43:29 There we go.
43:30 So set this over on a mastodon.
43:32 Paul Cutler today, it was mandated at work that we install Claude code because as they
43:38 said, it has built in PowerPoint creation capabilities.
43:43 What a reason.
43:44 FML.
43:45 Yeah.
43:46 Cause you know what's coming next hour long meetings with lots of PowerPoint.
43:51 You know, I thought this was super funny at first, but also like it drives me kind
43:56 of nuts with it.
43:57 Cause because you know, I'm a coder.
43:59 So if I have to write a PowerPoint presentation, it's unusual.
44:02 So, this probably is a good idea that I could save some time and not waste time on
44:08 creating PowerPoint.
44:09 So yeah, no kidding.
44:10 Well, it's actually, it was a, it's a pretty neat integration.
44:12 It's not just that it knows how to do PowerPoint, but it, if you open up the Claude desktop
44:17 app, the same one that does co-work.
44:19 Yeah.
44:20 It has a, like a little what's new button and you click it and it says install in the
44:24 PowerPoint.
44:25 And it actually adds like a Claude section inside of your PowerPoint presentation.
44:30 Why?
44:31 So you could like highlight a picture and say, could we get a different picture for this
44:35 or highlight the text?
44:36 And could you, could you animate this in from the left?
44:38 Oh, like not while you're presenting though?
44:41 Like, no, it's during the building time.
44:43 Okay.
44:44 That makes more sense.
44:45 You got like format picture animation tab, and then you've got Claude now.
44:48 Yeah.
44:49 It's actually pretty, as opposed to just read my PowerPoint file and do this, you know?
44:53 Sorry, Paul.
44:54 I want it to be during the presentation.
44:56 So when you're presenting and go, Hey Claude, does anybody in the audience stop?
45:00 They stopped at Starbucks before they got here.
45:03 Or something like that.
45:04 Exactly.
45:05 I forgot what I'm talking about, Claude.
45:06 Please tell the audience what this means.
45:08 Yeah.
45:09 Anyway.
45:10 Yeah.
45:11 Awesome.
45:12 Well, fun talking with you as always.
45:14 And I don't know if we need to change the name of the podcast to like Claude Bites.
45:21 Probably not.
45:22 I don't think so.
45:23 But I mean, honestly, it's a good point.
45:28 But as a meta comment for the audience out there, it's really challenging to cover this
45:33 stuff because so much of the energy in software development and tech in general is in AI.
45:38 Yeah.
45:39 But we obviously realize that there's plenty of stuff that's not really AI at all.
45:43 At the same time, it's transforming the industry like basically like the web when the web came
45:48 around.
45:49 And it's like, well, now we have the web, but we don't talk about it because it's, you know,
45:52 I don't know.
45:53 It's a balance.
45:54 Also, I just am aware that there's people that care about Python, but also they have
46:00 to care about this right now, whether they want to or not.
46:02 So it's something I'm willing to cover as well.
46:05 So, yeah.
46:06 Yeah.
46:07 And it's wild.
46:08 May we live in interesting times.
46:09 Bye.
46:10 Later, Brian.




