Transcript #347: The One About Context Mangers
Return to episode page view on github00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to
00:04 earbuds. This is episode 347, recorded August 8th, 2023. And I'm Brian Okken.
00:11 And I'm Michael Kennedy.
00:12 Well, we have lots of great topics today. I'm pretty excited to get to them. This episode,
00:18 of course, is, well, not of course, but is sponsored by us. So if you'd like to support
00:24 the show, you can support us on Patreon or check out one of Michael's many courses or
00:30 my other podcasts or you know how to support us.
00:34 They know the deal. Brian, let me throw one more in there for people.
00:36 Okay.
00:37 If you work for a company and that company is trying to spread the word about a product or
00:41 service, Python Bytes.fm slash sponsor, I can check that out as well. Recommend that to their
00:46 marketing team.
00:47 Definitely. And if you are listening and would like to join the show live sometimes,
00:53 just check out pythonbytes.fm/live. And there's info about it there. Why don't you kick
00:58 us off, Michael, with the first topic?
01:00 Here we go. Let's talk. Let's do a lead in here to basically all of my things.
01:06 Ready? I believe it was Freddy. The folks behind Litestar, L-I-T-E star, is a async framework
01:13 for building APIs in Python. It's pretty interesting. Similar but not the same as FastAPI. They kind
01:19 of share some of the same zen. Now, I'm not ready to talk about Litestar.
01:23 It's not actually my thing. I will at some point, probably. It's pretty popular. 2.4 thousand
01:28 stars. Which is cool. But I'm like, huh, let me learn more about this. Like, let me see what
01:32 this is built on. And so I started poking through, what did I poke through? Not the requirements,
01:37 but the poetry lock file and, and PI project and all that stuff. And came across two projects
01:44 that are not super well known, I think. And I kind of want to shine a light on them by way of
01:48 finding them through Litestar.
01:49 So the first one I want to talk about is async timeout. And I know you have some stuff you want to talk about
01:55 with context managers. And this kind of lines right up there. So this is an asyncio compatible, as in async and await keywords,
02:03 timeout class. And it is itself a context manager. Not the only way you could possibly use it, I suppose. But it's a context manager. And the idea is you say async with timeout. And then whatever you do inside of that block, that, that context manager, that with block, if it's asynchronous and it takes longer than the timeout you've specified, it will cancel it and raise an
02:13 exception say this took too long, right? Maybe you're trying to talk to a database and you're not sure it's on, or you're trying to call an API and you don't know, you don't want to wait more than two seconds for the API to respond or whatever it is you're after. That's, that's what you do is you just say async with timeout, and then it manages all of the
02:30 the nested asyncio calls. And if something goes wrong there, it just raises an exception and cancels it. That's really pretty cool. Isn't that cool? There are ways in which in Python 3.11, I believe it was added where you can create a task group. And then you can do certain things.
02:47 I believe you got to pass, you got to use the task group itself to run the work. Okay, so you pretty sure that's how you do is been a while since I thought about it be some like task group dot, you know, create task and you await it something along those lines, right? And there, you can do a task group.
03:08 And there, you've got to be really explicit, not just in the parts within that, but all the stuff that's doing async and await deep down in the guts, right? They all kind of got to know about this task group deal, I believe, remembering it correctly. And this one, you don't have to do that, right? You just run async stuff within this with block. And if it takes too long, that's it. So in the example here, it says,
03:38 we have an await inner, this is like all the work that's happening. I don't see why that has to be just one, it could be multiple things. Yeah. And it says if it executes faster than the timeout, it just runs as if nothing happened. Otherwise, the inner work is canceled internally by sending an asyncio dot canceled error into it. But from the outside, that's transformed into a timeout error that's raised outside the context manager scope. Be cool. Yeah, that's handy. Yeah, there's another way you can specify you can say timeout
04:08 at like now plus 1.5 seconds, if you'd rather than just say 1.5 seconds. So if there's, you want to capture time at some point, and then later, you want to say that time plus some bit of time, you can also access things like the expired property on the context manager, which tells you whether or not it was expired, or whether it ran successfully. Inside the context manager, you can ask for the deadline. So you know how long it takes. And you can upgrade the time as it runs. You're like, Oh, this part took too long.
04:38 long or under some circumstance, something happened. So we need to do more work. Like maybe we're checking the API if there's a user, but actually there's not. So we've got to create the new user and we've got to send them an email and that might take more time than we were in the sort of other scenario. So you can say shift by or shift to for time. So you can say, Hey, we need to add a second to the timeout within this context manager. Interesting. So basically reschedule it. Yeah. Oh, that's pretty cool. Yeah. So that's
05:08 one thing. And then there's one other bit in here. Wait for right. It says so that this is useful when asyncio wait for is not suitable, but it's also faster than wait for because it doesn't create a separate task that is also scheduled as async wait for itself does. So it's not totally unique functionality in Python, but it's it's a neat way to look at it. And I think this is a nice little library.
05:34 Yeah, I like that. The interface to it's pretty clean as well.
05:37 Yeah. A good little API there because it's a context manager, huh?
05:41 Yeah. Well, let's let's reorder my topics a little bit. Let's talk about context managers.
05:46 Did I change your order? Sorry.
05:48 That's all right. So Trey Hunter has written an article called creating a context manager in Python.
05:54 And we it's as you've just described a context managers, really the things that you use a with block with.
06:03 And there's a whole bunch of them like there's a open if you say with open and then a file name as file, then it automatic. The context manager automatically closes it afterwards.
06:14 So really, this article is about this is pretty awesome. But how do we do it ourselves? And and so he kind of walks through he's got a bunch of a bunch of detail here, which is great.
06:27 It's not too long of an article, though, a useful one, which I thought that was an awesome, good example is having a context manager that changed an environmental variable just with the with block.
06:39 And then it goes back to the way it was before. And the code for this is just a it's just a class with it's not inheriting from anything.
06:48 And the context manager class is a class that has dunder in it, dunder enter and dunder exit functions.
06:57 And then he talks about all the stuff you have to put in here.
07:00 The and then in your example before you said as like with the the timer as CM or something so that you could access that to see, you know, values afterwards.
07:13 So Trey talks about how what is the how do you get the as functionality to work?
07:19 And really, it's just you have to return something.
07:22 And then there's enter and exit functions.
07:25 And there's yeah.
07:29 How do you deal with all of those?
07:31 It's a great it's just a little great article.
07:33 I love using context managers and knowing how to I think it makes sense to practice a couple of these, because knowing how to use one in the in the context of your own code.
07:45 There's frequently times where you have to do something and, you know, you're going to have to clean up or something.
07:50 Or there's some final thing that you have to do.
07:52 You don't really want to have that littered all over your code, especially if there's multiple exit points or return points in a context manager is a great way to to deal with that.
08:01 I did want to shout out to pytest a little bit.
08:04 So the the environmental variable part example is a great useful one for normal code if you ever want to change the environment outside of testing.
08:14 But if you're doing it in testing, I recommend making sure that you I scroll to the wrong spot.
08:20 There's a monkey patch thing within pytest.
08:23 So if you use fixtures, monkey patch, there is a set environment monkey patch portion.
08:32 So within a test, that's how you do an environmental variable.
08:35 But outside of a test, why not create your own context manager?
08:39 Oh, you're muted.
08:40 So the environment variable only exists while you're in the context block, right?
08:44 That's cool.
08:45 Yeah.
08:45 With block.
08:46 Yeah.
08:46 Or you're changing it.
08:47 Like if you wanted to add a path, add something to the path path or something.
08:51 Sure.
08:52 There's other ways to do the path.
08:54 But let's say it's a, I don't know, some other Windows environment variable or something.
08:58 But yeah.
08:59 Yeah.
08:59 These things are so cool.
09:00 So if you ever find yourself writing, try finally, and the finally part is unwinding something
09:06 like it's clearing some variable or deleting a temporary file or closing a connection, that's
09:13 a super good chance to be using a context manager instead.
09:16 Because you just say with the thing, and then it goes, I'll give two examples that I think
09:20 were really fun and that people might connect with.
09:22 So prior to SQLAlchemy 1.4, the session, which is the unit of work design pattern object
09:30 in SQLAlchemy, the idea of those are, I start a session, I do some queries, updates, deletes,
09:37 inserts, more work.
09:38 And then I commit all of that work in one shot.
09:40 Like that thing didn't used to be a context manager.
09:43 And so what was really awesome was I would create one, like a wrapper class that would
09:49 say in this block, create a session, do all the work.
09:52 And then if you look at the Dunder exit, it has whether or not there was an exception.
09:56 And so my context manager, you could say, when you create it, do you want to auto commit the
10:00 transaction?
10:01 If it succeeds and auto roll it back, if there's an error.
10:04 Yeah.
10:04 And so you just say in the exit, is there an error roll back the session?
10:08 If it's no errors, commit the session.
10:11 And then you just, it's like beautiful, right?
10:13 You don't have to juggle that.
10:13 There's no try.
10:14 Finally, there's awesome.
10:16 Another one to put it in something sort of out of normal scope, maybe for people like
10:21 the database one might be something you think of is colors.
10:25 Yeah.
10:25 Colorama.
10:26 So if using something like Colorama, where you're like, I want to change the color of the
10:31 text for this block, right?
10:33 So you, there's all sorts of colors and cool stuff.
10:36 It's like a lightweight version of rich, but just for colors, you can do things like print
10:40 foreground dot red, and it'll do some sort of every bit of text that comes after that
10:45 will be red or whatever.
10:46 So you can create a context block that is like a colored block of output.
10:51 And then there's a reset all style dot reset all you can do.
10:54 So you just, in the open, you pass in the new color settings.
10:57 You do all your print statements and whatever deep down.
11:00 And then on the exit, you just say print style dot reset all out of Colorama.
11:04 And it's, it's undone, like the color vanishes or you capture what it is and then you reset
11:09 it to the way it was before something along those lines.
11:12 Anyway, this is, I really like this, that kind of stuff, right?
11:15 People maybe don't think about color as a context manager, but.
11:18 But it kind of is because you always have to do the thing afterwards.
11:22 You always have to do this.
11:23 And put it back.
11:24 It's so annoying.
11:25 Yeah.
11:25 Anything where you have to put it back, any other data structures that, that you may
11:30 have like dirty.
11:31 You've got queues sitting around that you want to clean up afterwards.
11:34 Those are great for context managers.
11:36 Absolutely.
11:37 Brandon Brainer notices that and points out that there's also concept context lib for
11:43 making them.
11:44 And I'm glad he brought that up.
11:46 I was going to bring that up.
11:47 Context lib is great, especially for quickly and doing context managers.
11:53 But I think it's in, in maybe the documentation is pretty good.
11:57 You can do a decorator context manager and then you can use a yield for it.
12:01 But I really like the notion of, I guess you should understand both.
12:05 I think people should understand how to write them with just Dunder methods and how to write
12:10 them with the context manager and context lib.
12:13 I think both are useful.
12:14 But meant to mentally understand how the enter exit, all that stuff works, I think is important.
12:20 So thanks, Brian.
12:21 Yes.
12:21 And let's tie the thing that I opened with and this one a little bit tighter together.
12:26 Brian, there's an a enter and a exit or async with blocks.
12:32 Right.
12:33 So if you want an asynchronous enabled version, you just create an async, async def, a enter,
12:38 then async def, a exit.
12:40 And now you can do async and await stuff in your context manager, which is sort of the async equivalent
12:47 of the enter and exit.
12:49 Okay.
12:49 And the context lib also has these async context manager options, a enter and a exit.
12:57 Cool.
12:58 Yeah.
12:58 Perfect.
12:58 Yeah.
12:59 Exactly.
12:59 Very nice.
13:01 Very nice.
13:01 All right.
13:02 Let's go to the next one.
13:03 Yeah.
13:03 So server sent events, let's talk about server sent events, server sent events.
13:09 People probably, well, they certainly know what a request responses for the web, because
13:14 we do that in our browsers all the time.
13:16 I enter URL, the page comes back.
13:19 I click a button.
13:20 It does another request.
13:21 It pulls back a page.
13:22 Maybe I submit a form, it posts it, and then it pulls back a page, right?
13:25 Like that's traditional web interchange, but that is a stateless kind of one time.
13:32 And who knows what happens after that sort of experience for the web.
13:35 And so there were a bunch of different styles of like, what if the web server and the client
13:41 could talk to each other type of thing, right?
13:43 In the early days, this is what's called long polling.
13:46 This works, but it is bad on your server for what you do is you make a request and the server
13:52 doesn't respond right away.
13:54 It just says, this request is going to time out in five minutes and then it'll wait.
13:58 And if it has any events to send during that time, it'll respond.
14:02 And then you start another long poll event cycle, right?
14:06 But the problem is you've got to, for everything that might be interested, you've got an open
14:10 socket just waiting.
14:11 Try it like in the process, a request queue sort of thing.
14:16 It's not great.
14:16 And then web sockets were added and web sockets are cool because they create this connection
14:21 that is bi-directional, like a binary bi-directional socket channel from the web server to the client,
14:27 which is cool.
14:28 Not great for IoT things, mobile devices are not necessarily super good for web sockets.
14:35 It's kind of heavyweight.
14:36 It's like a very sort of complex, like we're going to be able to have the client talk to
14:40 the server, but also the server, the client, they can respond to each other.
14:44 So a lighter weight, simpler version of that would be server sent events.
14:48 Okay.
14:49 Okay.
14:50 So what server sent events do is it's the same idea.
14:53 Like I want to have the server without the client's interaction, send messages to the client.
14:58 So I could create like a dashboard or something, right?
15:00 The difference with server sent events is it's not bi-directional.
15:04 Only the server can send information to the client, but often for like dashboard type things,
15:09 that's all you want.
15:09 Like I want to pull up a bunch of pieces of information and if any of them change, let
15:13 the server notify me, right?
15:15 Oh yeah.
15:16 I want to, I want to create a page that shows the position of all the cars in F1, their
15:20 last pit stop, their tires, like all of that stuff.
15:23 And like, if any of them change, I want the server to be able to let the browser know, but
15:28 there's no reason the browser needs to like make a change, right?
15:31 It's like, it's a watching, right?
15:33 If you, so if you have this watching scenario, server sent events are like a simpler, more
15:37 lightweight, awesome way to do this.
15:38 Okay.
15:39 We all know what SSE, server sent events are.
15:41 Okay.
15:42 So if you want that in Python, there's this cool library, which is not super well known,
15:48 but as cool is HTTP X.
15:51 So HTTP X is kind of like requests sort of maybe the modern day version of requests because it
15:57 has a really great.
15:57 Async and a wait story going on.
16:00 So there's this extension called HTTP X dash SSE for consuming service and events with
16:08 HTTP X.
16:09 Oh, okay.
16:10 Yeah.
16:10 So if you want to be a client to one of these things in Python, to some server that's sending
16:15 out these notifications and these, these updates.
16:18 Well, HTTP X is an awesome way to do it because you can do async and a wait.
16:21 So just a great client in general.
16:23 And then here you plug this in and it has a really, really clean API to do it.
16:27 So what you do is you would get the connect SSE out of it.
16:32 And you just, with HTTP X, you just create a client and then you say, connect the SSE to
16:38 that client to some place, gives you an event source.
16:40 And then you just iterate, you just say for thing and event, and it just blocks until the
16:44 server sends you an event.
16:45 And it'll, I think, raise an exception if the socket's closed is what happens.
16:49 So you just like loop over the events that the server's sending you when they happen.
16:53 Okay, cool.
16:54 Isn't that cool?
16:55 So yeah, so you could like in my F1 example, you could subscribe to the changes of the race
17:00 and when anything happens, you would get like, there's a new tire event and here's the data
17:04 about it and the ID of the event session and all those different things just streaming to
17:11 you.
17:11 And it's like literally five lines of code, sorry, six lines of code with the import statement.
17:16 So what does it look like on the server then?
17:19 I guess that's not what this project is about.
17:21 It's not your problem.
17:22 However, they do say you can, you can create a server, sorry, a starlet server here and they
17:29 have below an example you can use.
17:30 So it's cool.
17:31 They've got a Python example for both ends.
17:33 Yeah.
17:34 So what, what you do on the server is you create an async function and here's a async function
17:39 that just yields bits of just a series of numbers.
17:43 It's kind of like a really cheesy example, but it, it sleeps for about an async second.
17:47 It's like a New York second, like a New York minute, but one sixtieth of it.
17:51 And it doesn't block stuff.
17:52 So you, for an async second, you sleep and then it yields up the data.
17:57 Right.
17:57 And then you can just create one of these event source responses, which comes out of the starlet
18:05 SSE, which is not related to this, I believe, but it's like kind of the server implementation.
18:10 And then you just set that as an endpoint.
18:11 So in order to do that, they just connect to that.
18:14 And then they just get these, these numbers just streaming back every second.
18:18 That's pretty cool.
18:19 Yeah.
18:20 I mean, all of this, like if I, if I hit command minus one time, all of the, both the server
18:26 and the client fit on one screen of code.
18:28 Yeah.
18:28 Yeah.
18:29 Yeah.
18:29 That's pretty neat.
18:30 What else do I have to say about it?
18:32 It has an async way to call it and a synchronous way to call it because that's HTTPX's style.
18:38 It shows how to do it with the async.
18:40 Here's your async with block.
18:41 I mean, it's full of context managers this episode, and it shows you all the different
18:44 things that you can do.
18:45 It talks about how you to handle reconnects and, you know, all of these little projects
18:51 and all these things we're talking about are, there's sort of a breadcrumbs through
18:56 the trail of Python.
18:57 So it says, look, if there's an error, what you might do about that?
19:01 Like if you disconnect, you might want to just let it be disconnected, or you might want
19:05 to try to reconnect or who knows, right?
19:07 What you need to do is not really known by this library.
19:10 So it just says they're just going to get an exception, but it does provide a way to resume
19:15 by holding onto the last event ID.
19:17 So you can say like, Hey, you know, that generator you were sending me before, like, let's keep
19:21 doing that, which is kind of cool.
19:24 And it'll just pick up, but here's the breadcrumbs.
19:26 It says, here's how you might achieve this using stamina.
19:28 And it has the operations here and it says on HTTP gives a decorator says at retry on
19:35 HTTP X dot reader.
19:36 And then it goes, how to redo it again.
19:39 And how often.
19:39 So stamina is a project by Hennick that allows you to do asynchronous retries and all sorts
19:47 of cool stuff.
19:48 So maybe something fun to have.
19:50 We talked about stamina before.
19:51 I don't believe we have.
19:52 I don't think we have.
19:53 We'll have to.
19:54 I don't remember it either.
19:54 But anyway, yeah, there's a lot of cool stuff in here, right?
19:57 Yeah.
19:58 And yeah, so people can go and check this out.
20:01 But here's the retrying version.
20:03 You can see an example of that where it just automatically will continue to keep going.
20:08 So pretty cool little library here.
20:11 HTTP X dash S E.
20:13 It has 51 GitHub stars.
20:15 I feel like it deserves more.
20:16 So people can give it a look.
20:18 Yeah.
20:19 Well, speaking of cool projects on Python, cool projects in Python, you'll probably grab them
20:25 from PyPI, right?
20:26 Of course.
20:27 Do a pip install.
20:28 And let's take a look at stamina, for instance.
20:31 In a lot of projects, one of the things you can do, you can go down and on the left-hand
20:36 side, there's project description, release history, download files.
20:40 Everybody has that.
20:41 All of them have that.
20:42 But then there's project links.
20:43 And these change.
20:45 They're different on different projects.
20:46 So stamina's got a change log and documentation and funding and source.
20:51 And they all have icons associated with it.
20:54 So I don't know what we have.
20:57 If we go to source, it goes to GitHub, looks like.
20:59 Funding, it's a GitHub sponsors.
21:02 That's pretty cool.
21:03 Documentation.
21:04 I'm looking at the bottom of my screen.
21:05 Documentation links to stamina.hinnick.me.
21:08 Okay.
21:09 Interesting.
21:10 Change log.
21:11 Anyway, these links are great on projects.
21:14 Let's take a look at, but they're different.
21:16 So textual just has a homepage.
21:18 Okay.
21:19 HTTPX has change log homepage documentation.
21:25 pytest has a bunch also.
21:26 Also, it has a tracker.
21:28 That's kind of neat.
21:29 And Twitter.
21:30 A bug in there, yeah.
21:30 So how do you get these?
21:33 So if you have a project, it's really helpful to put these in here.
21:37 And so there's Daniel Roy Greenfield wrote a blog post or post saying, PyPI project URLs cheat sheet.
21:46 So basically figured all this stuff out.
21:48 It's in, it's not documented really anywhere except for here, but it's in the warehouse code.
21:53 And the warehouse is the software that runs PyPI.
21:56 And I'm not going to dig through this too much, but basically it's the trying to figure out what
22:02 the name is the name that you put on in for a link and then which icon to use, if that's it.
22:08 So there's a bunch of different icons that are available.
22:11 And anyway, we don't need to look at that too much because Daniel made a cheat sheet for us.
22:16 So he shows a handful of them on his post, also a link to where they all are.
22:23 But then what it is, is you've got project URLs in your PyProject.toml file.
22:29 And it just lists a bunch of them that you probably want, possibly like homepage, repository, changelog.
22:36 Anyway, this is a really cool cheat sheet of things that you might want to use and what names to give them.
22:42 So it's a name equals string with the URL and the names on the left can be anything.
22:50 But if they're special things, you get an icon.
22:53 So nice.
22:54 Anyway, and there's even a mastodon now one now.
22:57 So that's cool.
22:58 Yay.
22:58 You have to change the Twitter one.
23:00 Twitter.
23:02 Oh, it's Twitter or X.
23:03 Interesting.
23:04 Yeah.
23:05 I think how much math that's going to break.
23:06 It has to be called X everywhere now.
23:07 No more algebra for you.
23:09 Yeah.
23:10 What a dumpster fire.
23:12 Okay.
23:12 Mike out in the audience points out the icons are courtesy of Font Awesome.
23:18 And indeed they are.
23:19 If you're not familiar with Font Awesome, check that out.
23:21 So like we could come over here and search for, wait for it, GitHub.
23:25 And you get all these icons here.
23:27 One of them is the one that shows up.
23:30 I don't remember which one of these it would be.
23:32 But if, you know, so it shows you the code that you need.
23:36 It's just fa brands space fa dash GitHub for the icon there.
23:41 But if for some reason you're like, what if there was a merge one?
23:44 I want to merge.
23:46 But there's no merge that's there like on your other project, right?
23:49 Then there's, there's, I don't know how many icons are in Font Awesome.
23:51 Like 6,000.
23:52 Yeah.
23:53 6,444 in total.
23:55 And maybe.
23:56 No, I take that back.
23:57 Cause there's new 12,000 new ones.
23:59 So there's a, there's a lot.
24:00 Let's just say there's a lot here.
24:01 Well, the top said 26,000.
24:04 So that's.
24:04 There we go.
24:05 Yeah.
24:06 Awesome.
24:06 Yeah.
24:07 So.
24:07 Oh, there's a fire one.
24:09 There's so many good ones.
24:10 That'd be a good one for Twitter now.
24:12 By the way, if you go to Python bytes and you would be, I would be, you go to the bottom,
24:17 like all these little icons.
24:18 These are all Font Awesome.
24:19 Even the little heart about made in Portland.
24:21 Is Font Awesome a free thing or do you got to pay for it?
24:25 You know?
24:25 Yes and no.
24:26 So Font Awesome is there's like, if you, if I search for GitHub again, you see that some
24:31 say pro and some don't.
24:32 Yeah.
24:33 Oh, okay.
24:34 Pro.
24:34 The ones that don't say pro are free.
24:36 The ones that say pro are pro.
24:37 They cost like a hundred dollars a year subscription, but I have a, I bought a subscription to it
24:43 and just canceled it because.
24:44 You got the icons you need.
24:46 I got the icon.
24:47 If I'm just locked at version six for a good long while, that's fine.
24:50 Maybe someday I'll buy more, but yeah.
24:51 So.
24:52 Okay.
24:52 There you go.
24:53 Nice.
24:53 So yeah, that's, that's awesome.
24:55 But it's cool how you, or how you pointed out Danny related that to the pyproject.tom.
25:01 I had no idea that that's how those went together.
25:02 It's cool.
25:03 Nice.
25:04 All right.
25:05 All right.
25:05 Well, I've got my screen up.
25:06 I'm off to the next one, huh?
25:07 Yeah.
25:07 We're done with them, aren't we?
25:09 That was, I have no more items, no more items to cover other than, other than extras.
25:13 Okay.
25:14 Well, I have a couple extras.
25:16 So I, a couple.
25:20 More people.
25:20 More people.
25:21 More people.
25:22 More people in Python people.
25:23 What did I want to say?
25:25 Oh, just that I had some great feedback.
25:28 So I love starting something new.
25:31 It's good to provide feedback for people.
25:33 And I got some wonderful feedback that the music that I stole from Test and Code is annoying
25:38 on Python people because it's a completely different tone.
25:41 And fair enough.
25:42 So I'm going to go through and rip out all the intro music out of Python people.
25:47 So, and also the next episode's coming out this week.
25:50 It'll be Bob Belderbos from Pi Bites.
25:52 It's a good episode.
25:53 So it should be out later this week.
25:55 Do you have any extras?
25:56 I do.
25:57 I do.
25:58 I do.
25:58 I have some cool announcements and some extras and all of those things.
26:03 First of all, physicists achieve fusion with Net Energy Game for the second time.
26:08 So, you know, the holy grail of energy is fusion, not fission, right?
26:13 Just squishing stuff together like the sun does and getting heavier, heavier particles
26:18 and tons of energy with no waste, no negative waste, really.
26:21 I mean, there's output, but like helium or something, right?
26:24 Oh, no.
26:25 We need more helium anyway.
26:26 I don't know, Brian, if you knew, but there's a helium shortage and a crisis of helium potentially.
26:30 We'll see that someday.
26:31 Anyway, the big news is the folks over at the NIF repeated this big breakthrough that they
26:39 had last year at the National Ignition Facility.
26:41 So congrats to them.
26:43 And why am I covering this here other than, hey, it's kind of cool science, is last year
26:48 after that, or actually earlier this year, I had Jay Solmanson on the show and we talked
26:54 about all the Python that is behind that project at the NIF and how they use Python to help power
27:00 up the whole national fusion breakthrough that they had.
27:03 So very cool.
27:05 If people want to learn more about that, they can listen to the episode 403 on Talk Python
27:09 to Me.
27:09 And just congrats to Jay and team again.
27:12 That's very cool.
27:13 Do they have a 1.21 gigawatt one yet?
27:17 That would be good.
27:19 They can't go back in time yet.
27:20 No.
27:20 Okay.
27:21 No.
27:22 But if you actually look, there's a video down there.
27:28 If you actually look at the project here, the machine that it goes through, this is like
27:34 a warehouse room-sized machine of lasers and coolers and mirrors and insane stuff that it
27:42 goes through until it hits like a dime-sized or small marble-sized piece somewhere.
27:48 There's like an insane...
27:49 It's not exactly what you're asking for, but there is something insane on the other side
27:54 of the devices.
27:54 Yeah.
27:55 We've got ways to get this into a car.
27:57 Yeah.
27:58 I mean, Marty McFly has got to definitely wait.
28:01 Yeah.
28:01 To save his parents' relationship.
28:03 Okay.
28:03 All right.
28:05 All right.
28:05 I have another bit of positive news.
28:07 I think this is positive.
28:07 This is very positive news.
28:08 Yeah.
28:09 The other positive news is, you know, I've kind of knocked on Facebook and Google.
28:13 Last time, I think I was railing against Google and their DRM for websites, like their ongoing
28:20 persistent premise that we must track and retarget you.
28:24 So how can we make the web better?
28:26 Like, no, no, that's not the assumption we need to start with.
28:29 No, it's not.
28:30 So I would, you know, I just want to point out maybe like a little credit, a little credit
28:34 to Facebook at this time, a little, maybe a positive shout out.
28:38 So there's a bunch of rules that I think are off the target by here.
28:42 And for example, there were a bunch of attempts and like in Spain, there was an attempt to
28:49 say, if you're going to link to a news organization, you have to pay them.
28:55 Like, wait a minute.
28:56 So our big platform is sending you free traffic.
29:01 And to do that, we have to pay you, you know, because the newspapers are having a hard time
29:05 and they're important.
29:05 But maybe that's a little bit off.
29:08 Probably the most outrageous of this category of them were somewhere in Europe.
29:12 I can't remember if it was the EU in general or a particular company, a country rather, sorry.
29:16 The, they were trying to make companies like Netflix and Google via, because of YouTube,
29:22 pay for their broadband because people consume a lot of their content.
29:27 So it uses a lot of their traffic.
29:28 It's like, wait a minute, we're paying already to like get this to you.
29:32 And then you're going to charge us to make you pay for our infrastructure.
29:35 I don't know.
29:36 I just, you're like, oh, no, no, no.
29:38 That seems really odd to say like, you know, Netflix should pay for Europe's fiber because
29:44 people watch Netflix.
29:45 I don't know.
29:46 That just, it seems super backwards to me.
29:48 Okay.
29:49 I'm going to be devil's advocate here.
29:51 I think that, that if Netflix, for example, if Netflix is taking half the bandwidth or
29:57 something like that, then all of the infrastructure costs, half of those costs are benefiting Netflix
30:03 and they're profiting off of it.
30:05 I think that's sort of legitimate.
30:06 It depends on the scale, right?
30:09 I think like, we are not taking a ton of bandwidth from Europe, so it would be weird
30:15 for us to have to pay something.
30:16 But if I'm taking a measurable percentage, that's probably maybe okay.
30:21 the other side is like, I read Google news still, even though I'm not a huge fan of
30:28 Google, but I read Google news.
30:29 There's a lot of times where that's enough.
30:31 I'm like, is there anything important happening?
30:33 I'm just reading the headlines.
30:34 I'm not clicking on the link and that, that benefit then for Google wouldn't be there if
30:40 the newspapers weren't there.
30:41 So I would say some money going to the newspapers that are providing those headlines.
30:45 I think that's fair.
30:46 So I, I, I certainly hear what you're saying with the news on that.
30:50 we still haven't got to the topic yet.
30:52 Okay.
30:52 I, no, no, but I, I totally hear you.
30:54 I think with the, the bandwidth, like the customers decide like no one's Netflix isn't projecting
31:00 stuff onto the people in Europe and they're receiving it out of it.
31:03 They, they seek it out.
31:05 Right.
31:05 So I don't know.
31:05 I feel like, but we can, yeah, that's, I, I appreciate the devil's advocate.
31:10 Yeah.
31:10 Okay.
31:11 What was the, and Google news?
31:12 So here's the news though.
31:14 Facebook and more generally meta is protesting a new Canadian law, obliging it to pay for
31:21 news that if, so if my mom shares an article, say my mom was Canadian and she shared an article
31:28 to some, some news thing, the Canadian post or whatever, then on Facebook, then Facebook
31:36 would have to pay the Canadian post.
31:38 Cause my mom put it there.
31:39 So they're protesting it by no longer having news in Canada.
31:43 News doesn't exist in Canada now.
31:46 On Facebook or.
31:47 Yeah.
31:47 So my mom tried to post it.
31:48 They were just going to like, that can't be posted.
31:50 Oh, well that's weird.
31:52 Isn't that weird.
31:53 So I, I actually kind of agree with you on the Google news bit, like where a good chunk
31:57 of it is there and it becomes almost a reader type service, but like Facebook doesn't do that.
32:02 It just says, well, here's the, here's the thumbnail and you could click on it.
32:05 But also there's a lot of, a lot of anger below it, but get their news from people sharing
32:10 it on Facebook.
32:11 They follow.
32:12 Do they click it?
32:13 That's the, do they, do they, do they often not?
32:16 yeah, possibly.
32:18 And is it free is the, is the bandwidth if, if like, if I share it with a million people,
32:23 and they don't click on it, does it cost the news paper?
32:28 Possibly they might be drawing it for the headline and the image and all that stuff.
32:31 They might.
32:32 Yeah.
32:32 They probably cash it, but they might, might not.
32:34 I do.
32:34 So I'll leave, I'll put this out there for people to have their own opinions.
32:38 but I, I think this is something that Facebook should stand up to.
32:42 And just me not speaking for Brian.
32:44 Well done Facebook.
32:45 I don't think, I don't think this makes any sense.
32:47 Like they're protesting this law that makes them pay if my mom were Canadian and put
32:53 news into her feed.
32:54 Yeah.
32:54 And I'll just say, way to go Canada.
32:57 I like it.
32:58 Awesome.
32:59 All right.
33:00 Cool.
33:00 That's it for all the items I got.
33:02 You covered yours, right?
33:03 Yes, I did.
33:05 So let's do something funny before we get into fisticuffs.
33:08 So before, no, never.
33:10 So, well, you want to talk about fisticuffs.
33:12 So let's see the joke.
33:13 So this joke makes fun of a particular language.
33:15 The point is not to make fun of that language.
33:17 It's to make fun of AI.
33:19 Okay.
33:20 So people who are, want to support the AI, they can send me their angry messages.
33:23 People who are fans of the language I'm about to show you, please don't.
33:26 Not about that.
33:29 Okay.
33:29 So if you were working with a GitHub copilot, you know, a lot of times it tries to auto-suggest
33:36 stuff for you, right?
33:37 That didn't zoom that.
33:39 It tries to auto-suggest stuff for you.
33:41 Yeah.
33:42 And so if you say like, this is C-sharp, people know I've done C-sharp before.
33:46 I like it at all.
33:48 so not make fun of it, but it's just a slash slash day.
33:51 And then there's an auto-complete statement that the copilot is trying to write.
33:54 What does it say?
33:55 Right?
33:55 It says day one of C-sharp, and I already hate it.
33:59 So like how many people have written this in their like online journals or something?
34:06 Yes, exactly.
34:07 What in the world is going on here?
34:09 So that's, there's some, there's some fun comments, but, they're not too great
34:18 down here, but I just, I just thought like, you know, this, this weirdo, weirdo auto-complete,
34:24 like we're going to get into this, where this kind of stuff happens all the time, right?
34:27 This is kind of, the Google suggest, you know, let's see if I can get it to work here.
34:32 We go to Google and type American, Americans are, you know, what does it say?
34:39 Right?
34:39 Struggling.
34:40 Entitled.
34:41 Yeah.
34:42 Like C-sharp developers are, and then it'll give you like a list or let's do it with Python,
34:48 right?
34:49 Python, Python, right?
34:51 Who are the Python?
34:52 Why are they paid so much?
34:53 Who hired these people?
34:55 Et cetera, right?
34:56 So this is the AI equivalent, but it's going to be right where you work all the time.
35:00 That's funny.
35:01 And Mo and, Joe out there says, I wonder what it says for day one of Python.
35:06 I have no idea, but somebody had Copilot installed.
35:10 They should let us know.
35:11 And what maybe we'll point it out next time.
35:14 Yeah.
35:15 Interesting.
35:16 I haven't turned it on, but.
35:17 No, I haven't either.
35:19 All right.
35:19 All right.
35:20 Well, thanks.
35:20 Apparently many people do.
35:21 And I really enjoy it.
35:22 Like the usage numbers are kind of off the chart.
35:25 Well, so yeah, I'll just say one of the people used to not like maintaining software written
35:31 by others.
35:32 And they mostly like writing green field code.
35:35 But with Copilot, you don't have to write your first draft.
35:38 You can, you can just become a permanent maintainer of software written by something.
35:43 Exactly.
35:44 I wrote the bullet points and now I maintain what the AI wrote.
35:47 Fantastic.
35:47 Exactly.
35:48 Hope you understand it.
35:50 Yeah, exactly.
35:51 But anyway, well, thanks a lot for a great day again or a great episode.
35:58 Absolutely.
35:58 Thank you.
35:59 See y'all later.