Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book

Transcript #273: Getting dirty with __eq__(self, other)

Return to episode page view on github
Recorded on Tuesday, Mar 1, 2022.

00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:05 This is episode 273, recorded March 1st, 2022. And I'm Brian Okken.

00:11 I'm Michael Kennedy.

00:13 Well, welcome, Michael. It's good to have us here.

00:16 So it's great to see you, as always. It feels like spring is almost here. It's March. I can't believe it. So pretty awesome. Yeah. Fun to be talking Python with you.

00:26 Yeah. So should we kick it off with your first item?

00:29 Let's do it.

00:30 I'm a big fan of science, math, and all those things.

00:35 And I came across this article because I was reading about science, not because I was reading about Python.

00:40 But then I thought, "Oh, there has to be a Python story here.

00:43 Let's get into it and see if I can track it down." And wow, was it not easy to find.

00:47 So here's the deal.

00:48 I saw an article over on called "Physics Breakthrough as AI Successfully Controls Plasma in a Nuclear Fusion Experiment." That's so cool.

00:59 - That's amazing, right?

01:00 So let me put a few things together here.

01:01 Nuclear fusion, not fission, that's the kind of nuclear we want.

01:05 That is harnessing the sun with no negative effects to turn hydrogen into helium and so on, right?

01:13 If we could harness that, that's like free, super easy energy forever.

01:18 It's incredible, right?

01:19 So people have been working on this for a long time.

01:22 The way that I understand, which is probably pretty piecemeal that it works is you put some kind of thing, some kind of material like hydrogen or something in the middle, and then you blast it with tons of energy, but then it creates this plasma.

01:39 You've got to control with lasers and magnets on how you basically keep the pressure high enough in addition to just the heat to actually make the fusion work, right?

01:50 So there's been some success like, hey, we got fusion to work for a while.

01:55 It just took more energy than it put out.

01:57 So, you know, it's not a super great power plant, but it did do the science thing, right?

02:02 - Yeah.

02:03 - So, here's the deal.

02:04 This article says they've used artificial intelligence to teach it how to make instantaneous or near instantaneous adjustments to the magnetic field and the lasers in order to actually get better results with fusion, right?

02:19 So, take it farther along.

02:20 And it says, "In a joint effort, "the Swiss Plasma Center and artificial intelligence research company, DeepMind, they used deep reinforcement learning to study the nuances of plasma behavior and control inside a fusion tokamak, that's the donut shaped thing where the reaction happens.

02:41 And they're able to make a bunch of small adjustments really quickly in order to get better results.

02:48 And it's pretty wild that they did that with AI, isn't it?

02:51 - Yeah, there's definitely Python in there somewhere, you just know it.

02:55 Exactly.

02:55 So I'm like, all right, where is this?

02:56 So I went through and they talk about the findings being in nature, some of the articles that they're referencing.

03:02 So there's some like deep as in not super engaging sort of scientific articles, like the traditional academic style of writing that you've got to dive into and then like follow a bunch of links, but eventually in there, you will find that there is some cool science stuff going on and Python is at the heart of it. So, it's probably not worth going into too much of the details of how it it's actually happening, but it's the, the Python side of things, but I just thought it was super cool that, look, here's one of the most exciting things happening in energy and for the climate and for all sorts of things. And AI and Python are pushing it forward. That's crazy. And that's what we need for a Mr. Fusion so that we can make flying cars and, and, time traveling cars too.

03:47 Exactly.

03:48 I mean, Marty McFly and doc, they go and they throw their, their banana peel on the back of the DeLorean, right?

03:54 You've got to have one of these token mugs to make it roll and got to have Python.

03:57 Yeah.

03:57 Car.

03:57 Come on.

03:58 Obviously.

03:59 So cool.

04:00 All right.

04:02 Well, take us back to something more concrete.

04:04 Well, okay.

04:05 So I'm pretty excited about this.

04:06 It's a minor thing, but maybe not too minor.

04:10 pep six 80 has been, accepted standards track for Python 3 11.

04:15 PEP 680 is TOMLib support.

04:19 So support for parsing TOML in the standard library.

04:23 We haven't had it yet.

04:24 >> That's awesome. We've got JSON, we've got CSV, why not?

04:28 >> Right.

04:28 >> We've got XML.

04:29 >> Well, and now that PEP uses TOML for pyproject.toml.

04:38 Anyway, I think it'd be cool to have in the standard library.

04:43 I think it's fine to have other outside supports.

04:45 So what they're doing is, and if people don't, there's some rationale here, but you know, just think it's easier than normal.

04:54 So Toml is I like Toml for, because it's just, I don't know, it's an easy format to read, it's better than any and some other stuff.

05:03 And for people who don't know, it feels any like, like the dot I and I file style where you've kind of got like section headers and then key value bits.

05:12 Yeah.

05:13 And it doesn't, and often it doesn't, like you can use, you can use black and write a pyproject.toml file without even really knowing anything about toml.

05:22 So it's pretty straightforward, but we didn't have a way built into standard library to just use it.

05:28 So this is this pep.

05:31 One of the things there, interesting bits about it is it's only reading.

05:36 So it's only adding support for reading toml.

05:40 So there's a load and a load s.

05:43 So you can load a Toml file or you can load a string and that's it.

05:48 And it outputs a dictionary.

05:51 So that makes sense.

05:55 You're just getting a Toml object and turning it into a dictionary so you can use it.

06:01 But this is built on top of Tomly.

06:06 So Tomly is being used as the library to basically, there's an open source project called Tomly, which a lot of projects are using.

06:17 I think this is the one that pytest is using and quite a few projects have switched to this.

06:22 It's really fast, it's nice, but it supports like writing as well, but--

06:26 - Yeah, writing and code and dump ass and all those things. - Yeah, right.

06:30 But that's not the part that's gonna get supported.

06:33 And I think that's fine to just have reading built into.

06:38 - Sure, some file formats like text and CSV and whatnot, like reading and writing is super common, right?

06:46 But these are way more likely to be used as configuration files that drive app startup and like hide secrets.

06:53 You know, you put your secrets in there and don't put in Git or something like that, whatever, right?

06:57 Those are the kind of use cases I would see.

07:00 And so in that case, reading seems fine.

07:02 You could always add writing later.

07:03 You just can't take it away if you add it too soon.

07:05 - Right, right.

07:07 But also like, I don't, and I'm sure there are reasons to need to write it, but I don't.

07:16 You know, it's mostly people write it and computers read it sort of thing.

07:21 - Yeah, exactly.

07:22 Some kind of editor writes it and then you read it.

07:24 - Yeah, so.

07:25 - Fantastic.

07:26 All right, well, cool.

07:27 Very nice to see that one coming along.

07:31 Elvaro out in the audience.

07:32 Hello there.

07:33 It says, "TOML just reached version 1.0, not so long ago." So maybe that also has some kind of impact on the willingness, like, all right, the file format is stable.

07:43 Now we can actually start to support it in the library.

07:45 - That's true.

07:46 And we do support Python releases for a long time.

07:50 So it probably needed to be V1 at least.

07:53 So yeah.

07:55 - And Sam also says, "There's a lot of stylistic choices for how you write TOML files." like we need a black for Tomo, not, not to drive Tom, not to configure black, but something that then goes against Tomo files and, you know, makes them consistent.

08:10 Yeah, maybe.

08:12 Yeah.

08:13 Yeah.

08:13 But you could, yeah, you can, you could bake that in.

08:14 All right.

08:16 What have I got next here?

08:17 I've got, sticking on the internals here.

08:20 I want to talk about thread locals in Python.

08:23 Okay.

08:24 So last time we had Calvin on and I spoke about this crazy async running thing that I had built.

08:31 And boy, is it working?

08:33 Well, I, like I said, it is truly horrifying to think about what it's doing, but it actually works perfectly.

08:38 So there it is.

08:39 But one of the challenges that it has is it, it doesn't like it.

08:44 If you call back into it again.

08:47 And I talked about the, the nest async IO project last time, which maybe we'll solve it.

08:53 I tried those and it wasn't working, but it could have been like at a different iteration before I finally realized like, no, I have to go all in on this threading, like isolate all that execution into one place where we can control it.

09:05 So maybe it would work, but I just wanted to talk about thread locals in Python, which I thought were pretty easy and pretty interesting.

09:13 So I've got this stuff running over there.

09:15 And one thing that would be nice is each, there's different threads calling into the system to say schedule some work for me, basically.

09:22 puts it on a queue, the queue runs it on this like controlled loop and then it sends back the result.

09:27 The problem is if one function calls that to put in work and then as part of doing that work, the function itself somewhere deep down, like wraps that around, it doesn't really like the recursion aspect very much.

09:39 So what I thought is, well, how do I figure out, well, this thread has running work.

09:43 And if it calls again, you know, raise an exception and say, like you need to adjust the way you're calling this library, it's not working right.

09:49 Instead of just like doing some weird thing.

09:51 So what I think I might do, and I'm not totally sure it will work perfectly, but the idea is certainly useful for all sorts of things, is to use a thread local variable.

10:02 Now, when I thought about thread local variables, I've used them in other languages, and I had no idea how to do them in Python.

10:08 It turns out to be incredibly easy.

10:10 You just say, go to threading, the threading module, and you say local.

10:14 That becomes like a dynamic class that you can just start assigning values to.

10:18 So in the example that I'm linking to, It says you get a my data thing, which is a thread local data blob, whatever.

10:24 So you could say like, my data dot X equals one, my data dot list equals.

10:29 Whatever.

10:30 And then that will store that data, but it will store it on a per thread basis.

10:35 So each thread has, sees a different value.

10:37 So for example, what I could do is say, the red, you know, at the beginning of the call, like I have running work.

10:44 Yes.

10:44 At the end, you know, roll that back.

10:47 And if I ever call into schedule some work and the thread local says, I'm doing, I have active work running.

10:53 Well, there's that error case that I talked about.

10:54 And I don't have to do weird things like put different IDs of threads into database, into like a dictionary and then like check that and then lock it.

11:02 And like, I'll search, I can just say this thread has like a running state for my little scenario.

11:07 What do you think?

11:08 I think that's great.

11:09 I think it's interesting.

11:10 Yeah, it is.

11:11 Right?

11:11 Yeah.

11:12 And it's right.

11:14 Not too hard.

11:14 just create one of these little local things, interact with it in a thread, and each thread will have basically its own view into that data, which I think is pretty fantastic.

11:23 >> Like a thread version namespace thing.

11:27 >> Yes, exactly.

11:28 It's a cool little isolation without doing locks and all weird stuff that can end up in deadlocks or slowdowns or other stuff.

11:36 Anyway, if you've got scenarios where you're doing threading, and you're like, "Oh, it would be really great if I could dedicate some data just to this particular run and not like a global thing.

11:46 Check this out, it's incredibly nice.

11:49 >> Nice.

11:50 >> Let me pull up one more thing before we move on, Brian.

11:54 >> Okay.

11:55 >> How about Datadog?

11:57 >> Yes. That's also something else that's extremely easy to use.

12:02 Yeah. Thank you Datadog for sponsoring this episode.

12:05 Datadog is a real-time monitoring platform that unifies metrics, traces and logs into one tightly integrated platform.

12:13 Datadog APM empowers developer teams to identify anomalies, resolve issues, and improve application performance.

12:21 Begin collecting stack traces, visualize them as flame graphs, and organize them into profile types such as CPU, IO, and more.

12:29 Teams can search for specific profiles, correlate them with distributed traces, and identify slow or underperforming code for analysis and optimization.

12:39 Plus with Datadog's APM live search, you can perform searches across the full stream of integrated traces generated by your application over the last 15 minutes. That's cool.

12:51 Try Datadog APM with a 14-day free trial, and Datadog will send you a free t-shirt.

12:58 Visit, or just click the link in your podcast player show notes to get started.

13:06 >> Yes. Thank you, Datadog.

13:08 I love all the visibility into what's going on.

13:10 I was just dealing with some crashes and other issues on my, on something I was trying to roll out.

13:15 Some libraries conflicting with some other library, they were fighting.

13:18 And yeah, it's great to be able to just log in and see what's going on.

13:22 Now, before we move off this ThreadLocals, quick audience question.

13:26 Sam out there says, "It might be better to use context vars if you're also working with an invent loop.

13:30 As far as I know, context vars are the evolved version of ThreadLocals that are aware of async too." That's very interesting.

13:38 I haven't done anything with context bars, but the way async IO works is, even though there's a bunch of stuff running from different locations, there's one thread. So thread local is useless for that.

13:48 So that's why Sam is suggesting context bars.

13:50 The side that schedules the work has nothing to do with async IO in my world.

13:55 So that's why I was thinking thread local.

13:57 >> It's a good highlight to say if you're using async, you may need something different.

14:03 >> Absolutely. Yeah. Thanks, Dan, for that.

14:06 >> Yeah. I'm not sure if we've really talked about it much, but I came across that article from Trey Hunter called, "What is a generator function?" Like Python, especially the two to three switch, even like the items keyword function to get all the dictionary elements out, it doesn't return a list anymore, it returns a generator.

14:32 And maybe it always did, I don't know.

14:34 But there's a whole bunch of stuff that used to return lists that now return generators.

14:39 And it kind of, they look, they work great. You stick them in a for loop, and you're off to the races.

14:45 But a lot of people are a little timid at first to try to write their own because it's a yield statement instead of a, instead of a return, and what do you, how do you do it?

14:55 And so this is a great article by Trey to just say, here's what's going on.

15:01 It's not that complicated.

15:04 Generally, you often might have a for loop within your code.

15:08 Instead of returning all the items, you one by one yield the items.

15:14 Trade goes through some of the details of how this all works.

15:19 It's pretty interesting.

15:21 It's interesting for people to read through it and understand what's going on behind the scenes.

15:26 What happens is your function that has a yield in it, it will not return the item right away.

15:33 When somebody calls it, it returns a generator object.

15:36 And that generator object has things like next, and mostly that's what we care about, and next returns the next item that you've returned.

15:47 And then once you run out of items, it raises a stop iteration exception, and that's how it works.

15:54 But generally, we just don't care about that stuff.

15:57 We just throw them in a for loop.

15:58 But it is interesting to learn some of the details around it.

16:02 >> Yeah, they do seem mysterious and tricky, but they're super powerful.

16:06 The more data that you have, the way better idea it is to not load it all into memory at once.

16:12 >> Yeah, and you can do some fun things like chunking.

16:16 Like if you're returning your caller, like let's say, and these are fun things to do with this.

16:23 So let's say you're reading from an API or from a file or from a device or something.

16:30 It has, you read like a big chunk of things, like 20 of them or 256 or something like that, a whole bunch of data at once.

16:38 But then your caller really only wants one at a time.

16:43 Within your function, your generator function, you can do fancy stuff like read a whole bunch and then just meter those out and when then that's empty, you go and read some more and have intermittent reads.

16:55 and this will save time for, especially when you're not, you're not reading everything often.

17:00 Sometimes the caller will break and not utilize everything.

17:02 So that's definitely where, and they're very, they're a lot more efficient on memory too.

17:08 So if you're, like you said, if it's huge amounts of things, it might be either for memory reasons or for speed reasons.

17:14 These are great.

17:14 - Yeah.

17:15 Even computational, like, suppose you want a list of pedantic objects back and you're like reading some massive CSV and picking each row and star star value in there somehow.

17:26 That's the actual creation of the Pydantic object if there was like a million of them.

17:33 Forget memory, like even just the computation is expensive.

17:35 So if you only want the first 20, like you can only pay the price of initializing the first 20.

17:41 So there's all sorts of good reasons, yeah.

17:43 - Okay.

17:43 - I do want to just say one thing about generators that I wish there was like a slightly, Maybe some kind of behavior could be added, which would be fantastic.

17:53 So generators can't be reused.

17:56 - Yeah.

17:57 - Right, so if I get a result back from a function, I try to, and I wanna ask a question like, were there any items resolved in here?

18:03 And then loop over them if there were.

18:05 Like you kind of broke it, right?

18:06 You pulled the first one off, and then the next thing you work with is like, index one through n rather than zero through n, which is a problem.

18:14 So sometimes you need to turn them to a list.

18:16 It'd be cool if there was like a dot to list on a generator instead of having to call this on it, right?

18:22 Just like a way as an expression to kind of like, I'm calling this and it's sort of a data science flow.

18:27 I want all one expression and turn this generator into this other thing that I need to pass along.

18:31 That would be fun.

18:33 - Yeah, so a question out in the audience that maybe the returns, that the dictionary items and keys return something different, but Sam Morley says they return special generators, There's special kinds of generators, so yeah, thanks Sam.

18:52 Well, indeed.

18:53 Alright, well, what if I got next?

18:57 I think it's closed it now.

18:59 Would it really be an episode if we didn't talk about Wilma Coogan in some way or another?

19:02 So we got him on deck twice, but we're going to start with just something he recommended to us.

19:07 That's actually by Sam Colvin, who is the creator of pedantic.

19:12 And I'm not sure if you're ready for this, Brian, but this is a little bit dirty.

19:18 It's called dirty equals.

19:20 And the idea is to abuse the dunder EQ method, mostly around unit testing, to make test cases and assertions and other things you might want to test more declarative and less imperative.

19:36 So, that all sounds like fun, but how about an example?

19:39 So, it starts out with a trivial example.

19:41 It says, okay, from this library, you can import something called is positive.

19:46 So then you could assert one or like some number and whatever one equal, equal is positive.

19:52 That's true.

19:53 That assert passes negative two equal, equal is positive fails.

19:57 Okay.

19:58 Okay.

19:58 How does that strike you, Brian?

20:00 They were building, these are building blocks.

20:03 This is like a Lego piece, not the whole X wing fighter.

20:07 Okay.

20:07 But anyway, so that's the building block, right?

20:09 Like take something and instead of saying, yes, it's exactly equal, implement the dunder equal method in the is positive class to like take the value, make sure it's a number, then check whether it's greater than zero, right?

20:21 That kind of thing.

20:22 I don't know if that includes zero, but anyway.

20:24 But then you can get more interesting things.

20:26 Like, so you could go to a database, and if you do a query against the database, you get, I think in the case that's up there, I think you get a tuple back.

20:34 It depends on what you set the row factory to be, I suppose, but anyway, you get a tuple back of results.

20:42 It looks like maybe this is a dictionary.

20:43 Anyway, so then you can create a dictionary that has attributes that are like the result you want.

20:50 They can either be equal or they can be things like this is positive.

20:54 So in this case, we're doing a query against the database and then we're, looks like there's maybe needs to be like a first one, anyway, says, all right, what we're gonna do is we're gonna do equal equal that, the ID, so we'll create a dictionary, ID colon is positive int.

21:13 Username colon, Sam Colvin.

21:15 So that's an actual equality.

21:17 Like the username has to be Samuel here.

21:19 - Okay.

21:19 - Yeah.

21:20 And then the avatar is a string that matches a regular expression.

21:24 That's like a thumb number slash PNG.

21:27 The settings has to be a JSON thing where inside the settings, it's got some JSON values that you might test for.

21:34 And is created now, is now with some level of variation, like some level of precision that you're willing to work with, right?

21:42 because obviously you run the database query and then you get the result.

21:45 But it's like very near, nearly now, right?

21:49 It's like the almost equals and float type of stuff.

21:52 That's pretty cool, right?

21:54 (laughing)

21:56 - Do I need to answer?

21:58 I mean, I could see the utility.

21:59 - Tell me, share your thoughts.

22:00 Yeah.

22:01 - But I don't know, the API is a little odd to me.

22:06 - Okay, yeah, I think it's definitely an interesting idea.

22:10 It's definitely different.

22:11 You know, pedantic is often about, I know it's not pedantic, but it's by the creator.

22:17 Pedantic is often about given some data that kind of matches, can it be made into that thing?

22:23 And I feel like this kind of testing is in the same vein as what you might get working with pedantic and data.

22:30 Yeah.

22:31 Right?

22:31 Well, it's definitely, it's definitely terse and useful.

22:35 so, and, and I, I could totally get used to it if this is a, this is a pretty, pretty, condensed way to, to compare, to see if everything.

22:45 matches this protocol.

22:48 Yeah.

22:49 Yeah.

22:49 So Sergey on the audience has like sort of the alternative perspective could be, you could just write multiple assert statements instead of creating a dictionary that represents everything you could say, like, get the record back and assert that you'll get the first value out and assert on it, then get the username out and assert, and get the avatar and assert on it and so on.

23:10 It's an intermediate view story where you use the testing libraries, the testing classes, but more explicit.

23:19 >> Right. There's a couple of reasons why to not use more than one assert.

23:26 Because if you were to have multiple asserts, the first one to fail stops the check.

23:31 It's possible that this will tell you everything that's wrong, not just the first thing that's wrong.

23:36 >> Yes, exactly.

23:38 >> Then some people are just opposed to multiple asserts per test.

23:43 >> Yeah.

23:45 >> I don't know. A similar thing, so I have a plugin called pytestCheck, which is just it uses checks instead of asserts, so that you can have multiple checks for test.

24:00 but it does come up.

24:02 So this is interesting.

24:04 I'll definitely check it out and play with it.

24:06 >> Yeah. Another benefit of being able to construct one of these prototypical documents or dictionaries that then represents the declarative behavior or state that you're supposed to be testing for, is you could create one of these and then use it different locations.

24:22 Like, okay, when I insert a record and then I get it back out, it should be like this. But also if I call the API, and it gives me something back, it should also still pass the same test.

24:32 Like you could have a different parts of my app.

24:34 They all need to look like this.

24:35 - Yeah.

24:36 - As opposed to having a bunch of tests over and over that are effectively the same.

24:40 And Will is here who recommended this, suggests one of the benefits of dirty equals is that pytest will generate useful diffs from it.

24:49 - Yeah, definitely.

24:52 Reasons, pytest being a reason to use something, I'm on board then, yeah, sure.

24:56 - Yeah, check it out.

24:59 If you do play with it, give us a report how you feel about it.

25:01 >> One more question from Sam, said Sam Morley.

25:05 pytest already has something a bit like this with a prox except for it's for floats, etc.

25:12 Except for a prox is not etc, it's just for floats.

25:16 You can only use a prox with floats.

25:18 >> We have approximate now and stuff like that.

25:23 >> I'll try it, especially if Will likes it, It's gotta be good.

25:28 >> Exactly.

25:30 >> Awesome. All right. What's the final one you got for us here?

25:34 >> Okay. This is more of a question than I'm not saying this is awesome, but I ran across this.

25:39 Actually, I clicked on a listicle.

25:45 Mike, I think there's a self-help group for that.

25:48 >> Yeah. Well, we're definitely prone to clicking on the top listicles.

25:54 >> Yeah. So my name is Brian.

25:55 >> Awesome. That's awesome.

25:56 >> I clicked on a listicle.

25:58 The listicle was a top 10, where we at?

26:01 It was 10 tools I wish I knew when I started working with Python.

26:06 Actually, it's a good list.

26:08 I just knew about most of them as all.

26:10 We'll link to it anyway.

26:12 >> It's got the sound of music, it's got Jackie Chan, it's got Office Space.

26:15 Come on, this is a pretty solid listicle.

26:17 >> Well, then I got down to number 7 and 8, and I'm like, what are these things?

26:21 I've never heard of them. Commitison and semantic release.

26:26 The idea, I tried to commit with this.

26:32 Commitizen is a thing that you can say, if you install it, you can either brew install it for everything or you can put it in a virtual environment. That's cool.

26:41 But instead of just committing, you use this to commit and it asks you questions.

26:48 >> Right. Instead of typing git space commit, you type cz space commit.

26:52 >> Yeah. Then it asks you a whole bunch of stuff.

26:55 Was this a bug fix?

26:57 Was it a feature?

26:58 Did you? Then it follows on depending on what you answered.

27:02 If you had a bug fix or a feature, is it a breaking feature?

27:07 Did you? Basically, it's doing a whole bunch of stuff, but it's trying to do these conventional commits.

27:15 We've got a link to this too.

27:18 Then if you've got all this formatting, so it ends up formatting your commit message to a consistent format, so that when you're reading the history and stuff, you can do a whole bunch of... it's easier, I guess.

27:31 And then this tool also, this listicle also commented that you've got semantic release, which is a Python package that I haven't got through this much, but it can take this, all this information from these and do some better control your semantic release notes or release...

27:52 I don't know if it's release notes or just the release version.

27:54 I haven't got that far into it, but yeah, the commit is an ass.

27:57 Is this like a change corresponding to semantic versions such that it should be a major change so it'll like it looks like it'll increment the version and stuff like that as well?

28:06 Yeah, yeah, but so the in the about for committance and says command line utility to create commits with your rules and apparently you can.

28:17 You can specify some special rules, which is good.

28:21 display information about your commits, bump the version automatically, and generate a change log.

28:28 That's cool. That might be helpful.

28:31 My questions out to the audience and everybody listening, have you used something like this?

28:37 Is it useful?

28:38 Is there something different than this that you recommend?

28:41 Also, what size of a project would this make sense for a small or medium project?

28:45 >> That's cool. Yeah, let us know on Twitter or at the bottom of the YouTube live stream is the best place.

28:49 - Yep, so.

28:51 - Yeah, very cool.

28:52 Now, before you go on, I also have a question out to you.

28:55 You can be the proxy for the audience here.

28:57 - Okay.

28:58 - Notice at the bottom it says requirements 3.6 and above.

29:01 - Yeah.

29:01 - Python, that's not, I don't feel like that's very controversial as 3.6 is not even supported anymore, right?

29:07 - Right.

29:08 - So this is like every possibly supported version of Python 3 this works for.

29:13 What would you think if I said the requirement is this is Python 3, not Python 3, just, it requires Python three, knowing that like that means or implying that that means supported shipping real versions of Python, not Python three, one.

29:29 Right.

29:30 Cause obviously Python three, one is no longer supported, but neither is three, five, even like, could you say f-strings are just in Python three now without worrying about the version or do you need, you still need to say three, six, plus three, six, three, seven, like, should this be updated to be three, seven?

29:44 You know what I mean?

29:44 You kind of have to, you think so?

29:47 I, I don't know.

29:48 >> I know when I say something is on Python 3, actually I don't even say that anymore.

29:56 What do you think?

29:58 >> Okay. Well, I used it in the sense like, yeah, you need Python 3 for this thinking, well, any version that's supported these days.

30:05 People are like, well, there's older versions that don't support this thing.

30:07 Like, well, obviously, I'm not talking about the one that was not supported five years ago.

30:12 At some point, Python 3 is the supported version of Python.

30:18 I don't know.

30:19 Oh, that's true.

30:19 Yeah.

30:20 Okay.

30:21 So that's a bit of a diversion there, but I went down that route.

30:24 Hold on.

30:24 It's like, I really don't know which way I should go, but I feel like there's, there's a case to be made that just like, when you talk about Python three, you're not talking about old unsupported versions.

30:32 You're everything that's like modern three, seven and above should be like an S and an, an alias for Python.

30:39 I don't know.

30:39 When we were just saying Python three, what we meant was like three, one.

30:43 So I know we got to get used to that.

30:45 that there's no Python 2 really to worry about.

30:49 All right, well, that will definitely bring us to our extras, won't it?

30:53 - Yeah, yeah.

30:54 - All right, you want me to kick it off since I got my screen up?

30:57 - Yeah, go ahead.

30:58 - All right, so Will, like I said, he gets two appearances and also his comments.

31:02 So thank you for that.

31:03 And this is like in the same vein of what I was just talking about.

31:06 Like what is this convention that we want to have, right?

31:08 So the Walrus operator came out in 3.8 and it was kind of an interesting big deal, right?

31:15 there's a lot of debate around whether or not that should be in the language.

31:19 Honestly, I think it's a pretty minor thing that it's not a huge deal.

31:22 But the idea is you can both test for a variable as well, or you can use the test or use the value of the variable in the same place that you create it.

31:32 So instead of saying x equals get user, or like u equals get user, if user is not none, or if user, you could just say if u colon equals get user, do the true thing.

31:43 Otherwise then it's not set right.

31:46 And so Will is suggesting that we pronounce the walrus operator as u becomes the value.

31:55 So like x colon equals seven is like x becomes seven.

31:58 What do you think?

31:59 Are you behind this?

32:00 - Okay, so you'd be like when you're reading your code to yourself I guess.

32:03 - How do you say it?

32:04 Like if you say like the lambda expression, like how do you define like the variables of the lambda?

32:10 Like there's terms around there I make it a little bit hard to say without just saying syntax.

32:15 He's proposing like becomes is the way we verbalize while we're separate.

32:22 I like it. I'm going to give it a thumbs up.

32:25 >> It's interesting, but how is that different from assignment though?

32:28 What do you say with assignment?

32:30 I don't say like x.

32:32 >> Equals? I don't know.

32:33 >> Equals, assign, becomes works.

32:37 >> I will put it out there if people can think about it.

32:40 And there's a, there's a nice Twitter thread here with lots of comments.

32:43 so folks can jump in.

32:46 Or you can just walrus, just talk X walrus five.

32:50 oh yeah.

32:51 Well, what do walruses do?

32:53 I mean, is there like a cool action that would, is like particular to walruses?

32:58 Well, there probably is, but it's not, it doesn't apply to this.

33:01 It's not a very colloquial, is it?

33:03 Is X.

33:05 Yeah.

33:05 And John, then John Sheehan out in the audience says in my brain, I use a assigned to, and he must know what's coming because he's up next.

33:14 [LAUGH]

33:16 Hey, John.

33:17 So the other thing I want to talk about is, did you know, I learned through John, that string starts with will take an iterable, it says tuple, but I suspect it might even be an iterable of substrings.

33:31 And if any of them match, it will test out to be true.

33:35 to like ABCDEF, you say starts with a tuple, AB or CD or EF.

33:42 I've never used this.

33:44 >>I didn't know that that was a thing.

33:45 >>I would always just do that as like X starts with AB or X starts with CD or X starts with EF.

33:51 No, you apparently can do that all in one go.

33:53 >>What's the two for?

33:55 >>I have no idea.

33:56 I was just thinking that as well.

33:57 There's a two and I don't know what it's for.

34:02 Yeah, anyway, that's a super quick one, but I thought pretty interesting there.

34:06 So that's all I got.

34:08 How about you?

34:09 I just have one thing that we don't need to put it up, but my extra is this book.

34:15 You have your physical 2.0 book in hand.

34:20 Yes, I've got.

34:21 Oh, yeah.

34:22 And for the people not not watching my I've got a stack of it's funny.

34:28 My daughter uses my Amazon account too.

34:31 So UPS said, hey, there's a package arriving yesterday.

34:34 And I said, I didn't order anything.

34:37 So I said, I told my daughter, hey, you probably have a package showing up.

34:41 She's like, I didn't order anything.

34:43 And then this box arrives with five copies of my book, which is great.

34:49 That's awesome. Yeah. Yeah. Congratulations. Thanks. Very cool.

34:53 We abuse our Amazon account badly.

34:56 Like there's a lot of people that log into Amazon.

34:59 We end up getting stuff shipped wrong places because somebody shipped it to their house last time, and then we just hit reorder again.

35:04 And like, why do you have our shampoo?

35:06 I don't know.

35:07 Yeah.

35:08 Yeah.

35:09 So John adds that the two is the starting position of the starting position.

35:14 Yeah.

35:15 I figure it had something to do with it.

35:16 I wasn't sure how many characters to compare on.

35:18 But well, I also didn't know if the that you could pass a starting position for starts with.

35:23 That's cool.

35:25 Yeah, there's a lot going on here.

35:27 - Almost starts with.

35:29 - Yeah, nearly starts with.

35:30 Yeah, what's the right way?

35:33 So I want to close this out with a joke as always, but there's the joke we talked about a while ago, where Sebastian Ramirez, creator of FastAPI, saw an ad hiring a FastAPI developer, and he said, "Oh, it looks like I can't apply for this job.

35:51 "It requires four years of experience with FastAPI, "but I can't possibly have that Cause they only created it two years ago.

35:57 Right.

35:58 Yeah.

35:58 So it's a little bit in that vein.

36:00 So here we have, somebody.

36:03 tweeting and says, here's a conversation with the recruiter and them.

36:07 It says, recruiter, do you have a CS background?

36:10 Yes, absolutely.

36:11 My CS background.

36:13 And this is a screenshot from the game counter-strike, which is often referred to as just CS.

36:17 Yeah, of course I got a CS background.

36:19 Are you kidding me?

36:19 That's pretty good.

36:23 I love it.

36:24 Yeah.

36:25 >> Yeah, that's a good one.

36:26 >> Well, just a question though.

36:28 If you did FastAPI instead of eight hours a day, if you did it 16 hours a day for two years, would that constitute four years of experience?

36:38 >> That probably is about the same amount of experience.

36:41 >> Yeah.

36:41 >> So what a slacker that Sebastian is.

36:44 >> Does he have to eat or something?

36:46 Does he have family? What's going on?

36:48 >> Come on.

36:49 >> Well, always fun with hanging out with you and talking Python.

36:53 >> You bet.

36:54 Thanks to everybody that listens to it on their podcast player or watches us on YouTube.

37:02 Yeah, absolutely.

Back to show page