Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book


Transcript #321: A Memorial To Apps Past

Return to episode page view on github
Recorded on Monday, Jan 30, 2023.

00:00 Hello and welcome to Python Bytes, where we deliver news and headlines directly to your earbuds.

00:04 This is episode 321, recorded January 30th, almost the end of January.

00:10 And I am Brian Okken.

00:11 And I am Michael Kennedy.

00:13 Hey, Michael.

00:14 Hey, hey.

00:14 Excited to be here today again.

00:16 Absolutely.

00:17 Before we jump too far into it, I want to thank Microsoft for Startup Founders Hub.

00:22 Please listen to their spot later in the show.

00:24 How are we going to start the show?

00:26 What do you have for us, Michael?

00:27 You may wonder, some folks have publicly expressed the bewildering thought that maybe we live in a simulation.

00:34 I don't think so.

00:35 Do you think we live in a simulation, Brian?

00:37 Sometimes, yeah.

00:39 No, I don't.

00:40 When I'm playing a game, maybe.

00:42 But what if you were working on Git and you wanted to see how things were working, simulate some operations,

00:49 and try to understand how Git works without actually making those changes?

00:53 Because there's always the, you know, Git is full of good jokes, right?

00:58 Like, in case of fire, Git commit, Git push, run.

01:01 You know, things like that.

01:03 Yeah.

01:03 Those jokes.

01:03 But the other one is, you know, you don't need to know Git that well.

01:07 If you mess it up, you just delete the repository and clone it again and start over, right?

01:12 So, you know, ideally, you would be able to run some operations to help you understand what Git is going to do without consequence.

01:19 Okay.

01:20 And so I introduced to you this tool called GitSim.

01:23 And GitSim will visually simulate Git operations in your repos with a single command.

01:28 So what it is, is instead of saying, like, Git merge branch, you would say Git-Sim merge branch.

01:36 Now, how best to explain what's going to happen?

01:40 Like, if it just says, we would have merged this branch into that branch with seven changes.

01:45 You're like, okay, maybe that's fine for merge.

01:47 But there are many other things that are more complicated.

01:50 And so, as you and I are fans of, this will simulate, it will show you the visual behavior changes that are going to happen.

01:58 Isn't that cool?

01:59 Yeah.

01:59 So, by default, you get a JPEG image.

02:02 And the top one you see here, you can see all the commits, their SHAs and their message.

02:09 And you can see two branches.

02:11 It'll see where head and main are and where dev is.

02:13 And it'll show you, if you do a commit or actually a merge, you're going to take these changes from dev, push them forward, right?

02:19 And the resulting shape or behavior of the repository.

02:22 Yeah.

02:23 So, that one's pretty straightforward.

02:25 I'll show you some really cool ones in a minute.

02:27 So, use cases include visualizing Git commands to understand them.

02:31 It's kind of what I was talking about.

02:33 Also, my joke, prevent unexpected working directory and repository states by trying it out first.

02:39 But there's also a whole, I'm creating blog posts, tutorials, courses, whatever.

02:46 So, sharing visualizations of your Git commands with your team, maybe for documentation, right?

02:52 In our wiki, like, this is our workflow.

02:54 You probably don't understand what this weird Git thing is that we're doing because it's non-standard.

02:58 Please watch this little animation so you know why we're doing it or something like that, right?

03:03 Yeah.

03:04 Internal documentation, that's a great use for that.

03:08 Yeah, absolutely.

03:09 So, basically, the supported commands at the moment are log, status, add, restore, commit, stash, branch, tag, reset, revert, merge, rebase, and cherry pick.

03:19 And then sub commands of those, right?

03:21 Install.

03:22 It's got some steps to install it, some ways to run it.

03:26 But then you can see, if you scroll down far enough, you'll start to see some of the examples.

03:31 So, there's a bunch of examples that are pictures, like a git sim-log.

03:35 It'll simulate the output showing the most recent five commits on the active branch.

03:40 Yeah?

03:40 And it sort of shows you the tags and various things.

03:43 You can see status, and it does like a rich style sort of tree, not tree, a table view.

03:50 Really nice there.

03:52 You can see a lot of the different, almost Kanban flow of your files going from untracked to tracked and modified locally and staged and all that.

04:02 Which is, actually, I think it's pretty helpful.

04:04 Even with little arrows showing, you know, it moved from here in this one of those columns to this other column.

04:10 Yeah.

04:11 I don't think I've ever used git restore.

04:13 I don't think so.

04:14 I don't make mistakes, so it's fine.

04:15 Isn't that delete and reclone?

04:18 Yeah, exactly.

04:20 Oh, my God.

04:20 Don't commit that.

04:21 Don't push that.

04:22 Please.

04:22 Just delete it.

04:23 Let's see.

04:24 But if we go down further, you'll get more interesting examples.

04:27 The merge one is there.

04:29 Let's keep going.

04:30 But we get to the video one.

04:31 This is where it gets to be pretty awesome and pretty unique.

04:34 Animate.

04:35 We could do a git sim reset.

04:37 So, check this out.

04:38 So, over here, you have this visual diagram showing how this stuff is changing over time.

04:45 So, it's like, all right, what we're going to do is we're going to reset this and you'll see, like, the head pointer and the branch pointer move over.

04:50 And what do you think, Brian?

04:52 Isn't this cool?

04:53 This is really neat.

04:54 Yeah.

04:54 Yeah.

04:55 I'll just pick one or two more.

04:56 So, we got merge.

04:57 That's pretty straightforward.

04:58 Rebase.

04:59 Let's animate a cherry pick.

05:01 So, you'll see it coming along here, building up the git repo status.

05:06 And then what's going to happen?

05:08 It's going to show us a branch.

05:09 And we want to take some of those changes and cherry pick them over to the main.

05:13 Yeah.

05:14 Anyway.

05:14 If you trained on that one, but that's okay.

05:18 I am too.

05:19 That's not a great animation, to be honest.

05:20 Anyway, it shows you a bunch of examples, a bunch of cool things.

05:25 I think this is really nice.

05:27 It's like I said, primarily documentation internally, like your internal wiki, your onboarding docs, or for, say, blog posts.

05:34 You want to talk about what something looks like, then run this.

05:38 And it's not just what does a git merge look like.

05:40 It is, what does the git merge look like on this repo in this state, right?

05:44 Yeah.

05:44 It applies to your working repo, which is cool.

05:46 Yeah.

05:46 The applying to the working one, that's really cool.

05:50 I mean, you said that mergers are pretty easy, but actually, I think I'll probably use this for mergers the most.

05:55 Because there's a lot of times where I have a mental model of what my repo looks like.

06:00 And a merge shows a conflict or something.

06:03 And I'm like, why would it be a conflict?

06:05 That's a good point, actually.

06:06 And it's probably because my mental image of what the repo looks like right now is wrong.

06:11 That something has moved forward since I branched off of it or something.

06:15 Yeah.

06:16 Yeah.

06:16 Very good point.

06:17 Actually, I end up confused sometimes.

06:19 That should have gone a clean merge.

06:22 No problem.

06:22 And now I'm in some situation where it's asking me to describe the changes, and I actually don't know what they are.

06:28 So let's start digging in.

06:30 And then sometimes you've got to like, manual merge?

06:32 What is this?

06:33 1980?

06:34 Exactly.

06:35 What are we using?

06:36 CVS?

06:37 Come on.

06:37 Let's go.

06:37 Anyway, this is what I got.

06:40 Git.

06:41 Git Sim.

06:42 Nice.

06:42 Well, so I guess we're kind of doing a tools thing, at least for now.

06:46 I'd like to talk about Nox.

06:48 So have you used Nox?

06:51 I have not.

06:52 Okay.

06:52 So there's Nox, and then there's Tox.

06:56 And I have used Tox a lot.

06:59 So both of them, I guess, they do have different things that you can use them for and stuff.

07:04 What do I use them for?

07:05 I use them primarily to run pytest on multiple versions.

07:09 So the general, one of the workflows that works on both of them is I want to create a virtual

07:14 environment with like Python 310, 311, 39, a bunch of different Pythons, create a virtual

07:20 environment, install my package that I am trying to test, and then run that, and then all the

07:25 dependencies, and then run that test suite within that environment.

07:30 So that's kind of a standard thing.

07:33 So my first thought, first when I saw Nox was, so one of the benefits of Nox, Tox uses

07:41 any files for the settings.

07:43 You can also use, it supports Toml now, I think, maybe.

07:47 I think you can do it in PyProject Toml also.

07:49 If not, sorry.

07:52 But the Nox uses just a Python file.

07:57 So you have a Python, I think it's Nox file.py or something like that.

08:02 But it just uses, I could use the example.

08:06 Anyway, it does similar things.

08:09 So Hynek, I'm going to get it wrong, sorry.

08:12 Hynek wrote an article called Why I Like Nox, and he specifically calls out of like, I'm not

08:19 bashing Tox.

08:20 Tox is still awesome, a great team supporting it, and I agree.

08:23 I know a lot of the people that support it.

08:25 But Nox's might be for you as well.

08:27 So here's a person that likes both tools, comparing them, and it's refreshing.

08:33 So first off, it's the file format.

08:36 So Any, Tox uses AnyFiles, Nox uses Python.

08:39 And I got to admit, even for a simple example like this, then the example I'm showing is

08:44 running Python 3.10 and 3.11 and being able to pass in arguments to pytest.

08:49 Both are not terrible, but I think maybe the Nox one's a little bit more readable just because

08:56 it's Python.

08:56 It's definitely more flexible because you could run arbitrary Python code in addition to

09:00 some sort of setup, teardown, beyond pytest itself.

09:05 Similar.

09:05 And then he gets into another example, which is a little bit more involved, which is I've

09:12 got a test matrix, but I also, different Pythons, but I want to be able to run the oldest adder's

09:18 version against whatever Python environment I'm running.

09:23 And he claims that he, and I haven't tried this out, that it's actually, he doesn't know why

09:29 it isn't working, but it's just, it doesn't work.

09:31 And I'm, you know, I can't help him out there.

09:35 But then he switches to Nox and it's a lot longer example, but it works great.

09:43 And the, the longerness, the longerness I kind of like, and Hinnick points out in terms of

09:50 number of lines, it's longer than the Tox equivalent, but that's because it's more explicit.

09:55 And anyone with a passing understanding of Python can deduce what's happening here, including

10:00 myself a year from now.

10:01 Explicit can be good actually.

10:03 So I kind of like that, that the, it's okay that it's longer.

10:07 You're not reading it all the time and having it more verbose might help.

10:11 So I like that.

10:13 And then of course you brought this up, the power of the snake.

10:16 You can run, you can run anything you want.

10:19 It's Python code.

10:20 So that's nice.

10:21 And then one bonus thing is apparently it's a little easier to specify versions that Nox has

10:27 a --Python and you can pick the version you want to use like that.

10:31 And it just looks normal.

10:31 You can do that with Tox too, but the normal way to do it is to say what, like PI 310, which

10:38 you just have to know the syntax.

10:40 It's not terrible, but whatever.

10:41 Yeah.

10:42 So good, good overview of Nox.

10:45 Yeah.

10:46 It didn't really, I guess I didn't realize that Nox was playing Python.

10:50 I'm sure that I knew that at one point, but forgot about it.

10:52 That looks, that is an interesting advantage.

10:54 Yeah.

10:55 Yeah.

10:55 So I think I want to play it with a little bit more.

10:58 And I, the, he points out as also that he's not switching completely over to Nox,

11:03 but he does have some projects using Tox and some using Nox.

11:06 It's good.

11:06 There's two.

11:07 Yeah.

11:07 And they rhyme.

11:08 All right.

11:08 How about our sponsor this week?

11:10 Oh, yes.

11:11 Thank you to Python.

11:13 This episode of Python Bytes is brought to you by Microsoft for Startups.

11:17 Microsoft for Startups has built Founders Hub to help startups be successful.

11:22 Founders Hub provides founders at any stage with free resources to help solve startup challenges.

11:27 The digital platform provides technology benefits, access to expert guidance, skilling resources, mentorship, networking connections, and so much more.

11:36 It is truly open to all.

11:37 Along with free access to GitHub and Microsoft Cloud, with the ability to unlock credits over time, Founders Hub has also partnered with other innovative companies to provide exclusive benefits and discounts, including OpenAI.

11:50 And we've heard from one of our listeners that he's taken advantage of this already, and the discounts are awesome.

11:56 You'll also have access to their mentorship network, giving you access to a pool of hundreds of mentors across a range of disciplines.

12:02 You'll be able to book a one-on-one meeting with the mentors, many of whom are former founders themselves.

12:08 Make your ideas a reality today with the critical support you'll get from Microsoft for Startups, Founders Hub.

12:14 To join the program, visit pythonbytes.fm/foundershub 2022.

12:19 The link is in your show notes.

12:20 Ooh.

12:21 Indeed.

12:21 Indeed, indeed.

12:22 Thank you, Microsoft.

12:23 All right.

12:23 Ready for the next one?

12:24 Yes.

12:25 Not that one.

12:26 So this comes from Tom's corner of the internet.

12:30 Tom's got his own corner.

12:31 Yeah, got a very own corner.

12:33 There's a lot of corners of the internet, to be honest.

12:36 I don't know how many dimensions that is, but many, many corners exist in the internet.

12:40 And here's one of them from Tom.

12:42 And Tom says, don't think the tools he's using here are exactly about Python, but what he is.

12:49 applying them to certainly is.

12:51 It says, I scanned every package on PyPI and found 57 live AWS keys.

12:58 Not just, oh, that looks like a string that could be an AWS key.

13:01 He logged in as that person.

13:03 Oh.

13:05 This is not on GitHub.

13:06 I want to emphasize, this is on PyPI.

13:08 So pip install.

13:09 Hey, look, thanks for shipping me a version of your keys.

13:13 Weird.

13:14 Weird indeed.

13:15 So it says, after I inadvertently found that Infosys leaked AWS keys on PyPI, I have thought,

13:22 well, if it's once, it's probably many times, right?

13:26 They're probably not the only one.

13:27 So after scanning, get this, all 430,000 published.

13:32 Well, no, actually, I think that's releases.

13:35 There's 430,000 packages, but there's 4.1 million releases.

13:40 So I think he scanned all the version history as well.

13:44 Okay.

13:44 Somebody found them and took them out.

13:45 Anyway, after scanning those, I found 57 valid access keys from, you know, organizations.

13:52 I'm sure that they're new at working with the cloud and especially AWS.

13:56 It is tricky.

13:56 So these organizations may not be familiar with the rules, but Amazon leaked their own AWS keys.

14:06 That was the joke.

14:08 The rest, not so much.

14:09 But Intel, Stanford, Portland, and Louisiana universities, keeping it local.

14:13 The Australian government.

14:14 Thank you for that.

14:16 General Atomics Fusion Development, Teradata, Data Lake.

14:20 And yes, your gloves too have been leaked.

14:23 Top Glove, the world's largest glove manufacturer.

14:26 I love the emoji.

14:27 A little glove.

14:29 There's a glove emoji at the end of that title.

14:31 So here, like, check this out.

14:32 If I click on Australian government, it takes us to inspector.pypi.io, which I didn't really know anything about.

14:40 Then it links to data cube dash OWS version 186.

14:46 And pulls it down into WSGI local PyPI.

14:50 Brian, what does the comment say?

14:53 Do not commit.

14:54 AWS Qs do not commit.

14:56 Not only are they committed.

14:58 You know what?

14:59 Here's the thing that's interesting, okay?

15:01 They may not be committed to GitHub, but they may have forgotten to take them out when they did the build step to build the wheel.

15:10 And then they comment them out or somehow remove them from going to GitHub.

15:13 And that's...

15:14 Oh, I could totally see how this could be easily done.

15:16 Because you have to go through an extra step to push from GitHub to PyPI.

15:22 A more natural beginner state is you publish to GitHub and you publish to PyPI.

15:28 Yeah, exactly.

15:29 It's very...

15:31 If you haven't set up full end-to-end automation that does the publish for you, which I think a lot of people haven't.

15:36 Yeah.

15:36 You know, it's easy to have this happen.

15:39 All right.

15:39 So let's go through this real quick here.

15:40 So how do we do this?

15:41 Detecting AWS keys is actually pretty simple.

15:44 Did you know that there's a regular expression that is a valid match for AWS keys?

15:50 I thought there was kind of random business, but no.

15:52 There's a certain format that they take.

15:55 So you can tell this is not just A key.

15:57 It is an AWS key ID.

15:59 Oh, cool.

16:00 So now I know how to search for them in other people's repos.

16:03 I feel like this would be a pretty awesome pre-commit hook.

16:06 And, you know, there's tools like Twine and others that people use to build their packages that get shipped to PyPI.

16:16 Yeah.

16:16 And PyPI itself.

16:18 All of those could start applying checks for this kind of stuff, right?

16:22 Because GitHub access keys have a certain pattern now.

16:25 I don't remember.

16:26 There's like, there's some prefix that they seem to have that looks like it's predictable.

16:30 I feel like maybe, maybe this could be put into the supply chain pipeline, as they call it.

16:36 But anyway, there's a regular expression you can run against to find them.

16:40 And here we go.

16:41 So we can use the amazing ripgrep to search packages for this pattern.

16:45 And look at that.

16:46 Here we go.

16:47 You pull down this, this gzip file, and then you ripgrep it.

16:52 And boom, out comes the access keys.

16:54 Whoopsie.

16:55 Apparently Amazon pay at this point here.

16:58 But just because the keys are present, are they valid?

17:02 I don't know.

17:03 So the next step shows you how to execute the AWS CLI command to get the caller identity to see if it's actually valid.

17:10 Right?

17:11 Okay.

17:11 So it says, now the devil's in the details.

17:13 The Z dash Z flag doesn't support searching zips.

17:16 So let's go.

17:18 Let's go and tear this up.

17:20 And points out, you can get the entire over at github.com or F PyPI dash data.

17:28 You get the entire static dump of PyPI data.

17:30 Oh, nice.

17:31 Did you know this existed?

17:31 I had no idea.

17:32 So I'm like, wait a minute.

17:33 Let's go check this out.

17:34 PyPI dash data.

17:35 This is automatically updated.

17:37 PyPI API data available in bulk.

17:40 So the contents of the entire.

17:43 How big is this?

17:43 It's not small.

17:46 See the shallow checkout, perhaps.

17:50 The contents of the entire PyPI JSON API for all packages updated every 12 hours.

17:57 Wow.

17:57 Yeah.

17:58 So it says, for example, here's the JSON for Django.

18:03 So anyway, I didn't know that exists.

18:05 That's pretty awesome.

18:07 Then he set up a GitHub action to pull those down.

18:11 Then GitHub actions.

18:13 Let's see.

18:13 GitHub secret scanning service will kick in and let AWS know that the keys are leaked.

18:20 This will cause AWS to open a support ticket with you to notify that your keys are leaked,

18:24 which is kind of an interesting chain of events that happens here.

18:28 But it talks about how old the keys might be.

18:32 The oldest one is from 10 years old from 2013.

18:35 And different reasons this happens.

18:38 It's hard, for example, to test against AWS.

18:42 Another reason that they say is like there's legitimate and quote uses.

18:46 One of the things they talk about is, you know, why is this happening?

18:49 And Python being super heavily used in data science and ML, a lot of folks come from that

18:55 side of the world without super strong software engineering practices.

18:59 And so maybe, you know, coming from economy and being an economist, you're like, oh, I got

19:05 this thing working.

19:06 Let me publish this up to help people.

19:07 Right.

19:08 It's really easy that you didn't really think about some of these things.

19:12 Right.

19:12 But basically, don't put your secrets in your source code.

19:15 Don't put them in GitHub.

19:16 And, you know, don't buy the transitive property.

19:20 Put them in PyPI either.

19:21 Yeah.

19:22 Yeah.

19:22 So anyway, what do you think?

19:24 I think it's a head shaker, but interesting.

19:28 We need like stickers made up for laptops.

19:31 Do you know where your keys are?

19:33 Exactly.

19:34 It's 10 p.m.

19:35 Do you know where your keys are?

19:37 Yes.

19:39 They said they're sleeping at their friend's house.

19:41 They're actually at a frat party.

19:42 Okay.

19:44 Yeah.

19:44 So the article is like missing one step and that's how to how to set up a Bitcoin miner

19:49 on all these keys that you.

19:50 That's left as an exercise to the user.

19:54 By the way, nice little Hugo website here.

19:58 Got to give a little shout out to that.

19:59 And I know we both like our Hugo.

20:01 Okay.

20:01 That's it for this one.

20:02 Okay.

20:03 What's your final one for us?

20:05 I've got a hypothesis.

20:06 So I get actually asked this a lot.

20:09 I do like hypothesis, but it's a little overwhelming.

20:11 I get asked, so what do you think about hypothesis or something like that?

20:15 Or do whatever?

20:16 Yes.

20:17 I use hypothesis.

20:17 I do like it, but it is, it can be overwhelming.

20:20 So we're going to take a look at an article called getting started with property based testing

20:25 in Python with hypothesis and py test.

20:28 And this is from Rodrigo and I'm not going to try the rest of the name.

20:33 Rodrigo.

20:34 Maybe.

20:35 Yeah, I'm not.

20:36 Sorry.

20:36 Cool name.

20:37 Saro.

20:39 Yeah.

20:40 Anyway, it's on the semaphore blog.

20:42 And there's a lot of what I like about this article.

20:46 And the, well, first off, what I really like about property based testing is not that, I

20:54 mean, it can find some bugs in your code and that's, that's kind of what it's for.

20:58 And it's good, but it also makes you think about it.

21:00 So thinking about a few examples to test your code and corner cases and all that stuff is

21:06 good to say, you know, how do I, how do I know if my code's working, but with property

21:10 based testing, especially, and I, I think a good place to focus this on is algorithmic

21:15 stuff.

21:16 So you've got like some type of some algorithm and insider in a function, and you really want

21:21 to make sure that that's just solid, no matter what you throw at it.

21:25 And so that's a great place for property based testing.

21:27 But how, what you do is you think about, you have to think about what properties are true

21:32 because what hypothesis is going to do is it going to throw a bunch of input at your function.

21:36 And you, so you have to think, how do I tell if I don't know what the input is, if the answer

21:41 is correct?

21:42 Because if you know the input, you, you can like calculate it yourself, whether the answer

21:47 is correct or not or something.

21:49 But without that, you're thinking in properties.

21:52 And so I love this article.

21:54 The first, the script, the example, it goes through two examples.

21:58 The first example is a greatest common denominator in math problem of like thinking about, I mean,

22:05 you can just like have some known problems that you know the answer to and pull those out.

22:11 That's great.

22:12 But how would you test like for every number?

22:14 And so going through a couple, thinking about what to test is great.

22:19 What did he talk about?

22:21 For greatest common denominator, your answer is going to be positive.

22:25 And the answer needs to divide both, both of the numbers, right?

22:31 That's kind of the point of it.

22:32 But then how do you know if it's the right one?

22:35 Well, no other number larger than your answer is going to be able to divide N and M.

22:40 So it's kind of you're going to end up doing kind of an exhaustive search a little bit.

22:44 But that's OK.

22:45 It's it's it's source code.

22:47 It'll run.

22:48 Shouldn't be too long.

22:49 And the other thing hypothesis do does, which I didn't know at first, is it's pretty good

22:54 at picking numbers that will probably break your code.

22:56 And by default, it only picks 100 numbers.

22:59 It only picks up 100 test cases.

23:01 And the limit is important because often your your sample size that you could test is infinite.

23:09 So you don't want it to just run forever.

23:11 You want it to be some some constraints on it.

23:15 So the it goes through this example coming up with how to test that.

23:20 So it writes a test and then how to plug.

23:22 How do you plug hypothesis into it?

23:24 So you've got given and strategies is often used.

23:28 And so you you put a decorator on your test, say, given strategy integers for both the input

23:36 and of N and M test to make sure.

23:39 And then you run you run your your function and then test stuff around it.

23:43 And the test is listed up higher in the code.

23:46 And then quickly, you're going to find problems.

23:50 And I like I like the greatest common divisor because there are test cases that you that don't

23:55 work, which is great.

23:56 Like zero zero, you know, by definition, we don't if both of them are zero, it's undefined.

24:03 And if one of them zero, it's defined to be the other one, which I actually didn't know.

24:08 I'm like, really, is that true?

24:10 I looked it up.

24:10 And I didn't think about that either.

24:12 Yeah.

24:12 So apparently, if you the greatest common divisor of zero and five is five, which who knew?

24:19 But aside from that, so coming up with edge cases is probably good for algorithmic type

24:25 of code anyway.

24:26 And then the example of zero zero, how do you get rid of that?

24:30 So in this, I guess this is one of my caveats about this article.

24:34 He talks about limiting the range, which it's good.

24:38 It's a good example, because you're going to want to do this in a lot of your test cases

24:41 is limit the range.

24:42 So you can put a min and max on different things.

24:45 And there's a lot more than numbers.

24:46 You can do text and you can do all sorts of stuff with hypothesis.

24:49 But I think good starting with numbers is a good one.

24:51 I just don't like the solution that he came up with.

24:54 The solution he came up with is limit one of them from one to 100 so that you're never

25:00 going to have zero zero.

25:01 And I'm like, I personally would have used a different mechanism.

25:06 So my recommendation is a there's a strategy called.

25:11 So, oh, not strategies, making assumptions.

25:15 So there's a thing in hypothesis called assume.

25:18 And you can say within a test, you can say assume something.

25:23 Assume it's just like an assert, but it doesn't fail your test.

25:26 If it fails, the it rejects the test case is how it works.

25:31 So you can say for the zero zero case, you can say assume n equals is not equal to or assume

25:39 not n equals m equals zero.

25:41 It's hard to do this without code.

25:44 But you can make an assumption there so that it'll kick that one out.

25:47 That's how I would have done that.

25:49 But other than that, it's a really great introduction to the how to work with property based testing.

25:55 So I give it a thumbs up.

25:57 Then there's a second example, which is nice, too.

25:59 So that's cool.

26:00 I didn't know about assume.

26:02 So I'm very good to know about that.

26:03 The other thing that I think is a good thing to know about is example.

26:08 So like the example zero zero.

26:10 We specifically don't want to test that because we know it's broken or it's not defined.

26:14 But there are lots of cases where you're like, you know, somebody you're doing property based

26:19 testing on something and you get a defect of like, well, if I run these numbers, it fails.

26:23 And you're like, oh, well, we want to make sure we always run that.

26:26 So with example, you can say hypothesis.

26:29 You get to come up with the examples, except for always run this one also.

26:33 So and you can just kind of stack them up.

26:36 That's good.

26:36 Yeah.

26:36 And so for people listening, example is a decorator you put on your test at example and you put

26:41 a certain set of parameters that get called there.

26:44 Yeah, I just I kind of don't like that.

26:46 So examples is a decorator that you put on the outside to say, run this one always.

26:51 I'd like the reverse also to say this particular example, don't run it.

26:56 Because I know it's broken.

26:57 And the I mean, we get around it with the assume part, but it would be cool if there

27:02 was like a don't run this example.

27:04 Yeah.

27:05 Yeah.

27:05 It looks very helpful.

27:06 And I learned some things.

27:07 So excellent.

27:07 How about you?

27:08 What's next?

27:09 We got extras, extras, extras.

27:11 Are we done with our normal ones?

27:13 Wow.

27:13 Yeah.

27:14 Time flies when you're having fun, you know?

27:15 Yeah.

27:16 All right.

27:16 When we go first, you want to do yours?

27:18 I got a short one.

27:20 All right.

27:21 Go for it.

27:21 OK.

27:21 So my let's let's get rid of that.

27:23 We don't need that.

27:25 My example, my extra is just that this came in the mail and I'm really excited about

27:29 it.

27:29 So it's the Japanese version of Python testing with pytest.

27:33 It's been out for a little while in, I assume, Japan.

27:37 But it got translated.

27:42 I was in touch with the translator and they asked me a few questions.

27:48 Very respectful, dude.

27:50 And I'm glad.

27:51 And I'm like, can I get a copy?

27:52 And they sent me a few copies, actually.

27:54 So pretty exciting.

27:55 Fantastic.

27:56 Oh, that's really cool.

27:57 It's neat to see it.

27:58 It's neat to be reminded of the global reach.

28:01 Yeah.

28:01 And not just the cover.

28:02 The insides are there, too.

28:04 Amazing.

28:06 So you're going to learn Japanese?

28:09 Are you going to figure this out?

28:10 I do have a friend that speaks Japanese, so I'm going to go and see.

28:14 But they don't code.

28:15 Yeah.

28:16 It's all right.

28:17 I'm sure they'll find it riveting anyway.

28:19 Yeah, that's awesome.

28:22 What are your extras?

28:24 Well, a couple of things.

28:25 So it seems like I have survived the very first Python bytes on my new Mac Mini.

28:31 I just got, you know, Apple released the Mac Mini Pro M2, and I got that.

28:35 And so far, super, super neat.

28:38 I can recommend it.

28:38 It's a lot faster than the previous Mini.

28:40 So I mentioned that I used to have money, and then I had a Mini on order.

28:45 And now I actually have a Mini.

28:46 A new Mini.

28:47 It looks identical to the other, but it goes way faster, which is great.

28:52 Maybe I'll have more to say about that later.

28:53 All right.

28:54 So have you heard that Twitter is going through some turmoils?

28:57 I'm not sure.

28:58 I think something's going on over there.

29:00 The latest turmoil is that they decided to unceremoniously, unprofessionally cancel all the third-party Twitter apps.

29:10 That's just insane.

29:12 It's pretty insane.

29:14 I think it's, honestly, I think it's fine within Twitter's right to say, look, we don't want to have third-party apps.

29:21 We have a business model that doesn't work well.

29:23 There's not third-party Facebook apps.

29:25 There's not third-party Instagram apps, are there?

29:27 I don't think so.

29:28 Anyway, I think it's fine.

29:30 But the way that it was done was, oh, we're just going to cut them off.

29:35 And then in a few days, maybe somebody will say something.

29:37 What they said was, the reason we cut off things like TweetBot and other ones is because they violated the terms of service.

29:45 Like, wait, we've been doing this for 10 years.

29:47 What do you mean?

29:47 The reason they violated them is they went back and updated the terms of service to say we don't allow third-party apps.

29:53 That's the human violation of it.

29:55 I mean, it was just really weird.

29:57 Anyway, I want to just direct people who want to, you know, enjoy the moment technically and socially to tapbots.com slash TweetBot, where they put up a memorial to TweetBot.

30:09 Brian, isn't this a fantastic picture?

30:11 It really is.

30:12 There's this little elephant.

30:14 But it's all, like, 3D-looking, like claymation or something.

30:17 Yeah, it does look a little claymation-y.

30:20 And, you know, the mastodon elephant is at the gravestone of TweetBot.

30:25 And it has the life from April 2011 to January 2023 on the gravestone of TweetBot.

30:34 And it's just, anyway, it's pretty interesting.

30:36 The reason, partly I bring this up, not just for the picture, is if you're into mastodon, the same company decided, well, we're doubling down on mastodon and creating ivory,

30:46 which there's been a ton of talk about ivory being a really cool app for if you want something better than, say, the progressive web app for mastodon.

30:54 I know there are others out there as well, but a lot of people are really big fans of TweetBot and TapBot's the company.

31:00 And so you can now, this is now publicly available.

31:03 So there's that.

31:05 And anyway, I started using it.

31:07 I like it really well.

31:08 I don't use it exclusively over just using the progressive web app because the progressive web app has the advanced view where you have multiple columns.

31:16 You can create searches for hashtags and pin those as columns.

31:19 And it's really nice.

31:20 But this is quite a nice, nice app if you're not kind of doing that advanced view.

31:25 Oh, and by the way, Christopher Tyler has caught something incredible.

31:29 Yeah.

31:30 I have missed this.

31:32 This gravestone in the memorial happens to be on Mars, I think.

31:37 Yeah, it kind of looks like that.

31:38 Right.

31:38 That's awesome.

31:39 Why do you think it's on Mars?

31:40 I don't know.

31:41 Same thing in the simulation.

31:42 We're not sure about that.

31:43 Anyway, check out Ivory.

31:45 People can try that.

31:46 If you're in iOS, that's pretty good.

31:48 One more quick extra.

31:50 PyCon, PyCon, PyCon.

31:52 Yay.

31:52 Yay.

31:53 I'm looking forward to this.

31:54 I got my tickets, Brian.

31:56 I'm going to be there for a week.

31:58 I'm going to try to be cruising around the sprints.

32:01 I might even take a day and try to go skiing.

32:03 I haven't decided.

32:04 We'll see what the weather's like out there.

32:05 But it's in Salt Lake City, of course.

32:08 However, I have news for you.

32:09 I haven't even told you this officially.

32:11 Okay.

32:12 But I have gotten us an official time and place at PyCon to do a live Python Bytes show.

32:20 Yay.

32:20 Awesome.

32:21 Yeah.

32:21 So we've previously kind of run around the first day and looked for somewhere where we

32:26 might be able to do something.

32:27 But we're supposed to have an official room and a time where we could actually live stream

32:32 it.

32:32 We can talk to people ahead of time if they really want, they're going to be there and

32:35 they want to come.

32:36 So we should be able to have a cool event at PyCon.

32:38 That's why I bring this up.

32:39 Yeah.

32:40 And it won't be on Tuesday then.

32:41 No, it won't be on Tuesday because it's Thursday night, Friday, Saturday, Sunday is the conference.

32:48 So it's going to be one of those days.

32:49 Awesome.

32:50 I'm looking forward to that.

32:51 I'll be there too.

32:52 Neither of us are speaking, but we'll do the live event.

32:55 And then Michael's going to probably interview absolutely everybody at the conference.

32:58 Yeah.

33:00 I'm trying something different this year.

33:02 You know, PacPython, primarily for the courses side of things, has had a booth on the expo

33:08 floor hall where I'll set up and meet people and show off the things we're doing.

33:13 And, you know, Brian, you've come a couple of years and hung out there.

33:16 We talked about Python Bytes and that booth as well, which is fantastic.

33:20 This year, I'm not doing that.

33:21 I just want to try to be around, have more, interact with more people and try to maybe do

33:27 some more on the spot shows and some other stuff like that.

33:31 Right.

33:31 So absolutely, we're going to be at PyCon.

33:33 We're going to be doing fun stuff, just not at a booth this time.

33:37 I'm going to try some variations on it this year.

33:39 Yeah.

33:40 And I think a lot of people, I think they should hit us up.

33:43 So especially if you thought about maybe asking to be on one of our shows, either yours or mine

33:48 or this one.

33:49 But you're a little nervous, then this is a great opportunity.

33:52 One, you don't have to be.

33:53 You could just contact us anyway.

33:55 But at PyCon, you can hit us up and say, hey, I was curious if this would fit.

34:00 And in person, sometimes it's easier to talk.

34:03 And I'll be bringing stickers, of course, to promote the book.

34:07 And I'm excited.

34:09 And also, I'll be at PyCascades before PyCon.

34:12 PyCascades is there.

34:14 It's coming up in March.

34:16 And I'm speaking there, too.

34:18 That's in Vancouver, right?

34:20 Yep.

34:20 Vancouver, BC.

34:21 Lovely.

34:22 I've been to the one in Vancouver.

34:24 I think the inaugural one was there.

34:26 And it's really, really nice there.

34:27 So excellent, excellent commerce.

34:29 I told my daughter I'm going and she's like, what's the big deal?

34:32 Vancouver's like 20 minutes away.

34:34 No, different Vancouver.

34:36 For those of you who don't know, the people who explored, the Europeans who explored the

34:42 Pacific Northwest, they didn't have a lot of creativity.

34:45 There's multiple Vancouver's.

34:47 There's like one just by Portland.

34:49 There's one up in BC.

34:50 Mount Hood, one of the most awesome mountains around here.

34:53 It's just named for the friend of some guy back in England who never even was here or looked

34:57 upon the mountain.

34:58 Really?

34:58 Oh, yeah.

35:00 It's awesome in a bad way.

35:02 Okay.

35:05 But yeah, the other, the northern, the Canadian Vancouver is a really nice place to go.

35:09 All right.

35:10 You ready for our joke, Brian?

35:11 Yeah.

35:11 So I feel like you and I can relate to this given our age here.

35:15 There's a post here that says from somebody named Mark the Cat Whisperer, but re-shared

35:21 by Rob Isaac.

35:22 Mark says, I'm a Gen Xer, so I adapt to new technology like a millennial, but I get angry about it like

35:29 a boomer.

35:30 I get that.

35:32 I get that too.

35:33 I'm definitely in the Gen X space and oh my gosh, I have more than one time yelled

35:39 at my, yelled at my computer.

35:40 I find personally the way, like the reason I connected with this joke so well is I get

35:45 mad at other people's technology because I'm like, I know this could be better.

35:49 Why have you not put an index here?

35:50 Why have you not auto-filled this?

35:52 You know, like, just like, I know you could make it better so easily.

35:55 What is wrong with you?

35:56 And then I guess the boomer side.

35:58 Yeah.

35:59 But the joke is Rob says, I didn't come here to be called out like this.

36:03 Funny.

36:05 All right.

36:06 Well, that's what I got for us.

36:07 Nice.

36:08 Well, thanks a lot, Michael, for joining us again today.

36:12 And on this 232, wait, 321st episode.

36:16 Wow.

36:17 Yeah.

36:17 It's like the amazing countdown.

36:19 We just don't know to what.

36:20 Three, two, one.

36:21 Contact.

36:22 It's the future.

36:23 Well, thanks everybody for joining.

36:25 And thanks, Michael.

36:27 And talk to everybody next week.

Back to show page