Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book


Transcript #321: A Memorial To Apps Past

Return to episode page view on github
Recorded on Monday, Jan 30, 2023.

00:00 Hello and welcome to Python Bytes where we deliver news and headlines directly to your earbuds.

00:05 This is episode 321, recorded January 30th, almost the end of January.

00:10 And I am Brian Okken.

00:12 And I am Michael Kennedy.

00:13 Hey, Michael.

00:14 Hey, hey.

00:14 Excited to be here today again.

00:16 Absolutely.

00:18 Before we jump too far into it, I want to thank Microsoft for Startup Founders Hub.

00:22 Please listen to their spot later in the show.

00:24 How are we going to start the show?

00:26 What do you have for us, Michael?

00:27 You may wonder, some folks have publicly expressed the bewildering thought that maybe we live in a simulation.

00:34 I don't think so.

00:35 You think we live in a simulation, Brian?

00:37 sometimes.

00:38 Yeah.

00:39 No, I don't.

00:40 When I'm playing a game, maybe.

00:42 But what if you were working on Git and you wanted to see how things were working, simulate some operations and try to understand how Git works without actually making those changes, because there's, there's always the, you you'll get is full of good jokes, right?

00:58 Like in case of fire, git commit, git push, run.

01:02 Things like that, those jokes.

01:04 But the other one is, you don't need to know git that well.

01:07 If you mess it up, you just delete the repository and clone it again and start over, right?

01:12 So ideally, you would be able to run some operations to help you understand what git is gonna do without consequence.

01:20 And so I introduced you this tool called git sim.

01:23 And git sim will visually simulate git operations in your repos with a single command.

01:28 So what it is, is instead of saying, like git merge branch, you would say git dash sim merge branch.

01:36 Now, how best to explain what's gonna happen?

01:40 Like if it just says, we would have merged this branch into that branch with seven changes, you're like, okay, maybe that's fine for merge, but there are many other things that are more complicated.

01:50 And so as you and I are fans of, this will simulate, it'll show you the visual behavior changes that are going to happen.

01:58 Isn't that cool?

01:59 >> Yeah.

02:00 >> So by default, you get a JPEG image.

02:02 The top one you see here, you can see all the commits, their SHAs, and their message.

02:09 You can see two branches.

02:11 It'll see where head and main are, and where dev is, and it'll show you if you do a commit, or actually go or a merge, you're going to take these changes from dev, push them forward in the resulting shape or behavior of the repository.

02:23 - Yeah.

02:24 - So that one's pretty straightforward.

02:26 I'll show you some really cool ones in a minute.

02:28 So use cases include visualizing Git commands to understand them.

02:32 It's kind of what I was talking about.

02:33 Also my joke, prevent unexpected working directory and repository states by trying it out first.

02:39 But there's also a whole, I'm creating blog posts, tutorials, courses, whatever.

02:46 So sharing visualizations of your Git commands with your team, maybe for documentation, right?

02:52 in our wiki, like this is our workflow.

02:54 You probably don't understand what this weird Git thing is that we're doing because it's non-standard.

02:59 Please watch this little animation so you know why we're doing it or something like that.

03:03 - Yeah, internal documentation, that's a great use for that.

03:08 - Yeah, absolutely.

03:09 So basically the supported commands at the moment are log status, add, restore, commit, stash, branch, tag, reset, revert, merge, rebase, and cherry pick, and then sub commands of those.

03:21 It's got some steps to install it, some ways to run it.

03:26 But then you can see, if you scroll down far enough, you'll start to see some of the examples.

03:31 So there's a bunch of examples that are pictures, like a getSim-log.

03:35 It'll simulate the output showing the most recent five commits on the active branch.

03:40 Yeah, and it sort of shows you the tags and various things.

03:43 You can see status and it does like a rich style sort of tree, not tree, a table view.

03:50 really nice there, the different almost Kanban flow of your files going from untracked to tracked and modified locally and staged and all that, which is actually I think it's pretty helpful.

04:04 Even with little arrows showing it moved from here in this one of those columns to this other column.

04:11 Or I don't think I've ever used Git restore.

04:13 I don't make mistakes, so it's fine.

04:16 >> Isn't that delete and reclone?

04:18 - Yeah, exactly.

04:20 Oh my God, don't commit that.

04:21 Don't push that, please just delete it.

04:24 Let's see, but if we go down further, you'll get more interesting examples.

04:28 The merge one is there, let's keep going.

04:30 But we get to the video ones.

04:32 This is where it gets to be pretty awesome and pretty unique.

04:34 - Animate.

04:35 - We could do a getSimReset, so check this out.

04:38 So over here, you have this visual diagram showing how this stuff is changing over time.

04:45 So it's like, all right, what we're gonna do is we're gonna reset this and you will see like the head pointer and the branch pointer move over.

04:51 What do you think, Brian? Isn't this cool?

04:53 >> This is really neat. Yeah.

04:54 >> Yeah. I'll just pick one or two more.

04:56 So we got merge, that's pretty straightforward.

04:58 Rebase, let's animate a cherry pick.

05:01 So you'll see it coming along here, building up the get repo status.

05:06 Then what's going to happen?

05:08 It's going to show us a branch and we want to take some of those changes and cherry pick them over to the main.

05:14 - Yeah, anyway, if you're trying to--

05:16 - I think I'm so lost on that one, but that's okay.

05:18 - I am too, that's not a great animation to be honest.

05:21 (laughing)

05:23 Anyway, it shows you a bunch of examples, a bunch of cool things.

05:26 I think this is really nice.

05:27 Like I said, primarily documentation internally, like your internal wiki, your onboarding docs, or for say blog posts, you wanna talk about what something looks like, then run this.

05:38 And it's not just what does a Git merge look like, it is what does the Git merge look like on this repo in the state, right?

05:44 It applies to your working repo, which is cool.

05:46 >> Yeah, applying to the working one, that's really cool.

05:50 You said that mergers are pretty easy, but actually, I think I'll probably use this for merges the most because there's a lot of times where I have a mental model of what my repo looks like, and a merge shows a conflict or something.

06:03 I'm like, why would it be a conflict?

06:05 >> That's a good point, actually.

06:06 >> It's probably because my mental image of what the repo looks like right now is wrong.

06:12 that something has moved forward since I branched off of it or something.

06:15 >> Yeah. Very good point.

06:17 Actually, I end up confused sometimes by, that should have gotten a clean merge, no problem.

06:23 Now, I'm in some situation where it's asking me to describe the changes and I actually don't know what they are.

06:28 So let's start digging in.

06:30 >> Then sometimes you got to like manual merge.

06:32 What is this? 1980?

06:35 >> Exactly. What are we using?

06:36 CVS? Come on, let's go.

06:38 Anyway, this is what I got.

06:40 >> GetSim.

06:42 >> Nice. I guess we're doing a tools thing, at least for now. I'd like to talk about Knox.

06:49 Have you used Knox?

06:51 >> I have not.

06:52 >> Okay. There's Knox and then there's Tox.

06:56 I have used Tox a lot.

06:59 Both of them, I guess, they do have different things that you can use them for and stuff.

07:04 What do I use them for?

07:05 I use them primarily to run pytest on multiple versions.

07:09 So the general, one of the workflows that works on both of them is I want to create a virtual environment with like Python 3.10, 3.11, 3.9, a bunch of different Pythons, create a virtual environment, install my package that I am trying to test, and then run that, and then all the dependencies, and then run that test suite within that environment.

07:30 And that's kind of a standard thing.

07:33 So my first thought, first when I saw Knox was, so one of the benefits of Knox, Tox uses any files for the settings.

07:43 You can also use, it supports Toml now I think, maybe, I think you can do it in PyProject Toml also.

07:50 If not, sorry.

07:52 But the Knox uses just a Python file.

07:57 So you have a Pyth-- a Nox--

07:59 I think it's noxfile.py or something like that.

08:02 But it just uses--

08:04 I could use the example.

08:05 But anyway, it does similar things.

08:09 So Hynek-- I'm going to get it wrong, sorry--

08:12 Hynek wrote an article called "Why I Like Nox." And he specifically calls out of, like, I'm not bashing Tox.

08:20 Tox is still awesome, a great team supporting it.

08:22 And I agree.

08:23 I know a lot of the people that support it.

08:25 But Knox's might be for you as well.

08:28 So here's a person that likes both tools comparing them.

08:32 And that's refreshing.

08:33 So first off, it's the file format.

08:36 So TOX uses any files.

08:38 Knox uses Python.

08:39 And I got to admit, even for a simple example like this, then the example I'm showing is running Python 3.10 and 3.11 and being able to pass in arguments to pytest.

08:50 Both are not terrible.

08:51 But I think maybe the Knox one's a little bit more readable just because it's Python?

08:56 It's definitely more flexible because you could run arbitrary Python code in addition to some sort of setup, tear down, beyond pytest itself.

09:05 Similar, and then he gets into another example, which is a little bit more involved, which is, I've got a test matrix, but I also, different Pythons, but I want to be able to run the oldest adders version against whatever Python environment I'm running.

09:23 And he claims that he, and I haven't tried this out, that it's actually, he doesn't know why it isn't working, but it's just, it doesn't work.

09:31 And I, you know, I can't help him out there.

09:35 But then he switches to Knox, and it's a lot longer example, but it works great.

09:42 And the longerness, the longerness, I kind of like.

09:47 And he points out in terms of number of lines it's longer than the toxic equivalent but that's because it's more explicit and anyone with a passing understanding of python can deduce what's happening here including myself a year from now explicit can be good actually so i kinda like that that the it's ok that it's longer you're not reading it all the time and having it more verbose might help. So I like that. And then, of course, you brought this up, the power of the snake.

10:16 You can run anything you want. It's Python code. So that's nice. And then one bonus thing is, apparently, it's a little easier to specify versions. Knox has a --Python, and you can pick the version you want to use like that. And it just looks normal.

10:32 You can do that with talks too, but the normal way to do it is to say, what, like, pi 310, which you just have to know the syntax.

10:40 It's not terrible, but whatever.

10:42 >> Yeah.

10:42 >> So good overview of Knox.

10:45 >> Yeah, I didn't realize that Knox was playing Python.

10:50 I'm sure that I knew that at one point, but forgot about it.

10:52 That is an interesting advantage.

10:55 >> Yeah. So I think I want to play with it a little bit more.

10:58 He points out also that he's not switching completely over to Knox, but he does have some projects using talks and some using no, it's good there's two.

11:07 - Yeah, and they rhyme.

11:08 All right, how about our sponsor this week?

11:11 - Oh yes, thank you to Python.

11:13 This episode of Python Bytes is brought to you by Microsoft for Startups.

11:17 Microsoft for Startups has built Founders Hub to help startups be successful.

11:22 Founders Hub provides founders at any stage with free resources to help solve startup challenges.

11:27 The digital platform provides technology benefits, access to expert guidance, skilling resources, mentorship, networking connections, and so much more.

11:36 It is truly open to all.

11:38 Along with free access to GitHub and Microsoft Cloud, with the ability to unlock credits over time, Founders Hub has also partnered with other innovative companies to provide exclusive benefits and discounts, including OpenAI.

11:50 And we've heard from one of our listeners that he's taken advantage of this already, and the discounts are awesome.

11:56 You'll also have access to their mentorship network, giving you access to a pool of hundreds of mentors across a range of disciplines.

12:03 You'll be able to book a one-on-one meeting with the mentors many of whom are former founders themselves.

12:08 Make your ideas a reality today with the critical support you'll get from Microsoft for Startups Founders Hub.

12:14 To join the program, visit pythonbytes.fm/foundershub2022.

12:19 The link is in your show notes.

12:21 Ooh.

12:22 - Indeed, indeed, indeed.

12:22 Thank you, Microsoft.

12:23 All right, ready for the next one?

12:24 - Yes.

12:25 - Not that one.

12:26 So this comes from Tom's corner of the Internet.

12:30 >> Tom's got his own corner.

12:32 >> Yeah, I got a very own corner.

12:34 There's a lot of corners of the Internet to be honest.

12:36 I don't know how many dimensions that is, but many, many corners exist in the Internet.

12:41 Here's one of them from Tom.

12:42 Tom says, "Don't think the tools he's using here are exactly about Python, but what he is applying them to certainly is." It says, "I scanned every package on PyPI and found 57 live AWS keys." Not just, "Oh, that looks like a string that could be an AWS key." He logged in as that person.

13:03 >> Oh.

13:04 >> This is not on GitHub.

13:06 I want to emphasize, this is on PyPI.

13:08 pip install, "Hey, look, thanks for shipping me a version of your keys." >> Weird.

13:14 >> Weird indeed. It says, after I inadvertently found that Infosys leaked AWS keys on PyPI, I have thought, well, if it's once, it's probably many times, right?

13:26 They're probably not the only one.

13:27 So after scanning, get this all 430,000 published.

13:32 well, no, actually I think that's releases 400 and there's 430,000 packages, but there's 4.1 million releases.

13:40 So I think he scanned all the version history as well.

13:44 Okay.

13:44 Somebody found them and took them out.

13:45 Anyway, after scanning those, I found 57 valid access keys from, you know, organizations, I'm sure that they're new at working with the cloud and especially AWS, it is tricky.

13:57 So these organizations may not be familiar with the rules, but Amazon, like they're on AWS keys.

14:06 That's that was the joke.

14:08 The rest, not so much, but Intel, Stanford, Portland and Louisiana universities, keeping it local.

14:13 The Australian government.

14:15 Thank you for that.

14:15 general atomics, fusion development, Teradata, Data Lake, and, yes, your gloves too have been leaked top glove, the world's largest glove manufacturer.

14:26 I love the emoji of the little glove.

14:29 There's a glove emoji at the end of that title.

14:31 So here, like, check this out.

14:32 If I click on, Australian government, it takes us to inspector.pypi.io, which I didn't really know anything about.

14:40 Then it links to, data cube dash Oda best one version one, eight, six, and the data pulls it out into whizky local pipe.

14:49 Yeah.

14:50 I, what is the comment say?

14:53 Do not commit.

14:54 AWS keys do not commit.

14:56 Not only are they committed, you know what?

14:59 Here's here, here's the thing that's interesting.

15:01 Okay.

15:01 They may not be committed to GitHub, but they may have forgotten to take them out when they did the build step to build the wheel, and then they comment them out or somehow remove them from going to GitHub.

15:13 >> I could totally see how this could be easily done because you have to go through an extra step to push from GitHub to PyPI.

15:22 A more natural beginner state is you publish to GitHub and you publish to PyPI.

15:28 >> Yeah, exactly. If you haven't set up full end-to-end automation that does the publish for you, which I think a lot of people haven't.

15:36 >> Yeah.

15:36 >> It's easy to have this happen.

15:39 All right. So let's go through this real quick here.

15:40 So how do we do this?

15:42 Detecting AWS keys is actually pretty simple.

15:45 Did you know that there's a regular expression that is a valid match for AWS keys?

15:50 I thought there's random business, but no, there's a certain format that they take.

15:55 So you can tell this is not just a key, it is an AWS key ID.

15:59 >> Cool. So now I know how to search for them in other people's repos.

16:03 I feel like this would be a pretty awesome pre-commit hook.

16:06 And there's tools like Twine and others that people use to build, build their packages that get shipped to PyPI.

16:16 >> Yeah.

16:16 >> In PyPI itself, all of those could start applying checks for this kind of stuff, right?

16:22 Because GitHub access keys have a certain pattern now.

16:25 I don't remember. There's some prefix that they seem to have that looks like it's predictable.

16:30 I feel like maybe, maybe this could be put into the supply chain pipeline as they call it.

16:36 But anyway, there's a regular expression you can run against to find them.

16:40 And here we go.

16:41 So we can use the amazing rip grip to search packages for this pattern.

16:45 And look at that.

16:46 Here we go.

16:46 You pull down this, this GZIP file and then you, rip grip it and boom, outcomes, the access keys.

16:54 Whoopsie.

16:55 Apparently Amazon pay at this point here.

16:59 But just because the keys are present, are they valid?

17:02 I don't know. So the next step shows you how to execute the AWS CLI command to get the caller identity to see if it's actually valid.

17:10 Okay. So it says, now the devil's in the details, the Z-Z flag doesn't support searching zips.

17:17 So let's go and tear this up and points out, you can get the entire over@github.com/orf/pypi-data.

17:27 You get the entire static dump of PyPI data.

17:30 >> Oh, nice.

17:30 >> You know this existed. I had no idea.

17:32 So I'm like, wait a minute, let's go check this out.

17:34 PyPI-data, this is automatically updated PyPI API data available in bulk.

17:40 So the contents of the entire-

17:42 >> How big is this?

17:44 >> It's not small.

17:47 See the shallow checkout perhaps.

17:50 The contents of the entire PyPI JSON API for all packages updated every 12 hours.

17:57 >> Wow.

17:58 >> Yeah. So it says, for example, here's the JSON for Django.

18:03 Anyway, I didn't know that exists.

18:05 That's pretty awesome.

18:07 Then he set up a GitHub Action to pull those down.

18:10 Then GitHub Actions, let's see, GitHub Secret Scanning Service will kick in and let AWS know that the keys are leaked.

18:20 This will cause AWS to open a support ticket with you to notify that your keys are leaked, which is kind of an interesting chain of events that happens here.

18:28 But it talks about how old the keys might be.

18:32 The oldest one is from 10 years old from 2013.

18:36 And the different reasons this happens, it's hard, for example, to test against AWS.

18:42 Another reason that they say is like there's legitimate and quote uses.

18:46 One of the things they talk about is, you know, why is this happening?

18:49 And Python being super heavily used in data science and ML, a lot of folks come from that side of the world without super strong software engineering practices.

18:59 And so maybe, you know, coming from economy and being an economist, you're like, Oh, I got this thing working.

19:06 Let me publish this up to help people.

19:07 Right.

19:08 It's, it's really easy that you didn't really think about some of these things.

19:12 Right.

19:12 But basically don't put your secrets in your source code.

19:15 Don't put them in GitHub.

19:17 And, you know, don't by the transitive property, put them in pipe either.

19:21 Yeah.

19:22 So anyway, what do you think?

19:24 I think it's, it's a head shaker, but interesting.

19:28 we need, we need like stickers made up for laptops.

19:31 Do you know where your keys are?

19:33 It's 10 PM.

19:35 Do you know where your keys are?

19:37 Yes.

19:39 They said they're sleeping at their friend's house.

19:41 They're actually at a frat party.

19:42 Okay.

19:43 Yeah.

19:44 So the article is like missing one step and that's how to set up a Bitcoin miner on all these keys that you...

19:50 That's left as an exercise to the user. By the way, nice little Hugo website here.

19:58 Got to give a little shout out to that. And I know we both like our Hugo. Okay, that's it for this one.

20:02 Okay. What's your final one for us? I've got a hypothesis. So I get actually asked this a lot.

20:09 I do like hypothesis, but it's a little overwhelming. I get asked, "So what do you think about about hypothesis or something like that or do whatever.

20:16 Yes, I use hypothesis.

20:18 I do like it, but it can be overwhelming.

20:21 We're going to take a look at an article called Getting Started with Property-Based Testing in Python with Hypothesis and pytest.

20:28 This is from Reet Rodrigo, and I'm not going to try the rest of the name.

20:33 Rodrigo, maybe, yeah, I'm not.

20:36 Sorry. Cool name.

20:37 >> Saro.

20:38 >> Saro.

20:39 >> Yeah.

20:40 >> Anyway, it's on the Semaphore blog.

20:43 There's a lot of what I like about this article.

20:47 And the, well, first off, what I really like about property-based testing is not that, I mean, it can find some bugs in your code, and that's kind of what it's for, and it's good.

20:58 But it also makes you think about it.

21:00 So thinking about a few examples to test your code and corner cases and all that stuff is good to say, you know, how do I know if my code's working?

21:09 But with property-based testing, especially, And I think a good place to focus this on is algorithmic stuff.

21:16 So you've got some type of algorithm inside your function, and you really want to make sure that that's just solid, no matter what you throw at it.

21:25 And so that's a great place for property-based testing.

21:27 But what you do is you think about, you have to think about what properties are true, because what hypothesis is going to do is it's going to throw a bunch of input at your function.

21:36 And so you have to think, how do I tell if I don't know what the input is, if the answer is correct.

21:42 Because if you know the input, you can calculate it yourself whether the answer is correct or not or something.

21:48 But without that, you're thinking in properties. And so I love this article. The first descript, the example, it goes through two examples. The first example is a greatest common denominator in math problem of like thinking about, I mean, you can just like have some known problems that you know the answer to and all those out is great but how would you test like for every number and so going through a cup thinking about what to test is great what did he talk about for grace common denominator your answer is gonna be positive and the answer needs to divide both of the numbers, right? That's kind of the point of it. But then how do you know if it's the right one?

22:35 Well, no other number larger than your answer is going to be able to divide in an M.

22:40 So it's kind of, you're going to end up doing kind of an exhaustive search a little bit.

22:44 But that's okay. It's source code. It'll run, shouldn't be too long.

22:49 And the other thing hypothesis do, does, which I didn't know at first, is it's pretty good at picking numbers that will probably break your code.

22:56 and by default, it only picks 100 numbers.

22:59 It only picks up 100 test cases.

23:02 The limit is important because often your sample size that you could test is infinite.

23:09 You don't want it to just run forever, you want it to be some constraints on it.

23:15 It goes through this example coming up with how to test that, so it writes a test, and then how do you plug hypothesis into it?

23:24 You've got given and strategies is often used.

23:28 So you put a decorator on your test, say given strategy integers for both the input of n and m, test to make sure, and then you run your function and then test stuff around it and the test is listed up higher in the code.

23:47 Then quickly you're going to find problems.

23:50 I like the greatest common divisor because there are test cases that don't work, which is great.

23:56 Like zero.

23:57 Zero, you know, by definition, we don't, if both of them are zero, it's undefined.

24:03 And if one of them is zero, it's defined to be the other one, which I actually didn't know.

24:08 I'm like, really, is that true?

24:10 I looked it up and...

24:11 - Yeah, I didn't think about that either.

24:12 - Yeah.

24:12 So apparently, if you, the greatest common divisor of zero and five is five, which, who knew?

24:18 But aside from that, so coming up with edge cases is probably good for algorithmic type of code anyway.

24:26 And then the example of 0, 0, how do you get rid of that?

24:30 So in this, I guess this is one of my caveats about this article.

24:34 He talks about limiting the range, which it's good, it's a good example because you're going to want to do this in a lot of your test cases, is limit the range so you can put a min and max on different things.

24:45 And there's a lot more than numbers.

24:46 You can do text and you can do all sorts of stuff with hypothesis.

24:49 But I think good starting with numbers is a good one.

24:51 I just don't like the solution that he came up with.

24:54 The solution he came up with is limit one of them from 1-100 so that you're never going to have 0-0.

25:01 I'm like, "Eh." I personally would have used a different mechanism.

25:07 My recommendation is there's a strategy called, oh, not strategies, making assumptions.

25:15 There's a thing in hypothesis called assume.

25:18 and you can say within a test, you can say assume something.

25:23 Assume, it's just like an assert, but it doesn't fail your test.

25:26 If it fails, it rejects the test case is how it works.

25:31 You can say for the 0, 0 case, you can say assume not n equals m equals 0.

25:42 It's hard to do this without code, but you can make an assumption there so that it'll kick that one out.

25:47 That's how I would have done that case.

25:49 But other than that, it's a really great introduction to how to work with property-based testing.

25:55 So I give it a thumbs up.

25:57 Then there's a second example, which is nice too.

25:59 So--

26:00 - That's cool.

26:01 I didn't know about assume, so I'm very good to know about that.

26:04 - The other thing that I think is a good thing to know about is example.

26:08 So like the example zero zero.

26:10 We specifically don't wanna test that 'cause we know it's broken or it's not defined, but there are lots of cases where you're like, you know, somebody, you're doing property-based testing on something and you get a defect of like, well, if I run these numbers, it fails.

26:23 And you're like, oh, well, we want to make sure we always run that.

26:26 So with example, you can say, hypothesis, you get to come up with the examples, except for always run this one also.

26:34 So, and you can just kind of stack them up.

26:36 That's good.

26:37 - Yeah, and so for people listening, example is a decorator you put on your test.

26:40 Say @example and you put a certain set of parameters that get called there.

26:44 - Yeah, I just, I kind of don't like that.

26:46 So examples is a decorator that you put on the outside to say, run this one always.

26:51 I'd like the reverse also to say, this particular example, don't run it, because I know it's broken.

26:58 We get around it with the assume part, but it would be cool if there was a don't run this example.

27:04 >> Yeah. It looks very helpful.

27:06 I learned some things. So excellent.

27:07 >> How about you? What's next?

27:09 >> We got extras.

27:10 >> Are we done with our normal ones?

27:13 >> Yeah.

27:13 >> That's fast.

27:14 >> Time flies when you're having fun.

27:16 All right, you want me to go first or you want to do yours?

27:18 I got a short one.

27:20 All right, go for it.

27:21 Okay, so my, let's get rid of that. We don't need that.

27:24 My example, my extra is just that this came in the mail and I'm really excited about it.

27:29 So it's the Japanese version of Python testing with pytest.

27:33 It's been out for a little while in, I assume, Japan.

27:37 But it got translated.

27:42 I was in touch with the translator and they asked me a few questions.

27:48 Very respectful, dude.

27:50 And I'm glad and I'm like, can I get a copy?

27:52 And they sent me they sent me a few copies, actually.

27:54 So very exciting. Fantastic.

27:56 Oh, that's really cool.

27:57 That's neat to see it.

27:58 It's neat to be reminded of the global reach.

28:01 Yeah. And not just the cover.

28:03 Insides are there, too.

28:04 But amazing.

28:06 So you're going to learn Japanese.

28:09 You can figure this out.

28:10 I do have a friend that speaks Japanese.

28:12 So I'm going to go in and see that they don't code, but yeah, it's all right.

28:17 I'm sure they're going to find it riveting anyway.

28:19 yeah, that's awesome.

28:22 What are your extras?

28:23 Well, a couple of things.

28:25 So it seems like I have survived the very first, I don't bite on my new Mac mini.

28:31 I just got a, you know, Apple released the Mac mini pro M two and I got that.

28:35 And so far super, super neat.

28:38 I can recommend as a lot, lots faster than the previous mini.

28:41 So I mentioned that I used to have money and then I had a mini on order and now I actually have a mini, a new mini.

28:47 It looks identical to the other, but it goes way faster, which is great.

28:52 Maybe I'll have more to say about that later.

28:54 All right, so have you heard that Twitter is going through some turmoils?

28:57 I'm not sure.

28:58 - I think, yeah.

28:59 - I think something's going on over there.

29:01 The latest turmoil is that they decided to unceremoniously, unprofessionally cancel all the third-party Twitter apps.

29:11 That's that's just insane.

29:13 It's pretty insane.

29:14 Like, I think it's honestly, I think it's fine.

29:17 And within Twitter's right to say, look, we don't want to have third-party apps.

29:21 We have a business model that doesn't work well there.

29:23 There's not third-party Facebook apps.

29:25 There's not third-party Instagram apps.

29:27 Are there?

29:27 I don't think so.

29:28 Anyway, I think it's fine, but the way that it was done was, oh, we're just going to cut them off.

29:35 And then in a few days, maybe somebody will say something.

29:38 What they said was the reason we cut off things like tweetbot and other ones is because they violated the terms of service.

29:45 Like, wait, we've been doing this for 10 years.

29:47 What do you mean?

29:47 They, the reason they violated them is they went back and updated the terms of service to say, we don't allow third-party apps.

29:53 I mean, it was just really weird.

29:57 Anyway, I, I want to just direct people who want to, you know, enjoy the moment technically and socially, to tap bots.com/tweetbot where they put up a memorial to TweetBot.

30:09 Brian, isn't this a fantastic picture?

30:11 >> It really is. There's a little elephant, but it's all 3D looking like claymation or something.

30:17 >> It does look a little claymation-y.

30:21 The mastodon elephant is at the gravestone of TweetBot and it has the life from April 2011 to January 2023 on the gravestone of TweetBot.

30:34 Anyway, it's pretty interesting.

30:36 The reason part of that, I bring this up, not just for the picture, is if you're into Mastodon, the same company decided, well, we're doubling down on Mastodon and creating Ivory, which there's been a ton of talk about Ivory being a really cool app for if you want something better than say the progressive web app for Mastodon.

30:54 I know there are others out there as well, but a lot of people are really big fans of Tweetbot and TabBot's the company.

31:00 And so you can now, this is now publicly available.

31:03 - Okay.

31:04 - So there's that.

31:05 And anyway, I started using it.

31:07 I like it really well.

31:09 I don't use it exclusively over just using the Progressive Web App because the Progressive Web App has the advanced view where you have multiple columns.

31:16 You can create searches for hashtags and pin those as columns and it's really nice.

31:20 But this is quite a nice app if you're not kind of doing that advanced view.

31:25 Oh, and by the way, Christopher Tyler has caught something incredible.

31:30 - Yeah.

31:30 - I have missed this gravestone in the memorial house.

31:35 happens to be on Mars, I think.

31:37 - Yeah, it kind of looks like that.

31:39 That's awesome.

31:39 - Why do you think it's on Mars?

31:40 I don't know.

31:41 Same thing in the simulation.

31:42 We're not sure about that.

31:43 Anyway, check out Ivory.

31:45 People can try that.

31:46 If you're in iOS, that's pretty good.

31:48 One more quick extra, PyCon, PyCon, PyCon.

31:53 Yay, I'm looking forward to this.

31:55 I got my tickets, Brian.

31:56 I'm gonna be there for a week.

31:58 I'm gonna try to be cruising around the sprints.

32:01 Might even take a day and try to go skiing.

32:03 I haven't decided.

32:04 what the weather's like out there, but it's in Salt Lake City, of course.

32:08 However, I have news for you.

32:09 I haven't even told you this officially, but I have gotten us an official time and place at PyCon to do a live Python Bytes show.

32:20 - Yay, awesome.

32:21 - Yeah, so we've previously kind of run around the first day and looked for somewhere where we might be able to do something, but we're supposed to have an official room and a time where we could actually live stream it.

32:32 We can talk to people ahead of time They're going to be there and they want to come.

32:36 We should be able to have a cool event at PyCon.

32:38 That's why I bring this up.

32:39 >> Yeah. It won't be on Tuesday then.

32:42 >> No, it won't be on Tuesday because it's Thursday night, Friday, Saturday, Sunday is the conference.

32:48 It's going to be one of those days.

32:49 >> Awesome. I'm looking forward to that.

32:51 I'll be there too. Neither of us are speaking, but we'll do the live event and then Michael's going to probably interview absolutely everybody at the conference.

32:59 >> Yeah. I'm trying something different this year.

33:02 So talk Python primarily for the course of sighted things has had a booth on the expo floor hall where I'll set up and meet people and show off the things we're doing.

33:13 And you know, Brian, you've, you've come a couple of years and hung out there and we talked about Python bites and that booth as well, which is fantastic.

33:20 This year I'm not doing that.

33:21 I just want to try to try to be around, have more, interact with more people and try to maybe do some more on the spot shows and some other stuff like that.

33:31 Right.

33:32 - Absolutely, we're gonna be at PyCon, we're gonna be doing fun stuff, just not at a booth this time.

33:37 I'm gonna try some variations on it this year.

33:40 - Yeah, and I think a lot of people, I think they should hit us up.

33:43 So, especially if you thought about maybe asking to be on one of our shows, either yours or mine or this one, but you're a little nervous, then this is a great opportunity.

33:52 One, you don't have to be, you could just contact us anyway.

33:55 But at PyCon, you can hit us up and say, hey, I was curious if this would fit.

34:00 and in person sometimes it's easier to talk.

34:04 And I'll be bringing stickers, of course, to promote the book, and I'm excited.

34:09 And also, I'll be at PyCascades before PyCon.

34:13 PyCascades is coming up in March, and I'm speaking there too.

34:19 - That's in Vancouver, right?

34:20 - Yep, Vancouver, BC.

34:21 - Lovely, I've been to the one in Vancouver.

34:24 I think the inaugural one was there, and it's really, really nice there.

34:28 So, excellent, excellent conference.

34:30 I told my daughter I'm going and she's like, what's the big deal?

34:32 Vancouver's like 20 minutes away.

34:34 No, different Vancouver.

34:36 - For those of you who don't know, the people who explored, the Europeans who explored the Pacific Northwest, they didn't have a lot of creativity.

34:46 There's multiple Vancouver's, there's like one just by Portland, there's one up in BC.

34:50 Mount Hood, one of the most awesome mountains around here, is just named for the friend of some guy back in England who never even was here or looked upon the mountain.

34:58 - Really?

34:59 - Yeah, it's awesome in a bad way.

35:03 Okay, but yeah, the other, the northern, the Canadian Vancouver is a really nice place to go.

35:10 All right, you ready for our joke, Brian?

35:11 - Yeah.

35:12 - So I feel like you and I can relate to this given our age here.

35:16 There's a post here that says from somebody named Mark the Cat Whisperer, but re-shared by Rob Isaac.

35:23 Mark says, "I'm a Gen Xer, so I adapt to new technology "like a millennial, but I get angry about it Like a boomer.

35:30 I get that.

35:32 I get that too.

35:33 I'm definitely in the Gen X space and oh my gosh, I have more than one time yelled at my, yelled at my computer.

35:40 I find personally the way, like the reason I connected this joke so well is I get mad at other people's technology because I'm like, I know this could be better.

35:49 Why have you not put an index here?

35:50 Why have you not auto-filled this?

35:52 Like, you know, like, which is like, I know you could make it better so easily.

35:55 What is wrong with you?

35:56 And then I guess the boomer side.

35:58 >> Yeah.

35:59 >> But the joke is Rob says, "I didn't come here to be called out like this." >> Funny.

36:06 >> All right. Well, that's what I got for us.

36:07 >> Nice. Well, thanks a lot, Michael, for joining us again today.

36:11 On this 321st episode, wow.

36:17 >> Yeah. It's like the amazing countdown, we just don't know to what.

36:20 >> 3, 2, 1, contact.

36:22 It's the future.

36:23 >> Indeed it is.

36:24 >> Well, thanks everybody for joining, And thanks Michael.

36:27 And, talk to everybody next week.

36:29 Bet.

Back to show page