Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book


Transcript #333: Live From PyCon

Return to episode page view on github
Recorded on Saturday, Apr 22, 2023.

00:00 Hey, welcome everybody and welcome. We've got a whole room full of people. We're recording this live.

00:05 How about we get a live from PyCon?

00:07 Shout out.

00:08 There we go. Thank you all.

00:11 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:16 This is episode 333, half of a beast, recorded April 21st, 2023.

00:24 And I am Brian Okken.

00:25 And I'm Michael Kennedy.

00:26 And we are live at PyCon US 2023.

00:30 Yeah, it's excellent to be here. Thanks everyone for coming. It's awesome. It's a real honor.

00:35 So, shall we kick off the show?

00:43 Let's kick off the show. I think first I'd just like maybe to get your thoughts real quick on PyCon, the Expo, people.

00:49 How's it feeling this year?

00:50 Well, it's the first time I've been back from the pandemic. I didn't show up last year, so I'm pretty excited.

00:55 It's really good to see people. There's people that I haven't seen in person since 2019.

00:59 So, it's pretty awesome. How about you?

01:01 Same. It's really great to reconnect with a bunch of people and see folks I haven't seen for a long time.

01:05 Really nice.

01:06 Yeah, it's been great. Plus, we've got the Python staff this year.

01:09 Yeah, the Python staff of Python things.

01:14 So, I want to talk about something that's all the rage.

01:17 And I do want to put that on the screen for a live stream, people as well.

01:20 And that is more AI chat things.

01:25 What do you all think about ChatGPT and all these things?

01:28 Is this scary or is it scary or is it awesome?

01:31 I'm like, oh, we're getting...

01:33 Yeah, all right.

01:34 So, honesty.

01:35 I think that represents how I've felt.

01:37 I've felt like everyone out there at different stages in this whole journey.

01:40 Yeah.

01:42 What we saw is a whole bunch of thumbs up and a lot of sideways.

01:46 So, we're not quite sure yet what we think of it.

01:48 We're not quite sure yet.

01:49 It's one of those things that's kind of here.

01:52 The cat is out of the bag.

01:53 We can either rail against it or find good uses for it to take advantage of it.

01:57 So, Microsoft has found a way to use their large language models behind OpenAI and the stuff that powers ChatGPT to help security defenders, they say.

02:10 Like, if I'm on the blue team trying to stop people from breaking into my company, I could use a little bit of help with that.

02:16 And you can already use ChatGPT for all sorts of crazy programming and security type of things, right?

02:25 You can say, hey, dear chat, I would love if you could help me write a phishing email that looks convincing.

02:31 Or I would like you to help me identify things I might test for misconfiguration Nginx files and what I might do with that, you know?

02:39 Those are all bad things.

02:41 But this project here is called Microsoft Security Copilot.

02:46 It says, empowering defenders at the speed of AI.

02:49 And so, basically what this is, is it's ChatGPT.

02:52 But instead of using a general purpose language model is using a cybersecurity focused large language model that understands things like, don't let me get hacked, buffer overflows, configuration files, that kind of stuff.

03:05 So, if you're in the space of cybersecurity, which Python is one of the most popular languages out there for cybersecurity, right?

03:14 Both sides of it, the good and the bad.

03:16 But, yeah.

03:18 So, basically, you give it a prompt.

03:20 You ask it a question about configuration file or some kind of environment.

03:26 And it will allow, it'll go and use that large language model.

03:29 And it doesn't always get it right.

03:31 This is one of the big challenges.

03:33 Maybe some of the thumbs down from you all were like, you know, this large language model made up something about the world or whatever.

03:41 But it was real confident.

03:42 It was certain it was right.

03:43 But it wasn't.

03:44 So, this has a feedback loop.

03:45 You can say, no, no, that's actually not misconfigured security copilot thing.

03:49 That was okay.

03:50 And here's why.

03:51 And so, you can have this loop that you would have with, you know, maybe with like a junior cybersecurity researcher or whatever.

03:59 And another thing that I don't really know how all these large language models work and all this AI stuff works.

04:06 Many of it, much of it seems to be, we're going to go find a bunch of other people's work and then take that.

04:12 We'll have a really cool system with this cool data, right?

04:14 Like we're going to scan repos and maybe it doesn't matter if it's GPL.

04:19 If we filter the GPL out through some kind of neural net or, you know, get all the Getty images.

04:25 And now we can create really cool pictures if you ask for it.

04:27 But the Getty wasn't on board with that.

04:29 So, this data story is kind of a little suspicious for these.

04:34 But with this one, they explicitly say your data does not get shared back.

04:39 It doesn't go anywhere.

04:41 This is like, you can even lock it down about how other people are allowed to access.

04:44 So, that's kind of cool.

04:46 And yeah, they're basically trying to help people go through log files and other things on the server where people are trying to hide their tracks,

04:55 behaving normally but not really, and pull those things out.

04:59 Now, I have no experience with this.

05:00 But I know I interviewed some folks on Talk Python who are astronomers looking for exoplanets.

05:07 And they were able to take old Kepler data and apply some machine learning and computer vision and discover 50 new exoplanets that people thought they had already analyzed.

05:17 And guess what?

05:17 They were hiding.

05:18 They couldn't be discovered by people.

05:19 But by computers, they could.

05:21 I suspect the same type of thing is true here.

05:24 They're like, there's 10 million lines of log file.

05:26 And these three are suspicious.

05:28 But nobody really noticed, you know.

05:29 So, anyway, if you're in cybersecurity, definitely give this a look.

05:33 So, next, I want – I should have thought of this at a time.

05:37 But we've got a bunch of people here that can't see our screens.

05:41 And I do – which is a good reminder that this is also an audio podcast.

05:46 It's not just on YouTube, apparently.

05:48 So, the next topic I'll have to be careful talking about.

05:53 But it's PEP 695 type parameter syntax.

05:58 Now, this is – this PEP is an – it's for Python version 3.12.

06:05 It's accepted.

06:06 So, I don't know if it's already in some of the alphas or betas or not.

06:11 Yeah, I don't know either.

06:11 But we've got – so, it's accepted for 3.12 type parameter syntax.

06:16 The abstract is this PEP specifies an improved syntax for specifying type parameters within a generic class function or type alias.

06:26 It also introduces a new statement for declaring type aliases.

06:30 What does that mean?

06:31 Well, I like to – it has some great examples.

06:36 So, we go – if we go down to the examples, there – it's the old way.

06:40 Like, let's say I've got – one of the examples is great.

06:45 So, let's say I've got a function like that takes – it takes something.

06:49 We don't know what the type is.

06:51 But it takes something and then it returns the same type.

06:54 Or it takes something – it takes two of – it has to have two of the same typed things.

06:59 It doesn't matter what they are.

07:00 It doesn't matter what they are.

07:01 So, like two ints or two floats or two lists or two tuples.

07:06 It doesn't matter what, but it's the same thing.

07:09 The old way to do that, which is – I still think it's fairly recent.

07:13 I think this might have been 3.11 for type var.

07:15 It's pretty new, I think.

07:17 So, yeah.

07:18 Yeah.

07:18 I'm laughing because it's rolling over so quickly, right?

07:22 Yeah.

07:22 So, the – anyway, the old way to do it was from typing import type var.

07:28 And I didn't even know you could do this.

07:30 And then you declare a new type using, like, as in this example, underscore t equals type var.

07:37 And then in parentheses, underscore t.

07:41 And then you can use that as the type of arguments.

07:44 And that's really kind of ugly syntax.

07:47 And the new proposed syntax is to just give a bracket, like bracket t bracket after the function to say, basically, it's a templated function.

07:56 Like all the other generic statically typed languages, like C and stuff, right?

08:01 Yeah.

08:01 So, it definitely reminds me of, like, the type – yeah, the – Templates.

08:07 Templates.

08:08 Thank you.

08:08 In C++ and stuff.

08:11 So, it's definitely easier.

08:13 I still – I'm not sure.

08:15 So, it's approved.

08:16 So, we'll get this in 3.12.

08:17 It's definitely better than the old way, but it's still – I think we might be confusing people with this.

08:22 What do you think?

08:23 I think types in Python are awesome, but I think it can also go too far.

08:27 I mean, let's ask – since you all are here, let's ask, like, how many people like typing in Python?

08:33 Almost uniformly.

08:34 Yeah.

08:35 Yeah.

08:36 Okay.

08:36 But it can get over the top sometimes, I think.

08:40 One of the things, though, is cool.

08:42 One of the bottom examples in this, it shows that combining types.

08:46 So, like, maybe a function that takes two of the same type things, maybe that's a little weird.

08:51 But it's not too weird if you think of, like, lists of things.

08:55 If I want to say it can either be a list or a set of a certain type, but only one type.

09:02 How do you say that without these generics?

09:05 Yeah.

09:06 Yeah, I know.

09:06 Yeah, I think – It is incomplete.

09:08 And so it's the question of how far are you going to push the language to get that last couple percent?

09:12 Anyway, it is looking a lot more like C, isn't it?

09:15 I'm glad I studied that, but also glad I don't have to write it these days.

09:18 Anyway.

09:19 So something to look forward to in Python 3.12 is PEP 6.9.5.

09:24 Yeah, absolutely.

09:25 While we're riffing on types, I just want to make a quick comment.

09:29 I got a message from somebody recently on this project.

09:32 It said, Michael, I discovered a bug in your code.

09:35 It doesn't run.

09:36 I'm like, oh, really?

09:37 It seemed like it ran last time I touched it.

09:38 But, okay, what's going on?

09:39 Well, you used the lowercase l list bracket type, and only capital L list works.

09:45 Like, no, the bug is you're in Python 3.9, not 3.10.

09:49 This is a new feature.

09:50 And I think – I'm joking, kind of – but with all these changes so quickly,

09:54 like, it starts to get – you've got to be on the right version of Python or this thing won't exist.

09:59 Right?

09:59 And it's going to be an error.

10:00 Yeah.

10:00 It used to be, oh, the last five versions is fine.

10:03 Now it's like, oh, the last version is fine.

10:04 We'll see.

10:05 Yeah, that – I'm starting to – I'm working with some educators.

10:10 And one of the tricky things in, like, universities is the – your curriculum is kind of needs to be known ahead of time.

10:21 And they kind of set that.

10:22 And so with Python moving so fast, I wonder how educators are dealing with this.

10:27 If they're teaching 3.8 or 3.11.

10:30 All right, we've got some teachers in the audience saying 3.11.

10:33 Kids, they like new shiny things anyway.

10:35 Give them that.

10:35 Give them that.

10:36 All right.

10:36 All right.

10:37 What's next here, Brian?

10:38 What's my next one?

10:39 I don't know either.

10:40 No, I do.

10:40 It has to do with AI probably.

10:41 So this one comes to us from Matt Harrison, who's here at the conference, if you want to say hi.

10:46 Obviously, there's all this GPT stuff going crazy.

10:49 But one of the challenges is you can ask it a question, and it'll give you an answer.

10:54 Right?

10:54 Like, hey, please write this code for me.

10:56 And it'll go, boom.

10:57 Here's – you don't need to hire anybody.

10:59 Just take this code and trust me.

11:01 Or whatever, right?

11:01 You ask it a question.

11:02 And you can ask it a couple of questions, but it has what's called – was it a token stack or something like that?

11:08 It only has so much memory of, like, the context of what you're asking it.

11:12 And the ability to go and ask it to do one thing and then based on its response, go do another and then a third after that, it's not quite there yet.

11:19 So there's this project called Auto GPT.

11:22 So if you have an open AI API key, basically, so if you pay for open AI or somehow have access to it, then you can plug it into this thing.

11:32 And what it does is you give it a mission.

11:34 You say, dear AI thing, what I would like you to do is go search Google for this, figure out what you find, and then get the top three most popular ones.

11:45 Go find their web pages.

11:46 Take all the information out of that and summarize them for me and then make a prediction about, like, who's going to win the Super Bowl because I'm going to bet big on it.

11:53 I don't know.

11:54 So basically that's the idea.

11:56 It says it has a couple of benefits over regular chat to be, for example, which is you can't connect it to the Internet.

12:04 I don't know if you ever played with it, but it'll say things like, I only know up to 2021.

12:07 Sorry.

12:08 This one has Internet access.

12:09 It has long-term memory storage.

12:12 It'll store in a database so you can, like, have it go on and on for a long time.

12:16 File storage, all sorts of interesting things.

12:18 So they have a video that we'll link in the show notes you can check out here.

12:22 I'm going to mute it because I don't want to hear this person talk.

12:25 But it says, fires it up, and it says, all right, we're going to get started.

12:29 And what I want you to do, your role is an AI designed to teach me about AutoGPT, the thing that is itself, right, very meta, self-referential.

12:38 Your goals as a list in Python is first search what AutoGPT is and then find the GitHub and figure out what it actually is from its GitHub project.

12:48 And then explain what it is and save your explanation to a file called AutoGPT.txt and then stop.

12:55 And it will, if you run it, you will say, okay, well, now it's gone out to Google and it's done this thing and it's pulled it in.

13:03 And now it's starting to analyze it.

13:04 And why is this interesting?

13:06 This is all Python code, right?

13:07 So this thing is created in Python.

13:09 You run it with Python.

13:10 I'm sure you can extend it in different ways with Python.

13:13 But, yeah, it's pretty nuts.

13:15 You create these little things.

13:17 You put them on a mission.

13:18 And you just say, go, you know, go get me tickets for this concert or go do this other thing.

13:23 And here's the plan I want you to follow.

13:26 You just set it loose.

13:27 So, anyway, if you want to combine some Python and some automating of the large language models, there you go.

13:35 This seems like something could definitely easily be used for evil.

13:38 No, no way.

13:39 There's no way.

13:40 Yeah, I agree.

13:43 All right.

13:44 What do you got for the last one?

13:45 I am, so we've talked about Ruff before, I think.

13:50 So there's been an announcement that Charlie Marsh is now his own company and hiring people.

13:58 So Charlie Marsh has formed a company called Astral.

14:02 And he's made a good start.

14:05 He's starting with $4 million of investment money.

14:09 So it's not a bad.

14:10 That is not a bad deal at all.

14:12 Bad deal to start a company.

14:13 But I'm kind of excited about it, actually.

14:15 Well, one, I'm happy for him.

14:19 Obviously, well, at least I hope it's a good thing for him.

14:21 But I just think it's neat that I guess I just wanted to highlight and say congrats, Charlie, you're doing this.

14:29 So the Ruff, if you're not familiar, is kind of like a flake 8 linter sort of thing.

14:36 But it's written in rust, and it's really, really fast.

14:39 It's so fast.

14:40 I can't.

14:41 You can barely detect it's running.

14:43 But it works.

14:44 Yeah.

14:45 How many of you all have heard of Ruff?

14:46 R-U-F-F?

14:47 Pretty much everyone.

14:48 And this thing's only been out like a year.

14:50 So that's a big deal.

14:51 Yeah.

14:51 I ran it on the Python bytes and the Talk Python code and 20,000 lines of Python.

14:57 And you're like, did it actually run?

14:59 Did I give it the wrong files?

15:00 It might not have seen anything.

15:01 It's instant.

15:02 It's crazy.

15:02 One of the things Charlie's noticed is that it's becoming very popular.

15:07 But he's also getting a lot of requests.

15:09 So it's a very active project now.

15:12 And I'm sure it's taking a lot of time.

15:14 So he's got things like new requests.

15:17 So let's do more of the extensions of Flake 8, which is completely valid.

15:25 And then also, yeah.

15:28 Well, this was a good idea of taking part of the Python tool chain and rewriting it in Rust.

15:34 What other stuff could we rewrite in Rust?

15:36 And I think that's where they're headed is making more Python things more Ruff-like or Rustifying them.

15:45 So I'm excited for it and to see what they come up with.

15:49 And he's promising that a lot of this stuff is going to be open source available to everybody.

15:54 Awesome.

15:55 Congratulations, Charlie.

15:57 That's awesome.

15:58 I would say, you know, when I got into Python nine, ten years ago, there seemed to be this really strong resistance to anything corporate.

16:06 Anything like people were trying to bring money.

16:08 It seemed really suspicious.

16:09 Like, what is your motive here?

16:11 Are you trying to corrupt our open source environment?

16:14 And I think since then, we've kind of found a way where there can be commercial interests that don't undermine the community, but also come in and benefit.

16:24 I mean, we saw Samuel Colvin with Pydantic.

16:28 We're seeing this now.

16:30 And, you know, a lot of them seem to fall.

16:31 Textuals.

16:32 Textuals.

16:32 Absolutely.

16:33 Will McCoogan.

16:35 Out with Rich.

16:36 Sorry, Will.

16:36 And a lot of them seem to fall under this what's called open core business model where, like, the essence of what they're doing, they give away for free.

16:44 Like Rich.

16:45 Like Pydantic.

16:47 But then, on top of that, there's something that is highly polished and commercial, and that's where they're kind of working.

16:54 And I, personally, am just really happy for these folks that this is happening.

16:58 I think it creates more opportunity.

16:59 It creates more opportunity for people in Python.

17:01 People who worked on these projects for so long get a lot of, it kind of pays off eventually, right?

17:06 Like, the PayPal donate button.

17:08 There's no way that that's a job that's like a covered my dinner once a month sort of thing.

17:13 I also get that there's a lot of people that can't do this.

17:15 I mean, there's a lot of things that people are happy with their normal job.

17:21 But they're doing something cool on the side.

17:23 We still need to figure out how to compensate those people better.

17:26 Yeah.

17:27 So we'll figure that out.

17:28 One of the things I wanted to bring up is I was talking about this announcement with somebody just yesterday.

17:32 And they said, oh, Ruff, that's kind of like black, right?

17:36 I'm like, wait.

17:37 I don't think it's quite, that's quite right.

17:39 I think of it more like Flake 8, but I was curious about the overlap.

17:44 So I went up and looked in the FAQ, and the top question is, is Ruff compatible with black?

17:51 So yes, it says Ruff is compatible with black out of the box, as long as line length setting is consistent between the two,

17:59 because black has a weird line length thing.

18:02 I've had no problem with running them together.

18:05 And I was like, also, should I run them together?

18:08 And right in here, Ruff is, it says Ruff is designed to be used alongside black.

18:13 And as such, we'll defer implementing stylistic lint rules that are obviated by auto formatting.

18:22 So what does that mean?

18:24 It means that there's no point, if they're assuming that you're running black.

18:28 So if running black will do something, there's no point in Ruff checking it,

18:33 because they know that you've already done it.

18:35 Or something.

18:36 They're going to, you know.

18:37 Yeah.

18:37 Don't let them fight.

18:38 Wrap this line.

18:40 Unwrap that line.

18:40 Wrap that line.

18:41 Unwrap that line.

18:41 Well, that.

18:42 And also, like, that's not their highest priority of fixing, of checking for lint errors that black would have changed anyway.

18:49 So.

18:49 Yeah.

18:49 Indeed.

18:50 All right.

18:50 Well, congrats.

18:51 That's very cool.

18:52 I think that might be it for our items, huh?

18:55 What do you think?

18:55 Oh, yeah.

18:56 For our main items.

18:57 Our main items.

18:57 You got some extras?

18:58 I do have one extra.

18:59 The one extra is, like, Fikert?

19:03 What's Matthew?

19:04 Matthew Fikert.

19:05 Okay.

19:05 Yes.

19:06 Wanted us to bring up, which, sorry, Matthew, for me forgetting your name right away.

19:10 Former Python Bytes co-host, guest, attendee.

19:13 Yeah.

19:14 So, I wanted to announce that the tickets are available.

19:18 It's now open.

19:19 You can buy tickets to SciPy 2023.

19:21 And SciPy 2023 is in Austin, Texas, on July 10th through the 16th.

19:28 So, that's open.

19:29 If anybody wants to go, it should be fun.

19:31 Yeah.

19:32 Anyone go into Austin to go to SciPy?

19:34 I know you've all used up your conference going.

19:37 There's a maybe.

19:38 Some maybes out there.

19:39 I mean, Austin would be great to visit.

19:41 SciPy will give you a different flavor of Python.

19:43 I think it'd be great.

19:44 But I can't make it.

19:46 I'm coming home from vacation on the 10th or something like that, which makes it a little

19:51 tight to get all the way to Austin.

19:52 Yeah.

19:52 All right.

19:53 Do you have any extras?

19:54 I have one extra, nothing major, kind of a follow-up here.

20:00 The mobile app, I talked about that.

20:04 The mobile app is officially out for Talk By Then courses.

20:07 And I would like people to try it out.

20:08 If they find a bug, just shoot me an email rather than write a one-star review.

20:13 And trash it.

20:14 Because we're working really hard to get.

20:16 It's been two and a half months I've been working on it.

20:19 It's completely redone from scratch.

20:22 It's very nice.

20:24 But it needs a little testing across all the zillions of devices.

20:28 Android is out.

20:29 Do you notice, Brian, I did not say the Apple version is out, did I?

20:33 No.

20:33 Oh, no.

20:34 No, no, no.

20:34 Because when you submit something to Apple, what they tell you is, rejected.

20:37 Rejected.

20:38 Your app does not do X, Y, and Z.

20:40 And Android is like, yeah, sure.

20:42 That's good.

20:42 So we're now adding in-app purchasing because without it, you can't have your app.

20:49 So I'm going to work on that for the next week.

20:51 And then we'll have an Apple version y'all can test.

20:54 And it will be out, but it's just not out yet.

20:56 What are you going to sell for in-app purchases?

20:58 Courses.

20:59 I actually wrote some of them.

21:00 You know, I might even sell one of yours.

21:02 Yeah, the Pi test course.

21:03 Yes, exactly.

21:03 Nice.

21:06 Awesome.

21:07 Anyway, that's my extra.

21:08 What's Android, by the way?

21:09 Yeah, it's.

21:10 No, just kidding.

21:12 Let's not go there.

21:14 This one, I'm going to take a chance.

21:18 I'm going to take a risk here and turn my screen around.

21:21 Okay.

21:21 For everyone, because this joke is very visual.

21:25 You'll be able to see it over there.

21:27 And you can see mine.

21:28 But you know it already.

21:30 This is what it's like releasing new production.

21:32 We've got the senior dev and we've got the junior dev.

21:35 Here we go.

21:36 Here we go.

21:40 What is this, Mr. Bean?

21:43 Yeah.

21:44 Mr. Bean.

21:44 It's just people are rocking all over.

21:49 The junior dev is hanging on for life.

21:51 There's like a molten lava here in a second.

21:53 That's the database.

21:54 Some of the developers are thrown into the lava.

21:57 Scrum master.

21:57 There you go.

21:58 The scrum master was thrown into the lava, which is the database.

22:01 Anyway, what do you all think?

22:04 You ever felt that way?

22:05 No, I'd definitely throw the scrum master into the lava.

22:09 Yeah, definitely.

22:10 Definitely.

22:10 But anyway, that's what I brought for our joke.

22:13 Nice.

22:13 I like it.

22:14 And I also took you off the camera.

22:16 There you go.

22:16 That's all right.

22:17 Well, this was fun doing a live episode.

22:20 It was very fun.

22:21 And thank you all for being here.

22:22 This is really awesome.

22:23 Yeah.

22:24 Thanks to everybody, and thank you everybody online for watching and showing up.

22:27 Yeah, absolutely.

22:28 Have fun.

22:28 Bye, y'all.

22:29 Thank you.

Back to show page