Brought to you by Michael and Brian - take a Talk Python course or get Brian's pytest book


Transcript #282: Don't Embarrass Me in Front of The Wizards

Return to episode page view on github
Recorded on Tuesday, May 3, 2022.

00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.

00:05 This is episode 282, recorded May 3rd, 2022.

00:11 I'm Michael Kennedy.

00:12 And I am Brian Okken.

00:13 It's great to have you here, Brian. It's just us, just the two of us.

00:16 Yeah, just like old times.

00:18 I know, but we have our friends out in the audience, so we're not entirely alone.

00:22 It's great. So let's kick it off.

00:25 I know you have a particularly exciting announcement.

00:29 - topic to cover here. So definitely let's, let's go do it.

00:33 - Okay. So, PyScript. So this was an announcement at PyCon US by, Anaconda's CEO, Peter Wang, during a keynote. I wasn't there, but like everybody was tweeting about it. So it almost felt like I was, I was there. So, but, but I, I haven't seen the presentation, so I can't wait. Can't wait till that goes online.

00:55 >> I know. I have not seen the videos for the presentations at PyCon out yet.

01:02 Are they out yet? I just missed it.

01:03 >> I haven't looked.

01:04 >> Is my YouTube broken?

01:05 [LAUGHTER]

01:06 >> It should be full of this stuff.

01:08 >> What's up with, is it supposed to be next day or something? I don't know.

01:11 >> I know.

01:12 >> Anyway.

01:13 >> I would have loved to live stream it, but I didn't see an option.

01:16 Anyway, I'm looking forward to watching this one in particular when it comes out because this is big news.

01:20 >> PyScript is Python in the browser.

01:23 So what does that mean? It is built on top of Pyodide, which is a port of CPython based on WebAssembly.

01:29 I'm pretty sure we've covered Pyodide before.

01:32 But so this is a pretty neat thing.

01:34 And one of the things that this, so the PyScript.NET, you go to it, it's got a little, it's kind of actually, it's like hype and it sounds neat, and you can do Python in the browser.

01:45 Neat with the PyScript tags.

01:47 But what does that mean?

01:48 So if you go down to the bottom, There's a GitHub repo that you can go look at.

01:53 This is what I suggest.

01:55 And this will talk about--

01:57 there's a getting started guide.

01:59 But what I did is just followed this.

02:02 I cloned the repo.

02:03 And then I went in and did the--

02:05 into the JavaScript area.

02:07 And then did npm install.

02:09 And then did this dev run, run dev thing.

02:12 So this only took me like five minutes to get this far.

02:15 And what you have is you've got--

02:19 One of the things that it has is it has an examples folder and you can just open this up now in your local, your local browser, local host.

02:27 And there's all these cool demos.

02:28 Like there's a REPL where you can just do, it's kind of like a Jupyter where you can say like X equals three, let's do this.

02:36 And then X, and then if I do shift enter, it evaluates it.

02:40 How neat is that?

02:41 That's pretty neat.

02:42 - That's awesome, yeah.

02:44 - To do app here.

02:45 So make sure you listen to our podcasts, by Python testing by test. We'll check that because we know you already bought that. So, and then there's an example with D3 graphics. This is neat. I don't think I've ever done this.

02:59 There's an Altair example. And this is pretty fun because you click around and it changes the above.

03:04 It's like an interactive thing. This is fun. We use Altair with a project at work. So this is neat.

03:11 The Mandelbrot set. So there's some code. So all of this code is in the repo. So you can look at the examples and look exactly how the code is done. There's a HTML file and a Python file for all of these. So you can check it out. Actually, I don't know about the Python thing. It's HTML and Python within the HTML code embedded. So there isn't a separate file. But you can do imports and all this sort of stuff too. Oh, I went too far. But I wanted to bring up, there's also an article that we're going to link to in the show notes that is called PyScript, Unleash the Power of Python in your browser. This is by Eric Lewis Lewinson, and it runs through. It's a pretty interesting little quick read of what it is if you're not familiar with WebAssembly and Pyodide. So it's nice. What do you think, Michael?

04:01 So excited. I am very excited. You know, there's been progress on the WebAssembly plus Python side on several occurrences that were, they give you a sense of what's possible, but they they didn't give you a thing to build with.

04:15 You know what I mean?

04:16 - Yeah.

04:17 - So for example, Pyodite is awesome, but it's kind of like, well, if I want to sort of host Jupyter kernel in my browser, like I can kind of do that, right?

04:26 The WebAssembly Python itself is great, but it doesn't specify a way to have a UI of your webpage interact with Python.

04:34 It's just, oh, you could execute Python over here.

04:37 Well, like, and then what?

04:38 You know what I mean?

04:39 Which is still good, but there's not something where, Like I can have a button on there that like wires up to this thing in Python and I can have this list that binds in that way and so on.

04:50 And this looks like we might be there.

04:53 Like one of the things they talk about on the page is not just running Python in the browser and the Python ecosystem as you pointed out, but really importantly, two more things, Python with JavaScript, bidirectional communication between Python and JavaScript objects.

05:08 So you can wire into like events on the page and other DOM type of things.

05:13 - Yes.

05:14 - And then a visual application development ties in with that with use readily available curated UI components such as buttons, containers, text boxes, and more.

05:23 Oh yeah.

05:24 - Yeah, I mean like these are just a little quick examples but I'd love to see some bigger examples of things like that.

05:30 Like being able to connect JavaScript interaction with stuff on the Python side.

05:38 That'll be neat.

05:38 >> Yeah, it's weird to see Python written just straight in the browser.

05:42 You know?

05:43 >> Yeah. >> Like here you have like angle bracket pi-script and just import antigravity, antigravity.fly.

05:49 And like wait, what?

05:51 >> Well, so this is a good example.

05:53 I picked this example for one is because it does do an import.

05:57 So there's like a path thing so you can set up.

06:00 So you can put code, all your code doesn't have to be in HTML.

06:05 It can be in a Python file.

06:07 so you can debug it there, which that's where you want to debug it.

06:10 And then you can import and call it within Python.

06:12 This is probably more where I would use it, is putting most of my code somewhere else.

06:18 >> Yeah, that's what I want to see.

06:20 I would want to see just Python files and just effectively a script tag for it.

06:25 I mean, maybe you can't do it directly as a script tag, but you could do bracket, PyScript and then just import and run, right?

06:33 >> Yeah.

06:33 >> From entry point basically.

06:34 >> I haven't looked at this before.

06:35 So the anti-gravity.py that is bringing in is bringing in some pyodied stuff and to be able to work it.

06:44 - I'm seeing some from, this is Python code, from document, or sorry, from JS import document.

06:51 - Yeah. - And set interval.

06:53 Those are the things you do there.

06:56 Let's see, are there any callbacks?

06:58 I don't see any callbacks there.

07:00 Oh yeah, yeah, this set interval has a callback, self.move when the JavaScript interval fires.

07:05 So under fly, that is hooking into a timer there.

07:09 >> Yeah.

07:10 >> Timer callback.

07:10 >> So we should check that out. So where's that?

07:13 So I should have done this ahead of time.

07:17 The anti-gravity is not linked to, but I'll just bring it up.

07:21 >> Wow.

07:25 >> Oh my gosh, this is so amazing.

07:27 People have to do this.

07:28 >> This is cool.

07:30 >> We all know import anti-gravity, and we've got to know the XKCD that comes up.

07:35 But yes, it's great.

07:37 It's alive.

07:38 It's not just as the person who says, how are you flying?

07:41 The person says, I'm playing with Python.

07:42 Like that thing is alive and cruising around.

07:45 I love it.

07:45 Yeah.

07:45 And that's based on the callback, right?

07:47 That's, that's calling Python based on the set interval, a timer callback in JavaScript.

07:51 Yep.

07:51 Yeah.

07:52 And to me, that has been the missing piece.

07:55 Like how do I wire up?

07:56 It's like great if I can just execute Python and have, you know, like a number come out.

08:00 But what I want is view in Python or react.

08:03 I want to build the UI in Python and just not deal with JavaScript and be able to do so many more things on the front end.

08:11 I mean, this opens up stuff like progressive web apps, which could be really amazing for the Python space, right?

08:19 Like, I'm here in Vivaldi. If I go to my email client, just in the browser, I can right-click and install.

08:25 It gets its own app that works offline. It like pulls its data down into local DB or whatever.

08:30 Theoretically, you could do this, right?

08:32 You could pull down the CPython WASM.

08:35 You could pull down the 5k MyScript file, and then just somehow use JavaScript to Python to talk to local DBs.

08:43 I mean, what if we get like ORMs in Python going, "Oh yeah, we have one of our backends is the web browser local DB." >> Yeah.

08:51 >> Or something. I mean, this is great.

08:53 I would love, I'm very excited for where this might go.

08:56 >> Sky's the limit, right?

08:57 That's what that little flying character is saying at least.

09:01 >> Yeah.

09:02 >> Okay. So well, good job, Anaconda folks.

09:05 I believe this was Fabio and Cruz.

09:08 So really, really nice. That was super psyched.

09:10 How am I going to follow that one up, Brian? Come on.

09:13 It's just, I'll give it a try.

09:16 No, I've got some good items.

09:17 They're just not flying around amazing Python in the browser, amazing.

09:21 So Bloomberg has a lot of Python going on, And Bloomberg actually has a pretty cool like tech engineering blog where they talk about some of the stuff going on at Bloomberg, right?

09:33 Yeah.

09:33 one of the really good articles I read from this, from them was about how to really set up and run micro whiskey in production and was like this huge, long, deep list of like, here's a bunch of flags you probably never thought about, and here's why you should care about them in Python.

09:48 Really good stuff.

09:49 So they're back with another thing that they use that is cool called memory.

09:55 Like memory, but memory, it is a memory profile for Python.

10:00 So if you want to understand the performance of your application, especially around memory, here's a pretty neat tool.

10:09 Now let me just get this right out of the way before I forget, Linux only.

10:13 So if you're not using Linux, just close your ears.

10:15 No, just kidding.

10:16 Like you could all, if you're on Windows, you could just run your Python app under WSL and then profile it and then go back to running on Windows.

10:24 or if you're on Mac, just do a VM or something, right?

10:27 Anyway, it only runs on Linux, but because Python is so similar across the platforms, I'm sure you could just test your code there, even if that's not the main use case.

10:35 All right, so you get all these different visualizations of memory usage.

10:39 It can track allocations for Python code in native extension modules, like NumPy or something like that, and even within CPython itself.

10:48 So you get sort of a holistic view of the memory, which is pretty awesome.

10:52 - Yeah.

10:53 And it'll give you different memory reports.

10:55 We'll talk about them a little bit.

10:57 And you can use it as a CLI tool, just like kind of like time it or whatever.

11:01 You can just say, memory run my app.

11:04 And then when your app exits, it's like, and here's what happened.

11:07 One of the things that's super challenging about complicated applications and web apps and stuff is you wanna focus on a particular scenario and there's so much overhead of like startup and other things.

11:19 So for example, if I just want to profile a fast API, API call, if I just say run it up and then I go hit that API, all of the infrastructure starting up, UVicorn and fast API and Python, it just like, it just dwarfs whatever that little thing is usually.

11:36 So there's also a programmable API that says, you know, you could create like a context manager, like, I don't know if it actually is that way, but you could certainly build it if it doesn't exist, like with memory profile here and just do a little block of code and then get an answer, which I think is pretty neat.

11:51 Alvaro asks if it accepts an entry point.

11:53 I suspect you could call an entry point because you just do the run on the command prompt.

12:00 So you could probably pass it over.

12:02 - Whatever you run, yeah.

12:03 - Yeah, exactly.

12:04 But the problem is there's still like the startup of just CPython itself, right?

12:08 Like I always find just the imports and all that is just way more overhead than, you know, it clutters it up.

12:15 Anyway, let's hit some notable features of Memray.

12:18 It traces every function call as opposed to sampling it.

12:22 So instead of just going every millisecond, what are you doing now?

12:26 What are you doing now?

12:26 Let's just record that, right?

12:28 It actually exactly traces so you don't miss any functions being called, even if they're brief.

12:33 It handles native calls in C++ libraries.

12:36 So the entire stack is represented in the results, which is pretty cool.

12:40 - That's pretty neat.

12:41 - Yeah, that's pretty dope.

12:42 Apparently it's blazing fast.

12:44 There's some kind of character, I think it's a race car there.

12:47 It causes minimal slowdown in the app.

12:49 If you're doing Python tracing, if you do the native code stuff, it's a little bit slower.

12:53 It says, but that's optional.

12:54 You get a bunch of reports.

12:55 We'll see those in a minute.

12:56 It works on Python threads.

12:58 So you can see, I know all these people watching, but you check out the webpage.

13:03 There's a little thread, like a sewing thread emoji.

13:05 >> Or a Twitter thread.

13:07 >> Yeah, indeed.

13:09 So it also works on native threads like C++ threads and native extensions, which it represents as an alien plus the thread icon.

13:18 I love it.

13:18 >> Alien threads, yeah.

13:19 >> Yeah. Let's look over here real quick.

13:22 We'll look at just, I guess the reporting.

13:24 I mean, the running is super simple as I said, memory run Python file with arguments or memory run -m module with arguments.

13:32 These are the places you could put your entry point and so on.

13:36 Dean in the audience says, "We've had a rich spotting." I haven't pulled that up yet, but very nice.

13:42 So there's different ways in which you can view it.

13:44 And the first one that I ran across, which is pretty interesting if you're familiar with glances or you want to go old school like top or one of these things you can run in just the terminal and get, I got not rich with top, but rich output like glances is you can run it in a live mode where while it's running, it'll show you what's happening with the memory.

14:07 That is so awesome.

14:09 >> That's pretty cool.

14:09 >> Yeah.

14:10 >> Yeah.

14:10 >> Yeah. So instead of just showing you a memory graph, It's like, guess what? We're running here right now with this many allocations and so on.

14:17 Yeah, that looks super neat.

14:19 >> Yeah.

14:19 >> Just give it the dash dash life.

14:20 >> If you've got something interactive, you can interact with it and watch the memory change then.

14:24 >> Yeah. You can cycle through threads.

14:27 You can sort by total memory or its own memory.

14:32 That's a common thing you do in profiling like this and all the stuff it's called or just this method itself.

14:38 Sort by allocations versus memory usage is all kinds of stuff.

14:42 So that's really neat.

14:43 It will track the allocations across forks, as in process, subprocess.

14:49 >> Okay. >> Why would you care?

14:50 Because multi-processing.

14:52 If you wanna track some kind of multi-processing memory workflow, it'll actually do that.

14:56 Just you do dash dash follow fork and it'll aggregate the stats across the different processes.

15:01 Kind of insane.

15:02 Let's see if we can get down here.

15:05 You can do, they have the summary reporter, which is kind of a nice just, this is probably what you would expect.

15:11 If I can get down here somewhere, it'll show the color and the width of these bars.

15:17 I'll show you how significant it is.

15:19 There's a nice tree version that'll show you the biggest 10 allocations, and then a call stack in and out with trees, and how much memory is being allocated in each one of those and so on.

15:30 >> That's nice.

15:30 >> This is a nice app, right?

15:32 Nice little utility.

15:33 >> Definitely. Cool.

15:34 >> Yeah, indeed.

15:36 If you want to track down memory leaks or you're just wondering, like why is my program using so much memory?

15:43 Fire it up, let it run for a while, see what happens.

15:46 Cool.

15:47 All right, back to you, Ryan.

15:49 - Well, I wanna bring up a pytest tool.

15:52 So it was a, I have, I've often used pytest xdist for parallel.

16:01 So xdist is a way you can just say that it's the one that I heard about first for running pytest in parallel.

16:08 So you've got tons of unit tests maybe, and you want to just speed them up, you can throw a -n for something like that at it.

16:17 And it'll just launch different processes and run pytest in parallel on a bunch of them.

16:25 So it cuts time down, but there's overhead.

16:28 And I was recommending this to somebody on Twitter, and I think it was Bruno Olivier suggested a couple of alternatives, and one of them was pytest Parallel, which I know I've run across, but I haven't played with it for a while, so I tried it out.

16:45 It's actually really cool.

16:47 One of the pytest X does a lot.

16:51 One of the things it does is it's not just multiprocessor, but it can be on actual different computers, so you can launch them on.

16:59 >> Oh, nice. Like grid computing almost.

17:02 >> Yeah. You can SSH into a different systems and have it run in parallel.

17:06 But I don't usually need that kind of power.

17:10 The one thing it doesn't do is thread, so it's process-based and pytestParallel does both.

17:16 You can give it where we have, I'm going to go down to the examples.

17:24 You can give it number of workers and that's how many processes it'll spin up or how many CPUs.

17:33 Now, you can also give it test per worker, and then it'll run in multi-threading mode, and you can give it auto on both of these.

17:41 And it's a, this is extremely useful for, you have to, by default this is turned off, by default the features, if you just say workers equals five or something, it won't do multiple thread, multi-threading.

17:54 And the reason is, because you need to make sure your tests are thread safe, and many are not.

18:01 So I tried it on a couple of my--

18:03 - Even if they're isolated, they might not be thread safe, right?

18:05 - Yes.

18:07 - That's another level of consideration.

18:09 - However, if there are, there's a lot of small, especially small, not really unit, like system tests, but a lot of unit tests are just testing a little Python code.

18:19 If you've got a part of that is a lot of projects, that's a big chunk of the test load.

18:23 So being able to do multi-threading is really nice.

18:26 But you know, even with just multi-processing, I tried this on a few different projects and there were, like I tried it on Flask, and the parallel version, using pytest parallel, was like three times faster than the XDist version.

18:44 >> Oh, wow.

18:47 >> There was another one that Bruno mentioned, but I think these two are really solid, XDist and parallel.

18:53 If you want to speed up your test run times, I would try both on your project and just play with them and see which one's faster on.

19:02 Many of the projects I tried parallel was at least as fast or faster than Extest.

19:07 So it's kind of nice.

19:09 - Yeah, it's cool.

19:10 This looks great.

19:11 I like it.

19:12 And having your test run faster is always good.

19:15 Do you do anything crazy?

19:16 Like do you set up your editor to auto run tests on file change or anything like that?

19:20 - Sometimes.

19:23 One of the things that--

19:24 - I've always, I've done it a few times, but it always makes me nervous.

19:26 I'm like, ah, it's unnerving to me that it just keeps running.

19:29 >> One of the things that I really like around that was added to pytest not too long ago was stepwise.

19:37 So that's not really running it all the time, but stepwise will, and this would be a handy one to run all the time.

19:44 So what stepwise does is it takes, you can run all your tests in stepwise, and when you run it again, it'll start at the first failing test, because it assumes you're trying to fix something.

19:56 it'll start at that and then run until it finds a failure.

19:59 So if you haven't fixed this first failure, it'll just keep running that one until you fixed it, and it'll go to the next one.

20:06 So I do that a lot while I'm trying to debug something.

20:10 >> That's cool.

20:11 >> Hooking that up with an auto, like a watch feature, there's a bunch of ways you can watch your code to do that.

20:17 >> Yeah.

20:18 >> Yeah, it's fun.

20:19 >> Nice. Very cool.

20:21 So let's do some real-time follow-up here.

20:23 First, Alvaro is being all mischievous asking, "I wonder what would happen if I install both plugins, both XDist and Parallel?" >> I don't know if you can run them at the same time. I should try.

20:36 I have it installed on the Flask one.

20:38 I installed both of them and then tried them both, but not at the same time. I'll have to try that.

20:43 >> Fork the forks. It's going to go so fast.

20:45 Then just going back to PyScript, there's tons of excitement about PyScript.

20:49 >> Yeah.

20:49 >> JL is excited, Brandon is excited.

20:52 And David says, "I hope someday I can say, "back in my day, you couldn't just learn Python, "you had to learn JavaScript too." - Yeah, indeed, indeed.

21:00 - Let's see, so I got one more to cover that is gonna be fun as well.

21:07 And this one comes to us from former guest co-host, Michael Feigert, sorry, Matthew Feigert.

21:15 And Matthew is a great supporter of the show, sends all sorts of interesting things in to help us out.

21:21 - Man, good ideas.

21:22 And this is yet another one coming from the data science side of things, saying, you know, one of the things you have to do often in say a Jupyter notebook is go download a file off of an API or just some link or S3 bucket or whatever, and you want to process it.

21:38 And if you use requests, wow, great.

21:41 You end up making the request, verifying that it worked, reading the stream into bytes, writing the bytes to a file, picking a file name, and then using that file name to open it, and then say, now you can process it, right?

21:53 So there's this thing called Pooch, a friend to fetch your data files.

21:57 All right, Pooch, go get my files.

21:59 Like a little friendly dog that also seems to hold a snake in its mouth.

22:04 So that's pretty cool.

22:05 Anyway, who wouldn't want a dog that can wrangle snakes to go help you with your notebooks?

22:11 Anyway, the idea is you can do all of what I described with requests, you can do that in one line of code.

22:17 - Oh, wow.

22:17 - Yeah, and you get other cool features as well.

22:19 So it says, look, you can just make this one function call and it'll save it and it'll also cache your files locally.

22:26 So some of these files that data scientists especially work with are massive, right?

22:30 You know, it's like a gig.

22:31 And every time you run the notebook, you don't want it to download the gig again.

22:34 You just want it to run more quickly.

22:36 So you can set up a location for it to cache it.

22:39 You can pass in a hash of the file to say, I wanna get this file and I expect it to be this MD5 or whatever the heck the hash is that they're using so that you can be sure it doesn't change, right?

22:51 So if you're doing like reproducible data science, you say, what you do is you download this file, then you apply this algorithm, then you get this picture.

22:58 Well, if the data changes, I bet the picture changes, right?

23:00 And so you can put it like a layer of verification that it's unchanged from the last time you decided what it should be.

23:08 That's pretty cool.

23:09 You can do multiple protocols.

23:11 So not just HTTP, HTTPS, but FTP.

23:14 Oh my gosh, SFTP.

23:15 Oh yeah, it's what else basic off.

23:18 It'll also automatically resolve DOIs, digital object identifiers, which are used in places like Figshare and Zenodo.

23:28 This is about the reproducible science.

23:30 Like here's the file and we've been assigned an immutable ID that we can always refer back to it.

23:36 So you can just say, here's the ID and it'll actually get the file, and it'll even unzip and decompress files upon download.

23:41 >> Neat.

23:41 >> Pretty neat, huh? Yeah.

23:43 >> Yeah.

23:43 >> Pretty straightforward. Let me see if I can find an example of.

23:46 >> I like the section of learning about it.

23:50 It's called training your Pooch. That's cute.

23:53 >> Nice. I love it.

23:55 Apparently, it has progress bars, almost download actions, logging, and you get multiple files.

24:01 But the main use case is just file equals Pooch.retrieve URL, done.

24:07 That seems pretty nice.

24:08 >> Yeah, that's great.

24:10 It's my data. Here it is.

24:11 >> Cool. So Pamphil Roy on the audience says, "Hey folks, funny we're adding this to SciPy optional to have the SciPy data set sub module.

24:20 Scikit-image is using this as well.

24:23 I had no idea. Very cool.

24:24 Thanks for the extra background there.

24:25 >> Cool.

24:25 >> Yeah. But I think this is great.

24:27 In fact, I know it sells itself, it builds itself as being for data science.

24:32 I also like to download files sometimes and not go through five or six lines of code. I could use this.

24:37 >> Yeah. There's a lot of stuff that data science people are doing that we can use in lots of other fields.

24:44 >> Indeed. I do think that's actually one of the really interesting aspects of Python, is we have so many people from these different areas that it's not just all CS grads doing the same thing.

24:55 >> Yeah.

24:55 >> Yeah, for sure. All right.

24:58 Well, those are my items for today, Brian.

25:01 >> Nice. I don't have any extras today.

25:05 Do you have any extra stuff?

25:07 >> I do. I do have extras.

25:10 This one I'm very, very excited about.

25:13 I have a new course that I just released called Up and Running with Git, a Pragmatic UI-Based Introduction.

25:21 So I'm really excited, I just released, I haven't really even announced it yet, but I finished getting it all public and online and turned all the GitHub repos public and all that stuff right before we jumped on the call today.

25:32 And the idea is there are tons of Git courses, so why create a Git course?

25:36 Well, I feel like so many of them are just like, okay, we're just gonna work in the terminal or the command prompt, and you're just going to assume that that's the world of Git that you live in.

25:46 >> Yeah.

25:47 >> At least common denominator approach.

25:48 While that is useful, I don't think that's how most people are working.

25:52 If you're in Visual Studio Code or PyCharm, there's great hotkeys just to do the Git stuff and see the history and whatnot, and there's other tools like Sourcetree and Power and others.

26:01 It takes this approach of, well, let's take all the modern tools that give you the best visibility and teach you Git with that.

26:09 Super fun.

26:10 >> Which GUI tools are you using then? Which ones are you showing?

26:13 >> Visual Studio Code, PyCharm, SourceTree.

26:16 >> Okay.

26:17 >> Those are the thing. I've done a lot of work.

26:19 I've tried to take some of my experience from doing some work on YouTube where I was experimenting with setup and presentations and stuff.

26:27 I think I have a really neat polished experience for this course, with lots of cool visuals and graphics and video and stuff.

26:35 Hopefully, people will really enjoy it.

26:37 Anyway, this is my extra.

26:38 I just sent this out to the world.

26:40 >> I'm pretty excited about this. Nice. Congrats.

26:43 >> Yeah. Thanks. Thanks so much.

26:44 You have no extras. Does that mean you're ready for some humor?

26:47 >> Yes, always.

26:49 >> All right. This one, I chose this, honestly, I chose it just because of the title.

26:54 Is this Robert Downey Jr. looking at somebody in some wizard situation?

27:02 >> Yeah, this is like Endgame or something.

27:05 >> Okay. Yeah, I don't know the movie.

27:07 Apparently, I stopped watching movies at some point.

27:09 Now, I'm out of touch.

27:11 Anyway, the title is, when your code stopped working during an interview, or it could be a demo presentation or whatever.

27:18 You want to tell us what this is about, what's going on here?

27:21 >> He's looking back at Banner, so who's the Hulk.

27:26 He says, "Dude, you're embarrassing me in front of the wizards." Yeah, because Banner wasn't able to become the Hulk.

27:34 >> Try to. Don't embarrass me in front of the wizards.

27:37 I love to think of programmers as the modern day wizards.

27:40 We can think of things and then they come into existence.

27:44 >> Yeah.

27:45 >> It's good. Also, while working on that Git course, I had this pretty fun experience, right while I was recording it.

27:52 >> Nice.

27:53 >> I'm just sitting there and then-

27:55 >> Git was down.

27:56 >> How often does GitHub itself go down?

27:58 But no, there's like the OctaCat is falling, like with a 500 sign in his hand, which of course made me redo that section of the course.

28:09 >> Yeah. I like the expression on your face for that.

28:12 >> Exactly. People seem to really like that tweet.

28:15 I'll put it in the show notes if people want to check it out.

28:17 Anyway, dude, don't embarrass me in front of the lizards.

28:20 It's what I got for you.

28:22 >> Yeah. Good.

28:23 >> Good.

28:24 >> Well, thanks a lot again. It's a great show.

28:28 >> Yeah, sure was. Thanks, Brian.

28:30 Thanks for everyone who came.

Back to show page