Artwork

Jeremy Daly and Rebecca Marshburn에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Jeremy Daly and Rebecca Marshburn 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

Episode #18: Pushing the Limits of Lambda with Michael Hart (Part 1)

48:23
 
공유
 

Manage episode 244364703 series 2516108
Jeremy Daly and Rebecca Marshburn에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Jeremy Daly and Rebecca Marshburn 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

About Michael Hart

Michael has been fascinated with serverless, and managed services more generally, since the early days of AWS because he’s passionate about eliminating developer pain. He loves the power that serverless gives developers by reducing the number of moving parts they need to know and think about. He has written libraries like dynalite and kinesalite to help developers test by replicating AWS services locally. He enjoys pushing AWS Lambda to its limits. He wrote a continuous integration service that runs entirely on Lambda and docker-lambda, which he maintains and updates regularly, and has gone on to become the underpinning of AWS SAM Local (now AWS SAM CLI).

Transcript

Jeremy: Hi, everyone. I'm Jeremy Daly, and you're listening to Serverless Chats. This week, I'm chatting with Michael Hart. Hey, Michael. Thanks for joining me.

Michael: G’day, Jeremy, mate. How’s it going? You having a good day? Is everything going alright so far?

Jeremy: I love the Australian. I love the Australian accent. You don't actually talk like that, but that was...

Michael: I don't know what you're talking about. I talk like this all the time. Yeah, I do. I do wonder. I feel like if I did speak like that all the time, people might find me charming, but I don't think they'd have a clue what I was saying.

Jeremy: Exactly. Yeah. No, I actually thought Australians spoke English until I met a bunch of Australians, and I said I don't know if that's English, but anyway, so it's awesome to have you here. You’re the VP of Research Engineering at Bustle. You're also an AWS serverless hero. Why don't you tell the listeners a little about your background, what you do, and what's going on at Bustle?

Michael: Sure. So a little bit of my background: I have started a couple of companies, co-founded a couple of companies, been CTO before, in Australia. This was moved to New York, did a bit of consulting and then joined Bustle as the VP of Research Engineering. So I, you know, do a bunch of interesting research things there. Bustle is a digital media company. We have a bunch of sites, mainly targeted at sort of millennial women, although we've recently been expanding that market.

Jeremy: Awesome.

Michael: And we have, just in the last year or two, I think I think we've sort of acquired or started about nine other sites, so yeah, growing.

Jeremy: And you guys are using serverless up and down your entire stack.

Michael: Yes, serverless across the board. Yeah. Been pretty early on that, yeah.

Jeremy: Awesome. Alright, so I have had a number of conversations with you. We were out in Seattle. We were out in New York City the other day. We've had a ton of conversations about serverless and Lambda and all these things that it can do. I would have recorded the conversations, but usually we're in a bar drinking Old Fashioneds or just being, you know, whatever, and the audio quality wouldn't be that good. So anyways, I want to talk to you about all these cool things that you do with Lambda functions because I have talked to tons of people and I capture use cases in my newsletter every single week and, you know, they're interesting things, but I don't think I've met anybody who has pushed Lambda to the limits like you have. And, I mean, not just like one thing, like multiple things. So I want to get into all of that stuff. But just maybe we could start by talking about, you know, in case people don't know, what is it— The Lambda function itself, it's actually an execution environment. There's an Amazon Linux runtime underneath there, or operating system underneath there. So you know this inside and out. And this will become abundantly clear that you know probably more about this than some of the AWS engineers as we go through this, but just let's start with that. What is a Lambda function? What is it made up of?

Michael: Sure. Yeah, so you're absolutely right. It is the environment that your function is running in is sitting on Amazon Linux, until very recently until the Node.js 10 runtime. That was all Amazon Linux 1, which is getting pretty old now. And then the ruttimes themselves would would sit on top of that just in a directory in the operating system, and each runtime would have, you know, whether it's running on Python, then the Python binary and all the libraries. And if it’s Node, then the Node binary and other libraries so that it'd sort of be the only difference between those two runtimes; the underlying operating system’s still the same. I mean, and these are launched very quickly. Now it's on Firecracker, which is Amazon’s sort of new VM-type technology that sort of provides isolation. But essentially, you know, these isolated environments spin up very quickly and they're running an operating system that runs a runtime that then invokes your function, which is also sitting on the file system.

Jeremy: Alright. So let's get into a little bit more than the details though, I mean, in terms of things that are installed and ready to go. I mean, it's more than just the runtime, right? I mean, there's other libraries, and other things...

Michael: No, you're absolutely right. So, unlike — I'm trying a bit of a good example — unlike if anyone's played with Cloudflare Workers or something like that, that's just running JavaScript. There's no sort of file system that you have access to or anything like that. It's just sort of a JavaScript environment with all of the things that you would have access to if you were in the browser, for example, those so you know most of, or a lot of those sorts of APIs. In Lambda, there is a file system in there. It's a Linux operating system running a process and and then including your Javascript file or Python file and running that. So as well as your file and and the runtime itself, there are, you know, a bunch of base operating system binaries sitting there that you can access as well and Amazon’s pretty cagey about what guarantees they give you about what binaries will be there. They say, “You know if you want to compile something native that your Lambda uses, compile it for an Amazon Linux environment, you know, and that's pretty vague because obviously, you could have an Amazon Linux environment with a whole bunch of binaries or dynamic libraries installed or you could have an incredibly stripped-back operating system that has nothing installed. So in that case, you'd need to sort of bring those binaries yourself into your Lambda. So a good example is if your Lambda function wanted to call out to Bash, do you just assume that Bash is there on the operating system? That's probably a pretty fair assumption. Bash is on most Linux operating systems or certainly the larger ones. So that might be a fine assumption. But then another example might be Perl. You know, maybe you need — maybe your function does something a little bit exotic. Maybe it's doing some cool image manipulation or video manipulation that it needs to call out to a Perl script. Do you assume that Perl is in the Lambda environment or do you bundle yourself or include it in the layer or some...

  continue reading

142 에피소드

Artwork
icon공유
 
Manage episode 244364703 series 2516108
Jeremy Daly and Rebecca Marshburn에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Jeremy Daly and Rebecca Marshburn 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

About Michael Hart

Michael has been fascinated with serverless, and managed services more generally, since the early days of AWS because he’s passionate about eliminating developer pain. He loves the power that serverless gives developers by reducing the number of moving parts they need to know and think about. He has written libraries like dynalite and kinesalite to help developers test by replicating AWS services locally. He enjoys pushing AWS Lambda to its limits. He wrote a continuous integration service that runs entirely on Lambda and docker-lambda, which he maintains and updates regularly, and has gone on to become the underpinning of AWS SAM Local (now AWS SAM CLI).

Transcript

Jeremy: Hi, everyone. I'm Jeremy Daly, and you're listening to Serverless Chats. This week, I'm chatting with Michael Hart. Hey, Michael. Thanks for joining me.

Michael: G’day, Jeremy, mate. How’s it going? You having a good day? Is everything going alright so far?

Jeremy: I love the Australian. I love the Australian accent. You don't actually talk like that, but that was...

Michael: I don't know what you're talking about. I talk like this all the time. Yeah, I do. I do wonder. I feel like if I did speak like that all the time, people might find me charming, but I don't think they'd have a clue what I was saying.

Jeremy: Exactly. Yeah. No, I actually thought Australians spoke English until I met a bunch of Australians, and I said I don't know if that's English, but anyway, so it's awesome to have you here. You’re the VP of Research Engineering at Bustle. You're also an AWS serverless hero. Why don't you tell the listeners a little about your background, what you do, and what's going on at Bustle?

Michael: Sure. So a little bit of my background: I have started a couple of companies, co-founded a couple of companies, been CTO before, in Australia. This was moved to New York, did a bit of consulting and then joined Bustle as the VP of Research Engineering. So I, you know, do a bunch of interesting research things there. Bustle is a digital media company. We have a bunch of sites, mainly targeted at sort of millennial women, although we've recently been expanding that market.

Jeremy: Awesome.

Michael: And we have, just in the last year or two, I think I think we've sort of acquired or started about nine other sites, so yeah, growing.

Jeremy: And you guys are using serverless up and down your entire stack.

Michael: Yes, serverless across the board. Yeah. Been pretty early on that, yeah.

Jeremy: Awesome. Alright, so I have had a number of conversations with you. We were out in Seattle. We were out in New York City the other day. We've had a ton of conversations about serverless and Lambda and all these things that it can do. I would have recorded the conversations, but usually we're in a bar drinking Old Fashioneds or just being, you know, whatever, and the audio quality wouldn't be that good. So anyways, I want to talk to you about all these cool things that you do with Lambda functions because I have talked to tons of people and I capture use cases in my newsletter every single week and, you know, they're interesting things, but I don't think I've met anybody who has pushed Lambda to the limits like you have. And, I mean, not just like one thing, like multiple things. So I want to get into all of that stuff. But just maybe we could start by talking about, you know, in case people don't know, what is it— The Lambda function itself, it's actually an execution environment. There's an Amazon Linux runtime underneath there, or operating system underneath there. So you know this inside and out. And this will become abundantly clear that you know probably more about this than some of the AWS engineers as we go through this, but just let's start with that. What is a Lambda function? What is it made up of?

Michael: Sure. Yeah, so you're absolutely right. It is the environment that your function is running in is sitting on Amazon Linux, until very recently until the Node.js 10 runtime. That was all Amazon Linux 1, which is getting pretty old now. And then the ruttimes themselves would would sit on top of that just in a directory in the operating system, and each runtime would have, you know, whether it's running on Python, then the Python binary and all the libraries. And if it’s Node, then the Node binary and other libraries so that it'd sort of be the only difference between those two runtimes; the underlying operating system’s still the same. I mean, and these are launched very quickly. Now it's on Firecracker, which is Amazon’s sort of new VM-type technology that sort of provides isolation. But essentially, you know, these isolated environments spin up very quickly and they're running an operating system that runs a runtime that then invokes your function, which is also sitting on the file system.

Jeremy: Alright. So let's get into a little bit more than the details though, I mean, in terms of things that are installed and ready to go. I mean, it's more than just the runtime, right? I mean, there's other libraries, and other things...

Michael: No, you're absolutely right. So, unlike — I'm trying a bit of a good example — unlike if anyone's played with Cloudflare Workers or something like that, that's just running JavaScript. There's no sort of file system that you have access to or anything like that. It's just sort of a JavaScript environment with all of the things that you would have access to if you were in the browser, for example, those so you know most of, or a lot of those sorts of APIs. In Lambda, there is a file system in there. It's a Linux operating system running a process and and then including your Javascript file or Python file and running that. So as well as your file and and the runtime itself, there are, you know, a bunch of base operating system binaries sitting there that you can access as well and Amazon’s pretty cagey about what guarantees they give you about what binaries will be there. They say, “You know if you want to compile something native that your Lambda uses, compile it for an Amazon Linux environment, you know, and that's pretty vague because obviously, you could have an Amazon Linux environment with a whole bunch of binaries or dynamic libraries installed or you could have an incredibly stripped-back operating system that has nothing installed. So in that case, you'd need to sort of bring those binaries yourself into your Lambda. So a good example is if your Lambda function wanted to call out to Bash, do you just assume that Bash is there on the operating system? That's probably a pretty fair assumption. Bash is on most Linux operating systems or certainly the larger ones. So that might be a fine assumption. But then another example might be Perl. You know, maybe you need — maybe your function does something a little bit exotic. Maybe it's doing some cool image manipulation or video manipulation that it needs to call out to a Perl script. Do you assume that Perl is in the Lambda environment or do you bundle yourself or include it in the layer or some...

  continue reading

142 에피소드

모든 에피소드

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드

탐색하는 동안 이 프로그램을 들어보세요.
재생