Loading

The
K12
Engineering Education Podcast

« Previous · Next »

Engineers on Eyes

Latest Episode

Episode cover art

Engineers on Eyes

Season 3 · Episode 12

Previous Episodes

You Might Also Like...

Description

What’s so great about the human eye? Can we build something just like it? When would we need to design something better? Sadhan joins the podcast again as we give our engineers’ perspectives on these questions and more.

Our closing music is “Yes And” by Steve Combs, used under a Creative Commons Attribution License.

Subscribe and leave episode reviews wherever you get your podcasts. Support Pios Labs with regular donations on Patreon, by purchasing digital teaching materials at the Pios Labs curriculum store, or by buying a copy of the reference book Engineer's Guide to Improv and Art Games by Pius Wong. You'll also be supporting educational tools and projects like Chordinates! or The Calculator Gator. Thanks to our donors and listeners for making the show possible. The K12 Engineering Education Podcast is a production of Pios Labs.

Transcript

Pius Wong  0:00 

Today we are two engineers marveling at the human eye on The K12 Engineering Education Podcast.

 

Pius Wong  0:12 

Mechanical engineer Sadhan Sathyaseelan is back to help me muse about the science and engineering related to biological systems like the eye. I'm Pius Wong, and even though my eyes are pretty bad, they got me this far. Listen to Sadhan and me chat, next.

 

Sadhan Sathyaseelan  0:30 

Hey Pius.

 

Pius Wong  0:30 

Hi, Sadhan. Thanks for joining me for today's episode.

 

Sadhan Sathyaseelan  0:34 

Absolutely. I have a question, actually, which I thought we should talk about.

 

Pius Wong  0:38 

Yeah.

 

Sadhan Sathyaseelan  0:39 

I want to talk about the human body. Okay? But not the biology of it, but the engineering of it. The way I see it, I think the most sophisticated instrument -- or you can call it a machine -- on this planet, is the human body. So obviously there's a lot of engineering principles that the human body contains. And I was hoping to see what we can come up with and organize.

 

Pius Wong  1:11 

We're just having a casual conversation about, how does engineering apply to the human body.

 

Sadhan Sathyaseelan  1:17 

Yeah.

 

Pius Wong  1:17 

Okay.

 

Sadhan Sathyaseelan  1:17 

What engineering principles that we can talk about from human body perspective. You're a biomedical engineer. I'm a mechanical engineer. And I know there's chemical engineering that goes on in there.

 

Pius Wong  1:29 

Let me tell you, yes.

 

Sadhan Sathyaseelan  1:30 

Electrical circuits that goes on in there.

 

Pius Wong  1:32 

Yeah, you know? Okay. I will tell you, in my opinion, that for every technical subject that an engineer in the more traditional fields takes, there is probably a biological analog. So what I mean by that is, for example, classical engineers -- and by them, I mean, like electrical, mechanical, civil and chemical -- those engineer engineers, they all have to take the class on basic circuits. You know, Ohm's Law, V = IR.

 

Sadhan Sathyaseelan  2:07 

Yeah.

 

Pius Wong  2:08 

All that stuff, and how to connect things, and resistors. Well, I would say that there is a biological application to that class, because we have neural circuits, and chemical circuits, that kind of thing. And then there's another mechanical engineering class on statics. And you can apply that to studying maybe the skeletal system when we're just sitting at rest, for example, studying how our body stays still. So that's my general opinion. For every single engineering class, there's some application to the human body. What do you want to talk about?

 

Sadhan Sathyaseelan  2:44 

Okay, yeah, that that seems to be too broad. And so let's look -- Do you want to just try and narrow it down?

 

Pius Wong  2:48 

Okay.

 

Sadhan Sathyaseelan  2:49 

So I have an idea. So how about, let's talk about all engineering products that we have come up with based on our senses, how we apply our senses. For example, if we take vision. So we sense light through our eyes. And an equivalent product that we engineered will be a camera.  Right? So --

 

Pius Wong  3:20 

Cyborg design. Let's do -- Hey, no, cyborg design is totally the class that I wanted to take back in the day. So why not? That's what robotics is.

 

Sadhan Sathyaseelan  3:29 

So let's start with the first one that I thought of.

 

Pius Wong  3:32 

Cameras.

 

Sadhan Sathyaseelan  3:32 

Yeah, cameras. We're talking about vision. So how does vision work? What's the engineering principle behind it? What's the science or physics behind it?

 

Pius Wong  3:40 

Light stimulates your cells. Well, if you're talking about biology here, light goes into our eyes.

 

Sadhan Sathyaseelan  3:48 

Which is the lens.

 

Pius Wong  3:49 

Yeah, yeah. And it hits the retina, which is that special layer of cells in the back of our eyes.

 

Sadhan Sathyaseelan  3:53 

Okay, so in inveted, too. It's like an inverted image.  And it's like the screen where the image is falling.

 

Pius Wong  4:00 

Right.

 

Sadhan Sathyaseelan  4:02 

And you perceive that in your brain. I think it's like the back of your brain. I don't know what is called.

 

Pius Wong  4:08 

The occipital part of your brain.

 

Sadhan Sathyaseelan  4:10 

Yeah, it's in the back. That's what I remember from my classes. You perceive it at the back of your brain. Okay, so that principle is of light falling on a camera, and one thing that we haven't been able to achieve in engineering is how quickly our our eyes or the length in our eyes changes its focal length.

 

Pius Wong  4:39 

Like it can adapt to looking far away versus looking up close.

 

Sadhan Sathyaseelan  4:44 

Or near. Yeah.

 

Pius Wong  4:45 

My eyes don't do that very well, specifically my right eye doesn't do that very well.

 

Sadhan Sathyaseelan  4:49 

Okay.

 

Pius Wong  4:49 

Yeah, like I can tell you my lens in my right eye is not changing its focal length.

 

Sadhan Sathyaseelan  4:56 

What? [laughs]

 

Pius Wong  4:56 

Like, I just took off my glasses, and my right eye vision is pretty awful. And I can tell you that if I put my hand up to my face, like, what is that? Three inches away? I can see the palm of my hand clearly three inches away, but now -- Oh, boom, gone. My lens is no longer changing.

 

Sadhan Sathyaseelan  5:12 

What? Okay?

 

Pius Wong  5:13 

Yeah, like I can't see.

 

Sadhan Sathyaseelan  5:15 

Okay, so you have short sight? Is that how you -- What is it called?

 

Pius Wong  5:19 

Myopia.

 

Sadhan Sathyaseelan  5:21 

Yeah, I don't know the technical --

 

Pius Wong  5:21 

It changes the shape of my eye so that I guess the image that goes through my pupil and through my lens, that image is not actually focused on the back of my eyes just right.

 

Sadhan Sathyaseelan  5:36 

It's a little off.

 

Pius Wong  5:37 

Yeah, my lens isn't compensating for it.

 

Sadhan Sathyaseelan  5:39 

Okay, so you can see things that are near without a loss, and it's hard for you to have sharper images when things are far away. Okay, I have the same condition. I don't know. Myopia?

 

Pius Wong  5:49 

Yeah.

 

Sadhan Sathyaseelan  5:50 

Okay, I have myopia, as well.

 

Pius Wong  5:51 

Nearsightedness.

 

Sadhan Sathyaseelan  5:52 

Okay, so let's take the example of somebody with perfect eyesight. [laughs] We are not eligible for that. So in cameras, what we usually do is, if we were trying to focus on something, we have to adjust the focus using that knob in the camera.

 

Pius Wong  6:08 

Right. It goes click, click, click, back in the day.

 

Sadhan Sathyaseelan  6:10 

Yeah, so either, if you want to focus on something farther away, you need to use one kind of a lens. If you want to focus something near, you call it macro and use a different lens. So there's so many lenses if you're looking at digital -- sorry, analog photography.

 

Pius Wong  6:26 

Yeah, because a lens is like a hard piece of glass or polymer, and so you can't just instantly change the shape of that lens. Our eye is different where -- well, in a perfect vision eye, the lens -- you've got this muscle connected to the lens which stretches and pulls the lens into different shapes.

 

Sadhan Sathyaseelan  6:45 

Okay. I wonder why haven't we haven't done that with an actual camera?

 

Pius Wong  6:51 

Yeah, well, we have a functionally equivalent version of that in our camera phones. I mean, I put my camera close up to an object, and you can see the focus change. Like the lenses.

 

Sadhan Sathyaseelan  7:03 

Yeah, so I'm saying it takes a long time, right? Compared to our eyesight.

 

Pius Wong  7:09 

Oh, like in the person with perfect vision, they can look 100 meters away and then look right up close to their book and they instantly --

 

Sadhan Sathyaseelan  7:18 

Yeah, instantly changes the focal length.

 

Pius Wong  7:20 

It's getting better, though. I would say that the phones I've seen, they're focusing pretty fast.

 

Sadhan Sathyaseelan  7:24 

Okay.

 

Pius Wong  7:25 

But you're right. There's some mathematical algorithms in there that they have to use to optimize focus. Edge detection and minimizing blurriness and all this stuff. It's really neat.

 

Sadhan Sathyaseelan  7:36 

Okay, so I think I have, kind of, when you were talking about this, I'm understanding why it could be, because, for us, we spoke about how there's a muscle pulling and contracting and expanding the lens, which changes the focus. It's a different material. It's not the typical lens we use on camera, but also we are in control, right? So we are the feedback loop. So when you see something 100 meters away, and you instantly want to look down and look at your book, the control is within you. You know exactly what you're focusing on.

 

Pius Wong  8:13 

Interesting.

 

Sadhan Sathyaseelan  8:14  

But with a camera, you need to decide what you're focusing on. And even if it's an automatic, even the computer doesn't know what it needs to focus on. So it needs to first capture and understand the image of what's happening before you can focus on that.

 

Pius Wong  8:27 

That's a good point, because when we make future robots -- because that apparently is going to happen -- maybe they have to design the algorithm to not just taking the visual data around them, but also have some foreknowledge of what's around them. Like, the person -- the robot, I should say, that sees better, is going to be the robot that not only takes in all the visual information around them, but takes in some internal information that they already had, knowing that there should be a tree over there already. Or they already know there's going to be a book in front of them. Because you're right, as humans, like, if I'm holding a book and I'm looking far away, I already know I have a book in my hand. So I'm not gonna -- There's no surprise. I will say, though, that if someone surprises me by shoving their hand in my face, like, there's that moment where, like, yeah, if I don't know what to focus on, then maybe --

 

Sadhan Sathyaseelan  9:18 

Interesting.

 

Pius Wong  9:19 

You can't focus. Also, people can choose to focus and unfocus their vision.

 

Sadhan Sathyaseelan  9:24 

Really?

 

Pius Wong  9:24 

Yeah, like, there's that whole -- I don't know if you've seen those Magic Eye puzzles.

 

Sadhan Sathyaseelan  9:30 

No.

 

Pius Wong  9:30 

There's like this weird visual puzzle, which used to be big when I was in elementary school, where you put your face close to the image, and you pull the image back, and you're supposed to see like a special image pop out.

 

Sadhan Sathyaseelan  9:42 

Oh, yeah, I've done that.

 

Pius Wong  9:43 

It relies on how your eyes focus and everything. But basically, like, if I'm looking at a book with my glasses on, clearly, I can choose to unfocus my eyes. And I don't know if it's that my eyes are, like, pointing in a different direction or whatever, or if it's the lens just relaxing, the muscle that's pulling my lenses, if they're just relaxing, but I mean, I know I can defocus my eyes or unfocus it. Anyway, we're going off on -- I'm going off on a tangent.

 

Sadhan Sathyaseelan  10:10 

No, no, that's exactly what I wanted to talk -- This is all pure engineering, right?

 

Pius Wong  10:14 

This is engineering inspired by biology.

 

Sadhan Sathyaseelan  10:15 

Biology. But also, I would say that this, the human eye is -- it's one of the most highly engineered that we weren't able to replicate. But that being said, we do have two things, visual aids, that we use, that's not a camera, that actually enhances our vision. And that would be a telescope to see much further than what your normal eyes can. And a microscope which lets you see much smaller things than what our eyes can.

 

Pius Wong  10:46 

Exactly. So good point. Like, the things that we engineer may not always replicate exactly what we know in our own bodies, but they can easily exceed what's possible in the human body.

 

Sadhan Sathyaseelan  10:58 

Yeah.

 

Pius Wong  10:58 

And maybe that's the irony of it all.

 

Sadhan Sathyaseelan  11:01 

Well, it's irony, but I would also say that a big telescope is dumb compared to how sophisticated my lens --

 

Pius Wong  11:10 

They are getting smarter.

 

Sadhan Sathyaseelan  11:11 

I'm going to explain what I mean.

 

Pius Wong  11:12 

It's like, all the scientists in Hawaii who look at the stars are now insulted, who listen to this podcast.

 

Sadhan Sathyaseelan  11:16 

[laughs] No, I'm going to explain myself before we go. So let's take Hubble, for example. It's the first one that took the galaxy field. There was like billions of light years away.

 

Pius Wong  11:29 

The Hubble telescope orbiting around us right now?

 

Sadhan Sathyaseelan  11:33 

Yeah, orbiting around the Earth. So, but the thing is, I wonder if it would be able to recognize if you pop something right in front of it.

 

Pius Wong  11:43 

[laughs] So the equivalent of someone annoying, shoving their hand in the face of Hubble.

 

Sadhan Sathyaseelan  11:47 

Yeah. So the question is, what is a minimum focal length? Because for something like that, if you want to see that far, obviously the focal length has to be so high. Okay, so  maybe this is a very drastic example.

 

Pius Wong  12:01 

It goes into the mind of whoever designed it, if they thought that was going to ever happen.

 

Sadhan Sathyaseelan  12:06 

Okay, so I'm gonna use a different example, because I should have picked a better one, obviously.

 

Pius Wong  12:11 

It's funny.

 

Sadhan Sathyaseelan  12:11 

Microscope.

 

Pius Wong  12:13 

Oh yeah.

 

Sadhan Sathyaseelan  12:14 

So microscope -- It's awesome because you can see the tiniest cells that the human eye cannot, but take it away, and if you use it as a regular, let's say glasses. Let's say you're using a microscope as regular sunglasses or regular glasses, you won't be able to see anything.

 

Pius Wong  12:33 

Right, because you're focusing on the completely wrong plane.

 

Sadhan Sathyaseelan  12:36 

In that sense, I think the microscope and telescope are dumb compared to a human eye. Because they only act-- Well, maybe dumb was not right word.

 

Pius Wong  12:49 

Yeah, I don't know. It's --

 

Sadhan Sathyaseelan  12:51 

You're right, you're right. It's not dumb exactly.

 

Pius Wong  12:54 

Because dumb implies that it had some kind of choice in how it's behaving in some ways.

 

Sadhan Sathyaseelan  13:00 

Okay, how about this. The word is not "dumb." The word is "not that sophisticated."

 

Pius Wong  13:07 

Responsive?

 

Sadhan Sathyaseelan  13:07 

It's not as sophisticated as a human eye.

 

Pius Wong  13:09 

Sure. Our eye is robust for a certain range, as well.

 

Sadhan Sathyaseelan  13:12 

Yeah.

 

Pius Wong  13:13 

And we're designing our microscopes to be robust, as well, and our telescopes.

 

Sadhan Sathyaseelan  13:17 

Okay, I take it back. I don't think I don't think the Hubble telescope is dumb. I don't think a microscope is dumb either.

 

Pius Wong  13:21 

All the NASA scientists now --

 

Sadhan Sathyaseelan  13:24 

Okay.

 

Pius Wong  13:24 

No, but you are right, though. Like, they're only as -- for the lack of a better word, they're only as smart as the designers who designed them. Someone could have designed the Hubble telescope or another telescope to be adaptable to -- I mean, let's say, an asteroid, you know, floats by really close. I mean, that could be a legitimate thing that happens. And if the designer never takes that into account, or if we don't have enough money to take that into account, then yeah, it's going to be a limited device.

 

Sadhan Sathyaseelan  13:54 

So one can say that, if you're taking a microscope, that is the most optimum design to view things on that range.

 

Pius Wong  14:03 

Right. So that's the interesting thing. There's the idea of designing products to be used in multiple scenarios. For example, the microscope that can be used to look at things at different magnifications. But every time you design something to be used for more use cases, it becomes more complicated.

 

Sadhan Sathyaseelan  14:21 

Yeah.

 

Pius Wong  14:22 

More expensive. Like the microscope that you use to look at multiple magnifications, you have to rotate the lens around, and it clicks and everything. Whereas you have the really, really simple magnifying glass. One magnification, really simple, really cheap. And I think they both have their value.

 

Sadhan Sathyaseelan  14:39 

Right.

 

Pius Wong  14:40 

It's about knowing when you need the robustness, when you need to have something that can be used in multiple situations and when you don't.

 

Sadhan Sathyaseelan  14:50 

Okay. I think I'm behind that. I definitely agree with what you're saying. So I want to touch upon one more thing about vision before we move on to next sense. We spoke about -- We touched upon robots and their vision. Right? So this is the question I've always had. So, especially when it comes to humanoid robots, so we give them eyes, actual eyes that they are -- okay, not actual eyes.  We give them the shape of an eye.

 

Pius Wong  15:19 

Yeah.

 

Sadhan Sathyaseelan  15:19 

Right? Now the question is, where exactly is the camera? Is it behind the

 

Pius Wong  15:24 

[laughs] Totally. That's a great question. So, I mean, I don't know if you knew this, but I was in humanoid robotics for a while.

 

Sadhan Sathyaseelan  15:30 

Okay. So you're the perfect person to ask this.

 

Pius Wong  15:33 

So the short answer is, it totally depends on the robot.

 

Sadhan Sathyaseelan  15:36 

Yeah.

 

Pius Wong  15:37 

I mean, robots -- We don't have -- I wrote about creating a taxonomy of robots in the future. But we don't have one right now. People make robots looking like whatever. You can put an eye on the butt. Nobody will care.

 

Sadhan Sathyaseelan  15:47 

Yeah.

 

Pius Wong  15:47 

The thing is that, until we have a real taxonomy of robots where we can classify things into different groups, it's whatever. So right now, you can put the eyes on the face, and those eyes could be purely cosmetic. Or you could put real cameras in there. And there are some robots where the eyes are in there. And even when we say eyes, that doesn't even necessarily mean an optical camera. It could mean laser or infrared or something. It's some kind of sensor.

 

Sadhan Sathyaseelan  16:15 

right

 

Pius Wong  16:15 

Where the eyes would be. That said, there are other robots that just shove eyes on there to make people feel comfortable.

 

Sadhan Sathyaseelan  16:21 

Okay.

 

Pius Wong  16:22 

Because it's a psychological thing to not creep the humans out around them. And then there are robots where you have, like, an iPad for face. So the eyes are there sometimes as a cartoon eye, just to not detect anything but do some kind of social interaction, as well.

 

Sadhan Sathyaseelan  16:40 

Okay.

 

Pius Wong  16:40 

So eyes are so human. They're not necessary for a robot, not physically.

 

Sadhan Sathyaseelan  16:45 

Because what I'm imagining is like, let's say you build a robot that's only operating in a certain environment. The optimal way to do that would be to put a camera -- let's say if it's a square room, you put a camera overlooking the entire room, and --

 

Pius Wong  17:00 

Exactly. You wouldn't put it on the robot. You put it on the ceiling or something.

 

Sadhan Sathyaseelan  17:03 

Yeah.  It doesn't even have to be -- That might be a better way to go about it. So that's why I was asking. Oh, you see all these cool robots, you know, with eyes and face and nose. And you know, it's like, do they really place the camera behind the eyes?

 

Pius Wong  17:15 

No. Well, they might. They might.

 

Sadhan Sathyaseelan  17:16 

They might. To me, it seems like maybe that's not the most optimal place to put it.

 

Pius Wong  17:19 

Well, I'll bring it back here. This isn't necessarily having to do with our five human senses. But this has to do with our emotional senses. The reason why people put faces on robots is purely for humans to not be creeped out.

 

Sadhan Sathyaseelan  17:32  

Yeah.

 

Pius Wong  17:32 

There's no other practical reason to have a cutesy face on our robot.

 

Sadhan Sathyaseelan  17:39 

I feel like the Terminator movie ruined people's perception of -- But then there came the Judgment Day, Part II.

 

Pius Wong  17:43 

There are studies on this.

 

Sadhan Sathyaseelan  17:44 

It's like the coolest thing.

 

Pius Wong  17:46 

I actually really liked T2.

 

Sadhan Sathyaseelan  17:48 

Yeah, it's a better film, of everything, I mean.

 

Pius Wong  17:52 

Who's that actor?

 

Sadhan Sathyaseelan  17:54 

Arnold Schwartzenegger.

 

Pius Wong  17:54 

No, no, not Schwartzenegger. The guy -- the enemy Terminator. The one with the liquid metal.

 

Sadhan Sathyaseelan  18:01 

Oh yeah, I think it's something Patrick.

 

Pius Wong  18:05 

I don't know my actor's names. In any case.

 

Sadhan Sathyaseelan  18:10 

He acted in the move that's the most famous movie...

 

Pius Wong  18:16 

On that happy note, Sadhan and I will close out the episode there. We could have used the internet to find out that the actor in Terminator 2, who we were talking about, was Robert Patrick, by the way. You know what else you can do on the internet? You can leave us a rating and review of our show on Apple Podcasts, Stitcher, the Public Radio Exchange, or wherever you find this podcast. I appreciate it, because it helps others find out all about us. Send us a message on Twitter. You can tweet the show @K12engineering or tweet Sadhan or me directly. You can also email us, follow the show on Facebook, SoundCloud and everywhere else online. Find the details in the show notes or at the show website, k12engineering.net. That's k12engineering.net. More episode transcripts will be on the website shortly thanks to awesome supporters on Patreon. If you like what you hear and want to help sponsor the show, go to patreon.com/pioslabs to donate, or find the links Patreon from the show website.

 

Pius Wong  19:19 

Our closing music is from the song "Yes, And" by Steve Combs, used under a Creative Commons Attribution License. The K12 Engineering Education Podcast is a production of my independent studio Pios Labs in Austin, Texas, where I make software and digital media like this show. Thanks for listening, and we'll do it again soon.