Connect with us


Podcast: In the AI of the Beholder



Podcast: In the AI of the Beholder

Ideas about what constitutes “beauty” are complex, subjective, and by no means limited to physical appearances. Elusive though it is, everyone wants more of it. That means big business and increasingly, people harnessing algorithms to create their ideal selves in the digital and, sometimes, physical worlds. In this episode, we explore the popularity of beauty filters, and sit down with someone who’s convinced his software will show you just how to nip and tuck your way to a better life.

We meet:

  • Shafee Hassan, Qoves Studio founder 
  • Lauren Rhue, Assistant Professor of Information Systems at the Robert H. Smith School of Business


This episode was reported by Tate Ryan-Mosley, and produced by Jennifer Strong, Emma Cillekens, Karen Hao and Anthony Green. We’re edited by Michael Reilly and Bobbie Johnson.



[Montage of songs about beauty]

Strong: Beauty has always been one of society’s greatest obsessions. And for as long as we’ve worshipped it… we’ve also found ways to change and enhance it. From makeup and clothes… to airbrushing photos… or a surgical nip and tuck. And now? AI.

[Montage of news coverage about beauty filters] 

[Sound from an Apple keynote featuring photo augmentation where women are made to smile more. Audience cheers]

Strong: You may not realize it…but this technology is right at your fingertips. In the beauty filters on your phone and social media. The tech has gotten so good at detecting where your eyes, nose, and jawline are, it’s easier than ever to adjust those features. With a simple swipe, you can tweak the arch of your eyebrow, or tune the curve of your lips and construct your ‘ideal image’.

It’s possible there’ll be 45-billion cameras in the world by next year… along with ever more ways to use AI to parse, tag, edit and prioritize those images. Companies like Microsoft, NVIDIA and Face++… have all publicly released products meant to gauge beauty in some way. There’s even AI-driven systems that promise to look at images of your face to tell you how beautiful you are—(or aren’t)—and what you can do about it.

Hassan: So we’re showing you what the algorithm is looking for. And if you so wish to change it, you can, you know, using these, these surgeries.

Strong: But can anyone, or any thing, be truly objective about beauty?

Rhue: Let’s just say I’ve never seen a culturally sensitive beauty AI.

Strong: And will this new wave of beauty enhancement leave our next generation with more insecurities than ever? 

Veronica: There’s like a way that the filters are kind of like detrimental to people’s like mental health and can be really crippling for some people because they’re comparing themselves to that. 

Strong: I’m Jennifer Strong and this episode we look at the role of machines in shaping our standards of beauty and how those standards shape us.


Veronica: When I’m going to use a face filter it’s because there are certain things that I want to look differently. So if I’m not wearing makeup or if I think I don’t necessarily look my best, the beauty filter sort of changes certain things about your appearance. 

Veronica: Hi, I’m Veronica. I am 19 years old and I’m from Minnesota.

Sophia: I’m Sophia, I’m 15. And I’m also from Minnesota.

Strong: They’re sisters… and avid users of social media. They use beauty filters to enhance how they look in photos. They’re showing my producer Tate Ryan-Mosley some of their favorites. 

Sophia: Do I look like that? No. Not one bit.

Tate: Describe what makes you look different in that picture?

Sophia: It has these massive lashes that make my eyes look beautiful. My lips triple the size and my nose tinier.

Veronica: My ideal filter. It is called the Naomi filter on Snapchat. It clears your skin and then makes your eyes huge. 

Tate: When did you start using them? Do you remember?

Veronica: Fifth grade? I dunno. It was more like funny at first. Like it was kind of like a joke, like people weren’t trying to look good when they use the filters

Sophia: I definitely was. Like 12 year old girls, like having access to something that makes you not look like you’re 12. Like that’s like the coolest thing ever. 

Strong: Filters are explosively popular. Some are funny… like the one that put puppy ears and a fake nose on your face. Others are branded, geotagged, and there’s artsy ones too. But hands down the most common kind are beauty filters, which change the appearance of someone in a photo in an effort to make them more attractive—often by reshaping and recoloring their features.

And the biggest fans are young girls.

For years now… these sisters have used filters almost everyday… But they still aren’t sure how they feel about them.

Veronica: With social media in general. It’s impossible not to compare yourself to people. But I think that when people do use filters like that and they don’t disclose it. I feel like that can cause people to become more insecure or more affected by it than they would on just a regular photo, because you’re less appreciating, like, their natural beauty compared to the beauty that was like kind of formulated to make them look perfect.

Sophia: That’s not normal. That’s not a normal body. // We feel so pretty in them. And it’s like, why.. 

Veronica: There’s this somewhat of a validation when you’re meeting that standard? Even if it’s only for like a picture…

[Sound from TEDx talk: Epidemic of Beauty Sickness] 

Engeln: About 15 years ago, I was an eager young graduate student. And I spent a lot of time teaching.  

Strong: This is Renee Engeln, a professor of psychology at Northwestern University, giving a TEDx talk.

Engeln: And the more I listened to my female students, the more I picked up on something troubling. These bright, talented young women were spending alarming amounts of time thinking about talking about trying to modify their physical appearance. 

Now our perceptions of beauty are complicated. They have deep evolutionary roots. From a scientific perspective, beauty is not just desirable, but also rare. 

Strong: She went on to study this problem, interviewing women on how they were affected by constantly seeing images of unrealistic beauty standards… and what she found was… unexpected.

Engeln: Women know that the women they see in these images, aren’t representative of the general population of women. They are very aware that in the real world, nobody, nobody actually looks like this…It doesn’t seem to matter. Knowing better isn’t enough. The same woman who said this, for example, this body type is unrealistic skinny and her ribs are showing and you’re kind of like, yeah, right on. She followed it up with, I feel like I want to be like that.

Strong: Engeln gave her talk in 2013…well before AI beauty filters. And these days we’re not just seeing Photoshopped models in magazinesbut photos of ourselves and our friends that have been retouched by algorithms.

And… it’s fueling an entirely new industry… 

Hassan: We realized that there’s a demand for learning how to correctly edit faces. And from that we realized there’s also a demand in assessing faces to understand what makes a face attractive or to better understand what changes will make a face look better, essentially.

Strong: Shafee Hassan is the founder of Qoves Studio. It’s just one of a number of new companies using neural networks to recognize things in people’s faces that could be deemed unattractive. He’s a structural engineer by training… which he says informs his work.  

Hassan: And these flaws show up time and time again. And they’re very common in certain ethnicities and less common in others and a computer can detect that really accurately because the pixel values, the color values are very similar regardless of where you’re looking at it or what section of the face it’s from.

Strong: Researchers believe social media giants like Facebook, Instagram and Tik Tok all use algorithms that measure the attractiveness of a face.

Hassan: …determine or predetermine if a piece of content is going to be successful or not, and then further push that content to a greater population of users.

Strong: To date, none have confirmed this. What we do know (from reporting by The Intercept) is that TikTok asked its content moderators to suppress videos with people they deemed unattractive, poor, or to have a disability. A TikTok spokesperson said those rules were a “early, blunt attempt at preventing bullying and no longer in place.”

And this is where companies like Hassan’s come in. From his perspective, arguing about whether it’s right or wrong to promote and suppress images of people based on their looks?… is kind of beside the point. He says this system is the reality and facial features impact social status, professional prospects and income. But he thinks his company can make that process more transparent.

Hassan: So we’re showing you what the algorithm is looking for. And if you so wish to change it, you can, you know, using these, these surgeries. And that’s also something we provide as well. We provide ways, solutions, and it doesn’t even have to be cosmetic. Sleep can improve your under eye contours, which a beauty algorithm may penalize you by like 0.5 of a mark.

Strong: Uh-huh. You heard that right. Surgeries to help people embody what they think machines are looking for. His YouTube channel focuses on just that—with videos that get more than a million views. Like this one:

[Sound from YouTube Video featuring Hassan]

Hassan: Welcome to the first episode of defining beauty… Where I attempt to explore what makes a face attractive in the most objective way possible.

Strong: And they offer detailed reports about these perceived flaws.

Hassan: Ideally human eyes should be one eye width apart… here’s an article written about a 2008 experiment on specifically interpupillary distance between the eyes and how they influence attractiveness.

Strong:  He sees surgery as a bigger part of our future, especially as the importance of our online image grows.

Hassan: The whole point is we want to clear how people see surgery into being a more positive tool of social mobility, because your looks influence the way you’re treated, the amount of money you earn, how your socioeconomic status can move up or down. If you have a deformed jaw, I’m not going to tell you that you’re beautiful, just the way you are. And I think you should get correction on that because research has shown that a Jaw cervical angle deformity of like say 130 degrees or greater is very stringently rated as very unattractive by like the mass majority of lay-person raters. So, so like the, the idea of this political correct way of beauty, beauty is something that I kind of want to take on, even though it’s controversial. I feel like a lot of people do agree with what I’m saying. And that’s obviously why I have a platform.

Strong: I asked Hassan if he’s received much criticism for this work.  

Hassan: Funnily enough, the most harsh criticism I received were from my friends and family when I started off and never criticism from anywhere in the greater internet, uh, people were very curious as to the technology. It does raise some concerns about privacy, but obviously we do our best to keep everything as secure as possible. It does raise some concerns about, um, I suppose, an overarching sense of control, you know, telling people this is wrong with your face, blah, blah, blah 

Strong:  But beauty algorithms have come under severe criticism for perpetuating racism and ageism. For example, in 2016, Microsoft and NVIDIA hosted a beauty pageant with an AI judge. And out of 6-thousand entries, almost all of the 44 winners were white.

Hassan: Well, one of the big issues with beauty algorithms is that they typically trend with Caucasian faces. And so they penalize, uh, faces with non-Eurocentric features very harshly because they’re not trained with that kind of feature. Now, one of the things, when we were developing our algorithm is train it with as many different faces as possible.  I’ve always believed that attractive people are a race of their own. And so their attractive features kind of transcend a Eurocentric or a Caucasian or an Afro-centric or whatever centric you want to look at. Sharp jaws, sharp cheekbones, lean, facial fat, like this isn’t a Eurocentric thing. This is just a biology thing.  

Strong:  And Hassan takes his, ‘inspiration’, from the deeply dystopian 90s film Gattaca.   

[Sounds from the theatrical trailer for Gattaca]

Hassan: So Gattaca is very impactful because a lot of people aren’t born the most genetically gifted. And this goes back to the idea of the celebrities at the top, the, the good-looking attractive people at the top being there just because they’re genetically gifted. I don’t entirely believe that’s how they got there. I think a lot of it has to do with a bit of help from surgery, a bit of help from diet, a bit of help from world-class trainers. These are things that they will never speak about, but it’s, it’s part of the illusion of being unreachable and being exalted from the everyday man. So Gattaca is the best epitome, the best representation of basically what our company is about.

Strong: While reporting this story… my producer Tate decided to try out his facial assessment tool. And watching what unfolds next makes me extremely uncomfortable. 

I had this experience at a trade show a few years back… and though I knew it was a gimmick… it still planted fears in my head. And now on this zoom screen?  It goes beyond scare tactics and overpriced face cream… this tool recommends needles and knives…  

Hassan: So, we’re on the website And so far so good, we scroll down. So this is your, um, image. We can upload it. I’m not a robot… Here. Here. Right  Uh, and these are the flaws that the computer detects.

Hassan: Deepened nasolabial folds. These are these lines here, and that’s because you’re smiling…Under eye contour depression, which is definitely here…the region just instantly sinks. And then it goes back up as it comes towards the cheekbones. So generally for attractive faces, the contour is inline. It’s flush with the eyes, So slight, slight dark circles. puffy lower eyelid, which I do agree  This eyelid is definitely really puffy for whatever reason, but this one is not. So that’s what it’s picked up instead of 0.5 or 0.58, which, which is decently strong. a nasal Jugal fat pad, uh, that’s this pad here, it’s very minor. And so at this 0.3, which is, I think accurate, it’s not something I worry in highly about the computer thinks that you have an Epicanthic fold which is an Asian monolid as they call it… and that’s probably because your upper eyelid fat covers up a lot of your upper eyelid. So it basically sees it as the whole thing, being one eyelid.  

Strong: Let’s hit the pause button here for some context… however weird it is for me to describe my friend and colleague this way…. you can’t see Tate. So, with her permission… here we go: she’s tall, blond, has these big blue eyes, strong cheekbones, and a giant smile… she’s young too, as in double digits younger than I am… and as far as those genetics go? She’s the daughter of a pro athlete. 

But we’re hearing recommendations on what she can do to fix her supposed flaws… including different types of plastic surgery… and I can’t help but think how harshly this tool might judge the rest of us… especially someone who isn’t young and white.

Strong: We’re going to take a short break, but first…  Our friends over at the Financial Times have relaunched their podcast, Tech Tonic. Find out how a device like your fitbit might be the first to know you’ve got covid… or what antitrust laws mean for a smoked fish specialist… innovation editor John Thornhill takes us into emergency rooms, cruise ships and classrooms to explore how tech has reshaped our world… and what that means for us. 

All five episodes are available now wherever you get your podcasts… just search tech tonic. 

We’ll be back … right after this.


Strong: What does it mean to take already flawed standards of beauty… largely imposed upon us by ourselves… and instead? Hand this mess off to algorithms that are even more flawed, littered with bias, and that further reinforce eurocentric features as the definition of what’s beautiful…

Whether that’s an Instagram filter making eyes larger… skin smoother and jawlines sharper… Or software pointing out how your features miss the standardized mark…  

…and so we called up a researcher who investigates how technology impacts the choices we make.

Rhue: And I was looking at the facial recognition tools that were out there to try to better understand the pictures. And that’s when I realized that there were scoring algorithms for beauty.   

Strong: Lauren Rhue is a professor at the University of Maryland School of Business.

Rhue: And I thought that seems impossible. Beauty is completely in the eye of the beholder. There’s all these different cultural standards that have to do with beauty. How can you train an algorithm to determine whether or not someone is beautiful? 

Strong: This type of scoring is different from what Hassan does… but both apply the same technology.

Rhue: Well, you upload a picture and they, on a score of zero to 100, it’ll tell you how beautiful this person is. They actually, the paper that I’m writing, it’s looking at Face Plus Plus, and they divide it into a male score and a female score. So women think this person is beautiful, 85 out of a hundred, whereas men think maybe she’s 90 out of a hundred.

Strong: It’s mostly unclear which companies use beauty scoring algorithms… but for those that want to, they’re easily up for sale. For example, one of the largest players in this – Face Plus Plus, owned by Chinese tech unicorn, Megvii — Their beauty scoring feature is available as part of their face recognition system. Instagram and Facebook have denied using such algorithms. TikTok and Snapchat declined to comment… but Rhue says, just the recommendation algorithms themselves often end up gauging attractiveness… regardless of whether they’re intended to. 

Rhue: Well, if you look at what Instagram wants it’s going to be essentially models, right? You’re not going to see a lot of different types of facial features and expressions. And, and that’s going to perpetuate this idea of, of beauty because, um, because of the lack of diversity in what you see in Instagram, and what’s extremely popular on Instagram

Strong: In other words, the pictures judged to be most beautiful by users get the most likes… and that’s what gets recommended to others… 

Rhue: We’re narrowing the type of pictures that are available to everybody.

Strong: When you combine that with people pervasively applying those beauty filters to their photos… it’s led to something termed “the instagram face”… which is a particular aesthetic that’s prioritized and rewarded on social media. And it’s created a new idealized look that dominates the platform.

Rhue: I understand it’s more of an entertainment value as to why we have beauty filters, but our choice of beauty filters is definitely informed by the culture, right? Informed by what the beauty standards are. And a lot of times there are Eurocentric beauty standards, and you can see that with some of the facial recognition issues that have continued to crop up. So the fact that on zoom people with very dark skin can, literally, their skin gets lost. For Asian faces that their eyes weren’t originally seen by cameras. Right? And so at the fact that a lot of the beauty filters are there to make your eyes look larger. And part of it’s that that’s what people want. And that’s where I think the chicken and the egg comes in. Is it, how are you going to expand this, the idea of beauty away from just Eurocentric standards of beauty if we see these beauty filters that perpetuate certain characteristics as more attractive than others.

Strong: Social media is well-known to be exclusionary, as is the beauty industry. But so is AI.

Rhue: I became interested of course, to see if you could see these cultural biases in the algorithms. And of course you can. Let’s just say I’ve never seen a culturally sensitive beauty AI.

Strong: Rhue’s research found that women with lighter skin and hair were consistently rated as more attractive than women with darker skin and hair. And filters too, which use facial detection, are likely to have some racial bias built in. And the consequences go well beyond the digital world.

Rhue: I think we should be very careful when we think about choice in the digital space. I mean, there have been extensive studies that have shown the order in which you recommend something to somebody changes their actual preferences. So as we have all of these, uh, recommendation algorithms and these decision support tools that are helping us figure out what to buy or how to position ourselves in social media it’s changing what we think we want.

Strong: And she believes the applications of A-I in beauty are largely being overlooked by the tech community.

Rhue: It’s just not something that we’re really talking about. And I think that speaks to the importance of diversity in this space. A lot of people say, Oh, well, beauty is just not important because we’re tech people and we’re objective. But of course, I mean, beauty is this huge industry… it has such an impact on people. And the idea that there isn’t more research is, is really interesting to me.

Strong: Next episode… we look to the future of digital payments.

Omar Farooq: We believe that there’s a path forward where money can be smarter itself. So you can actually program the coin and it can control who it goes to. So, that is not really possible in today’s centralized systems. That can only be done in a decentralized, smart money enabled system.  

Strong: This episode was reported by Tate Ryan-Mosley, and produced by me, Emma Cillekens, Karen Hao and Anthony Green. We’re edited by Michael Reilly and Bobbie Johnson.

Thanks for listening, I’m Jennifer Strong. 



How a tiny media company is helping people get vaccinated



How a tiny media company is helping people get vaccinated

More than 132 million people in the US have received at least one dose of a covid-19 vaccine, and as of this week, all Americans over 16 are eligible.

But while the US has vaccinated more people than any other country in the world, vulnerable people are still falling through the cracks. Those most affected include people who don’t speak English, people who aren’t internet-savvy, and shift workers who don’t have the time or computer access to book their own slots. In many places, community leaders, volunteers, and even news outlets have stepped in to help.

One of those groups is Epicenter-NYC, a media company that was founded during the pandemic to help neighbors navigate covid-19. Based in the Queens neighborhood of Jackson Heights, which was particularly hard hit by the virus, the organization publishes a newsletter on education, business, and other local news. 

S. Mitra Kalita, publisher of Epicenter-NYC

But Epicenter-NYC has gone further and actually booked more than 4,600 vaccine appointments for people in New York and beyond. People who want to get vaccinated can contact the organization—either through an intake form, a hotline, a text, or an email—for help setting up an appointment.

Throughout the vaccine rollout, the group has also been documenting and sharing what it has learned about the process with a large audience of newsletter readers. 

We spoke with S. Mitra Kalita, the publisher of Epicenter-NYC, who was previously a senior vice president at CNN Digital and is also the cofounder and CEO of URL Media, a network for news outlets covering communities of color. 

This interview has been condensed and edited for clarity.

Q: How did you start setting people up with vaccine appointments? 

A: It began with two areas of outreach. First, when I had to register my own parents for a vaccine and found the process to be pretty confusing, I immediately wondered how well elderly residents, their friends and neighbors, manage this process. I just started messaging them.

The second was when a restaurant [from our small business spotlight program] reached out and said, “Do you guys know how to get vaccines for our restaurant workers?” Because I had been navigating some of this for the elderly, I started to help the restaurant workers. There started to be a similar network effect. One of the workers at this restaurant has a boyfriend who is a taxi driver; when I helped her, she asked if I could help her boyfriend; then the boyfriend texted me with some of his friends; and it kept spreading in that way. 

Q: How is Epicenter-NYC filling gaps in vaccine distribution right now? What is your process like, and who are you helping?

“There’s a lot of matchmaking going on. We can sort through a list of about 7,500 to 8,000 people who said they need help, and then find places in proximity.”

S. Mitra Kalita

A: We’ve had between 200 and 250 people reach out to volunteer. The outreach efforts range from putting up fliers, doing translations, and calling people to literally booking the appointments. 

I don’t care if you’re a Bangladeshi taxi driver in Queens and your cousin is in New Jersey. We’re going to help both of you. A woman on the Upper East Side who’s 102 years old who is homebound and needs a visit is absolutely going to get Epicenter’s help. 

What we’re doing now is continuing the route of connecting people to each other and opportunities. There’s a lot of matchmaking going on. We can sort through a list of about 7,500 to 8,000 people who said they need help, and then find places in proximity. We’ve become this wonderful marriage—a centralized operation that also embraces decentralized solutions.

Q: We know that vaccination rates lag in many communities that were hit the hardest. Why is that? What issues and barriers are people experiencing? 

A: Just before the latest Johnson & Johnson pause announcement, I said, “We’re at a point where everybody remaining is a special case.”

I think we’ve leapfrogged to vaccine hesitancy without solving for vaccine access. We don’t see a lot of hesitancy, but we do see a lot of concerns over some issues. Number one would be scheduling. We’re dealing with populations that are working two, maybe three jobs, and when they say “I have this window on Sunday at 3 p.m. until maybe 6 p.m., when my next shift starts,” they really mean that’s the only window.

Q: People have been asked to prove who they are, where they work, and where they live in order to qualify for a vaccine. This was especially true when eligibility was more limited. How did you help people face barriers around getting the documents they needed? 

A: New York State has been explicit in saying you can still get a vaccine even if you are undocumented. But that messaging doesn’t really match the on-the-ground reality. 

Continue Reading


Police in Ogden, Utah and small cities around the US are using these surveillance technologies



Police in Ogden, Utah and small cities around the US are using these surveillance technologies

One afternoon, I accompanied Heather West, the detective who’d been perusing gray pickups in the license-plate database, and Josh Terry, the analyst who’d spotted the kidnapper with the Cowboys jacket, to fly a drone over a park abutting a city-owned golf course on the edge of town. West was at the controls; Terry followed the drone’s path in the sky and maintained “situational awareness” for the crew; another detective focused on the iPad showing what the drone was seeing, as opposed to where and how it was flying. 

Of all the gadgets under the hood at the real time crime center, drones may well be the most tightly regulated, subject to safety (but not privacy) regulations and review by the Federal Aviation Administration. In Ogden, neighbor to a large Air Force base, these rules are compounded by flight restrictions covering most of the city. The police department had to obtain waivers to get its drones off the ground; it took two years to develop policies and get the necessary approvals to start making flights. 

Joshua Terry, an analyst who does much of the real time crime center’s mapping work, with a drone.


The police department purchased its drones with a mind to managing large public events or complex incidents like hostage situations. But, as Dave Weloth soon found, “the more we use our drones, the more use cases we find.” At the real time crime center, Terry, who has a master’s in geographic information technology, had given me a tour of the city with images gathered on recent drone flights, clicking through to cloud-shaped splotches, assembled from the drone’s composite photographs, that dotted the map of Ogden. 

Above 21st Street and Washington, he zoomed in on the site of a fatal crash caused by a motorcycle running a red light. A bloody sheet covered the driver’s body, legs splayed on the pavement, surrounded by a ring of fire trucks. Within minutes, the drone’s cameras had scanned the scene and created a 3D model accurate to a centimeter, replacing the complex choreography of place markers and fixed cameras on the ground that sometimes leave major intersections closed for hours after a deadly collision.

No one seemed to give much thought to the fact that quietly, people who were homeless had become the sight most frequently captured by the police department’s drone program.

When the region was hit by a powerful windstorm last September, Terry flew a drone over massive piles of downed trees and brush collected by the city. When county officials saw the resulting volumetric analysis—12,938 cubic yards—that would be submitted as part of a claim to the Federal Emergency Management Agency, they asked the police department to perform the same service for two neighboring towns. Ogden drones have also been used to pinpoint hot spots after wildland fires, locate missing persons, and fly “overwatch” for SWAT team raids.

This flight was more routine. When I pulled into the parking lot, two officers from Ogden’s community policing unit looked on as West steered the craft over a dense stand of Gambel oak and then hovered over a triangular log fort on a hillside a couple of hundred yards away. Though they’d never encountered people on drone sweeps through the area, trash and makeshift structures were commonplace. Once the RTCC pinpointed the location of any encampments, the community service officers would go in on foot to get a closer look. “We get a lot of positive feedback from runners, hikers,” one officer explained. After one recent visit to a camp near a pond on 21st Street, he and the county social service workers who accompanied him found housing for two people they’d met there. When clearing camps, police also “try and connect [people] with services they need,” Weloth said. The department recently hired a full-time homeless outreach coordinator to help. “We can’t police ourselves out of this problem,” he said, comparing the department’s efforts to keep new camps from springing up to “pushing water uphill.”

Continue Reading


NASA has flown its Ingenuity drone helicopter on Mars for the first time



NASA has flown its Ingenuity drone helicopter on Mars for the first time

The news: NASA has flown an aircraft on another planet for the first time. On Monday, April 19, Ingenuity, a 1.8-kilogram drone helicopter, took off from the surface of Mars, flew up about three meters, then swiveled and hovered for 40 seconds. The historic moment was livestreamed on YouTube, and Ingenuity captured the photo above with one of its two cameras. “We can now say that human beings have flown a rotorcraft on another planet,” said MiMi Aung, the Ingenuity Mars Helicopter project manager at NASA’s Jet Propulsion Laboratory, at a press conference. “We, together, flew at Mars, and we, together, now have our Wright brothers moment,” she added, referring to the first powered airplane flight on Earth in 1903.

In fact, Ingenuity carries a tribute to that famous flight: a postage-stamp-size piece of material from the Wright brothers’ plane tucked beneath its solar panel. (The Apollo crew also took a splinter of wood from the Wright Flyer, as it was named, to the moon in 1969.)

The details: The flight was a significant technical challenge, thanks to Mars’s bone-chilling temperatures (nights can drop down to -130 °F/-90 °C) and its incredibly thin atmosphere—just 1% the density of Earth’s. That meant Ingenuity had to be light, with rotor blades that were bigger and faster than would be needed to achieve liftoff on Earth (although the gravity on Mars, which is only about one-third of Earth’s, worked in its favor). The flight had originally been scheduled to take place on April 11 but was delayed by software issues. 

Why it’s significant: Beyond being a significant milestone for Mars exploration, the flight will also pave the way for engineers to think about new ways to explore other planets. Future drone helicopters could help rovers or even astronauts by scoping out locations, exploring inaccessible areas, and capturing images. Ingenuity will also help inform the design of Dragonfly, a car-size drone that NASA is planning to send to Saturn’s moon Titan in 2027. 

What’s next: In the next few weeks, Ingenuity will conduct four more flights, each lasting up to 90 seconds. Each one is designed to further push the limits of Ingenuity’s capabilities. Ingenuity is only designed to last for 30 Martian days, and is expected to stop functioning around May 4. Its final resting place will be in the Jezero Crater as NASA moves on to the main focus of its mission: getting the Perseverance rover to study Mars for evidence of life.

Continue Reading

Copyright © 2020 Diliput News.