Connect with us


Podcast: when your face is your ticket, your face is your ticket, your face could be your ticket



Podcast: when your face is your ticket, your face is your ticket, your face could be your ticket

In part-three of this latest series, Jennifer Strong and the team at MIT Technology Review jump on the court to unpack just how much things are changing. 

We meet:

  • Donnie Scott, senior vice president of public security, IDEMIA
  • Michael D’Auria, vice president of business development, Second Spectrum
  • Jason Gay, sports columnist, The Wall Street Journal
  • Rachel Goodger, director of business development, Fancam
  • Rich Wang, director of analytics and fan engagement, Minnesota Vikings


This episode was reported and produced by Jennifer Strong, Anthony Green, Tate Ryan-Mosley, Emma Cillekens and Karen Hao. We’re edited by Michael Reilly and Gideon Lichfield. 


 [TR ID]

Strong: I’m in Queens in the neighborhood near a massive stadium complex called Citi Field. It’s home to the New York Mets, though because it’s baseball’s offseason. Right now, everything is locked up and all you can really hear is rush hour traffic.

But if you look up, along the edge of the stadium where thousands of fans will, eventually, return, you can see some of the hardware that powers the team’s use of face recognition. These cameras are meant to detect faces that have been banned from the grounds–folks like ticket scalpers, people who’ve run onto the field, even committed crimes out in the parking lot and that system is powered by one of the biggest names in face recognition – N-E-C. It’s able to measure things like ears — and it still works with people wearing masks, hats and sunglasses.

And then once you get over to the turnstiles – there’s another face system from a company that’s known for airport security – called Clear – and that’s for ticketless entry. Basically you can use your face as a ticket. When you get inside there’s a payments system in a concessions area – meaning you can buy a beer with your face, if you wish.

But it’s when you get to your seat that things get really interesting. Even before the pandemic, attendance at baseball games has been on the decline. Actually, this stadium has about 15-thousand fewer seats in it than the one it replaced. And so, on the one hand, stadiums are trying to make the experience just as safe and hassle-free as they possibly can but they’re also trying to learn just as much as they can about who these people are in the stands and that too is being done with face recognition. I’m Jennifer Strong, and in this latest episode of our mini-series, we look at how this and other tracking systems are changing the sports experience in the stands and on the court.


[Sound from Chicago White Sox at Milwaukee Brewers (Anchor): Ok we are back to playing ball. Two out. 1st inning. No score. And the batter will be Harold Baines with a 7-game hitting streak…]

[Sound from Chicago White Sox at Milwaukee Brewers: crowd cheers]

Strong: For decades, crowding around the TV or radio was the go-to way to consume sports.  Oftentimes, that meant tuning in for hours like this 1984 Major League Baseball game between the Chicago White Sox and the Milwaukee Brewers.

[Sound from Chicago White Sox at Milwaukee Brewers (Anchor): That’s deep in the center field. Going back.. It could be out of here. Manning looks up. It’s outta here! A home run for Harold Baines. The Soxs win 7-6 in the longest game in American league history.]

Strong: The game lasted eight hours and six minutes. And it had to be completed over two days. But, sports watching today looks pretty different.  Human attention spans are measured in seconds and they’re shrinking. Millions of people still tune in to watch but about a third stream them on mobile devices. And of those who still watch on television, 80 percent of them do so while using a second device to search stats, live scores, message other fans, and watch related videos. The segment of fans who attend games in-person are now seen as high-value customers. And that’s another place where face ID comes in. 

[Sound from CNBC newscast (Anchor): And if you were angered over Facebook invading your privacy, you may not want to attend a major sporting event.]

[Sound from CNBC newscast (Eric Chemi): New high tech cameras can now snap a high-rez photo of every person, in every seat, every minute of the game.]

Strong: Face data collected in stadiums by companies like Fancam is now being used to get insights on fan demographics like age, gender and race. Panoramic cameras are able to capture images in such fine detail, that you can zoom in (from a birds-eye view of a stadium) into the stands, onto an individual person, and still be able to make out nuances like a smile, the writing on their shirt, even the texture of their jacket. And now you can also quickly calculate the percentage of people wearing masks – Like in the case of the NFL’s Minnesota Vikings.

Wang: This is new for everybody. We’re still trying to work out exactly how we enforce these mask rules and how to monitor them and track them.

Strong: Rich Wang is their director of analytics & fan engagement. He’s on a Zoom call showcasing how they use computer vision. 

Wang: Also, if you look at this graph. The lowest point is that 87% of people who have their mask on at most of the time and in most of the game. People are you know behaving and enforcing the mask rule. So those are really positive storylines that will continue to support our case of increasing fans

Goodger: Being able to utilize these stats to reopen venues and get fans back into the stadium. And then just as a safeguard as well, once fans are back in the stadium using some of these metrics in addition to the mask usage, also being able to utilize the information of section capacity. 

Strong: And this is Rachel Goodger, the director of business development at Fancam.

Goodger: So, obviously fans have a seat assigned to them when they go back into the stadium and fans are socially distanced. But what happens if fans start to move around the stadium, and one section becomes over capacity. You know, in real time us to able to notify staff and for them being able to see that information and say, ok well, we need to go break up this section a little bit. And then for teams being able to look back after every single game and say “wow we did a great job today.” Or “wow we really need to work more on mask usage in the lower goal or upper goal of this section” and things like that. I think it is data that is going to be very important for not only, as I mentioned, reopening these stadiums but keeping them open in the future. 

Strong: The company sells data back to the sports teams who use it to advance their marketing, affecting everything from what music is played at stadiums to what ads people see during and even after the game has ended.

Scott: You’re gonna start to see the data that you’re willing to share more broadly coupled with the technology used for identification to make things more predictable.

Strong: Donnie Scott is the senior vice president of public security at IDEMIA. It designs AI-driven identity and security solutions to all kinds of businesses.  

Scott: And that would be everything from a digital driver’s license on your phone to a physical license, to a credit card, to an electronic payment mechanism.

Strong: They also make biometric technologies that recognize faces, fingerprints or eyes which can be used to verify identity in sports stadiums or other places like airports and theaters.

Scott: So, we would essentially embed the technology in their loyalty program but we’d add to it, the ability to link either their biometrics – face, fingerprint, iris in some countries that prefer it because of face coverings and other things, or their mobile device where you could authoritatively share your biometric information, or the fact that you’re a season ticket holder, with a piece of equipment at the venue. And therefore, you know, when you show up, they know, okay, Jennifer has tickets to this game. They’re valid at this date. She can pass through the gate.

Strong: Their goal? Is to be invisible. Identity data is captured by cameras concealed as appears to be a normal turnstile. It’s all about creating what’s known as a frictionless experience.

Scott: So particularly around theme parks, um, but the same with stadiums and other concert venues, the technology is evolving from being a device  that kind of stands out to being part of the normal flow and cue of  the venue itself. 

Strong: We already unlock smartphones with our eyes, fingers and face and that got us used to this idea of biometrics in our daily lives. Scott thinks that may be why the response to these services has been mostly positive. 

Scott: You know, I’ve watched my kids grow up with  first opening an Apple device with their thumb print, then moving on that they felt they were very mistreated because they couldn’t unlock it with their face. And we’ve all become, you know, the last 15 years, 10 years, desensitized to the weirdness of it. I think most of society is focused on how it makes my life easier.

Strong: And in a world where confirming your identity is as easy as unlocking a phone, your biometric data could become more important than a passport, car keys or any other physical item we carry with us.

Scott: I think people are going to become really accustomed to the technology being there, how to use it, how to interact with it and what to expect from it because I think we’re going to see it in all walks of life. We’re going to see it when we travel. We’re going to see it when we do business with our government. We’re going to see it when we do business in grocery stores in you know sports and concert venues and music parks as well. So it’s going to become such a standard way of life that the access part will become a de facto normal. And then it’s what happens next.

Strong: And what happens next could mean more personalized experiences. 

Scott: I think that the next thing to come is going to be, to enable the fan experience. But after that, it becomes, how does the fan experience fit in your life? And, you know, that is a concept that is pretty big and broad, but one that once the first two pieces are enabled through technology and enabled through an acceptance by the user themselves are only natural things that come with an improved, mature use of a technology. You could think of an amusement park, head or character where kids could walk up to their favorite character and be recognized for who they are and have a custom experience specific to them. 

Strong: Which is likely to happen at scale. 

Scott: You could see a future where as you arrived to the airport or as you arrive to the sporting event, and it directs you to your parking based on recognizing your car or on sharing who you are from your phone with the airport operator or the airline or the TSA themselves.  You would have an, you know, a known time to gate, right. Which is the ideal state where it says I’ve got a five o’clock flight today based on the wait times that are predicted and where we are. I know that it’s going to take me 12 minutes to get from the front of the airport  through the checkpoint to the gate. And you’re going to have directions along the way, the same experience is going to happen for  sports venues and for concert venues, where from parking, you’re going to be directed through the shortest line, you know, that line’s going to move quickly because it’s biometrically enabled, and then it’s going to be able to guide you to where can I get my concessions that I want, how long do I have to, before I have to start walking, so I can be in my seat before it kicks off, I think those types of secondary benefits are going to come pretty quickly as the, as the venues get instrumented, to be able to recognize and identify folks.

D’Auria: I think there’s a huge opportunity to make the kind of sports fan experience, more engaging, more potent. And I just think where we’re at the early days of that. I’m Mike D’Auria and I’m the vice president of business development at Second Spectrum.

Strong: The company provides tracking data and analytics software for professional sports leagues like the NBA and Major League Soccer. A series of cameras no bigger than your standard security camera, provide unprecedented machine understanding of every game.

D’Auria: The kind of core of this technology is computer vision that runs on top of these camera feeds. And what this is intended to do is track the movement of every player and the ball 25 times a second. So you can kind of think over the course of one umm typical NBA basketball game, you’re able to capture millions of data points that didn’t exist before and use those to kind of, build a suite of products or experiences on top of that can really change the way that we see and interact with sports.

Strong: Those data points are rapidly analyzed with AI, which can spit out predictions such as the likelihood a player will sink a three-pointer—while the play is still in progress. It’s also using this data to deliver a more personalized, interactive viewing experience for fans watching remotely. 

D’Auria: In this last NBA finals, we ran what we call video augmentation essentially real time on top of the game. And so what you could do there is for example, take that shot probability model. And while the game is being played, you could integrate into 3D space in the video, a shot probability bubble over every offensive player’s head that updates in real time.  We can diagram the play that’s being run as it’s unfolding. So if you’re trying to learn about the game a little bit, you can kind of, you know, have a bit of a tutorial or what would it feel like to have a coach sitting next to you. You know, Or if you just want to have fun or kind of game-ify this a little bit, you know, every time somebody dunks the ball, you can see a lightning strikeon the back board. And so each of those experiences might not be right for everybody, but I think we will move to a world where live sports can be really personalized to the way you want to view it.

Strong: And access to troves of data has transformed how coaches train their players. 

D’Auria: So if you kind of step back and think about the way data has traditionally been captured in sports, you would have people either sitting in the stands or watching the game on TV and kind of manually coding. That was a shot. That was a pass. That was a pick and roll action. And so from this kind of underlying tracking dataset you can apply machine learning to kind of automate that whole process.

Strong: That automation allows for all that data to be matched to game film. Coaches, general managers, and analysts can then sift through it with a software tool that functions like a search engine.  

D’Auria: And so for folks who work on an NBA team, you can ask very complicated questions or make very kind of detailed queries about the game. And with a few keystrokes, a few clicks your mouse, you can get a very precise answer in data visualization and a automatically generated playlist of, you know, for example, if I wanted to look at,  Anthony Davis, LeBron James, pick and roll from the right wing where the defense ices and Anthony Davis rolls and somebody tags him from the weak side. And so LeBron James takes a jump shot and makes it. You know, you can get the very precise set of every time that combination has happened in the course of these guys NBA careers in a matter of seconds, and then kind of use that for your coaching purposes. And now, uh, someone at a team level can spend their time saying, well, I have this video or this information, how can I help a coach implement that into his game plan? Or how can I help my players kind of learn something new on the court? And so it kind of shifts their workflow to teaching and implementation versus kind of, you know, data gathering and manual labor.

Strong: And he says, over the next couple of years, the roles of these machines in the game could shift from assistant coach to assistant referee—adding context and nuance to difficult calls.

D’Auria: I mean, we’ve seen this already in some other places where we work.  So we’ll kind of give the soccer example of you now have technology that will help with the goal, no goal call, right? You see this in tennis with computer systems being used to kind of judge, if a ball is  over line or, you know, inbounds or out of bounds and be able to do this with  precision that’s quite frankly, better than what a line judge could do or  a referee who might have a really difficult angle to see if like literally every millimeter of the ball went over. You’re starting to see this with the offsides line in soccer as well.  And so I think generally the first place this happens is to basically, um, you know, augment or assist a referee’s capabilities. So you can kind of think about providing a referee and additional data source or, you know, an additional validation of one of their decisions.

Strong: Because the system can already identify players from their jerseys, Second Spectrum doesn’t need to use facial mapping or recognition. But it is useful for analytics. And that’s not just specific to capturing faces. Right now, players appear in the system as dots on a map. And as their camera systems improve those dots could transform into full skeletons. Extra detail like real-time elbow angle could help with even more accurate shot predictions. Though, not everyone is onboard.

Gay: You know, a sport that I follow and find fascinating is bike racing and bike racing is a sport that is actually in a long conversation about. Removing technology. 

Strong: Jason Gay is a sports columnist for The Wall Street Journal.

Gay: Technology now in cycling can say, okay, if you want to win this race or catch up to this person, you have to put out X amount of effort for X amount of minutes. And you actually have this data right on an onboard computer, on a bicycle in front of you telling you exactly what to do. Now. That’s like an amazing thing. However, it’s also not terribly human, right? It seems to be somewhat clinical and it’s created what many people feel is a little bit of a dry style of racing where people are data-driven and they’re using their heads too much, as opposed to their hearts. The French have an expression of panache. They love to see races won with panache, which basically means our gut instinct. And so there’s been conversations about, well, what if we take away these computers from riders and make them, you know, use their heads in their hearts to cycle. Now there’s a safety consideration here that’s concurrent with this, right? You want to actually have that information creates a safer experience for a rider oftentimes, but it is fascinating that the tech has gotten so good in certain instances, in terms of maximizing effort or telling an athlete, what effort is required, that they’re starting to draw back from that.

Strong: And for sports embracing this tech, It’s changing how the game is played. 

Gay: Here’s an example from baseball and we see quite often a manager will come to a mound and remove a pitcher from a game, even though the pitcher is pitching very, very well that day, the reason they remove them is that the data shows that this pitcher tends to break down at a certain point. It’s almost like a car tire or something. And they’re just saying, well this pitcher at this point of the game historically is going to stop performing at the high level we need him to. So we’re going to make that move. We’re removing sort of the gut of saying there oh well he’s rolling today, let’s just let them go. They’re relying on the numbers.

Strong: Data driven game strategies are also changing how teams recruit. Like in basketball, where players who can execute a three-point shot (once considered a gimmick by the NBA) are now deemed extremely valuable. 

Gay: The reason is that basketball teams by looking at their numbers discovered that a three-point shot is a more efficient shot. You’d rather take that three-point shot than certainly take a longer two point jump shot. And so you prioritize the three pointer in an offense. The most extreme example of this – the Houston Rockets, where you have a perennial MVP candidate in James Harden who oftentimes is taking three pointer after three pointer in a game, because it’s an efficient way for them to play.

[Sound of Houston Rockets at Los Angeles Clippers (Announcers): Harden, nobody near him, sets all the time and nails the three-pointer! Steps back, open three, got it! James Harden steps back puts up a three, It goes, bounces and drops through!]

Strong: Technology is also playing assistant coach in places like the locker room of The Dallas Mavericks. 

[Sound from video of Marc Cuban at Dallas Mavericks (Cuban): What will happen is when a player walks in, or anybody walks in, we’ll have facial recognition. It’ll take a picture of you and it will say ‘ok here comes Marc or here comes Dirk’]

Strong: Marc Cuban is their owner.

[Sound from video of Marc Cuban at Dallas Mavericks (Cuban): And for any of the players or any of the staff, it’ll put coaches notes: here’s what you’re expected to do and tell you what’s going on. For anybody we don’t know it’s going to be ehh-ehh-ehh get the heck out.”]

Strong: And it’s not just basketball. Using AI to find the most efficient pattern of play is growing across all sports. And there’s a role for face ID too. That same face-mapping that sees when you’re looking directly at your phone to unlock it could also help coaches see what players are focusing on during the game.

Gay: I mean, that’s an incredibly integral thing for say a football quarterback. If you could somehow be able to render what a football quarterback is looking at or more importantly not looking at, not seeing downfield. Well, you could see, you know, immediate utility for any quarterback, any football team. But it also applies to a point guard or, you know, somebody playing left tackle or somebody catching on a baseball team. There are numerous plays that if you’re able to sort of look at what an athlete is seeing on the court or not seeing again, which is probably the more essential thing, that would have enormous consequences. 

Strong: Next episode, we wrap up our miniseries with a look at how face mapping is transforming the shopping experience. And spoiler alert – it goes way beyond just identifying who’s in the store  

Guive Balooch: In order to really virtually be able to try on with augmented reality makeup, you need to detect where the eye is and where the eyebrow is. And, um, it has to be at a level of accuracy that when the product’s on there, it doesn’t look like it’s not exactly on your lip   and people’s lips are, can vary in shape, the color between your skin tone and your lip, can also be very different. And so you need to have an algorithm that can detect it and make sure it works.

Strong: This episode was reported and produced by me, Anthony Green, Tate Ryan-Mosley, Emma Cillekens and Karen Hao. We’re edited by Michael Reilly and Gideon Lichfield. Thanks for listening, I’m Jennifer Strong. 



Police in Ogden, Utah and small cities around the US are using these surveillance technologies



Police in Ogden, Utah and small cities around the US are using these surveillance technologies

One afternoon, I accompanied Heather West, the detective who’d been perusing gray pickups in the license-plate database, and Josh Terry, the analyst who’d spotted the kidnapper with the Cowboys jacket, to fly a drone over a park abutting a city-owned golf course on the edge of town. West was at the controls; Terry followed the drone’s path in the sky and maintained “situational awareness” for the crew; another detective focused on the iPad showing what the drone was seeing, as opposed to where and how it was flying. 

Of all the gadgets under the hood at the real time crime center, drones may well be the most tightly regulated, subject to safety (but not privacy) regulations and review by the Federal Aviation Administration. In Ogden, neighbor to a large Air Force base, these rules are compounded by flight restrictions covering most of the city. The police department had to obtain waivers to get its drones off the ground; it took two years to develop policies and get the necessary approvals to start making flights. 

Joshua Terry, an analyst who does much of the real time crime center’s mapping work, with a drone.


The police department purchased its drones with a mind to managing large public events or complex incidents like hostage situations. But, as Dave Weloth soon found, “the more we use our drones, the more use cases we find.” At the real time crime center, Terry, who has a master’s in geographic information technology, had given me a tour of the city with images gathered on recent drone flights, clicking through to cloud-shaped splotches, assembled from the drone’s composite photographs, that dotted the map of Ogden. 

Above 21st Street and Washington, he zoomed in on the site of a fatal crash caused by a motorcycle running a red light. A bloody sheet covered the driver’s body, legs splayed on the pavement, surrounded by a ring of fire trucks. Within minutes, the drone’s cameras had scanned the scene and created a 3D model accurate to a centimeter, replacing the complex choreography of place markers and fixed cameras on the ground that sometimes leave major intersections closed for hours after a deadly collision.

No one seemed to give much thought to the fact that quietly, people who were homeless had become the sight most frequently captured by the police department’s drone program.

When the region was hit by a powerful windstorm last September, Terry flew a drone over massive piles of downed trees and brush collected by the city. When county officials saw the resulting volumetric analysis—12,938 cubic yards—that would be submitted as part of a claim to the Federal Emergency Management Agency, they asked the police department to perform the same service for two neighboring towns. Ogden drones have also been used to pinpoint hot spots after wildland fires, locate missing persons, and fly “overwatch” for SWAT team raids.

This flight was more routine. When I pulled into the parking lot, two officers from Ogden’s community policing unit looked on as West steered the craft over a dense stand of Gambel oak and then hovered over a triangular log fort on a hillside a couple of hundred yards away. Though they’d never encountered people on drone sweeps through the area, trash and makeshift structures were commonplace. Once the RTCC pinpointed the location of any encampments, the community service officers would go in on foot to get a closer look. “We get a lot of positive feedback from runners, hikers,” one officer explained. After one recent visit to a camp near a pond on 21st Street, he and the county social service workers who accompanied him found housing for two people they’d met there. When clearing camps, police also “try and connect [people] with services they need,” Weloth said. The department recently hired a full-time homeless outreach coordinator to help. “We can’t police ourselves out of this problem,” he said, comparing the department’s efforts to keep new camps from springing up to “pushing water uphill.”

Continue Reading


NASA has flown its Ingenuity drone helicopter on Mars for the first time



NASA has flown its Ingenuity drone helicopter on Mars for the first time

The news: NASA has flown an aircraft on another planet for the first time. On Monday, April 19, Ingenuity, a 1.8-kilogram drone helicopter, took off from the surface of Mars, flew up about three meters, then swiveled and hovered for 40 seconds. The historic moment was livestreamed on YouTube, and Ingenuity captured the photo above with one of its two cameras. “We can now say that human beings have flown a rotorcraft on another planet,” said MiMi Aung, the Ingenuity Mars Helicopter project manager at NASA’s Jet Propulsion Laboratory, at a press conference. “We, together, flew at Mars, and we, together, now have our Wright brothers moment,” she added, referring to the first powered airplane flight on Earth in 1903.

In fact, Ingenuity carries a tribute to that famous flight: a postage-stamp-size piece of material from the Wright brothers’ plane tucked beneath its solar panel. (The Apollo crew also took a splinter of wood from the Wright Flyer, as it was named, to the moon in 1969.)

The details: The flight was a significant technical challenge, thanks to Mars’s bone-chilling temperatures (nights can drop down to -130 °F/-90 °C) and its incredibly thin atmosphere—just 1% the density of Earth’s. That meant Ingenuity had to be light, with rotor blades that were bigger and faster than would be needed to achieve liftoff on Earth (although the gravity on Mars, which is only about one-third of Earth’s, worked in its favor). The flight had originally been scheduled to take place on April 11 but was delayed by software issues. 

Why it’s significant: Beyond being a significant milestone for Mars exploration, the flight will also pave the way for engineers to think about new ways to explore other planets. Future drone helicopters could help rovers or even astronauts by scoping out locations, exploring inaccessible areas, and capturing images. Ingenuity will also help inform the design of Dragonfly, a car-size drone that NASA is planning to send to Saturn’s moon Titan in 2027. 

What’s next: In the next few weeks, Ingenuity will conduct four more flights, each lasting up to 90 seconds. Each one is designed to further push the limits of Ingenuity’s capabilities. Ingenuity is only designed to last for 30 Martian days, and is expected to stop functioning around May 4. Its final resting place will be in the Jezero Crater as NASA moves on to the main focus of its mission: getting the Perseverance rover to study Mars for evidence of life.

Continue Reading


The $1 billion Russian cyber company that the US says hacks for Moscow



The $1 billion Russian cyber company that the US says hacks for Moscow

The public side of Positive is like many cybersecurity companies: staff look at high-tech security, publish research on new threats, and even have cutesy office signs that read “stay positive!” hanging above their desks. The company is open about some of its links to the Russian government, and boasts an 18-year track record of defensive cybersecurity expertise including a two-decade relationship with the Russian Ministry of Defense. But according to previously unreported US intelligence assessments, it also develops and sells weaponized software exploits to the Russian government. 

One area that’s stood out is the firm’s work on SS7, a technology that’s critical to global telephone networks. In a public demonstration for Forbes, Positive showed how it can bypass encryption by exploiting weaknesses in SS7. Privately, the US has concluded that Positive did not just discover and publicize flaws in the system, but also developed offensive hacking capabilities to exploit security holes that were then used by Russian intelligence in cyber campaigns.

Much of what Positive does for the Russian government’s hacking operations is similar to what American security contractors do for United States agencies. But there are major differences. One former American intelligence official, who requested anonymity because they are not authorized to discuss classified material, described the relationship between companies like Positive and their Russian intelligence counterparts as “complex” and even “abusive.” The pay is relatively low, the demands are one-sided, the power dynamic is skewed, and the implicit threat for non-cooperation can loom large.

Tight working relationship

American intelligence agencies have long concluded that Positive also runs actual hacking operations itself, with a large team allowed to run its own cyber campaigns as long as they are in Russia’s national interest. Such practices are illegal in the western world: American private military contractors are under direct and daily management of the agency they’re working for during cyber contracts. 

US intelligence has concluded that Positive did not just discover and publicize flaws, but also developed offensive hacking capabilities to exploit security holes that it found

Former US officials say there is a tight working relationship with the Russian intelligence agency FSB that includes exploit discovery, malware development, and even reverse engineering of cyber capabilities used by Western nations like the United States against Russia itself. 

The company’s marquee annual event, Positive Hack Days, was described in recent US sanctions as “recruiting events for the FSB and GRU.” The event has long been famous for being frequented by Russian agents. 

NSA director of cybersecurity Rob Joyce said the companies being sanctioned “provide a range of services to the SVR, from providing the expertise to developing tools, supplying infrastructure and even, sometimes, operationally supporting activities,” Politico reported.

One day after the sanctions announcement, Positive issued a statement denying “the groundless accusations” from the US. It pointed out that there is “no evidence” of wrongdoing and said it provides all vulnerabilities to software vendors “without exception.”

Tit for tat

Thursday’s announcement is not the first time that Russian security companies have come under scrutiny. 

The biggest Russian cybersecurity company, Kaspersky, has been under fire for years over its relationships with the Russian government—eventually being banned from US government networks. Kaspersky has always denied a special relationship with the Russian government.

But one factor that sets Kaspersky apart from Positive, at least in the eyes of American intelligence officials, is that Kaspersky sells antivirus software to western companies and governments. There are few better intelligence collection tools than an antivirus, software which is purposely designed to see everything happening on a computer, and can even take control of the machines it occupies. US officials believe Russian hackers have used Kaspersky software to spy on Americans, but Positive—a smaller company selling different products and services—has no equivalent. 

Recent sanctions are the latest step in a tit for tat between Moscow and Washington over escalating cyber operations, including the Russian-sponsored SolarWinds attack against the US, which led to nine federal agencies being hacked over a long period of time. Earlier this year, the acting head of the US cybersecurity agency said recovering from that attack could take the US at least 18 months.

Continue Reading

Copyright © 2020 Diliput News.