Connect with us

Tech

Police are flying surveillance over Washington. Where were they last week?

Published

on

Police are flying surveillance over Washington. Where were they last week?


Nor were resources an issue. The United States Capitol Police, or USCP, is one of the most well-funded police forces in the country. It is responsible for security across just 0.4 square miles of land, but that area hosts some of the most high-profile events in American politics, including presidential inaugurations, lying-in-state ceremonies, and major protests. The USCP is well-staffed, with 2,300 officers and civilian employees, and its annual budget is at least $460 million—putting it among the top 20 police budgets in the US. In fact, it’s about the size of the Atlanta and Nashville police budgets combined. For comparison, the DC Metropolitan Police Department—which works regularly with the USCP and covers the rest of the District’s 68 square miles—has a budget of $546 million

The USCP is different from state and local departments in other important ways, too. As a federal agency that has no residents inside its jurisdiction, for example, it answers to a private oversight board and to Congress—and only Congress has the power to change its rules and budgets. Nor is it subject to transparency laws such as the Freedom of Information Act, which makes it even more veiled than the most opaque departments elsewhere in the country. 

All of this means there is little public information about the tools and tactics that were at the USCP’s disposal ahead of the riots. 

But “they have access to some pretty sophisticated stuff if they want to use it,” says Stoughton. That includes the resources of other agencies like the Secret Service, the FBI, the Department of Homeland Security, the Department of the Interior, and the United States military. (“We are working [on technology] on every level with pretty much every agency in the country,” the USCP’s then-chief said in 2015, in a rare acknowledgment of the force’s technical savvy.)

What should have happened

With such resources at its disposal, the Capitol Police would likely have made heavy use of online surveillance ahead of January 6. Such monitoring usually involves not just watching online spaces, but tracking known extremists who had been at other violent events. In this case, that would include the “Unite the Right” rally in Charlottesville, Virginia, in 2017 and the protest against coronavirus restrictions at the Michigan state capitol in 2020. 

Exactly what surveillance was happening before the riots is unclear. The FBI turned down a request for a comment, and the USCP did not respond. “I’d find it very hard to believe, though, that a well-funded, well-staffed agency with a pretty robust history of assisting with responding to crowd control situations in DC didn’t do that type of basic intelligence gathering,” says Stoughton. 

Ed Maguire, professor of criminal justice at Arizona State University, is an expert on protests and policing. He says undercover officers would usually operate in the crowd to monitor any developments, which he says can be the most effective surveillance tool to manage potentially volatile situations—but that would require some preparedness and planning that perhaps was lacking. 

Major events of this kind would usually involve a detailed risk assessment, informed by monitoring efforts and FBI intelligence reports. These assessments determine all security, staffing, and surveillance plans for an event. Stoughton says that what he sees as inconsistency in officers’ decisions to retreat or not, as well as the lack of an evacuation plan and the clear delay in securing backup, point to notable mistakes. 

This supports one of the more obvious explanations for the failure: that the department simply misjudged the risk. 

What seems to have happened

It appears that Capitol Police didn’t coordinate with the Park Police or the Metropolitan Police ahead of the rally—though the Metropolitan Police were staffed at capacity in anticipation of violence. Capitol Police Chief Steven Sund, who announced his resignation in the wake of the riots, also asserts that he requested additional National Guard backup on January 5, though the Pentagon denies this.

The USCP has also been accused of racial bias, along with other police forces. Departments in New York, Seattle, and Philadelphia are among those looking into whether their own officers took part in the assault, and the Capitol Police itself suspended “several” employees and will investigate 10 officers over their role.

But one significant factor that might have altered the volatility of the situation, Maguire says, is that police clashes with the Proud Boys in the weeks and days before the event, including a violent rally in Salem, Oregon, and the arrest of the white supremacist group’s leader, Henry Tarrio, fractured the right wing’s assumption that law enforcement was essentially on their side. On January 5, Maguire had tweeted about hardening rhetoric and threats of violence as this assumption started to fall apart. 



Tech

How AI is reinventing what computers are

Published

on

How AI is reinventing what computers are


Fall 2021: the season of pumpkins, pecan pies, and peachy new phones. Every year, right on cue, Apple, Samsung, Google, and others drop their latest releases. These fixtures in the consumer tech calendar no longer inspire the surprise and wonder of those heady early days. But behind all the marketing glitz, there’s something remarkable going on. 

Google’s latest offering, the Pixel 6, is the first phone to have a separate chip dedicated to AI that sits alongside its standard processor. And the chip that runs the iPhone has for the last couple of years contained what Apple calls a “neural engine,” also dedicated to AI. Both chips are better suited to the types of computations involved in training and running machine-learning models on our devices, such as the AI that powers your camera. Almost without our noticing, AI has become part of our day-to-day lives. And it’s changing how we think about computing.

What does that mean? Well, computers haven’t changed much in 40 or 50 years. They’re smaller and faster, but they’re still boxes with processors that run instructions from humans. AI changes that on at least three fronts: how computers are made, how they’re programmed, and how they’re used. Ultimately, it will change what they are for. 

“The core of computing is changing from number-crunching to decision-­making,” says Pradeep Dubey, director of the parallel computing lab at Intel. Or, as MIT CSAIL director Daniela Rus puts it, AI is freeing computers from their boxes. 

More haste, less speed

The first change concerns how computers—and the chips that control them—are made. Traditional computing gains came as machines got faster at carrying out one calculation after another. For decades the world benefited from chip speed-ups that came with metronomic regularity as chipmakers kept up with Moore’s Law. 

But the deep-learning models that make current AI applications work require a different approach: they need vast numbers of less precise calculations to be carried out all at the same time. That means a new type of chip is required: one that can move data around as quickly as possible, making sure it’s available when and where it’s needed. When deep learning exploded onto the scene a decade or so ago, there were already specialty computer chips available that were pretty good at this: graphics processing units, or GPUs, which were designed to display an entire screenful of pixels dozens of times a second. 

Anything can become a computer. Indeed, most household objects, from toothbrushes to light switches to doorbells, already come in a smart version.

Now chipmakers like Intel and Arm and Nvidia, which supplied many of the first GPUs, are pivoting to make hardware tailored specifically for AI. Google and Facebook are also forcing their way into this industry for the first time, in a race to find an AI edge through hardware. 

For example, the chip inside the Pixel 6 is a new mobile version of Google’s tensor processing unit, or TPU. Unlike traditional chips, which are geared toward ultrafast, precise calculations, TPUs are designed for the high-volume but low-­precision calculations required by neural networks. Google has used these chips in-house since 2015: they process people’s photos and natural-­language search queries. Google’s sister company DeepMind uses them to train its AIs. 

In the last couple of years, Google has made TPUs available to other companies, and these chips—as well as similar ones being developed by others—are becoming the default inside the world’s data centers. 

AI is even helping to design its own computing infrastructure. In 2020, Google used a reinforcement-­learning algorithm—a type of AI that learns how to solve a task through trial and error—to design the layout of a new TPU. The AI eventually came up with strange new designs that no human would think of—but they worked. This kind of AI could one day develop better, more efficient chips. 

Show, don’t tell

The second change concerns how computers are told what to do. For the past 40 years we have been programming computers; for the next 40 we will be training them, says Chris Bishop, head of Microsoft Research in the UK. 

Traditionally, to get a computer to do something like recognize speech or identify objects in an image, programmers first had to come up with rules for the computer.

With machine learning, programmers no longer write rules. Instead, they create a neural network that learns those rules for itself. It’s a fundamentally different way of thinking. 

Continue Reading

Tech

Decarbonizing industries with connectivity and 5G

Published

on

Decarbonizing industries with connectivity and 5G


The United Nations Intergovernmental Panel on Climate Change’s sixth climate change report—an aggregated assessment of scientific research prepared by some 300 scientists across 66 countries—has served as the loudest and clearest wake-up call to date on the global warming crisis. The panel unequivocally attributes the increase in the earth’s temperature—it has risen by 1.1 °C since the Industrial Revolution—to human activity. Without substantial and immediate reductions in carbon dioxide and other greenhouse gas emissions, temperatures will rise between 1.5 °C and 2 °C before the end of the century. That, the panel posits, will lead all of humanity to a “greater risk of passing through ‘tipping points,’ thresholds beyond which certain impacts can no longer be avoided even if temperatures are brought back down later on.”

Corporations and industries must therefore redouble their greenhouse gas emissions reduction and removal efforts with speed and precision—but to do this, they must also commit to deep operational and organizational transformation. Cellular infrastructure, particularly 5G, is one of the many digital tools and technology-enabled processes organizations have at their disposal to accelerate decarbonization efforts.  

5G and other cellular technology can enable increasingly interconnected supply chains and networks, improve data sharing, optimize systems, and increase operational efficiency. These capabilities could soon contribute to an exponential acceleration of global efforts to reduce carbon emissions.

Industries such as energy, manufacturing, and transportation could have the biggest impact on decarbonization efforts through the use of 5G, as they are some of the biggest greenhouse-gas-emitting industries, and all rely on connectivity to link to one another through communications network infrastructure.

The higher performance and improved efficiency of 5G—which delivers higher multi-gigabit peak data speeds, ultra-low latency, increased reliability, and increased network capacity—could help businesses and public infrastructure providers focus on business transformation and reduction of harmful emissions. This requires effective digital management and monitoring of distributed operations with resilience and analytic insight. 5G will help factories, logistics networks, power companies, and others operate more efficiently, more consciously, and more purposely in line with their explicit sustainability objectives through better insight and more powerful network configurations.

This report, “Decarbonizing industries with connectivity & 5G,” argues that the capabilities enabled by broadband cellular connectivity primarily, though not exclusively, through 5G network infrastructure are a unique, powerful, and immediate enabler of carbon reduction efforts. They have the potential to create a transformational acceleration of decarbonization efforts, as increasingly interconnected supply chains, transportation, and energy networks share data to increase efficiency and productivity, hence optimizing systems for lower carbon emissions.

Explore more.

Continue Reading

Tech

Surgeons have successfully tested a pig’s kidney in a human patient

Published

on

Surgeons have successfully tested a pig’s kidney in a human patient


The reception: The research was conducted last month and is yet to be peer reviewed or published in a journal, but external experts say it represents a major advance. “There is no doubt that this is a highly significant breakthrough,” says Darren K. Griffin, a professor of genetics at the University of Kent, UK. “The research team were cautious, using a patient who had suffered brain death, attaching the kidney to the outside of the body, and closely monitoring for only a limited amount of time. There is thus a long way to go and much to discover,” he added. 

“This is a huge breakthrough. It’s a big, big deal,” Dorry Segev, a professor of transplant surgery at Johns Hopkins School of Medicine who was not involved in the research, told the New York Times. However, he added, “we need to know more about the longevity of the organ.”

The background: In recent years, research has increasingly zeroed in on pigs as the most promising avenue to help address the shortage of organs for transplant, but it has faced a number of obstacles, most prominently the fact that a sugar in pig cells triggers an aggressive rejection response in humans.

The researchers got around this by genetically altering the donor pig to knock out the gene encoding the sugar molecule that causes the rejection response. The pig was genetically engineered by Revivicor, one of several biotech companies working to develop pig organs to transplant into humans. 

The big prize: There is a dire need for more kidneys. More than 100,000 people in the US are currently waiting for a kidney transplant, and 13 die of them every day, according to the National Kidney Foundation. Genetically engineered pigs could offer a crucial lifeline for these people, if the approach tested at NYU Langone can work for much longer periods.

Continue Reading

Copyright © 2020 Diliput News.