Connect with us

Tech

Voice games are giving kids a break from screen time

Published

on

Voice games are giving kids a break from screen time


Voice games are not all about competition. Earlier this year, Nina Meehan and Jonathan Shmidt Chapman, both youth theater professionals, created the K’ilu Kit: Passover Adventure for the upcoming Jewish holiday. They realized that for the second year in a row, the pandemic would disrupt the usual gathering of families and friends for the seder, the ritual dinner in which observants re-tell the story of the Jewish exodus from Egypt.

“And it’s not that great on Zoom,” says Shmidt Chapman. “A lot of these biblical stories are hard to explain to a three- to eight-year-old. How do we convey this story in an age-appropriate way?” The K’ilu Kit attempts to make the exodus story meaningful, understandable, and fun for children with the help of interactive elements: a paper flame wrapped around a flashlight becomes the burning bush through which God tells Moses to lead the Israelites out of Egypt, for example.

“The audio experience guides kids into physically doing things with prompts rather than just listening,” Meehan says. “The Passover story is the story of recognizing complex topics about freedom from bondage and slavery and oppression. This is how kids can learn the Passover story. It’s not about just staring at a screen or hearing the story but the levels of importance, the understanding.”

Voice-led entertainment is uniquely capable of delivering that kind of understanding, according to Naomi Baron, professor emerita of linguistics at American University and author of How We Read Now: Strategic Choices for Print, Screen, and Audio. “The concern with screen time has not just been the hours our eyes have been glued to the screen, but the shallowness of most of the interaction,” Baron says. “You aren’t putting in mental effort.”

With audio stories and games, however, the information isn’t presented to you on a platter. Imagination is required, and it takes more focus and attention than gazing at a screen. Baron says research has shown that with this type of learning, comprehension and recall are much higher for developing readers. She adds that older listeners can benefit too, particularly if English is not their first language, their learning style is less visual, or they are visually impaired.

Whether screen time is “good” or “bad” is still debatable, and it’s too soon to tell if the pandemic’s boom in audio and voice games will end as vaccines make it possible to hang out in person once again. They’re not perfect. Voice games often misunderstand users, particularly kids who are just learning how to enunciate and interact with technology. 

The Danielses, however, have doubled down on audio. The family recently bought their second Yoto, which 21-month-old baby Price has figured out how to use. “He’ll sing along to it. He loves it,” Kate says. Charlotte agrees: “I love it because it plays music and stories.”

Tech

Language models like GPT-3 could herald a new type of search engine

Published

on

Language models like GPT-3 could herald a new type of search engine


Now a team of Google researchers has published a proposal for a radical redesign that throws out the ranking approach and replaces it with a single large AI language model, such as BERT or GPT-3—or a future version of them. The idea is that instead of searching for information in a vast list of web pages, users would ask questions and have a language model trained on those pages answer them directly. The approach could change not only how search engines work, but what they do—and how we interact with them

Search engines have become faster and more accurate, even as the web has exploded in size. AI is now used to rank results, and Google uses BERT to understand search queries better. Yet beneath these tweaks, all mainstream search engines still work the same way they did 20 years ago: web pages are indexed by crawlers (software that reads the web nonstop and maintains a list of everything it finds), results that match a user’s query are gathered from this index, and the results are ranked.

“This index-retrieve-then-rank blueprint has withstood the test of time and has rarely been challenged or seriously rethought,” Donald Metzler and his colleagues at Google Research write.

The problem is that even the best search engines today still respond with a list of documents that include the information asked for, not with the information itself. Search engines are also not good at responding to queries that require answers drawn from multiple sources. It’s as if you asked your doctor for advice and received a list of articles to read instead of a straight answer.

Metzler and his colleagues are interested in a search engine that behaves like a human expert. It should produce answers in natural language, synthesized from more than one document, and back up its answers with references to supporting evidence, as Wikipedia articles aim to do.  

Large language models get us part of the way there. Trained on most of the web and hundreds of books, GPT-3 draws information from multiple sources to answer questions in natural language. The problem is that it does not keep track of those sources and cannot provide evidence for its answers. There’s no way to tell if GPT-3 is parroting trustworthy information or disinformation—or simply spewing nonsense of its own making.

Metzler and his colleagues call language models dilettantes—“They are perceived to know a lot but their knowledge is skin deep.” The solution, they claim, is to build and train future BERTs and GPT-3s to retain records of where their words come from. No such models are yet able to do this, but it is possible in principle, and there is early work in that direction.

There have been decades of progress on different areas of search, from answering queries to summarizing documents to structuring information, says Ziqi Zhang at the University of Sheffield, UK, who studies information retrieval on the web. But none of these technologies overhauled search because they each address specific problems and are not generalizable. The exciting premise of this paper is that large language models are able to do all these things at the same time, he says.

Yet Zhang notes that language models do not perform well with technical or specialist subjects because there are fewer examples in the text they are trained on. “There are probably hundreds of times more data on e-commerce on the web than data about quantum mechanics,” he says. Language models today are also skewed toward English, which would leave non-English parts of the web underserved.  

Still, Zhang welcomes the idea. “This has not been possible in the past, because large language models only took off recently,” he says. “If it works, it would transform our search experience.”

Continue Reading

Tech

The world had a chance to avoid the pandemic—but blew it

Published

on

The world had a chance to avoid the pandemic—but blew it


The covid-19 pandemic is a catastrophe that could have been averted, say a panel of 13 independent experts tasked with assessing the global response to the crisis. 

Their report, released May 12 and commissioned by the WHO, lambasts global leaders who failed to heed repeated warnings, wasted time, hoarded information and desperately needed supplies, and failed to take the crisis seriously. While some countries took aggressive steps to curb the spread of the virus, “many countries, including some of the wealthiest, devalued the emerging science, denied the disease’s severity, delayed responding, and ended up sowing distrust among citizens with literally deadly consequences,” said Helen Clark, cochair of the Independent Panel for Pandemic Preparedness and Response and former prime minister of New Zealand, on Wednesday. 

The report —COVID-19: Make It the Last Pandemic  takes a hard look at why we failed to curb the spread of the coronavirus. It also looks to the future, highlighting strategies for ending the current crisis and avoiding future ones. 

Here are five key takeaways: 

  1. We had an opportunity to avoid disaster in early 2020, and we squandered it. “The combination of poor strategic choices, unwillingness to tackle inequalities, and an uncoordinated system created a toxic cocktail which allowed the pandemic to turn into a catastrophic human crisis,” the authors write. 
  2. Vaccine supply must be boosted and shots redistributed. The report calls on rich countries to provide a billion vaccine doses to low- and middle-income countries by September 2021 and another billion by the middle of next year. It also pushes for vaccine makers to offer up licensing and technology transfer agreements. And if those agreements don’t come within three months, it calls for an automatic waiver so that production can begin where the shots are most needed.  
  3. The World Health Organization needs more power and more money. The WHO should have the authority to investigate pathogens with pandemic potential in any country on short notice, and to publish information about outbreaks without approval from national governments.  
  4. A new organization is needed to help the WHO. The report calls for the formation of a Global Health Threats Council composed of heads of state to ensure that countries stay committed to pandemic preparedness and to hold countries accountable if they fail to curb outbreaks.  
  5. The pandemic’s impact on nearly every aspect of daily life is hard to overstate. More than 3 million people have died of covid-19, including at least 17,000 health workers. The crisis provided “the deepest shock to the global economy since the Second World War and the largest simultaneous contraction of national economies since the Great Depression,” the panel writes. The crisis pushed more than a hundred million people into extreme poverty. “Most dispiriting is that those who had least before the pandemic have even less now,” they add.  

Continue Reading

Tech

A nonprofit promised to preserve wildlife. Then it made millions claiming it could cut down trees

Published

on

A nonprofit promised to preserve wildlife. Then it made millions claiming it could cut down trees


Clegern said the program’s safeguards prevent the problems identified by CarbonPlan.   

California’s offsets are considered additional carbon reductions because the floor serves “as a conservative backstop,” Clegern said. Without it, he explained, many landowners could have logged to even lower levels in the absence of offsets.

Clegern added that the agency’s rules were adopted as a result of a lengthy process of debate and were upheld by the courts. A California Court of Appeal found the Air Resources Board had the discretion to use a standardized approach to evaluate whether projects were additional.

But the court did not make an independent determination about the effectiveness of the standard, and was “quite deferential to the agency’s judgment,” said Alice Kaswan, a law professor at the University of San Francisco School of Law, in an email.

California law requires the state’s cap-and-trade regulations to ensure that emissions reductions are “real, permanent, quantifiable, verifiable” and “in addition to any other greenhouse gas emission reduction that otherwise would occur.”

“If there’s new scientific information that suggests serious questions about the integrity of offsets, then, arguably, CARB has an ongoing duty to consider that information and revise their protocols accordingly,” Kaswan said. “The agency’s obligation is to implement the law, and the law requires additionality.”

The recipe

On an early spring day, Lautzenheiser, the Audubon scientist, brought a reporter to a forest protected by the offset project. The trees here were mainly tall white pines mixed with hemlocks, maples and oaks. Lautzenheiser is usually the only human in this part of the woods, where he spends hours looking for rare plants or surveying stream salamanders.

The nonprofit’s planning documents acknowledge that the forests enrolled in California’s program were protected long before they began generating offsets: “A majority of the project area has been conserved and designated as high conservation value forest for many years with deliberate management focused on long-term natural resource conservation values.”

Continue Reading

Copyright © 2020 Diliput News.