Connect with us

Tech

New business models, big opportunity: Financial services

Published

on

New business models, big opportunity: Financial services


More motivated than ever, organizations in all industries are ready to cut expenses that lack a clear return on investment. So it’s no surprise that survey respondents highlight computing projects—all highly measurable— as priorities in their 2021 plans. Among financial services institutions, 62% are looking to ramp up tech investments, and another 62% expect to move IT and business functions to the cloud, compared with 46% across industries. In a recent report, Nucleus Research found that cloud deployments deliver four times the return on investment as on-premises deployments do.

Planning beyond the pandemic

The Guardian Life Insurance Company of America is an exemplar of a progressive cloud adopter—it’s now moving many of its core financial systems to the cloud. The insurer was motivated to do so—an internal study had found several opportunities, including insufficient data management, a need for lower-level data for better analytics, a lack of system integration, and manual reconciliation issues. “These pain points helped create the need for a new system,” says Marcel Esqueu, assistant vice president for financial systems transformation at Guardian. “We looked at moving to the cloud about five years ago, but we didn’t think it was ready.” Now the company deems cloud services mature enough to support the advanced functionality it requires.

Financial institutions are also looking at mergers and acquisitions as a path beyond pandemic survival. In fact, according to a Reuters report, such deals were up 80% in July, August, and September 2020 from the previous fiscal quarter to hit a whopping $1 trillion in transactions. In the MIT Technology Review Insights survey, 41% of financial services execs report that their organizations acted on a business merger or acquisition or will do so over the coming year.

“People have realized they need to consolidate to create stronger and better-equipped businesses to deal with what the world looks like going forward,” says Alison Harding-Jones, managing director at Citigroup, in the Reuters report.

Mergers and acquisitions have long been a way for an organization to expand its core business—or even gain expertise in emerging technologies. For example, while many financial institutions buy business software with built-in artificial intelligence (AI) capabilities, Mastercard acquired a Canadian AI platform company called Brighterion in 2017 to provide “mission-critical intelligence from any data source,” says Gautam Aggarwal, regional chief technology officer (CTO) at Mastercard Asia-Pacific. The company first used Brighterion’s technology for fraud detection but now puts it to work in credit scoring, anti-money laundering, and the company’s marketing efforts. “We’ve really taken Brighterion and applied it not just for the payment use case but beyond,” says Aggarwal.

Business change, outside and in

Indeed, organizations have had to innovate and respond fast to survive in the covid economy. In the survey, 81% of organizations across industries have evaluated new business models in 2020 or are planning to launch them over the next year. Among financial services institutions, improving the customer experience is paramount, with 55% reporting that they’re improving the experience they offer their customers, compared with 35% across industries.

That’s true for Jimmy Ng, group chief information officer (CIO) at Singapore-based DBS Bank. When physical branches closed during lockdowns, DBS customers— like other bank patrons the world over—did their banking online. But some of them did so only because they had to. “The question is whether this group of people will continue staying on the digital channel.” So DBS is exploring ways to keep customers who prefer in-person service engaged, exploring technologies such as augmented and virtual reality and the 5G mobile network, which enables superfast connections. “How do we enable a joyful customer journey in this remote way of engagement?”

Download the full report

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Tech

A nonprofit promised to preserve wildlife. Then it made millions claiming it could cut down trees

Published

on

A nonprofit promised to preserve wildlife. Then it made millions claiming it could cut down trees


Clegern said the program’s safeguards prevent the problems identified by CarbonPlan.   

California’s offsets are considered additional carbon reductions because the floor serves “as a conservative backstop,” Clegern said. Without it, he explained, many landowners could have logged to even lower levels in the absence of offsets.

Clegern added that the agency’s rules were adopted as a result of a lengthy process of debate and were upheld by the courts. A California Court of Appeal found the Air Resources Board had the discretion to use a standardized approach to evaluate whether projects were additional.

But the court did not make an independent determination about the effectiveness of the standard, and was “quite deferential to the agency’s judgment,” said Alice Kaswan, a law professor at the University of San Francisco School of Law, in an email.

California law requires the state’s cap-and-trade regulations to ensure that emissions reductions are “real, permanent, quantifiable, verifiable” and “in addition to any other greenhouse gas emission reduction that otherwise would occur.”

“If there’s new scientific information that suggests serious questions about the integrity of offsets, then, arguably, CARB has an ongoing duty to consider that information and revise their protocols accordingly,” Kaswan said. “The agency’s obligation is to implement the law, and the law requires additionality.”

The recipe

On an early spring day, Lautzenheiser, the Audubon scientist, brought a reporter to a forest protected by the offset project. The trees here were mainly tall white pines mixed with hemlocks, maples and oaks. Lautzenheiser is usually the only human in this part of the woods, where he spends hours looking for rare plants or surveying stream salamanders.

The nonprofit’s planning documents acknowledge that the forests enrolled in California’s program were protected long before they began generating offsets: “A majority of the project area has been conserved and designated as high conservation value forest for many years with deliberate management focused on long-term natural resource conservation values.”

Continue Reading

Tech

Meet Jennifer Daniel, the woman who decides what emoji we get to use

Published

on

Meet Jennifer Daniel, the woman who decides what emoji we get to use


Emoji are now part of our language. If you’re like most people, you pepper your texts, Instagram posts, and TikTok videos with various little images to augment your words—maybe the syringe with a bit of blood dripping from it when you got your vaccination, the prayer (or high-fiving?) hands as a shortcut to “thank you,” a rosy-cheeked smiley face with jazz hands for a covid-safe hug from afar. Today’s emoji catalogue includes nearly 3,000 illustrations representing everything from emotions to food, natural phenomena, flags, and people at various stages of life.

Behind all those symbols is the Unicode Consortium, a nonprofit group of hardware and software companies aiming to make text and emoji readable and accessible to everyone. Part of their goal is to make languages look the same on all devices; a Japanese character should be typographically consistent across all media, for example. But Unicode is probably best known for being the gatekeeper of emoji: releasing them, standardizing them, and approving or rejecting new ones.

Jennifer Daniel is the first woman at the helm of the Emoji Subcommittee for the Unicode Consortium and a fierce advocate for inclusive, thoughtful emoji. She initially rose to prominence for introducing Mx. Claus, a gender-inclusive alternative to Santa and Mrs. Claus; a non-gendered person breastfeeding a non-gendered baby; and a masculine face wearing a bridal veil. 

Now she’s on a mission to bring emoji to a post-pandemic future in which they are as broadly representative as possible. That means taking on an increasingly public role, whether it’s with her popular and delightfully nerdy Substack newsletter, What Would Jennifer Do? (in which she analyzes the design process for upcoming emoji), or inviting the general public to submit concerns about emoji and speak up if they aren’t representative or accurate.

“There isn’t a precedent here,” Daniel says of her job. And to Daniel, that’s exciting not just for her but for the future of human communication.

I spoke to her about how she sees her role and the future of emoji. The interview has been lightly edited and condensed. 

What does it mean to chair the subcommittee on emoji? What do you do?

It’s not sexy. [laughs] A lot of it is managing volunteers [the committee is composed of volunteers who review applications and help in approval and design]. There’s a lot of paperwork. A lot of meetings. We meet twice a week.

I read a lot and talk to a lot of people. I recently talked to a gesture linguist to learn how people use their hands in different cultures. How do we make better hand-gesture emoji? If the image is no good or isn’t clear, it’s a dealbreaker. I’m constantly doing lots of research and consulting with different experts. I’ll be on the phone with a botanical garden about flowers, or a whale expert to get the whale emoji right, or a cardiovascular surgeon so we have the anatomy of the heart down. 

There’s an old essay by Beatrice Warde about typography. She asked if a good typeface is a bedazzled crystal goblet or a transparent one. Some would say the ornate one because it’s so fancy, and others would say the crystal goblet because you can see and appreciate the wine. With emoji, I lend myself more to the “transparent crystal goblet” philosophy. 

Why should we care about how our emoji are designed?

My understanding is that 80% of communication is nonverbal. There’s a parallel in how we communicate. We text how we talk. It’s informal, it’s loose. You’re pausing to take a breath. Emoji are shared alongside words.

When emoji first came around, we had the misconception that they were ruining language. Learning a new language is really hard, and emoji is kind of like a new language. It works with how you already communicate. It evolves as you evolve. How you communicate and present yourself evolves, just like yourself. You can look at the nearly 3,000 emoji and it [their interpretation] changes by age or gender or geographic area. When we talk to someone and are making eye contact, you shift your body language, and that’s an emotional contagion. It builds empathy and connection. It gives you permission to reveal that about yourself. Emoji can do that, all in an image.

Continue Reading

Tech

Product design gets an AI makeover

Published

on

Product design gets an AI makeover


It’s a tall order, but one that Zapf says artificial intelligence (AI) technology can support by capturing the right data and guiding engineers through product design and development.

No wonder a November 2020 McKinsey survey reveals that more than half of organizations have adopted AI in at least one function, and 22% of respondents report at least 5% of their companywide earnings are attributable to AI. And in manufacturing, 71% of respondents have seen a 5% or more increase in revenue with AI adoption.

But that wasn’t always the case. Once “rarely used in product development,” AI has experienced an evolution over the past few years, Zapf says. Today, tech giants known for their innovations in AI, such as Google, IBM, and Amazon, “have set new standards for the use of AI in other processes,” such as engineering.

“AI is a promising and exploratory area that can significantly improve user experience for designing engineers, as well as gather relevant data in the development process for specific applications,” says Katrien Wyckaert, director of industry solutions for Siemens Industry Software.

The result is a growing appreciation for a technology that promises to simplify complex systems, get products to market faster, and drive product innovation.

Simplifying complex systems

A perfect example of AI’s power to overhaul product development is Renault. In response to increasing consumer demand, the French automaker is equipping a growing number of new vehicle models with an automated manual transmission (AMT)—a system that behaves like an automatic transmission but allows drivers to shift gears electronically using a push-button command.

AMTs are popular among consumers, but designing them can present formidable challenges. That’s because an AMT’s performance depends on the operation of three distinct subsystems: an electro-mechanical actuator that shifts the gears, electronic sensors that monitor vehicle status, and software embedded in the transmission control unit, which controls the engine. Because of this complexity, it can take up to a year of extensive trial and error to define the system’s functional requirements, design the actuator mechanics, develop the necessary software, and validate the overall system.

In an effort to streamline its AMT development process, Renault turned to Simcenter Amesim software from Siemens Digital Industries Software. The simulation technology relies on artificial neural networks, AI “learning” systems loosely modeled on the human brain. Engineers simply drag, drop, and connect icons to graphically create a model. When displayed on a screen as a sketch, the model illustrates the relationship between all the various elements of an AMT system. In turn, engineers can predict the behavior and performance of the AMT and make any necessary refinements early in the development cycle, avoiding late-stage problems and delays. In fact, by using a virtual engine and transmissions as stand-ins while developing hardware, Renault has managed to cut its AMT development time almost in half.

Speed without sacrificing quality

So, too, are emerging environmental standards prompting Renault to rely more heavily on AI. To comply with emerging carbon dioxide emissions standards, Renault has been working on the design and development of hybrid vehicles. But hybrid engines are far more complex to develop than those found in vehicles with a single energy source, such as a conventional car. That’s because hybrid engines require engineers to perform complex feats like balancing the power required from multiple energy sources, choosing from a multitude of architectures, and examining the impact of transmissions and cooling systems on a vehicle’s energy performance.

“To meet new environmental standards for a hybrid engine, we must completely rethink the architecture of gasoline engines,” says Vincent Talon, head of simulation at Renault. The problem, he adds, is that carefully examining “the dozens of different actuators that can influence the final results of fuel consumption and pollutant emissions” is a lengthy and complex process, made by more difficult by rigid timelines.

“Today, we clearly don’t have the time to painstakingly evaluate various hybrid powertrain architectures,” says Talon. “Rather, we needed to use an advanced methodology to manage this new complexity.”

For more on AI in industrial applications, visit www.siemens.com/artificialintelligence.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Continue Reading

Copyright © 2020 Diliput News.