City
Epaper

How Google is using AI to keep search safe

By IANS | Updated: March 31, 2022 16:40 IST

San Francisco, March 31 Google is using its Artificial Intelligence systems to help people get access to critical ...

Open in App

San Francisco, March 31 Google is using its Artificial Intelligence systems to help people get access to critical information while avoiding potentially shocking or harmful content, so that they can stay safe, both online and offline.

Google shows contact information alongside the most relevant and helpful results when people search on suicide, sexual assault, substance abuse and domestic violence. But for people in personal crises it takes the help of machine learning to understand their language.

The tech giant's latest AI model Multitask Unified Model, or MUM can automatically and more accurately detect a wider range of personal crisis searches.

MUM can better understand the intent behind people's questions to detect when a person is in need, which helps us more reliably show trustworthy and actionable information at the right time.

"MUM not only understands language, but also generates it. It's trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models," shared Pandu Nayak, Google Fellow and Vice President of Search, in a blogpost.

"And MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio," he added.

Another feature to keep an individual safe on Search, while also steering clear of unexpected shocking results, is the SafeSearch mode, which offers users the option to filter explicit results.

"This setting is on by default for Google accounts of people under 18. And even when users choose to have SafeSearch off, our systems still reduce unwanted racy results for searches that aren't seeking them out," Nayak said.

Further, Google uses advanced AI technologies like BERT to better understand what an individual is looking for.

BERT has improved the understanding of whether searches are truly seeking out explicit content, helping vastly to reduce the chances of encountering surprising search results.

Nayak said last year, BERT has reduced unexpected shocking results by 30 per cent.

"It's been especially effective in reducing explicit content for searches related to ethnicity, sexual orientation and gender, which can disproportionately impact women and especially women of colour," he added.

Nayak stated that Google is also working with trusted local partners to better detect personal crisis queries all over the world, and show actionable information in several more countries.

"Whatever you're searching for, we're committed to helping you safely find it," Nayak said.

Disclaimer: This post has been auto-published from an agency feed without any modifications to the text and has not been reviewed by an editor

Tags: googleWord on macWho dgMicrosoft incUs google & youtubeSk duaDan patelBacPrivate institutesStory of reality tv
Open in App

Related Stories

Technology'In Memory of Victims': What Does the Black Ribbon Below Google Search Bar Mean?

Social ViralToday’s Google Googly: Where Did the Arabic Numerals Originate? Find the Correct Answer Here

TechnologyWhy Is Google Lens Coming to YouTube Shorts and What Can It Do?

EntertainmentGoogle and Kamal Haasan’s RKFI Join Forces to Launch an Interactive Search Animation for Thug Life

CricketToday’s Google Googly: What Is a Diamond Duck? Find the Correct Answer Here

Technology Realted Stories

TechnologyDGCA gives clean chit to Air India’s Boeing 787 fleet amid thorough inspection

TechnologyNo major safety concerns with Air India’s Boeing 787 fleet: DGCA

TechnologyAndhra Pradesh plans to establish three circular economy parks

TechnologyIPO-bound Arisinfra's net loss widens to Rs 17.3 crore, revenue drops nearly 7 pc in FY24

TechnologyUnion Minister Jitendra Singh reviews progress of science and technology institutes in Northeast