Nagesh Singh Chauhan
Dissecting Google’s MUM, The Mother of all Search Algorithms?
"The quest for understanding human speech better."

“A new AI milestone for understanding information", MUM is designed to make it easier for Google to answer complex needs in search.
Introduction
In 2021, Prabhakar Raghavan, Senior Vice President at Google, announced the launch of a new AI model called Multitask Unified Model (MUM) at the Google I/O event. It is one of the biggest updates the search engine giant is developing. The Google MUM update aims to provide users with an improved search experience by fulfilling modern search demands. It is designed in a way that users need not perform multiple searches to compare information and gain deeper insights into a particular topic.
BERT, Google’s original contextual learning technology, was deployed back in 2019. It improved the search engine’s ability to figure out the context of human speech. MUM, Google’s latest update, builds on what its predecessor has started. The Multitask Unified Model goes beyond just context. With this latest Google Search update, it’s going to feel a lot more like you’re talking to an actual person.
What is Google’s MUM update?
The Google Multitask Unified Model (MUM) update, aims to answer modern search demands by using an AI-powered algorithm to enhance online search capability. When searching the internet, contradictory to anticipations users are faced with multiple searches, geographical, and language barriers due to a lack of intuition on the search engine.
Google's MUM uses the Text-To-Text Transfer Transformer (T5) framework and is 1,000 times more powerful than BERT. MUM not only understands language but also generates it. It’s trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models. And MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio.
What is T5— Text-To-Text Transfer Transformer?
T5 attempts to combine all the downstream tasks into a text-to-text format.
The Text-to-Text Framework:

Unified Framework for All Downstream Tasks. Google AI Blog
Consider the example of a BERT-style architecture that is pre-trained on a Masked LM and Next Sentence Prediction objective and then, fine-tuned on downstream tasks (for example predicting a class label in classification or the span of the input in QnA). Here, we separately fine-tune different instances of the pre-trained model on different downstream tasks.
The text-to-text framework, on the contrary, suggests using the same model, same loss function, and the same hyperparameters on all the NLP tasks. In this approach, the inputs are modeled in such a way that the model shall recognize a task, and the output is simply the “text” version of the expected outcome. Refer to the above animation to get a clearer view of this.
What is a Multimodel in machine learning?
Multimodal is a composite machine learning technique that compares and blends information from multiple sources to form a single response. The "modal" in multimodal refers to the accumulation of data within media, such as visual data from images and video, language data from text documents, and audio data from music and sound recordings. Modalities are integrated into the training dataset for machine learning models. Multimodal sentiment analysis, for example, can examine various combinations of text, audio and visual data to evaluate the sentiment towards an event or occurrence. With MUM, Google is treating media as modalities to enhance the user experience with its search.
So back to the original discussion, Google’s MUM will remove the need to carry out multiple searches that users presently do in order to compare and gain deeper insights. It has the ability to understand and bring solutions based not just on textual content but also an understanding of images, videos, and podcasts in a way that was never attainable before.
It understands 75 different languages which indicates that it can pool and serve results to give users the most holistic and exhaustive search experience, answering even the most complicated queries.
Google MUM will redefine search relevance changing the way people access and use information across the internet. This, however, needs to be taken with a pinch of salt that not all content can be trusted and would eventually boil down to user discretion.
The MUM update means searches will serve information that provides helpful, related insights, and will reach further for these sources than any other search engine update before it.
Google believes that the MUM update is the answer.
Although in its early days the algorithm will continue to see iterations it certainly looks to be an exciting move that Google is committed to building on. How?
Google intends to follow the below steps in order to ensure they can make it “the world’s best MUM” and remove any machine learning biases:
Human feedback from raters using the Search Quality Rater Guidelines will help understand how people find information
Similar to 2019’s BERT update, MUM too will undergo the same process applied to Google search models
Applying learnings from their latest research on how to reduce the carbon footprint of large neural network training systems to ensure search continues to function as efficiently as possible
How will search change with the use of MUM?
The objective of MUM is to answer complicated search queries and to fit more of a buyer's journey onto a single SERP. Here are the features we know about so far.
New search options in Google Lens
You’ll soon be able to use Google Lens to click a photograph and ask a question about what’s in that photograph. Google gives you an example where you can take a picture of a shirt and find socks with the same pattern:

Or you can click a bike part and ask Google how to repair it or where to buy a substitute from. This actually sounds useful to the end-user and I imagine trying it from time to time.
Larger images in search
Certain types of queries, like searching for inspiration, ideas, and apparel, will trigger image universal blocks in search results:

New recommendation features
Google is adding a new SERP feature called Things to know and it’s basically an advanced recommendation system. Its function is similar to the People also ask feature — to refine your search and take you further on your buyer/learner journey.
The key difference is that the People also ask feature usually takes you just one or two steps further in your search, while the Things to know feature aims to take you all the way.

Furthermore, Google is adding features called Broaden/Refine this search:

This is also equivalent to the existing feature called Related searches. Except, again, the recommendations are supposed to be more on point and offer more directions for additional queries.
No more ten blue links
Well, to be honest, there haven’t been ten blue links for a while now. Search Engine Results Pages or SERPs are already loaded with rich components that draw attention away from regular snippets. If you search how to ride a motorcycle then it will take you three full scrolls before you get to the first “organic” search result:

So, even though ten blue links are still present in SERP, they are no longer the main course.
With the introduction of MUM, the importance of the conventional search results will be diminished even further. The upper part of SERP will become much more engaging and even fewer clicks will trickle down to the regular snippets.
Conclusion
Are we genuinely leading to an internet-driven world without barriers? While Google’s MUM seeks to understand more about what we might be looking for than any search engine has ever before, will this open up the search-scape to a truly more worldly experience? We can’t answer all the questions and there are many still to be asked as the rollout gathers pace. Only time will tell us how Google improvises MUM in the future. After all, technology and innovation never stand still for long.
References
https://blog.google/products/search/introducing-mum/
https://www.link-assistant.com/news/google-mum-update.html
https://towardsdatascience.com/rip-bert-googles-mum-is-coming-cb3becd9670f
https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html