With the MUM update, Google Lens will now allow you to perform visual searches and ask questions about the image.
Google has been innovating a lot in its search algorithm, and now, the company has announced a new design and productivity features for its lens. The Multitask Unified Model (MUM) was announced at I/O in May this year, bringing significant improvements to the search. The feature has now been put to use for Google Lens. Google has officially announced that it is in its early beta phase and will be available to Google Lens “early next year.”
Google Lens is the company’s image recognition technology which is present in almost every Android Smartphone. It can make daily life easy through tasks like image recognition, OCR from the text in an image, finding similar items, real-time language translations, etc. With the MUM update, Google Lens will now allow you to perform visual searches and ask questions about the image.
For example, you can search for an item through Google Lens and tap on the Lens icon to search for similar items to buy or other questions related to it, like fixing it or getting similar products online. Google aims to use these AI leaps to make its products more useful for people. By using the phone’s camera as part of Search, Google aims to stay relevant in a market where many of its core use cases are starting to shift to other properties. The Google Lens update will roll out in the months ahead, noting that it still needs to go through rigorous testing and evaluation.