Google marks 25 years with a Doodle: Fun facts
Now Reading
Google marks 25 years with a Doodle: Fun facts about the search giant

Google marks 25 years with a Doodle: Fun facts about the search giant

Google’s journey over the past two decade reflects the historical impact it has had on the way we use and engage with technology

Marisha Singh
Google doodle 25

Alphabet-owned Google marks 25 years on September 27. The search engine is celebrating its anniversary with its users by – you guessed it – a Doodle that takes you 25 years down memory lane, along with a screen full of confetti.

Go ahead and type ‘Google 25 years’ or even ‘happy birthday’ to get showered in confetti coloured in the search platform’s blue, red, yellow and green. Google‘s CEO Sundar Pichai wrote earlier this month, “It’s an enormous privilege to reach this milestone — and we haven’t done it alone. We may be a technology company, but Google is what it is today because of people: Our employees, our partners, and most importantly, all the people who use our products.”

History

It was over 25 years ago that Larry Page and Sergey Brin launched Google Search from a small garage in a California suburb. The search platform was incorporated in 1998. It has since then grown to become one of the world’s biggest technology companies.

Its parent company Alphabet reached a $1tn valuation in 2020, making it the third company to achieve this milestone.

Today, Google is a tech behemoth with offices and data centers on six continents, in over 200 cities. Additionally, it commands over 86 per cent of the world’s search traffic on the internet, according to Statista.

The company has shared some of the most pivotal moments in its journey over the past two decades and a half.

Here are some of the most interesting facts about Google

2001: Google Images

When Jennifer Lopez attended the 2000 Grammy Awards, her daring Versace dress became an instant fashion legend — and the most popular query on Google at the time. Back then, search results were just a list of blue links, so people couldn’t easily find the picture they were looking for. This gave birth to Google Images.

2001: “Did you mean?”
“Did you mean,” with suggested spelling corrections, was one of Google’s first applications of machine learning. Previously, if a search had a misspelling, Google would serve up pages with the same misspelt word. Over the years Google integrated new AI-powered techniques to ensure its search became more powerful and accurate.

2002: Google News
During the tragic events of September 11, 2001, people struggled to find timely information in Search. The need for real-time news, helped Google create Google News the following year with links to a diverse set of sources for any given story.

2003: Easter eggs
Googlers have developed many clever Easter eggs hidden in Search over the years. In 2003, one of the first Easter eggs gave the answer to life, the universe and everything. One of Google’s earliest Easter eggs is still available on Search.

2004: Autocomplete
Google’s Autocomplete was launched to match thought to the search query. It was first launched as “Google Suggest” that automatically predicts queries in the search bar as you start typing. Google says on average, Autocomplete reduces typing by 25 per cent and saves an estimated over 200 years of typing time per day.

2004: Local information
People used to rely on traditional phone books or ‘Yellow pages’ for business information. The web paved the way for local discovery, like “pizza in Chicago” or “haircut 75001.” In 2004, Google Local added relevant information to business listings like maps, directions and reviews. In 2011, the click to call on mobile feature was added, making it easy to get in touch with businesses. On average, local results in Search drive more than 6.5 billion connections for businesses every month, including phone calls, directions, ordering food and making reservations.

2006: Google Translate
Google researchers started developing machine translation technology in 2002 to tackle language barriers online. Four years later, Google Translate was launched with text translations between Arabic and English. Today, Google Translate supports more than 100 languages, with 24 added in 2022 alone.

2006: Google Trends
Google Trends was built to help understand search trends using aggregated data. It was used to create the platform’s annual Year in Search. Today, Google Trends is the world’s largest free dataset of its kind, enabling journalists, researchers, scholars and brands to learn how searches change over time.

2007: Universal Search
The idea of search was expanded to include relevant information across formats, like links, images, videos and local results. Google redesigned its systems to provide results for search of all content types at once.

2008: Google Mobile App
The arrival of Apple’s App Store, set the stage for the launch of  the first Google Mobile App on iPhone. It had features such as Autocomplete and “My Location” which made search easier with fewer key presses, and were especially helpful on smaller screens.

2008: Voice search
In 2008, Google introduced the ability to search by voice on the Google Mobile App, which was expanded to the desktop version in 2011. The Voice Search feature allowed people to search by voice with the touch of a button. Google says the voice search is particularly popular in India, where the percentage of Indians doing daily voice queries is nearly twice the global average.

2009: Emergency hotlines
Google’s emergency hotline came from a suggestion from a mother who had a hard time finding poison control information after her daughter who had swallowed something potentially dangerous. This led to a box for the poison control hotline at the top of the search results page. Since this launch, Google has elevated emergency hotlines for critical moments in need like suicide prevention.

2011: Search by image
Google innovated further with its image search and launched ‘Search by Image’ which allowed users to upload any picture or image URL to find out what it is and where else that image is on the web. This update paved the way for Lens later on.

2012: Knowledge graph
Google brought in the Knowledge Graph, a vast collection of people, places and things in the world and how they’re related to one another, to make it easier to get quick answers. Knowledge Panels, the first feature powered by the Knowledge Graph, provide a quick snapshot of information about topics like celebrities, cities and sports teams.

2015: Popular times
The launch of the Popular Times feature in Search and Maps help people see the busiest times of the day when they search for places like restaurants, stores, and museums.

2016: Discover
By launching a personalised feed (now called Discover) Google began tailoring search results for each user’s interests in the Google app, without having to search.

2017: Google Lens
Google Lens helped turn a phone camera into a search query tool. Just point by looking at objects in a picture, comparing them to other images, and ranking those other images based on their similarity and relevance to the original picture. Now, you can search what you see in the Google app. Today, Lens sees more than 12 billion visual searches per month.

2018: Flood forecasting
To help people better prepare for impending floods, Google delved into forecasting models that predict when and where devastating floods will occur with AI. These efforts began with India and now include flood warnings to 80 countries.

2019: BERT
A big part of what makes Search helpful is our ability to understand language. In 2018, Google introduced and open-sourced a neural network-based technique to train language understanding models: BERT (Bidirectional Encoder Representations from Transformers). BERT makes Search more helpful by better understanding language, meaning it considers the full context of a word. After rigorous testing in 2019, Google applied BERT to more than 70 languages.

2020: Shopping graph
Online shopping became a whole lot easier and more comprehensive when Google made it free for any retailer or brand to show their products on Google. The introduction of a Shopping Graph, an AI-powered dataset of constantly-updating products, sellers, brands, reviews and local inventory that today consists of 35 billion product listings.

2020: Hum to search
Google launched Hum to Search in its app, to help find a song based on the tune stuck in your head. The machine learning feature identifies potential song matches after you hum, whistle or sing a melody.

2021: About this result
To help people make more informed decisions about which results will be most useful and reliable for them, Google brought in added “About this result” next to most search results. It explains why a result is being shown and gives more context about the content and its source, based on best practices from information literacy experts. About this result is now available in all languages where Search is available.

2022: Multisearch
To help uncover information you’re looking for Google created an entirely new way to search with text and images simultaneously through Multisearch. First launched in the US, Multisearch is now available globally on mobile, in all languages and countries where Lens is available.

2023: Search Labs and Search Generative Experience (SGE)
Every year in Search, Google conducts hundreds of thousands of experiments to figure out how to make the platform more helpful for users. With Search Labs, users can test early-stage experiments and share feedback directly with the teams working on them. The first experiment, SGE, brings the power of generative AI directly into Search. You can get the gist of a topic with AI-powered overviews, pointers to explore more and natural ways to ask follow ups.

The search engine has evolved to help billions of users over two decades.

Also read: ChatGPT adds voice, image feature: Everything you need to know

You might also like


© 2021 MOTIVATE MEDIA GROUP. ALL RIGHTS RESERVED.

Scroll To Top