Skip to main content

Google is giving free access to two of Gemini’s best AI features

Gemini Advanced on the Google Pixel 9 Pro Fold.
Andy Boxall / Digital Trends

Google’s Gemini AI has steadily made its way to the best of its software suite, from native Android integrations to interoperability with Workspace apps such as Gmail and Docs. However, some of the most advanced Gemini features have remained locked behind a subscription paywall.

That changes today. Google has announced that Gemini Deep Research will now be available for all users to try, alongside the ability to create custom Gem bots. You no longer need a Gemini Advanced (or Google One AI Premium) subscription to use the aforementioned tools.

Recommended Videos

The best of Gemini as an AI agent

Deep Research is an agentic tool that takes over the task of web research, saving users the hassle of visiting one web page after another, looking for relevant information. With Deep Research, you can simply put a natural language query as input, and also specify the source, if needed.

Using Gemini Deep Research on a smartphone.
Nadeem Sarwar / Digital Trends

Deep Research will break down the query in numerous stages and then seeks a final plan approval before it jumps into action. After completing the research work, which usually takes a few minutes, it presents a neatly formatted document, divided across headlines, tables, bullet points, and other relevant stylistic elements.

It’s a fantastic tool for conducting research as a student, journalist, finance planner, academic, and more. I have extensively used this feature for digging into scientific papers, and it has been so helpful that I pay for a Gemini Advanced subscription solely to access Deep Research.

“Now anyone will be able to try it across the globe and in 45+ languages,” writes Dave Citron, Senior Director of Product Management for the Gemini app. Aside from giving free access to all users, Google is also upgrading the underlying infrastructure to the more advanced Gemini 2.0 Flash Thinking Experimental AI model.

Response provided by Gemini Deep Research.
Gemini serves your answers in a report that looks like this. Nadeem Sarwar / Digital Trends

Do keep in mind that you won’t get unlimited access, since it’s a very compute-intensive process. Google says free users can try Deep Research a “few times per month.”

The strategy is not too different compared to what Perplexity has to offer with its own Deep Research tool. OpenAI chief Sam Altman has also confirmed that free ChatGPT users will also be able to launch Deep Research queries twice a month.

Creating custom versions of Gemini

Another freebie announced by Google today is Gems. These are essentially custom chatbots, which can be trained to perform a specific task. From drafting detailed email responses with a simple “yes” or “no” as input to a coding assistant, users can create one that best suits their workflow.

Interacting with a custom Gem created with Gemini.
Screenshot Google

The best part is that you don’t need any coding knowledge to create a personalized Gem for your daily use, as all the operational instructions can be given in natural language sentences. So far, the ability to create Gems has been limited to paying users.

Now, Gems are rolling out widely to all Gemini users, without any subscription requirement. Gems are available for free in the Gemini mobile app, but to create them, you need to visit the Gemini desktop client. The behavior of Gems can also be customized later on.

Just like the regular Gemini assistant, Gems can also process data based on files uploaded by users. I have created a handful of Gems, which take the drudgery out of boring tasks and save me a lot of time.

Nadeem Sarwar
Nadeem is a tech and science journalist who started reading about cool smartphone tech out of curiosity and soon started…
Google employees are testing new AI Mode search feature
AI Overviews being shown in Google Search.

Google is working on integrating more AI features into its search engine. The company is now having its U.S. employees test a new feature called “AI Mode.” 

The publication 9to5 Google uncovered an internal email detailing that employees had been invited to dogfood (test), the AI Mode, which is intended to be a form of intelligent search embedded within Google Search. The feature adds “easy-to-digest breakdowns with links to explore content across the web.”

Read more
Turns out, it’s not that hard to do what OpenAI does for less
OpenAI's new typeface OpenAI Sans

Even as OpenAI continues clinging to its assertion that the only path to AGI lies through massive financial and energy expenditures, independent researchers are leveraging open-source technologies to match the performance of its most powerful models -- and do so at a fraction of the price.

Last Friday, a unified team from Stanford University and the University of Washington announced that they had trained a math and coding-focused large language model that performs as well as OpenAI's o1 and DeepSeek's R1 reasoning models. It cost just $50 in cloud compute credits to build. The team reportedly used an off-the-shelf base model, then distilled Google's Gemini 2.0 Flash Thinking Experimental model into it. The process of distilling AIs involves pulling the relevant information to complete a specific task from a larger AI model and transferring it to a smaller one.

Read more
Sundar Pichai says even more AI is coming to Google Search in 2025
Google Search on a laptop

Google will continue to go all in on AI in 2025, CEO Sundar Pichai announced during the company's Q4 earnings call Wednesday. Alphabet shares have since dropped more than 7% on news that the company giant fell short of fourth-quarter revenue expectations and announced an ambitious spending plan for its AI development.

"As AI continues to expand the universe of queries that people can ask, 2025 is going to be one of the biggest years for search innovation yet,” he said during the call. Pichai added that Search is on a “journey” from simply presenting a list of links to offering a more Assistant-like experience. Whether users actually want that, remains to be seen.

Read more