Chrome’s Gemini Nano AI Model Uses Up to 4 GB Locally, Can Be Disabled to Free Space
Chrome may download a 4 GB Gemini Nano AI model for on‑device features; users can turn it off in Settings to reclaim storage space.
Visual sourcing
No source-linked image is attached to this story yet. Measured Take avoids generic stock art when a relevant credited image is not available.
TL;DR
Chrome can download a 4 GB Gemini Nano AI model for on‑device assistance, but disabling the feature in Settings frees the space.
Chrome now pulls a local AI model when users enable on‑device features. The model, known as Gemini Nano, stores its data in a file called weights.bin. This file can occupy as much as 4 GB of a computer’s hard drive.
The Gemini Nano model powers three core functions: writing assistance, autocomplete suggestions, and fraud protection. By running locally, the model keeps user data off the cloud, which improves privacy but adds a storage cost.
The model appears in Chrome’s system files under an OptGuideOnDeviceModel folder. Users can verify its presence by browsing the folder path in their file explorer.
If the storage impact is undesirable, Chrome offers a simple reversal. Navigating to Settings > System and turning off the on‑device AI toggle stops the download and removes the weights.bin file, instantly reclaiming up to 4 GB of space.
What it means
For users with limited SSD capacity, the hidden 4 GB can be significant. The ability to disable the feature gives control back to the user without sacrificing Chrome’s core browsing capabilities. Enterprises concerned about data residency may also appreciate the option to keep AI processing local while still having a quick way to remove it.
Developers should note that the on‑device model is optional; the browser will continue to function with cloud‑based AI services if the local model is disabled. This flexibility allows Chrome to cater to both privacy‑focused users and those who prioritize storage efficiency.
Looking ahead
Watch for updates from Google on whether future AI models will require less local storage or offer granular control over which components are installed.
Continue reading
More in this thread
Conversation
Reader notes
Loading comments...