Artificial Intelligence 7 min read

Run Gemini Nano AI Locally in Chrome Canary: Step‑by‑Step Guide

This tutorial walks you through downloading Chrome Canary, enabling the Gemini Nano Prompt API and optimization guide flags, installing the 1.5 GB model, and using the window.ai JavaScript API to create sessions and send prompts, while discussing performance, limitations, and the benefits of offline AI.

Code Mala Tang
Code Mala Tang
Code Mala Tang
Run Gemini Nano AI Locally in Chrome Canary: Step‑by‑Step Guide

Setup

Download Chrome Canary from the official website.

Enable the "Gemini Nano Prompt API" flag: open chrome://flags/ , search for "prompt API", and set "Prompt API for Gemini Nano" to Enabled.

Enable the "Enables optimization guide on device" flag: open chrome://flags , search for "optimization guide on", select the option "Enabled ByPassPerfRequirement".

Install the model file (≈1.5 GB): go to chrome://components/ , search for "Optimization Guide", find "Optimization Guide On Device Model", and click "Check for update" to download.

Restart Chrome Canary to apply the changes.

Using window.ai

Open DevTools (F12) and switch to the Console tab. Verify the API is available by typing window.ai and checking for the object.

canCreateGenericSession and canCreateTextSession indicate whether the required features are ready.

createGenericSession and createTextSession create sessions for interacting with the model.

defaultGenericSessionOptions and defaultTextSessionOptions return the default option objects for the two creation functions.

To start a session:

const chatSession = await window.ai.createTextSession();

Then send a prompt:

const result = await chatSession.prompt('hi, what is your name?');

Remember to use await for all asynchronous calls. Response time varies from milliseconds to seconds depending on hardware and prompt complexity.

For convenience, you can define a helper function:

<code>async function askLocalGPT(promptText) {
  if (window.chatSession) {
    console.log('starting chat session');
    window.chatSession = await window.ai.createTextSession();
    console.log('chat session created');
    return console.log(await chatSession.prompt(promptText));
  }
}
</code>

Invoke it in the console with askLocalGPT("your prompt") . Note that the current API does not retain conversation context, and parameters such as systemPrompt and initialPrompts appear ineffective.

Evaluation

The local AI runs reasonably well with average response speed. It is not a replacement for cloud models like Claude or ChatGPT, but it is excellent for experimentation and offline use.

Local processing of sensitive data.

Potentially faster results without server round‑trips.

Offline capability.

Reduced API costs for web developers.

Why Local AI Matters

When AI becomes universally available in browsers, new possibilities emerge:

AI‑enhanced content consumption : summarization, Q&A, classification, and tagging.

AI‑assisted content creation : writing assistance, proofreading, grammar correction, and rephrasing.

javascriptLocal AIChrome CanaryGemini NanoPrompt APIwindow.ai
Code Mala Tang
Written by

Code Mala Tang

Read source code together, write articles together, and enjoy spicy hot pot together.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.