How I Used Gemini CLI to Turn a Figma URL into a Next.js Project

I’ve been exploring ways to build a web app from Figma without doing everything manually. My goal was simple: find an automated workflow that doesn’t lock me into expensive subscriptions or heavy UI tools.

I tried a few common approaches first:

  • Figma plugins – most of the decent ones require a paid plan.
  • v0.dev – not bad, but Figma import is limited to premium users, and it only supports one page at a time.

That got me thinking:
:backhand_index_pointing_right: What if I just give the Figma file directly to an AI CLI and let it generate the project?

Using Gemini CLI Instead of UI Tools

I didn’t want to burn tokens, so I decided to try Gemini CLI with the Gemini 3 model.

Here’s what I did:

  1. Pasted the Figma URL directly into Gemini CLI
  2. Gemini told me it couldn’t access the file without permission
  3. I generated a Figma access token and provided it
  4. Gemini successfully read the design and generated all pages, not just one

To my surprise, it worked end to end.

Results

The output isn’t perfect, but for a free CLI-based workflow, it’s honestly impressive. Layouts, structure, and page separation were all handled automatically.

Thoughts

This experiment convinced me that:

  • AI CLIs are becoming a real alternative to Figma-to-code plugins
  • Giving the model direct design access (via tokens) makes a huge difference
  • Even with a free model, the results are already usable as a starting point

I’m pretty sure that if I ran the same workflow with a stronger model like Opus 4.6 or GPT-5.3, the results would be even better—cleaner components, better semantics, and fewer manual fixes.

For now, though, this feels like a solid, low-cost way to bootstrap a Next.js project straight from Figma.

If you’re trying to automate design-to-code without paying for yet another SaaS tool, this approach is definitely worth a try.

To my surprise, the original Figma design was desktop-only, but Gemini still generated a layout that works well on mobile screens too :mobile_phone::sparkles:
That responsive behavior wasn’t something I asked for explicitly, so it felt like a nice bonus :wrapped_gift:. It seems the model inferred reasonable breakpoints and layout adjustments on its own :robot::light_bulb: