Hands‑On Review of LM Studio: Install, Run, and Evaluate Open‑Source LLMs on Windows

This article walks through installing LM Studio on a Windows PC, downloading models from Hugging Face, using the AI Chat interface (including a Codellama‑generated Snake game), measuring resource usage, exploring the built‑in OpenAI‑compatible API, and summarizing its strengths and limitations.

Eric Tech Circle
Eric Tech Circle
Eric Tech Circle
Hands‑On Review of LM Studio: Install, Run, and Evaluate Open‑Source LLMs on Windows

Installation on Windows

Because the author’s MacBook uses an Intel CPU rather than Apple Silicon, the setup was performed on a Windows desktop. LM Studio runs cross‑platform, requiring only the application installation without additional language environments.

Downloading Models

All models are sourced from Hugging Face, so a VPN or other method to bypass regional restrictions is necessary. Several models were downloaded for testing, as shown in the accompanying screenshots.

Running the Application

After launching LM Studio, click the AI Chat button and select a model to start chatting. The interface is smooth and responsive, noticeably faster than a comparable local Ollama setup on the author’s MacBook, with minimal perceived latency.

Switching to the codellama model, the author prompted it to write a Snake game, and the generated code was displayed in the UI screenshots.

Resource Consumption

During usage, CPU and RAM usage were relatively high, while GPU utilization remained low, as illustrated by the resource‑monitor screenshot.

Local Service API

LM Studio implements an OpenAI‑compatible API, providing example code snippets that can be copied for quick local calls. This enables developers to integrate the locally hosted models into their own applications.

Multi‑Model Mode

The tool also supports a “Multi Model” mode, allowing multiple models to run simultaneously; however, the author did not test this due to RAM constraints.

Pros

Cross‑platform installation with a clean, user‑friendly interface.

No need to install separate language runtimes; the app works out‑of‑the‑box.

Fast response speed; automatically recommends model sizes suitable for the hardware.

Access to a wide range of models from Hugging Face, provided they are in GGUF format.

Cons

Requires VPN access to download models from abroad, which can be inconvenient for users in regions with restricted internet.

Only models hosted on the Hugging Face Hub are supported; integration with domestic model hubs is lacking.

Missing advanced features such as ingesting PDFs, Markdown, TXT, or video files to build knowledge bases; the tool currently offers only chat capabilities.

Geared toward beginners; advanced users may prefer scripting with Python for custom functionality.

Conclusion

Overall, LM Studio provides a straightforward way to experiment with open‑source large language models locally, especially for users seeking a quick, GUI‑based experience. While it has some limitations regarding model sources and advanced features, it remains a valuable tool for initial exploration.

model deploymentWindowsopen-source LLMAI chatHugging FaceLM Studio
Eric Tech Circle
Written by

Eric Tech Circle

Backend team lead & architect with 10+ years experience, full‑stack engineer, sharing insights and solo development practice.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.