Create & Edit Images Instantly with Google Nano Banana 2

Try Nano Banana 2 Now
Skip to main content

Continue.dev + ModelsLab: Use Any LLM in Your IDE (PR #10620)

Adhik JoshiAdhik Joshi
||3 min read|API
Continue.dev + ModelsLab: Use Any LLM in Your IDE (PR #10620)

Integrate AI APIs Today

Build next-generation applications with ModelsLab's enterprise-grade AI APIs for image, video, audio, and chat generation

Get Started
Get Started

Continue.dev — the open-source AI coding assistant with 31,000+ GitHub stars — now supports ModelsLab as a native LLM provider through PR #10620. This means developers using Continue in VS Code or JetBrains can connect to ModelsLab's full model library directly from their IDE.

What Is Continue.dev?

Continue is the leading open-source AI coding extension for VS Code and JetBrains IDEs. It handles code completion, chat, edit commands, and agentic coding workflows — and unlike GitHub Copilot, you choose which LLM powers it. OpenAI, Anthropic, Ollama, and dozens of other providers are supported via a plugin architecture.

With ModelsLab now added as a provider, you can run Continue against ModelsLab's OpenAI-compatible API — giving you access to fine-tuned coding models, custom deployments, and a cost-effective alternative to the major providers.

Why ModelsLab as a Continue.dev LLM Provider?

ModelsLab's API is OpenAI-compatible. The Continue integration works by extending the OpenAI provider class with ModelsLab's API base URL — meaning all OpenAI-compatible models on ModelsLab become accessible with zero additional configuration overhead.

Key advantages:

  • Cost — ModelsLab API pricing is typically lower than direct OpenAI or Anthropic access
  • Model variety — access fine-tuned coding models alongside general-purpose LLMs
  • Privacy — control where your code context goes by choosing self-hosted or ModelsLab-routed models
  • No vendor lock-in — swap models without changing your IDE workflow

Setup: Continue.dev + ModelsLab

Once the PR merges, configuration is straightforward. Add ModelsLab to your Continue config file (~/.continue/config.yaml or via the IDE extension settings):

models:
  - name: ModelsLab Llama 3
    provider: modelslab
    model: llama-3-8b-instruct
    apiKey: YOUR_MODELSLAB_API_KEY
    apiBase: https://modelslab.com/api/v6/llm

Or in JSON format (~/.continue/config.json):

{
  "models": [
    {
      "title": "ModelsLab Llama 3",
      "provider": "modelslab",
      "model": "llama-3-8b-instruct",
      "apiKey": "YOUR_MODELSLAB_API_KEY",
      "apiBase": "https://modelslab.com/api/v6/llm"
    }
  ]
}

Get your API key at modelslab.com/api-key.

Available Models for Coding

ModelsLab hosts a range of models suitable for code-heavy workloads:

  • Llama 3.1 70B Instruct — strong general + code performance, good context window
  • DeepSeek Coder V2 — purpose-built coding model, strong at completion and refactoring
  • Qwen2.5 Coder 32B — excellent at code generation across 40+ languages
  • Mistral 7B Instruct — fast, lightweight, ideal for quick completions

Browse the full catalog at modelslab.com/models.

Using Continue.dev for Code Completion and Chat

Once configured, all Continue.dev features work with your ModelsLab-powered model:

# Example: Ask Continue.dev to refactor a function
# Press Cmd+I in VS Code, then type:
# "Refactor this to use async/await and add type hints"

def fetch_user_data(user_id):
    response = requests.get(f"https://api.example.com/users/{user_id}")
    return response.json()

Continue will use your ModelsLab model to produce the refactored version inline, with the same UX you'd get from any other provider.

Current Status

The PR is open and under review. The implementation extends Continue's OpenAI provider class with apiBase pointing to ModelsLab's endpoint — a clean, minimal change that leverages the existing OpenAI-compatible infrastructure.

To track merge status: github.com/continuedev/continue/pull/10620

Related Integrations

ModelsLab has been expanding into developer tooling rapidly. Recent integrations include:

Each integration makes ModelsLab models accessible in more developer workflows. The trend is clear: ModelsLab is becoming the API layer that powers the open-source AI tooling ecosystem.

Get Started

1. Install Continue.dev from the VS Code marketplace
2. Create a free ModelsLab account and get your API key
3. Add the config above to ~/.continue/config.json
4. Start coding with your ModelsLab-powered AI assistant

Questions? The ModelsLab docs and Discord community are good starting points.

Share:
Adhik Joshi

Written by

Adhik Joshi

Plugins

Explore Plugins for Pro

Our plugins are designed to work with the most popular content creation software.

API

Build Apps with
ML
API

Use our API to build apps, generate AI art, create videos, and produce audio with ease.