Skip to main content

Local LLM Connector Node

The Local LLM Connector node enables you to connect to external LLM API endpoints from various providers. This node provides flexibility to use AI models from services like DeepSeek, OpenAI, Google AI Studio, and other custom API providers by specifying their API URLs and configurations.

Local LLM Connector node


Basic Usage

Use the Text, Local LLM Connector, and Display Text nodes for your process to connect to external LLM API services.


Inputs

The Local LLM Connector node accepts the following inputs:

The Bot Name

  • Type: Text name for the bot
  • Mandatory: Optional
  • Works best with: Text node, Text Input

Define a custom name for your external LLM bot that will be displayed to users.

Icon Image

  • Type: Image for the bot's avatar
  • Mandatory: Optional
  • Works best with: Image Input, File Upload

Upload a custom icon or avatar image to represent your bot visually.

Bot Introduction - First Message

  • Type: Text for the bot's initial greeting
  • Mandatory: Optional
  • Works best with: Text node, Text Input

Define the first message users see when they start a conversation with the bot.

Additional System Prompt

  • Type: Text to supplement the bot's default behavior
  • Mandatory: Optional
  • Works best with: Text node, Text Input

Add instructions to guide how the bot should respond without replacing the default system prompt.

First User Message

  • Type: Text for the initial user message
  • Mandatory: Optional
  • Works best with: Text node, Text Input

Pre-populate the first message from the user to start the conversation.

Placeholder

  • Type: Text shown in the input field
  • Mandatory: Optional
  • Works best with: Text node, Text Input

Define placeholder text that appears in the user's message input field.


Outputs

Full Chat Log

  • Type: Complete conversation history
  • Works best with: Display Text, Document Download, AI General Prompt

Captures the entire conversation between the user and the external LLM bot, useful for analysis, record-keeping, or further processing.


Configuration

URL

Specify the API endpoint URL of the external LLM service:

  • Format: http:// or https:// followed by the API domain
  • Example: http://127.0.0.1, https://api.openai.com, https://api.deepseek.com
  • Description: The base URL of the external LLM API service

Common API URLs:

  • DeepSeek: https://api.deepseek.com
  • OpenAI: https://api.openai.com
  • Google AI Studio: https://generativelanguage.googleapis.com
  • Custom/Local servers: http://127.0.0.1 or your server address

PORT

Define the port number if required by your API endpoint:

  • Format: Numeric port number
  • Example: 6689, 8080, 443
  • Description: The port where your API is accessible (often not needed for standard HTTPS APIs)

MODEL

Specify the model name or identifier as required by the API:

  • Format: Text string matching the API's model name
  • Example: deepseek-chat, gpt-4, gpt-3.5-turbo, gemini-pro, llama, mistral
  • Description: The specific model name as defined by the API provider

Add Header

Add custom HTTP headers for API authentication or configuration:

  • Use case: API keys, authentication tokens, custom parameters
  • Format: Key-value pairs
  • Example: Authorization: Bearer your-api-key-here

Click + Add Header to include additional headers as needed.

Common Header Examples:

  • OpenAI: Authorization: Bearer sk-...
  • DeepSeek: Authorization: Bearer your-deepseek-api-key
  • Google AI Studio: x-goog-api-key: your-google-api-key
  • Custom headers: Content-Type: application/json

Bot Control

Temperature

Control the creativity and randomness of bot responses:

  • Lower values (0.0-0.3): More focused, consistent, and deterministic responses
  • Medium values (0.4-0.7): Balanced between consistency and creativity
  • Higher values (0.8-1.0): More creative, varied, and exploratory responses

Adjust the slider to find the right balance for your use case.


User Interaction Options

Math Input

  • Description: Enable mathematical expression input and rendering
  • Use case: When users need to input equations or mathematical notation
  • Default: Unchecked

Handwriting Input

  • Description: Enable handwriting recognition for user input
  • Use case: When users prefer to write messages by hand (touch/stylus devices)
  • Default: Unchecked

Example Workflows

External LLM API Bot Setup

Scenario: Create a conversational AI bot using an external LLM API service like DeepSeek, OpenAI, or Google AI Studio.

Local LLM Connector Example

Steps to Create the Flow:

  1. Start with the Start Node.

  2. Add and connect a Text node for the bot name.

    • Example text:
    Local Bot
    • Connect to The Bot Name input
  3. Add and connect a Text node for the welcome message.

    • Example text:
    Hello! Welcome from Local Bot.
    • Connect to Bot Introduction - First Message input
  4. Configure the Local LLM Connector Node:

    i. Set the URL of your external LLM API

    • Example for DeepSeek: https://api.deepseek.com
    • Example for OpenAI: https://api.openai.com
    • Example for Google AI Studio: https://generativelanguage.googleapis.com
    • Example for local server: http://127.0.0.1

    ii. Set the PORT number (if required)

    • Example: 6689 (for local servers)
    • Note: Often not needed for standard HTTPS APIs (uses default port 443)

    iii. Specify the MODEL name

    • Example for DeepSeek: deepseek-chat
    • Example for OpenAI: gpt-4, gpt-3.5-turbo
    • Example for Google: gemini-pro
    • Example for local: llama, mistral

    iv. Add Headers for API authentication

    • Click + Add Header
    • For OpenAI: Authorization: Bearer sk-your-api-key
    • For DeepSeek: Authorization: Bearer your-deepseek-key
    • For Google AI Studio: x-goog-api-key: your-google-key

    v. Adjust Temperature to control response creativity

    • Set according to your use case (e.g., 0.5 for balanced responses)

    vi. Configure User Interaction Options

    • Enable Math Input if mathematical notation is needed
    • Enable Handwriting Input for stylus/touch device support
  5. Add Display Text to show the Full Chat Log output (optional).

Prerequisites:

Before using this node, ensure you have:

  1. An API key from your chosen LLM provider (DeepSeek, OpenAI, Google AI Studio, etc.)
  2. The correct API endpoint URL for your provider
  3. The model name as specified by your provider's documentation
  4. Appropriate API quota or credits

Result:

Users interact with an AI chatbot powered by external LLM APIs that:

  • Connects to professional AI services
  • Provides access to state-of-the-art models
  • Offers flexibility to switch between providers
  • Supports custom API endpoints
  • Enables use of specialized models

Use Cases:

  • Multi-Provider Support: Switch between DeepSeek, OpenAI, Google AI Studio, and other providers
  • Cost Optimization: Use cost-effective providers like DeepSeek for high-volume applications
  • Custom Deployments: Connect to self-hosted or enterprise API endpoints
  • Model Flexibility: Access different models from various providers
  • Development & Testing: Test with different LLM providers without changing your workflow

Supported LLM API Providers

The Local LLM Connector works with various external LLM API providers, including:

Commercial AI Services

  • DeepSeek: Cost-effective AI models with competitive pricing
  • OpenAI: GPT-4, GPT-3.5-turbo, and other OpenAI models
  • Google AI Studio: Gemini Pro and other Google AI models
  • Anthropic: Claude models (with compatible API format)
  • Cohere: Cohere's language models
  • Together AI: Access to various open-source models

Self-Hosted Solutions

  • Ollama: Easy-to-use local LLM platform with API support
  • LM Studio: Desktop app for running local models with API endpoints
  • Text Generation WebUI: Web interface for local LLM hosting with API
  • vLLM: High-performance inference server
  • LocalAI: OpenAI-compatible local API
  • OpenRouter: Unified API for multiple LLM providers

Compatibility

Any API service that follows OpenAI-compatible API format can be used with this node.