Skip to main content

Connect

Establishes a connection to your local Ollama server and creates a client session for interacting with AI models.

tip

You can skip using Connect/Disconnect nodes by providing the Host URL directly in other Ollama nodes. This is useful for simple, one-off operations.

Common Properties

  • Name - The custom name of the node.
  • Color - The custom color of the node.
  • Delay Before (sec) - Waits in seconds before executing the node.
  • Delay After (sec) - Waits in seconds after executing node.
  • Continue On Error - Automation will continue regardless of any error. The default value is false.
info

If the ContinueOnError property is true, no error is caught when the project is executed, even if a Catch node is used.

Inputs

This node does not have any inputs.

Output

  • Client ID - A unique identifier for this Ollama client session. Use this ID in other Ollama nodes to reuse the same connection.

Options

This node uses the Ollama Host credential configured in the vault. If not specified in the vault, it defaults to http://localhost:11434.

How It Works

The Connect node:

  1. Creates a new Ollama client instance
  2. Establishes connection to the Ollama server (local or remote)
  3. Generates a unique client ID
  4. Returns the client ID for use in subsequent Ollama operations

Usage Example

Basic Connection Flow

1. Connect
└─> Client ID: "abc123..."

2. Generate Completion (using Client ID: "abc123...")
└─> Response: "Your generated text..."

3. Disconnect (using Client ID: "abc123...")

Configuring Ollama Host

To connect to a custom Ollama server:

  1. Go to Vault > Credentials
  2. Add a new Ollama Host credential
  3. Enter your server URL (e.g., http://192.168.1.100:11434 for a remote server)
  4. Select this credential in your Connect node

Requirements

  • Ollama must be installed and running on the target machine
  • The Ollama service must be accessible at the specified URL
  • Default port is 11434

Common Use Cases

  • Establishing a persistent connection for multiple AI operations
  • Connecting to a remote Ollama server on your network
  • Sharing a single connection across multiple nodes in your workflow

Tips

  • Store the Client ID in a message variable to reuse across multiple nodes
  • If you only need to perform a single operation, consider using the Host URL option directly in the operation node instead of using Connect/Disconnect
  • For local installations, the default settings usually work without configuration
  • Use Disconnect at the end of your workflow to properly clean up resources

Error Handling

Common errors you might encounter:

  • "Failed to create client" - The Ollama service is not running or not accessible
  • Connection refused - Check that Ollama is running on the specified host and port
  • Invalid URL - Verify the Ollama Host URL format (should start with http:// or https://)