Setting Up LangSmith with a Dockerised n8n Instance

n8n
LangSmith
Author

Chris Carr

Published

May 12, 2025

Introduction

Recently, I’ve been experimenting with using LangGraph to prototype LLM applications. While other observation tools exist (Arize Phoenix and MLflow, for example), LangChain’s own offering - LangSmith - is easy to set up and thus worth trying out.

Since I’m still using n8n as part of my development flow, it makes sense to integrate LangSmith with my self-hosted n8n instance. This is supported in n8n since its AI Agent nodes use LangChain.js under the hood.

I run a personal n8n instance on a DigitalOcean Droplet (using Docker Compose). It was set up using the instructions that can be found here.

I’m documenting how I set it up since this I couldn’t find any instructions for integrating LangChain with this common installation pattern.

Important

This guide is for self-hosted n8n instances installed with Docker Compose. LangSmith integration is not currently supported in n8n Cloud.

Instructions

Get LangSmith API Key

If you have not already done so, sign up for a LangSmith account. Note that there are two data regions to choose from - US and EU.

In LangSmith, select Set up tracing and then Generate API Key. Select the TypeScript environment variables and you should be able to see the following values: LANGSMITH_TRACING, LANGSMITH_ENDPOINT, LANGSMITH_API_KEY, LANGSMITH_PROJECT and OPENAI_API_KEY.

Update Docker Compose YAML File

In the command line of the server hosting your n8n, use the following commands to enter the directory containing n8n and open docker-compose.yml.

cd n8n-docker-compose
nano docker-compose.yml

In this file, we need to add the bottom four lines to the n8n environment subsection as shown below below . Make sure that the LANGSMITH_ENDPOINT matches the data centre that you’ve selected.

Note

As of writing (May 2025), setting LANGSMITH_PROJECT doesn’t appear to work. However, setting LANGCHAIN_PROJECT does, so we’ll go with that.

n8n:
    image: docker.n8n.io/n8nio/n8n
    restart: always
    ports:
        - 5678:5678
    environment:
        - N8N_HOST=${SUBDOMAIN}.${DOMAIN_NAME}
        - N8N_PORT=5678
        - N8N_PROTOCOL=https
        - NODE_ENV=production
        - WEBHOOK_URL=https://${SUBDOMAIN}.${DOMAIN_NAME}/
        - GENERIC_TIMEZONE=${GENERIC_TIMEZONE}
        - LANGSMITH_ENDPOINT=https://api.smith.langchain.com # add this (use https://eu.api.smith.langchain.com if using the EU data centre)
        - LANGSMITH_API_KEY=PASTE-LANGSMITH-KEY-HERE # add this
        - LANGSMITH_TRACING=true # add this
        - LANGCHAIN_PROJECT=YOUR-PROJECT-NAME # add this

Save and exit out of this file. Then restart n8n by using the following commands:

sudo docker compose down
sudo docker compose up -d

Verify Setup Has Been Successful

Optional Check Using n8n Workflow

I put together a small workflow that checks to see whether n8n has visibility of the environment variables that were just added. To use it, right-click and select Inspect and then Console (you’ll have to right-click somewhere on the page that isn’t the workflow canvas). Then run the workflow. If the setup has been successful, you’ll see the following printed in the console (with your values):

n8n can see the environment variables! Check them below:

LANGSMITH_API_KEY=YOUR-LANGSMITH-API-KEY
LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT=YOUR-ENDPOINT
LANGCHAIN_PROJECT=YOUR-PROJECT-NAME

Test by Calling an AI Node in n8n

Create a workflow (or use an existing one) that uses an AI node (such as AI Agent). Run the workflow. In LangSmith, you will now be able to see the trace under the project name that you chose when setting the environment variables.