使用上下文AI重新排序器对MCP服务器进行排名
高级
这是一个Miscellaneous, AI RAG, Multimodal AI领域的自动化工作流,包含 16 个节点。主要使用 If, Code, Merge, HttpRequest, Chat 等节点。 使用OpenAI GPT-4.1和上下文AI重新排序器实现动态MCP服务器选择
前置要求
- •可能需要目标 API 的认证凭证
- •OpenAI API Key
工作流预览
可视化展示节点连接关系,支持缩放和平移
导出工作流
复制以下 JSON 配置到 n8n 导入,即可使用此工作流
{
"id": "d1iK84AVOBn7nPRx",
"meta": {
"instanceId": "11121a0a0c6d26991d417aaff350a8e1836bf48496a817dba8b2be23aec9b053",
"templateCredsSetupCompleted": true
},
"name": "使用上下文 AI 重新排序器对 MCP 服务器进行排名",
"tags": [],
"nodes": [
{
"id": "59b497fe-1934-4183-8a17-f3b30ca0f5c4",
"name": "OpenAI 聊天模型",
"type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
"position": [
216,
-56
],
"parameters": {
"model": {
"__rl": true,
"mode": "list",
"value": "gpt-4.1-mini"
},
"options": {
"responseFormat": "json_object"
}
},
"credentials": {
"openAiApi": {
"id": "1qWYthUxPflxQXam",
"name": "OpenAi account"
}
},
"typeVersion": 1.2
},
{
"id": "a1c8a119-9b23-44ad-a1c0-2acef910beaf",
"name": "如果",
"type": "n8n-nodes-base.if",
"position": [
496,
-280
],
"parameters": {
"options": {},
"conditions": {
"options": {
"version": 2,
"leftValue": "",
"caseSensitive": true,
"typeValidation": "strict"
},
"combinator": "and",
"conditions": [
{
"id": "47fd1d36-7a24-4086-9b68-ba5b42d9a714",
"operator": {
"type": "boolean",
"operation": "true",
"singleValue": true
},
"leftValue": "={{ $json.output.parseJson().use_mcp }}",
"rightValue": ""
}
]
}
},
"typeVersion": 2.2
},
{
"id": "3cfcff90-fdee-430a-951a-d30f8f487a6e",
"name": "合并",
"type": "n8n-nodes-base.merge",
"position": [
944,
-352
],
"parameters": {},
"typeVersion": 3.2
},
{
"id": "33cdc727-eaee-4898-b583-ec57c79362af",
"name": "合并1",
"type": "n8n-nodes-base.merge",
"position": [
1616,
-352
],
"parameters": {},
"typeVersion": 3.2
},
{
"id": "07450849-96b2-40a7-a9d1-5e1925d76f6c",
"name": "便签",
"type": "n8n-nodes-base.stickyNote",
"position": [
-624,
-528
],
"parameters": {
"width": 480,
"height": 1152,
"content": "# 动态 MCP 选择"
},
"typeVersion": 1
},
{
"id": "4fc2caf6-ba03-4507-82f9-3b88d0460e57",
"name": "便签1",
"type": "n8n-nodes-base.stickyNote",
"position": [
-96,
-520
],
"parameters": {
"color": 7,
"width": 704,
"height": 608,
"content": "## 1. Determine whether MCP servers are needed\nBased on user's request, LLM determines the need for an MCP Server, provides a reason, and if needed, provides reranking instruction text which will be passed to reranker"
},
"typeVersion": 1
},
{
"id": "37386e9a-6051-4ef9-9e46-cbd4c60c7f80",
"name": "便签2",
"type": "n8n-nodes-base.stickyNote",
"position": [
672,
-520
],
"parameters": {
"color": 7,
"width": 640,
"height": 400,
"content": "## 2. Fetch MCP Server list and format them\nWe fetch 5000 MCP Servers from PulseMCP directory and parse them as documents to pass it onto the Contextual AI Reranker"
},
"typeVersion": 1
},
{
"id": "eef73a4d-eb47-4d2d-a7a9-44650e5ffc6b",
"name": "便签3",
"type": "n8n-nodes-base.stickyNote",
"position": [
1368,
-520
],
"parameters": {
"color": 7,
"width": 816,
"height": 400,
"content": "## 3. Rerank the servers and display top five results\nWe use Contextual AI's reranker to re-rank the servers and identify the top 5 servers based ont eh user query and re-ranker instruction, which is then formatted to be displayed in user friendly format.\n- You can checkout this [blog](https://contextual.ai/blog/introducing-instruction-following-reranker/) to learn more about rerankers"
},
"typeVersion": 1
},
{
"id": "b82d5e55-3ff9-4fd9-a37c-fc75c155353e",
"name": "User-Query",
"type": "@n8n/n8n-nodes-langchain.chatTrigger",
"position": [
-80,
-280
],
"webhookId": "018048be-810b-4a22-82c4-9e7ed7f05e1a",
"parameters": {
"public": true,
"options": {
"responseMode": "responseNodes",
"allowFileUploads": true
},
"initialMessages": "Try MCP Reranker using Contextual AI's Reranker v2"
},
"typeVersion": 1.3
},
{
"id": "04a2eb05-a82b-4a86-a18d-ed01094ba638",
"name": "LLM Agent for Decision-Making",
"type": "@n8n/n8n-nodes-langchain.agent",
"position": [
144,
-280
],
"parameters": {
"options": {
"systemMessage": "=Analyze this user query and decide if it requires external tools/APIs (Model Context Protocol (MCP) servers) or can be answered directly.\n Query: \"{{ $json.chatInput }}\"\n\n Consider:\n - Does it need real-time data, web search, or external APIs?\n - Does it require specialized tools (file management, databases, etc.)?\n - Is it a complex task that would benefit from external services?\n - Can it be answered with general knowledge alone?\n\n If MCP is needed, also generate a concise reranking instruction for selecting the best external tools/APIs (MCPs) for this query.\n\n The instruction should:\n - Specify the exact capabilities/features/details that an MCP server requires for this query\n - Look for domain/field specificity and functionality needs\n - Any specific requirements that the user asks for\n - Highlight the user's prioritized criteria for server selection\n\n Base the instruction only on what is explicitly stated or clearly implied in the user's query.\n Do not assume additional requirements or preferences that are not present in the query.\n\n Respond with JSON: {\"use_mcp\": true/false, \"reason\": \"brief explanation\", \"instruction\": \"reranking instruction text or null if not needed\"}"
}
},
"typeVersion": 2.2
},
{
"id": "1cfbc30b-68ef-402f-a8ad-2aad77789d08",
"name": "PulseMCP Fetch MCP Servers",
"type": "n8n-nodes-base.httpRequest",
"position": [
720,
-280
],
"parameters": {
"url": "=https://api.pulsemcp.com/v0beta/servers",
"options": {},
"sendQuery": true,
"queryParameters": {
"parameters": [
{
"name": "count_per_page",
"value": "5000"
},
{
"name": "offset",
"value": "0"
}
]
}
},
"typeVersion": 4.2
},
{
"id": "955343c1-540a-460b-a27f-84d2da2da40a",
"name": "Final Response1",
"type": "@n8n/n8n-nodes-langchain.chat",
"position": [
720,
-88
],
"parameters": {
"message": "= {{ $json.output.parseJson().reason }} Therefore, no MCP Servers are required to fulfill this request.",
"options": {},
"waitUserReply": false
},
"typeVersion": 1
},
{
"id": "a788876e-4bc7-4f6e-82aa-8617ba99cdc9",
"name": "Parse MCP Server list into documents w metadata",
"type": "n8n-nodes-base.code",
"position": [
1168,
-352
],
"parameters": {
"jsCode": "const servers = $input.first().json.servers || [];\nconst documents = [];\nconst metadata = [];\n\nfor (const server of servers) {\n documents.push(`MCP Server: ${server.name}\\nDescription: ${server.short_description}`);\n metadata.push(`Name: ${server.name}, Stars: ${server.github_stars}, Downloads: ${server.package_download_count}`);\n}\n\nconst aiOutputRaw = $('LLM Agent for Decision-Making').first().json.output;\nconst aiOutput = JSON.parse(aiOutputRaw);\n\nreturn [{\n json: {\n query: $('User-Query').first().json.chatInput,\n instruction: aiOutput.instruction, \n documents,\n metadata,\n servers\n }\n}];\n"
},
"typeVersion": 2
},
{
"id": "0b49e518-d9b6-4865-9cd4-658bb7317927",
"name": "ContextualAI Reranker",
"type": "n8n-nodes-base.httpRequest",
"position": [
1392,
-280
],
"parameters": {
"url": "https://api.contextual.ai/v1/rerank",
"method": "POST",
"options": {},
"sendBody": true,
"sendHeaders": true,
"bodyParameters": {
"parameters": [
{
"name": "query",
"value": "={{ $json.query }}"
},
{
"name": "instruction",
"value": "={{ $json.instruction }}"
},
{
"name": "documents",
"value": "={{ $json.documents }}"
},
{
"name": "metadata",
"value": "={{ $json.metadata }}"
},
{
"name": "model",
"value": "ctxl-rerank-v2-instruct-multilingual"
}
]
},
"headerParameters": {
"parameters": [
{
"name": "Authorization",
"value": "=Bearer {{$vars.CONTEXTUALAI_API_KEY}}"
},
{
"name": "Content-type",
"value": "application/json"
}
]
}
},
"typeVersion": 4.2
},
{
"id": "30cf71cc-d8cb-44af-aaab-4fd9ae0bceb5",
"name": "Format the top 5 results",
"type": "n8n-nodes-base.code",
"position": [
1840,
-352
],
"parameters": {
"jsCode": "const results = $input.first().json.results || [];\nconst servers = $('Parse MCP Server list into documents w metadata').first().json.servers || [];\n\nconst top = results.slice(0, 5).map((r, i) => {\n const server = servers[r.index] || {};\n return {\n name: server.name || \"Unknown\",\n description: server.short_description || \"N/A\",\n stars: server.github_stars || 0,\n downloads: server.package_download_count || 0,\n score: r.relevance_score\n };\n});\n\nlet message = \"Top MCP Servers \\n\\n\";\ntop.forEach((s, i) => {\n message += `${i + 1}. ${s.name} (⭐ ${s.stars}, ⬇️ ${s.downloads}, 🔎 ${s.score.toFixed(2)})\\n ${s.description}\\n\\n`;\n});\n\nreturn [{ json: { message } }];\n"
},
"typeVersion": 2
},
{
"id": "395b94c6-bba5-4585-bbf8-e3272699c2ac",
"name": "Final Response2",
"type": "@n8n/n8n-nodes-langchain.chat",
"position": [
2064,
-352
],
"parameters": {
"message": "={{ $json.message }}",
"options": {},
"waitUserReply": false
},
"typeVersion": 1
}
],
"active": true,
"pinData": {},
"settings": {
"callerPolicy": "workflowsFromSameOwner",
"executionOrder": "v1"
},
"versionId": "4fd9aecc-d9c0-4efd-87c7-3385c810fc75",
"connections": {
"If": {
"main": [
[
{
"node": "PulseMCP Fetch MCP Servers",
"type": "main",
"index": 0
},
{
"node": "Merge",
"type": "main",
"index": 1
}
],
[
{
"node": "Final Response1",
"type": "main",
"index": 0
}
]
]
},
"Merge": {
"main": [
[
{
"node": "Parse MCP Server list into documents w metadata",
"type": "main",
"index": 0
}
]
]
},
"Merge1": {
"main": [
[
{
"node": "Format the top 5 results",
"type": "main",
"index": 0
}
]
]
},
"User-Query": {
"main": [
[
{
"node": "LLM Agent for Decision-Making",
"type": "main",
"index": 0
}
]
]
},
"OpenAI Chat Model": {
"ai_languageModel": [
[
{
"node": "LLM Agent for Decision-Making",
"type": "ai_languageModel",
"index": 0
}
]
]
},
"ContextualAI Reranker": {
"main": [
[
{
"node": "Merge1",
"type": "main",
"index": 0
}
]
]
},
"Format the top 5 results": {
"main": [
[
{
"node": "Final Response2",
"type": "main",
"index": 0
}
]
]
},
"PulseMCP Fetch MCP Servers": {
"main": [
[
{
"node": "Merge",
"type": "main",
"index": 0
}
]
]
},
"LLM Agent for Decision-Making": {
"main": [
[
{
"node": "If",
"type": "main",
"index": 0
}
]
]
},
"Parse MCP Server list into documents w metadata": {
"main": [
[
{
"node": "ContextualAI Reranker",
"type": "main",
"index": 0
},
{
"node": "Merge1",
"type": "main",
"index": 1
}
]
]
}
}
}常见问题
如何使用这个工作流?
复制上方的 JSON 配置代码,在您的 n8n 实例中创建新工作流并选择「从 JSON 导入」,粘贴配置后根据需要修改凭证设置即可。
这个工作流适合什么场景?
高级 - 杂项, AI RAG 检索增强, 多模态 AI
需要付费吗?
本工作流完全免费,您可以直接导入使用。但请注意,工作流中使用的第三方服务(如 OpenAI API)可能需要您自行付费。
相关工作流推荐
房产搜索器爬虫助手
使用 PropertyFinder.ae、OpenRouter 和 SerpAPI 通过 AI 回答房地产问题
If
Set
Code
+9
18 节点George Zargaryan
杂项
PDF 转订单
使用AI将PDF采购订单自动化转换为Adobe Commerce销售订单
If
Set
Code
+19
96 节点JKingma
文档提取
5 使用AI聊天自动化Instagram轮播图
使用AI和Blotato在5个平台创建和发布社交媒体轮播图
If
Wait
Merge
+8
29 节点Sabrina Ramonov 🍄
杂项
上下文混合RAG AI文案
Google Drive到Supabase上下文向量数据库同步用于RAG应用
If
Set
Code
+25
76 节点Michael Taleb
AI RAG 检索增强
✨🩷自动化社交媒体内容发布工厂 + 系统提示组合
基于动态系统提示和GPT-4o的AI驱动多平台社交媒体内容工厂
If
Set
Code
+20
100 节点Amit Mehta
内容创作
使用 OpenAI 嵌入的 BigQuery RAG
使用 BigQuery RAG 和 OpenAI 回答文档相关问题
Set
Http Request
Agent
+6
24 节点Dataki
杂项