Assistant de chat IA adaptatif et conditionnel - www.quantralabs.com

Avancé

Ceci est unAIworkflow d'automatisation du domainecontenant 40 nœuds.Utilise principalement des nœuds comme Set, Switch, Summarize, Agent, RespondToWebhook, combinant la technologie d'intelligence artificielle pour une automatisation intelligente. Créer un chat agent RAG adaptatif avec Google Gemini et Qdrant

Prérequis
  • Point de terminaison HTTP Webhook (généré automatiquement par n8n)
  • Informations de connexion au serveur Qdrant
  • Clé API Google Gemini
Aperçu du workflow
Visualisation des connexions entre les nœuds, avec support du zoom et du déplacement
Exporter le workflow
Copiez la configuration JSON suivante dans n8n pour importer et utiliser ce workflow
{
  "id": "bU9BBKV0yadVVd30",
  "meta": {
    "instanceId": "315ec16104c52c82cc21fd9b6adb469e4bd7c2899d0990cb255788b78628ebf4"
  },
  "name": "Adaptive & Conditional AI Chat Agent - www.quantralabs.com",
  "tags": [],
  "nodes": [
    {
      "id": "6cccf7c5-9d8b-4f11-a7e1-c1bcf48bb9fe",
      "name": "Classification de Requête",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "notes": "Classify a query into one of four categories: Factual, Analytical, Opinion, or Contextual.\n        \nReturns:\nstr: Query category",
      "position": [
        -660,
        880
      ],
      "parameters": {
        "text": "=Classify this query: {{ $('Combined Fields').item.json.user_query }}",
        "options": {
          "systemMessage": "You are an expert at classifying questions. \n\nClassify the given query into exactly one of these categories:\n- Factual: Queries seeking specific, verifiable information.\n- Analytical: Queries requiring comprehensive analysis or explanation.\n- Opinion: Queries about subjective matters or seeking diverse viewpoints.\n- Contextual: Queries that depend on user-specific context.\n\nReturn ONLY the category name, without any explanation or additional text."
        },
        "promptType": "define"
      },
      "typeVersion": 1.8
    },
    {
      "id": "937104bc-5756-4adb-af0b-3e8741536e44",
      "name": "Commutateur",
      "type": "n8n-nodes-base.switch",
      "position": [
        -300,
        860
      ],
      "parameters": {
        "rules": {
          "values": [
            {
              "outputKey": "Factual",
              "conditions": {
                "options": {
                  "version": 2,
                  "leftValue": "",
                  "caseSensitive": true,
                  "typeValidation": "strict"
                },
                "combinator": "and",
                "conditions": [
                  {
                    "id": "87f3b50c-9f32-4260-ac76-19c05b28d0b4",
                    "operator": {
                      "type": "string",
                      "operation": "equals"
                    },
                    "leftValue": "={{ $json.output.trim() }}",
                    "rightValue": "Factual"
                  }
                ]
              },
              "renameOutput": true
            },
            {
              "outputKey": "Analytical",
              "conditions": {
                "options": {
                  "version": 2,
                  "leftValue": "",
                  "caseSensitive": true,
                  "typeValidation": "strict"
                },
                "combinator": "and",
                "conditions": [
                  {
                    "id": "f8651b36-79fa-4be4-91fb-0e6d7deea18f",
                    "operator": {
                      "name": "filter.operator.equals",
                      "type": "string",
                      "operation": "equals"
                    },
                    "leftValue": "={{ $json.output.trim() }}",
                    "rightValue": "Analytical"
                  }
                ]
              },
              "renameOutput": true
            },
            {
              "outputKey": "Opinion",
              "conditions": {
                "options": {
                  "version": 2,
                  "leftValue": "",
                  "caseSensitive": true,
                  "typeValidation": "strict"
                },
                "combinator": "and",
                "conditions": [
                  {
                    "id": "5dde06bc-5fe1-4dca-b6e2-6857c5e96d49",
                    "operator": {
                      "name": "filter.operator.equals",
                      "type": "string",
                      "operation": "equals"
                    },
                    "leftValue": "={{ $json.output.trim() }}",
                    "rightValue": "Opinion"
                  }
                ]
              },
              "renameOutput": true
            },
            {
              "outputKey": "Contextual",
              "conditions": {
                "options": {
                  "version": 2,
                  "leftValue": "",
                  "caseSensitive": true,
                  "typeValidation": "strict"
                },
                "combinator": "and",
                "conditions": [
                  {
                    "id": "bf97926d-7a0b-4e2f-aac0-a820f73344d8",
                    "operator": {
                      "name": "filter.operator.equals",
                      "type": "string",
                      "operation": "equals"
                    },
                    "leftValue": "={{ $json.output.trim() }}",
                    "rightValue": "Contextual"
                  }
                ]
              },
              "renameOutput": true
            }
          ]
        },
        "options": {
          "fallbackOutput": 0
        }
      },
      "typeVersion": 3.2
    },
    {
      "id": "7346bf45-3f9b-4717-8cb5-52d829f0826c",
      "name": "Stratégie Factuelle - Concentration sur la Précision",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "notes": "Retrieval strategy for factual queries focusing on precision.",
      "position": [
        100,
        120
      ],
      "parameters": {
        "text": "=Enhance this factual query: {{ $('Combined Fields').item.json.user_query }}",
        "options": {
          "systemMessage": "=You are an expert at enhancing search queries.\n\nYour task is to reformulate the given factual query to make it more precise and specific for information retrieval. Focus on key entities and their relationships.\n\nProvide ONLY the enhanced query without any explanation."
        },
        "promptType": "define"
      },
      "typeVersion": 1.7
    },
    {
      "id": "ac1df57d-524c-4393-a81c-fb720f19b05e",
      "name": "Stratégie Analytique - Couverture Complète",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "notes": "Retrieval strategy for analytical queries focusing on comprehensive coverage.",
      "position": [
        100,
        660
      ],
      "parameters": {
        "text": "=Generate sub-questions for this analytical query: {{ $('Combined Fields').item.json.user_query }}",
        "options": {
          "systemMessage": "=You are an expert at breaking down complex questions.\n\nGenerate sub-questions that explore different aspects of the main analytical query.\nThese sub-questions should cover the breadth of the topic and help retrieve comprehensive information.\n\nReturn a list of exactly 3 sub-questions, one per line."
        },
        "promptType": "define"
      },
      "typeVersion": 1.7
    },
    {
      "id": "7df8350e-4f18-47fb-bd9a-c0238d218603",
      "name": "Stratégie d'Opinion - Perspectives Diverses",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "notes": "Retrieval strategy for opinion queries focusing on diverse perspectives.",
      "position": [
        100,
        1200
      ],
      "parameters": {
        "text": "=Identify different perspectives on: {{ $('Combined Fields').item.json.user_query }}",
        "options": {
          "systemMessage": "=You are an expert at identifying different perspectives on a topic.\n\nFor the given query about opinions or viewpoints, identify different perspectives that people might have on this topic.\n\nReturn a list of exactly 3 different viewpoint angles, one per line."
        },
        "promptType": "define"
      },
      "typeVersion": 1.7
    },
    {
      "id": "0f9ef12d-7df4-4255-b5e2-27eb4e7ce982",
      "name": "Stratégie Contextuelle - Intégration du Contexte Utilisateur",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "notes": "Retrieval strategy for contextual queries integrating user context.",
      "position": [
        100,
        1740
      ],
      "parameters": {
        "text": "=Infer the implied context in this query: {{ $('Combined Fields').item.json.user_query }}",
        "options": {
          "systemMessage": "=You are an expert at understanding implied context in questions.\n\nFor the given query, infer what contextual information might be relevant or implied but not explicitly stated. Focus on what background would help answering this query.\n\nReturn a brief description of the implied context."
        },
        "promptType": "define"
      },
      "typeVersion": 1.7
    },
    {
      "id": "3c04f8e8-1304-436d-86eb-d905aa1cc261",
      "name": "Chat",
      "type": "@n8n/n8n-nodes-langchain.chatTrigger",
      "position": [
        -1320,
        1020
      ],
      "webhookId": "56f626b5-339e-48af-857f-1d4198fc8a4d",
      "parameters": {
        "options": {}
      },
      "typeVersion": 1.1
    },
    {
      "id": "e1daa9fc-62d2-4664-a9f3-dcdecf9071e6",
      "name": "Invite et Sortie Factuelles",
      "type": "n8n-nodes-base.set",
      "position": [
        500,
        120
      ],
      "parameters": {
        "options": {},
        "assignments": {
          "assignments": [
            {
              "id": "a4a28ac2-4a56-46f6-8b86-f5d1a34b2ced",
              "name": "output",
              "type": "string",
              "value": "={{ $json.output }}"
            },
            {
              "id": "7aa6ce13-afbf-4871-b81c-6e9c722a53dc",
              "name": "prompt",
              "type": "string",
              "value": "You are a helpful assistant providing factual information. Answer the question based on the provided context. Focus on accuracy and precision. If the context doesn't contain the information needed, acknowledge the limitations."
            }
          ]
        }
      },
      "typeVersion": 3.4
    },
    {
      "id": "1429dfd5-709d-4065-9134-05820fad871b",
      "name": "Invite et Sortie Contextuelles",
      "type": "n8n-nodes-base.set",
      "position": [
        500,
        1740
      ],
      "parameters": {
        "options": {},
        "assignments": {
          "assignments": [
            {
              "id": "a4a28ac2-4a56-46f6-8b86-f5d1a34b2ced",
              "name": "output",
              "type": "string",
              "value": "={{ $json.output }}"
            },
            {
              "id": "7aa6ce13-afbf-4871-b81c-6e9c722a53dc",
              "name": "prompt",
              "type": "string",
              "value": "You are a helpful assistant providing contextually relevant information. Answer the question considering both the query and its context. Make connections between the query context and the information in the provided documents. If the context doesn't fully address the specific situation, acknowledge the limitations."
            }
          ]
        }
      },
      "typeVersion": 3.4
    },
    {
      "id": "01c57856-4378-4085-bb67-2edf9b1164f9",
      "name": "Invite et Sortie d'Opinion",
      "type": "n8n-nodes-base.set",
      "position": [
        500,
        1200
      ],
      "parameters": {
        "options": {},
        "assignments": {
          "assignments": [
            {
              "id": "a4a28ac2-4a56-46f6-8b86-f5d1a34b2ced",
              "name": "output",
              "type": "string",
              "value": "={{ $json.output }}"
            },
            {
              "id": "7aa6ce13-afbf-4871-b81c-6e9c722a53dc",
              "name": "prompt",
              "type": "string",
              "value": "You are a helpful assistant discussing topics with multiple viewpoints. Based on the provided context, present different perspectives on the topic. Ensure fair representation of diverse opinions without showing bias. Acknowledge where the context presents limited viewpoints."
            }
          ]
        }
      },
      "typeVersion": 3.4
    },
    {
      "id": "3cb3f5e1-c85c-4481-a147-8b3c419526ee",
      "name": "Invite et Sortie Analytiques",
      "type": "n8n-nodes-base.set",
      "position": [
        500,
        660
      ],
      "parameters": {
        "options": {},
        "assignments": {
          "assignments": [
            {
              "id": "a4a28ac2-4a56-46f6-8b86-f5d1a34b2ced",
              "name": "output",
              "type": "string",
              "value": "={{ $json.output }}"
            },
            {
              "id": "7aa6ce13-afbf-4871-b81c-6e9c722a53dc",
              "name": "prompt",
              "type": "string",
              "value": "You are a helpful assistant providing analytical insights. Based on the provided context, offer a comprehensive analysis of the topic. Cover different aspects and perspectives in your explanation. If the context has gaps, acknowledge them while providing the best analysis possible."
            }
          ]
        }
      },
      "typeVersion": 3.4
    },
    {
      "id": "34577e85-b067-49dd-90a6-048805de5118",
      "name": "Classification Gemini",
      "type": "@n8n/n8n-nodes-langchain.lmChatGoogleGemini",
      "position": [
        -680,
        1080
      ],
      "parameters": {
        "options": {},
        "modelName": "models/gemini-2.0-flash-lite"
      },
      "typeVersion": 1
    },
    {
      "id": "7257c1dd-56bb-4f50-b206-2edd55fdd7cf",
      "name": "Gemini Factuel",
      "type": "@n8n/n8n-nodes-langchain.lmChatGoogleGemini",
      "position": [
        80,
        340
      ],
      "parameters": {
        "options": {},
        "modelName": "models/gemini-2.0-flash"
      },
      "typeVersion": 1
    },
    {
      "id": "47465499-74e8-4425-913a-2efd5c5e3441",
      "name": "Gemini Analytique",
      "type": "@n8n/n8n-nodes-langchain.lmChatGoogleGemini",
      "position": [
        80,
        880
      ],
      "parameters": {
        "options": {},
        "modelName": "models/gemini-2.0-flash"
      },
      "typeVersion": 1
    },
    {
      "id": "a7273940-82c8-44a9-8890-b00c1a741015",
      "name": "Mémoire Tampon de Chat Analytique",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "position": [
        240,
        880
      ],
      "parameters": {
        "sessionKey": "={{ $('Combined Fields').item.json.chat_memory_key }}",
        "sessionIdType": "customKey",
        "contextWindowLength": 10
      },
      "typeVersion": 1.3
    },
    {
      "id": "6b573a7d-a6f0-4290-b3f1-3e36785bbee1",
      "name": "Mémoire Tampon de Chat Factuelle",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "position": [
        240,
        340
      ],
      "parameters": {
        "sessionKey": "={{ $('Combined Fields').item.json.chat_memory_key }}",
        "sessionIdType": "customKey",
        "contextWindowLength": 10
      },
      "typeVersion": 1.3
    },
    {
      "id": "9b20e2d9-9c67-45a0-9dfe-03bafb62f67f",
      "name": "Gemini Opinion",
      "type": "@n8n/n8n-nodes-langchain.lmChatGoogleGemini",
      "position": [
        80,
        1420
      ],
      "parameters": {
        "options": {},
        "modelName": "models/gemini-2.0-flash"
      },
      "typeVersion": 1
    },
    {
      "id": "70489752-18b8-4676-a700-539c7b0fecb3",
      "name": "Mémoire Tampon de Chat d'Opinion",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "position": [
        240,
        1420
      ],
      "parameters": {
        "sessionKey": "={{ $('Combined Fields').item.json.chat_memory_key }}",
        "sessionIdType": "customKey",
        "contextWindowLength": 10
      },
      "typeVersion": 1.3
    },
    {
      "id": "e32a0ce3-2f72-43ca-b12a-03e3cf1b7818",
      "name": "Gemini Contextuel",
      "type": "@n8n/n8n-nodes-langchain.lmChatGoogleGemini",
      "position": [
        80,
        1960
      ],
      "parameters": {
        "options": {},
        "modelName": "models/gemini-2.0-flash"
      },
      "typeVersion": 1
    },
    {
      "id": "75c4f677-4c78-4a22-b0b4-44f3882c1a4e",
      "name": "Mémoire Tampon de Chat Contextuelle",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "position": [
        240,
        1960
      ],
      "parameters": {
        "sessionKey": "={{ $('Combined Fields').item.json.chat_memory_key }}",
        "sessionIdType": "customKey",
        "contextWindowLength": 10
      },
      "typeVersion": 1.3
    },
    {
      "id": "31d68f85-3cfc-4c93-81f7-c27070bf7307",
      "name": "Embeddings",
      "type": "@n8n/n8n-nodes-langchain.embeddingsGoogleGemini",
      "position": [
        1020,
        1100
      ],
      "parameters": {
        "modelName": "models/text-embedding-004"
      },
      "typeVersion": 1
    },
    {
      "id": "53910aea-7326-4d59-8585-693cb05afc3e",
      "name": "Note Adhésive",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        0,
        0
      ],
      "parameters": {
        "color": 7,
        "width": 700,
        "height": 520,
        "content": "## Factual Strategy\n**Retrieve precise facts and figures.**"
      },
      "typeVersion": 1
    },
    {
      "id": "87015bf7-0bf1-490f-90b4-96346cc51b7c",
      "name": "Note Adhésive1",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        0,
        540
      ],
      "parameters": {
        "color": 7,
        "width": 700,
        "height": 520,
        "content": "## Analytical Strategy\n**Provide comprehensive coverage of a topics and exploring different aspects.**"
      },
      "typeVersion": 1
    },
    {
      "id": "b14886e0-e513-405d-92d5-d4a417280546",
      "name": "Note Adhésive2",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        0,
        1080
      ],
      "parameters": {
        "color": 7,
        "width": 700,
        "height": 520,
        "content": "## Opinion Strategy\n**Gather diverse viewpoints on a subjective issue.**"
      },
      "typeVersion": 1
    },
    {
      "id": "77cd1373-d547-462b-85cb-6799e7fbae84",
      "name": "Note Adhésive3",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        0,
        1620
      ],
      "parameters": {
        "color": 7,
        "width": 700,
        "height": 520,
        "content": "## Contextual Strategy\n**Incorporate user-specific context to fine-tune the retrieval.**"
      },
      "typeVersion": 1
    },
    {
      "id": "edfe1620-b040-466c-a8b2-a6f2aae565c5",
      "name": "Concaténer le Contexte",
      "type": "n8n-nodes-base.summarize",
      "position": [
        1400,
        880
      ],
      "parameters": {
        "options": {},
        "fieldsToSummarize": {
          "values": [
            {
              "field": "document.pageContent",
              "separateBy": "other",
              "aggregation": "concatenate",
              "customSeparator": "={{ \"\\n\\n---\\n\\n\" }}"
            }
          ]
        }
      },
      "typeVersion": 1.1
    },
    {
      "id": "059f5b2e-52db-49f6-bee8-9dfcd5fd1ea4",
      "name": "Récupérer des Documents du Vector Store",
      "type": "@n8n/n8n-nodes-langchain.vectorStoreQdrant",
      "position": [
        1040,
        880
      ],
      "parameters": {
        "mode": "load",
        "topK": 10,
        "prompt": "={{ $json.prompt }}\n\nUser query: \n{{ $json.output }}",
        "options": {},
        "qdrantCollection": {
          "__rl": true,
          "mode": "id",
          "value": "={{ $('Combined Fields').item.json.vector_store_id }}"
        }
      },
      "typeVersion": 1.1
    },
    {
      "id": "d1a2de81-c92b-459a-a2b6-bb6d171ba712",
      "name": "Définir l'Invite et la Sortie",
      "type": "n8n-nodes-base.set",
      "position": [
        840,
        880
      ],
      "parameters": {
        "options": {},
        "assignments": {
          "assignments": [
            {
              "id": "1d782243-0571-4845-b8fe-4c6c4b55379e",
              "name": "output",
              "type": "string",
              "value": "={{ $json.output }}"
            },
            {
              "id": "547091fb-367c-44d4-ac39-24d073da70e0",
              "name": "prompt",
              "type": "string",
              "value": "={{ $json.prompt }}"
            }
          ]
        }
      },
      "typeVersion": 3.4
    },
    {
      "id": "aa7dede0-8241-4428-84db-7a403935a052",
      "name": "Gemini Réponse",
      "type": "@n8n/n8n-nodes-langchain.lmChatGoogleGemini",
      "position": [
        1680,
        1100
      ],
      "parameters": {
        "options": {},
        "modelName": "models/gemini-2.0-flash"
      },
      "typeVersion": 1
    },
    {
      "id": "5a33f53d-2297-46a1-9508-54c1e4f168be",
      "name": "Réponse",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "position": [
        1720,
        880
      ],
      "parameters": {
        "text": "=User query: {{ $('Combined Fields').item.json.user_query }}",
        "options": {
          "systemMessage": "={{ $('Set Prompt and Output').item.json.prompt }}\n\nUse the following context (delimited by <ctx></ctx>) and the chat history to answer the user query.\n<ctx>\n{{ $json.concatenated_document_pageContent }}\n</ctx>"
        },
        "promptType": "define"
      },
      "typeVersion": 1.8
    },
    {
      "id": "888d1b5e-151f-4fb7-a201-fa8706af6ae8",
      "name": "Mémoire Tampon de Chat",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "position": [
        1860,
        1100
      ],
      "parameters": {
        "sessionKey": "={{ $('Combined Fields').item.json.chat_memory_key }}",
        "sessionIdType": "customKey",
        "contextWindowLength": 10
      },
      "typeVersion": 1.3
    },
    {
      "id": "573efafe-0dca-46d9-98a9-2684277d411d",
      "name": "Note Adhésive4",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        760,
        680
      ],
      "parameters": {
        "color": 7,
        "width": 820,
        "height": 580,
        "content": "## Perform adaptive retrieval\n**Find document considering both query and context.**"
      },
      "typeVersion": 1
    },
    {
      "id": "f4219ef7-16ce-4cc6-8b03-a9480b2daf55",
      "name": "Note Adhésive5",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        1600,
        680
      ],
      "parameters": {
        "color": 7,
        "width": 740,
        "height": 580,
        "content": "## Reply to the user integrating retrieval context"
      },
      "typeVersion": 1
    },
    {
      "id": "4fe79c8d-9670-4b2a-a7c7-5a2239906a14",
      "name": "Répondre à Webhook",
      "type": "n8n-nodes-base.respondToWebhook",
      "position": [
        2080,
        880
      ],
      "parameters": {
        "options": {}
      },
      "typeVersion": 1.1
    },
    {
      "id": "ff99bc5a-bbbe-457d-b979-2ca1e04bd980",
      "name": "Note Adhésive6",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -760,
        680
      ],
      "parameters": {
        "color": 7,
        "width": 700,
        "height": 580,
        "content": "## User query classification\n**Classify the query into one of four categories: Factual, Analytical, Opinion, or Contextual.**"
      },
      "typeVersion": 1
    },
    {
      "id": "c632a735-5a96-4a70-bf4c-6126e2e193f1",
      "name": "Lorsqu'Exécuté par un Autre Workflow",
      "type": "n8n-nodes-base.executeWorkflowTrigger",
      "position": [
        -1320,
        760
      ],
      "parameters": {
        "workflowInputs": {
          "values": [
            {
              "name": "user_query"
            },
            {
              "name": "chat_memory_key"
            },
            {
              "name": "vector_store_id"
            }
          ]
        }
      },
      "typeVersion": 1.1
    },
    {
      "id": "332b925d-0581-4b92-a7ce-83cbc8f66254",
      "name": "Champs Combinés",
      "type": "n8n-nodes-base.set",
      "position": [
        -1000,
        880
      ],
      "parameters": {
        "options": {},
        "assignments": {
          "assignments": [
            {
              "id": "90ab73a2-fe01-451a-b9df-bffe950b1599",
              "name": "user_query",
              "type": "string",
              "value": "={{ $json.user_query || $json.chatInput }}"
            },
            {
              "id": "36686ff5-09fc-40a4-8335-a5dd1576e941",
              "name": "chat_memory_key",
              "type": "string",
              "value": "={{ $json.chat_memory_key || $('Chat').item.json.sessionId }}"
            },
            {
              "id": "4230c8f3-644c-4985-b710-a4099ccee77c",
              "name": "vector_store_id",
              "type": "string",
              "value": "={{ $json.vector_store_id || \"<ID HERE>\" }}"
            }
          ]
        }
      },
      "typeVersion": 3.4
    },
    {
      "id": "eab0d609-5d9b-4794-adb5-fd8a784f34b0",
      "name": "Note Adhésive7",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -1340,
        1300
      ],
      "parameters": {
        "width": 1280,
        "height": 1300,
        "content": "# Adaptive RAG Workflow\n\nThis n8n workflow implements a version of the Adaptive Retrieval-Augmented Generation (RAG) approach. It classifies user queries and applies different retrieval and generation strategies based on the query type (Factual, Analytical, Opinion, or Contextual) to provide more relevant and tailored answers from a knowledge base stored in a Qdrant vector store.\n\n## How it Works\n\n1.  **Input Trigger:**\n    * The workflow can be initiated via the built-in Chat interface or triggered by another n8n workflow.\n    * It expects inputs: `user_query`, `chat_memory_key` (for conversation history), and `vector_store_id` (specifying the Qdrant collection).\n    * A `Set` node (`Combined Fields`) standardizes these inputs.\n\n2.  **Query Classification:**\n    * A Google Gemini agent (`Query Classification`) analyzes the `user_query`.\n    * It classifies the query into one of four categories:\n        * **Factual:** Seeking specific, verifiable information.\n        * **Analytical:** Requiring comprehensive analysis or explanation.\n        * **Opinion:** Asking about subjective matters or seeking diverse viewpoints.\n        * **Contextual:** Depending on user-specific or implied context.\n\n3.  **Adaptive Strategy Routing:**\n    * A `Switch` node routes the workflow based on the classification result from the previous step.\n\n4.  **Strategy Implementation (Query Adaptation):**\n    * Depending on the route, a specific Google Gemini agent adapts the query or approach:\n        * **Factual Strategy:** Rewrites the query for better precision, focusing on key entities (`Factual Strategy - Focus on Precision`).\n        * **Analytical Strategy:** Breaks down the main query into multiple sub-questions to ensure comprehensive coverage (`Analytical Strategy - Comprehensive Coverage`).\n        * **Opinion Strategy:** Identifies different potential perspectives or angles related to the query (`Opinion Strategy - Diverse Perspectives`).\n        * **Contextual Strategy:** Infers implied context needed to answer the query effectively (`Contextual Strategy - User Context Integration`).\n    * Each strategy path uses its own chat memory buffer for the adaptation step.\n\n5.  **Retrieval Prompt & Output Setup:**\n    * Based on the *original* query classification, a `Set` node (`Factual/Analytical/Opinion/Contextual Prompt and Output`, combined via connections to `Set Prompt and Output`) prepares:\n        * The output from the strategy step (e.g., rewritten query, sub-questions, perspectives).\n        * A tailored system prompt for the final answer generation agent, instructing it how to behave based on the query type (e.g., focus on precision for Factual, present diverse views for Opinion).\n\n6.  **Document Retrieval (RAG):**\n    * The `Retrieve Documents from Vector Store` node uses the adapted query/output from the strategy step to search the specified Qdrant collection (`vector_store_id`).\n    * It retrieves the top relevant document chunks using Google Gemini embeddings.\n\n7.  **Context Preparation:**\n    * The content from the retrieved document chunks is concatenated (`Concatenate Context`) to form a single context block for the final answer generation.\n\n8.  **Answer Generation:**\n    * The final `Answer` agent (powered by Google Gemini) generates the response.\n    * It uses:\n        * The tailored system prompt set in step 5.\n        * The concatenated context from retrieved documents (step 7).\n        * The original `user_query`.\n        * The shared chat history (`Chat Buffer Memory` using `chat_memory_key`).\n\n9.  **Response:**\n    * The generated answer is sent back to the user via the `Respond to Webhook` node."
      },
      "typeVersion": 1
    },
    {
      "id": "1580b44f-bd48-43f8-b9ef-dbfd58042e68",
      "name": "Note Adhésive8",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -1100,
        680
      ],
      "parameters": {
        "color": 7,
        "width": 320,
        "height": 580,
        "content": "## ⚠️  If using in Chat mode\n\nUpdate the `vector_store_id` variable to the corresponding Qdrant ID needed to perform the documents retrieval."
      },
      "typeVersion": 1
    },
    {
      "id": "df475a3d-cec6-4a29-8d33-cfe8a1ae3d6c",
      "name": "Note Adhésive9",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -1720,
        1300
      ],
      "parameters": {
        "color": 5,
        "width": 360,
        "height": 200,
        "content": "## Quantra Labs \nFollow Us\nhttps://www.x.com/quantralabs\n\nConnect with Us\nhttps://www.linkedin.com/company/quantra-labs\n\nwww.quantralabs.com"
      },
      "typeVersion": 1
    }
  ],
  "active": false,
  "pinData": {},
  "settings": {
    "executionOrder": "v1"
  },
  "versionId": "c562673d-bddb-4abd-adef-59c2fb61e716",
  "connections": {
    "3c04f8e8-1304-436d-86eb-d905aa1cc261": {
      "main": [
        [
          {
            "node": "332b925d-0581-4b92-a7ce-83cbc8f66254",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "5a33f53d-2297-46a1-9508-54c1e4f168be": {
      "main": [
        [
          {
            "node": "4fe79c8d-9670-4b2a-a7c7-5a2239906a14",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "937104bc-5756-4adb-af0b-3e8741536e44": {
      "main": [
        [
          {
            "node": "7346bf45-3f9b-4717-8cb5-52d829f0826c",
            "type": "main",
            "index": 0
          }
        ],
        [
          {
            "node": "ac1df57d-524c-4393-a81c-fb720f19b05e",
            "type": "main",
            "index": 0
          }
        ],
        [
          {
            "node": "7df8350e-4f18-47fb-bd9a-c0238d218603",
            "type": "main",
            "index": 0
          }
        ],
        [
          {
            "node": "0f9ef12d-7df4-4255-b5e2-27eb4e7ce982",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "31d68f85-3cfc-4c93-81f7-c27070bf7307": {
      "ai_embedding": [
        [
          {
            "node": "059f5b2e-52db-49f6-bee8-9dfcd5fd1ea4",
            "type": "ai_embedding",
            "index": 0
          }
        ]
      ]
    },
    "aa7dede0-8241-4428-84db-7a403935a052": {
      "ai_languageModel": [
        [
          {
            "node": "5a33f53d-2297-46a1-9508-54c1e4f168be",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "7257c1dd-56bb-4f50-b206-2edd55fdd7cf": {
      "ai_languageModel": [
        [
          {
            "node": "7346bf45-3f9b-4717-8cb5-52d829f0826c",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "9b20e2d9-9c67-45a0-9dfe-03bafb62f67f": {
      "ai_languageModel": [
        [
          {
            "node": "7df8350e-4f18-47fb-bd9a-c0238d218603",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "332b925d-0581-4b92-a7ce-83cbc8f66254": {
      "main": [
        [
          {
            "node": "6cccf7c5-9d8b-4f11-a7e1-c1bcf48bb9fe",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "47465499-74e8-4425-913a-2efd5c5e3441": {
      "ai_languageModel": [
        [
          {
            "node": "ac1df57d-524c-4393-a81c-fb720f19b05e",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "e32a0ce3-2f72-43ca-b12a-03e3cf1b7818": {
      "ai_languageModel": [
        [
          {
            "node": "0f9ef12d-7df4-4255-b5e2-27eb4e7ce982",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "888d1b5e-151f-4fb7-a201-fa8706af6ae8": {
      "ai_memory": [
        [
          {
            "node": "5a33f53d-2297-46a1-9508-54c1e4f168be",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "edfe1620-b040-466c-a8b2-a6f2aae565c5": {
      "main": [
        [
          {
            "node": "5a33f53d-2297-46a1-9508-54c1e4f168be",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "6cccf7c5-9d8b-4f11-a7e1-c1bcf48bb9fe": {
      "main": [
        [
          {
            "node": "937104bc-5756-4adb-af0b-3e8741536e44",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "34577e85-b067-49dd-90a6-048805de5118": {
      "ai_languageModel": [
        [
          {
            "node": "6cccf7c5-9d8b-4f11-a7e1-c1bcf48bb9fe",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "d1a2de81-c92b-459a-a2b6-bb6d171ba712": {
      "main": [
        [
          {
            "node": "059f5b2e-52db-49f6-bee8-9dfcd5fd1ea4",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "e1daa9fc-62d2-4664-a9f3-dcdecf9071e6": {
      "main": [
        [
          {
            "node": "d1a2de81-c92b-459a-a2b6-bb6d171ba712",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "01c57856-4378-4085-bb67-2edf9b1164f9": {
      "main": [
        [
          {
            "node": "d1a2de81-c92b-459a-a2b6-bb6d171ba712",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "6b573a7d-a6f0-4290-b3f1-3e36785bbee1": {
      "ai_memory": [
        [
          {
            "node": "7346bf45-3f9b-4717-8cb5-52d829f0826c",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "70489752-18b8-4676-a700-539c7b0fecb3": {
      "ai_memory": [
        [
          {
            "node": "7df8350e-4f18-47fb-bd9a-c0238d218603",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "3cb3f5e1-c85c-4481-a147-8b3c419526ee": {
      "main": [
        [
          {
            "node": "d1a2de81-c92b-459a-a2b6-bb6d171ba712",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "1429dfd5-709d-4065-9134-05820fad871b": {
      "main": [
        [
          {
            "node": "d1a2de81-c92b-459a-a2b6-bb6d171ba712",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "a7273940-82c8-44a9-8890-b00c1a741015": {
      "ai_memory": [
        [
          {
            "node": "ac1df57d-524c-4393-a81c-fb720f19b05e",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "75c4f677-4c78-4a22-b0b4-44f3882c1a4e": {
      "ai_memory": [
        [
          {
            "node": "0f9ef12d-7df4-4255-b5e2-27eb4e7ce982",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "c632a735-5a96-4a70-bf4c-6126e2e193f1": {
      "main": [
        [
          {
            "node": "332b925d-0581-4b92-a7ce-83cbc8f66254",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "059f5b2e-52db-49f6-bee8-9dfcd5fd1ea4": {
      "main": [
        [
          {
            "node": "edfe1620-b040-466c-a8b2-a6f2aae565c5",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "7346bf45-3f9b-4717-8cb5-52d829f0826c": {
      "main": [
        [
          {
            "node": "e1daa9fc-62d2-4664-a9f3-dcdecf9071e6",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "7df8350e-4f18-47fb-bd9a-c0238d218603": {
      "main": [
        [
          {
            "node": "01c57856-4378-4085-bb67-2edf9b1164f9",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "ac1df57d-524c-4393-a81c-fb720f19b05e": {
      "main": [
        [
          {
            "node": "3cb3f5e1-c85c-4481-a147-8b3c419526ee",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "0f9ef12d-7df4-4255-b5e2-27eb4e7ce982": {
      "main": [
        [
          {
            "node": "1429dfd5-709d-4065-9134-05820fad871b",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}
Foire aux questions

Comment utiliser ce workflow ?

Copiez le code de configuration JSON ci-dessus, créez un nouveau workflow dans votre instance n8n et sélectionnez "Importer depuis le JSON", collez la configuration et modifiez les paramètres d'authentification selon vos besoins.

Dans quelles scénarios ce workflow est-il adapté ?

Avancé - Intelligence Artificielle

Est-ce payant ?

Ce workflow est entièrement gratuit et peut être utilisé directement. Veuillez noter que les services tiers utilisés dans le workflow (comme l'API OpenAI) peuvent nécessiter un paiement de votre part.

Informations sur le workflow
Niveau de difficulté
Avancé
Nombre de nœuds40
Catégorie1
Types de nœuds12
Description de la difficulté

Adapté aux utilisateurs avancés, avec des workflows complexes contenant 16+ nœuds

Auteur
Brandon Crenshaw

Brandon Crenshaw

@brandononchain

Founder & Systems Architect | Quantra Labs Engineering scalable solutions across AI, Blockchain, and Fintech.

Liens externes
Voir sur n8n.io

Partager ce workflow

Catégories

Catégories: 34