Plantilla MCP de noticias por correo electrónico

Avanzado

Este es unAI Chatbot, Multimodal AIflujo de automatización del dominio deautomatización que contiene 18 nodos.Utiliza principalmente nodos como GmailTool, PerplexityTool, Agent, McpTrigger, TavilyTool. Asistente para redactar correos e investigar noticias - Integrado con OpenAI, Gmail, Tavily y Perplexity

Requisitos previos
  • Cuenta de Google y credenciales de API de Gmail
  • Clave de API de OpenAI
Vista previa del flujo de trabajo
Visualización de las conexiones entre nodos, con soporte para zoom y panorámica
Exportar flujo de trabajo
Copie la siguiente configuración JSON en n8n para importar y usar este flujo de trabajo
{
  "id": "TgpCq3JAieEaFdGJ",
  "meta": {
    "templateCredsSetupCompleted": true
  },
  "name": "Email News MCP Template",
  "tags": [],
  "nodes": [
    {
      "id": "0606f766-255e-469c-8e6c-5751537ed3ab",
      "name": "Agentee de IA",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "position": [
        192,
        -160
      ],
      "parameters": {
        "options": {
          "systemMessage": "You are a helpful email assistant.\n\n##Tool\nUse attached Email MCP Tool for emails when asked\n\nUse attached Email MCP Tool for "
        }
      },
      "typeVersion": 2.2
    },
    {
      "id": "225b0350-6eae-45fc-a158-da9961b8aafe",
      "name": "Cuando se recibe un mensaje de chat",
      "type": "@n8n/n8n-nodes-langchain.chatTrigger",
      "position": [
        0,
        -160
      ],
      "parameters": {
        "options": {}
      },
      "typeVersion": 1.3
    },
    {
      "id": "80fcfcad-1310-4cf2-a4df-bf6746339cfd",
      "name": "Modelo de Chat OpenAI",
      "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
      "position": [
        48,
        48
      ],
      "parameters": {
        "model": {
          "__rl": true,
          "mode": "list",
          "value": "gpt-4.1-mini"
        },
        "options": {}
      },
      "typeVersion": 1.2
    },
    {
      "id": "7e3db391-7ede-4e92-9593-7a1288938d80",
      "name": "Memoria Simple",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "position": [
        224,
        48
      ],
      "parameters": {},
      "typeVersion": 1.3
    },
    {
      "id": "1b9a577c-3401-4081-be39-d5051922df38",
      "name": "Enviar un mensaje en Gmail",
      "type": "n8n-nodes-base.gmailTool",
      "position": [
        -144,
        480
      ],
      "parameters": {
        "sendTo": "<<<REPLACE_WITH_EMAIL>>>",
        "message": "<<<REPLACE_WITH_MESSAGE>>>",
        "options": {},
        "subject": "<<<REPLACE_WITH_SUBJECT>>>"
      },
      "typeVersion": 2.1
    },
    {
      "id": "fa6ae7d8-3d4d-4bd0-a4f9-d1d295f5f14b",
      "name": "Enviar un mensaje en Gmail1",
      "type": "n8n-nodes-base.gmailTool",
      "position": [
        64,
        480
      ],
      "parameters": {
        "sendTo": "<<<REPLACE_WITH_EMAIL>>>",
        "message": "<<<REPLACE_WITH_MESSAGE>>>",
        "options": {},
        "subject": "<<<REPLACE_WITH_SUBJECT>>>"
      },
      "typeVersion": 2.1
    },
    {
      "id": "252988a9-b546-4e1a-9d6f-338618b5781b",
      "name": "Enviar un mensaje en Gmail2",
      "type": "n8n-nodes-base.gmailTool",
      "position": [
        256,
        480
      ],
      "parameters": {
        "sendTo": "<<<REPLACE_WITH_EMAIL>>>",
        "message": "<<<REPLACE_WITH_MESSAGE>>>",
        "options": {},
        "subject": "<<<REPLACE_WITH_SUBJECT>>>"
      },
      "typeVersion": 2.1
    },
    {
      "id": "722718e7-8a84-44b4-98e3-a6eb53902a7c",
      "name": "Buscar en Tavily",
      "type": "@tavily/n8n-nodes-tavily.tavilyTool",
      "position": [
        512,
        480
      ],
      "parameters": {
        "query": "={{ /*n8n-auto-generated-fromAI-override*/ $fromAI('Query', ``, 'string') }}",
        "options": {}
      },
      "typeVersion": 1
    },
    {
      "id": "0bba3a97-e1ee-46f5-abec-7713d6ff2948",
      "name": "Enviar mensaje a un modelo en Perplexity",
      "type": "n8n-nodes-base.perplexityTool",
      "position": [
        688,
        480
      ],
      "parameters": {
        "options": {},
        "messages": {
          "message": [
            {
              "content": "={{ /*n8n-auto-generated-fromAI-override*/ $fromAI('message0_Text', ``, 'string') }}"
            }
          ]
        },
        "simplify": "={{ /*n8n-auto-generated-fromAI-override*/ $fromAI('Simplify_Output', ``, 'boolean') }}",
        "requestOptions": {}
      },
      "typeVersion": 1
    },
    {
      "id": "60b275c9-9e2f-4e3c-bc11-2477fe0bc951",
      "name": "Servidor MCP de Noticias",
      "type": "@n8n/n8n-nodes-langchain.mcpTrigger",
      "position": [
        544,
        256
      ],
      "parameters": {
        "path": "<<<REPLACE_WITH_PATH>>>"
      },
      "typeVersion": 2
    },
    {
      "id": "946b0a9d-590f-4633-ac98-ce983bbb205f",
      "name": "Servidor MCP de Correo",
      "type": "@n8n/n8n-nodes-langchain.mcpTrigger",
      "position": [
        -96,
        256
      ],
      "parameters": {
        "path": "<<<REPLACE_WITH_PATH>>>"
      },
      "typeVersion": 2
    },
    {
      "id": "34bff09d-95d1-446f-88cb-1c664d1ad754",
      "name": "Cliente MCP de Correo",
      "type": "@n8n/n8n-nodes-langchain.mcpClientTool",
      "position": [
        544,
        48
      ],
      "parameters": {
        "endpointUrl": "<<<REPLACE_WITH_ENDPOINT_URL>>>",
        "serverTransport": "httpStreamable"
      },
      "typeVersion": 1.1
    },
    {
      "id": "57587695-df6b-461d-8596-6561ce295f79",
      "name": "Cliente MCP de Noticias",
      "type": "@n8n/n8n-nodes-langchain.mcpClientTool",
      "position": [
        384,
        48
      ],
      "parameters": {
        "endpointUrl": "<<<REPLACE_WITH_ENDPOINT_URL>>>",
        "serverTransport": "httpStreamable"
      },
      "typeVersion": 1.1
    },
    {
      "id": "2e931983-39af-4b1d-9a16-e30cd536ff0b",
      "name": "Buscar en Tavily1",
      "type": "@tavily/n8n-nodes-tavily.tavilyTool",
      "position": [
        848,
        480
      ],
      "parameters": {
        "query": "={{ /*n8n-auto-generated-fromAI-override*/ $fromAI('Query', ``, 'string') }}",
        "options": {}
      },
      "typeVersion": 1
    },
    {
      "id": "c8fc2868-c029-454f-b47c-6cf2a4f2fb7c",
      "name": "Nota Adhesiva",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -1024,
        -432
      ],
      "parameters": {
        "width": 736,
        "height": 1808,
        "content": "AI Agent MCP for Email & News Research \n\nBuild a chat-first MCP-powered research and outreach agent. This workflow lets you ask questions in an n8n chat, then the agent researches news (via Tavily + Perplexity through an MCP server) and drafts emails (via Gmail through a separate MCP server). It uses OpenAI for reasoning and short-term memory for coherent, multi‑turn conversations.\n\nWatch build along videos for workflows like these on: www.youtube.com/@automatewithmarc\n\nWhat this template does\n\nChat-native trigger: Start a conversation and ask for research or an email draft.\n\nMCP client tools: The agent talks to two MCP servers — one for Email work, one for News research.\n\nNews research stack: Uses Tavily (search) and Perplexity (LLM retrieval/answers) behind a News MCP server.\n\nEmail stack: Uses Gmail Tool to generate and send messages via an Email MCP server.\n\nReasoning + memory: OpenAI Chat Model + Simple Memory for context-aware, multi-step outputs.\n\nHow it works (node map)\n\nWhen chat message received → collects your prompt and routes it to the agent.\n\nAI Agent (system prompt = “helpful email assistant”) → orchestrates tools via MCP Clients.\n\nOpenAI Chat Model → reasoning/planning for research or email drafting.\n\nSimple Memory → keeps recent chat context for follow-ups.\n\nNews MCP Server exposes:\n\nTavily Tool (Search) and Perplexity Tool (Ask) for up-to-date findings.\n\nEmail MCP Server exposes:\n\nGmail Tool (To, Subject, Message via AI fields) to send or draft emails.\n\nThe MCP Clients (News/Email) plug into the Agent, so your single chat prompt can research and then draft/send emails in one flow.\n\nRequirements\n\nn8n (Cloud or self‑hosted)\n\nOpenAI API key for the Chat Model (set on the node)\n\nTavily, Perplexity, and Gmail credentials (connected on their respective tool nodes)\n\nPublicly reachable MCP Server endpoints (provided in the MCP Client nodes)\n\nSetup (quick start)\n\nImport the template and open it in the editor.\n\nConnect credentials on: OpenAI, Tavily, Perplexity, and Gmail tool nodes.\n\nConfirm MCP endpoints in both MCP Client nodes (News/Email) and leave transport as httpStreamable unless you have special requirements.\n\nRun the workflow. In chat, try:\n\n“Find today’s top stories on Kubernetes security and draft an intro email to Acme.”\n\n“Summarize the latest AI infra trends and email a 3‑bullet update to my team.”\n\nInputs & outputs\n\nInput: Natural-language prompt via chat trigger.\n\nTools used: News MCP (Tavily + Perplexity), Email MCP (Gmail).\n\nOutput: A researched summary and/or a drafted/sent email, returned in the chat and executed via Gmail when requested.\n\nWhy teams will love it\n\nOne prompt → research + outreach: No tab‑hopping between tools.\n\nUp-to-date answers: Pulls current info through Tavily/Perplexity.\n\nEmail finalization: Converts findings into send-ready drafts via Gmail.\n\nContext-aware: Memory keeps threads coherent across follow-ups.\n\nPro tips\n\nUse clear verbs in your prompt: “Research X, then email Y with Z takeaways.”\n\nFor safer runs, point Gmail to a test inbox first (or disable send and only draft).\n\nAdd guardrails in the Agent’s system message to match your voice/tone."
      },
      "typeVersion": 1
    },
    {
      "id": "226bc7c3-d026-4dea-adec-1d8fc5a5481b",
      "name": "Nota Adhesiva1",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -144,
        -304
      ],
      "parameters": {
        "color": 5,
        "width": 928,
        "height": 512,
        "content": "Agent & MCP Client"
      },
      "typeVersion": 1
    },
    {
      "id": "4d9280da-af9b-4eab-be1a-9c25a6258022",
      "name": "Nota Adhesiva2",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -256,
        224
      ],
      "parameters": {
        "color": 6,
        "width": 672,
        "height": 512,
        "content": "Email MCP Server"
      },
      "typeVersion": 1
    },
    {
      "id": "f55f5515-090b-4c3d-9e60-49e0588292a4",
      "name": "Nota Adhesiva3",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        432,
        224
      ],
      "parameters": {
        "color": 7,
        "width": 672,
        "height": 512,
        "content": "News Research MCP Server"
      },
      "typeVersion": 1
    }
  ],
  "active": false,
  "pinData": {},
  "settings": {
    "executionOrder": "v1"
  },
  "connections": {
    "7e3db391-7ede-4e92-9593-7a1288938d80": {
      "ai_memory": [
        [
          {
            "node": "Agente de IA",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "57587695-df6b-461d-8596-6561ce295f79": {
      "ai_tool": [
        [
          {
            "node": "Agente de IA",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "34bff09d-95d1-446f-88cb-1c664d1ad754": {
      "ai_tool": [
        [
          {
            "node": "Agente de IA",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "722718e7-8a84-44b4-98e3-a6eb53902a7c": {
      "ai_tool": [
        [
          {
            "node": "60b275c9-9e2f-4e3c-bc11-2477fe0bc951",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "80fcfcad-1310-4cf2-a4df-bf6746339cfd": {
      "ai_languageModel": [
        [
          {
            "node": "Agente de IA",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "2e931983-39af-4b1d-9a16-e30cd536ff0b": {
      "ai_tool": [
        [
          {
            "node": "60b275c9-9e2f-4e3c-bc11-2477fe0bc951",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "1b9a577c-3401-4081-be39-d5051922df38": {
      "ai_tool": [
        [
          {
            "node": "946b0a9d-590f-4633-ac98-ce983bbb205f",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "fa6ae7d8-3d4d-4bd0-a4f9-d1d295f5f14b": {
      "ai_tool": [
        [
          {
            "node": "946b0a9d-590f-4633-ac98-ce983bbb205f",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "252988a9-b546-4e1a-9d6f-338618b5781b": {
      "ai_tool": [
        [
          {
            "node": "946b0a9d-590f-4633-ac98-ce983bbb205f",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "225b0350-6eae-45fc-a158-da9961b8aafe": {
      "main": [
        [
          {
            "node": "Agente de IA",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "0bba3a97-e1ee-46f5-abec-7713d6ff2948": {
      "ai_tool": [
        [
          {
            "node": "60b275c9-9e2f-4e3c-bc11-2477fe0bc951",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    }
  }
}
Preguntas frecuentes

¿Cómo usar este flujo de trabajo?

Copie el código de configuración JSON de arriba, cree un nuevo flujo de trabajo en su instancia de n8n y seleccione "Importar desde JSON", pegue la configuración y luego modifique la configuración de credenciales según sea necesario.

¿En qué escenarios es adecuado este flujo de trabajo?

Avanzado - Chatbot de IA, IA Multimodal

¿Es de pago?

Este flujo de trabajo es completamente gratuito, puede importarlo y usarlo directamente. Sin embargo, tenga en cuenta que los servicios de terceros utilizados en el flujo de trabajo (como la API de OpenAI) pueden requerir un pago por su cuenta.

Información del flujo de trabajo
Nivel de dificultad
Avanzado
Número de nodos18
Categoría2
Tipos de nodos10
Descripción de la dificultad

Adecuado para usuarios avanzados, flujos de trabajo complejos con 16+ nodos

Autor
Automate With Marc

Automate With Marc

@marconi

Automating Start-Up and Business processes. Helping non-techies understand and leverage Agentic AI with easy to understand step-by-step tutorials. Check out my educational content: https://www.youtube.com/@Automatewithmarc

Enlaces externos
Ver en n8n.io

Compartir este flujo de trabajo

Categorías

Categorías: 34