n8n ワークフローと認証情報の完全なバックアップソリューション(ローカルストレージと FTP)
上級
これはContent Creation, Multimodal AI分野の自動化ワークフローで、28個のノードを含みます。主にFtp, N8n, Code, Merge, Aggregateなどのノードを使用。 n8nのワークフローと認証情報の完全なバックアップソリューション(ローカルストレージとFTPを使用)
前提条件
- •特別な前提条件なし、インポートしてすぐに使用可能
使用ノード (28)
ワークフロープレビュー
ノード接続関係を可視化、ズームとパンをサポート
ワークフローをエクスポート
以下のJSON設定をn8nにインポートして、このワークフローを使用できます
{
"meta": {
"instanceId": "",
"templateCredsSetupCompleted": true
},
"nodes": [
{
"id": "--0",
"name": "付箋",
"type": "n8n-nodes-base.stickyNote",
"position": [
-112,
-1408
],
"parameters": {
"width": 1264,
"height": 528,
"content": "## Backup Worflows & Credentials to Drive\nConfigure \"Init\" node in 2 sections.\n### Workflow Standard Configuration\n```\n// Admin email for notifications\nconst N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com';\n \n// Workflow name (auto-detected)\nconst WORKFLOW_NAME = $workflow.name;\n \n// Projects root directory on your server (see file structure in node)\nconst N8N_PROJECTS_DIR = $env.N8N_PROJECTS_DIR || '/files/n8n-projects-data';\n \n// Project folder name for this backup workflow\nconst PROJECT_FOLDER_NAME = \"Workflow-backups\"; // ⚠️ Your project folder name\n```\n### Workflow Custom Configuration\n```\n// Base backup path using your Docker and FTP volume configuration (folder must exist)\nconst BACKUP_FOLDER = $env.N8N_BACKUP_FOLDER || '/files/n8n-backups'; // ⚠️ Change the default value for your n8n backup folder\nconst FTP_BACKUP_FOLDER = $env.N8N_FTP_BACKUP_FOLDER || '/n8n-backups';\nconst FTPName = 'Your FTP Server Name'; // FTP server name for logging (display purposes only)\n \nconst credentials_backup_folder = \"n8n-credentials\"; // Credentials backup path\n```"
},
"typeVersion": 1
},
{
"id": "-2-1",
"name": "付箋2",
"type": "n8n-nodes-base.stickyNote",
"position": [
-112,
-816
],
"parameters": {
"color": 4,
"width": 624,
"height": 832,
"content": "## Initialisation"
},
"typeVersion": 1
},
{
"id": "-3-2",
"name": "付箋3",
"type": "n8n-nodes-base.stickyNote",
"position": [
576,
-816
],
"parameters": {
"color": 3,
"width": 1232,
"height": 272,
"content": "## Credentials’ backup & FTP upload"
},
"typeVersion": 1
},
{
"id": "-5-3",
"name": "付箋5",
"type": "n8n-nodes-base.stickyNote",
"position": [
576,
-480
],
"parameters": {
"color": 5,
"width": 1232,
"height": 496,
"content": "## Workflows’ backup & FTP upload"
},
"typeVersion": 1
},
{
"id": "-6-4",
"name": "付箋6",
"type": "n8n-nodes-base.stickyNote",
"position": [
1872,
-816
],
"parameters": {
"color": 6,
"width": 816,
"height": 832,
"content": "## Finalisation"
},
"typeVersion": 1
},
{
"id": "--5",
"name": "バックアップサマリー",
"type": "n8n-nodes-base.code",
"onError": "continueErrorOutput",
"position": [
2112,
-544
],
"parameters": {
"jsCode": "// Generate backup summary and statistics with binary data for log file\n// ✅ Enhanced to handle both credentials and workflows FTP upload results\n// ✅ Adapted for Aggregate + Merge structure\n// ✅ Handles disabled FTP nodes gracefully\n\n// ⚠️ CONFIGURE THIS NODE WITH:\n// SETTINGS > ON ERROR > CONTINUE (USING ERROR OUTPUT)\n\nconst allItems = $input.all();\nconst summaryDataInput = $(\"Init\").first().json;\n\nconsole.log('[Backup Summary] Total items received:', allItems.length);\n\n// Separate different data types from Merge\nconst credentialsFtpItems = allItems.filter(item => \n item.json.credentialsFtpUpload !== undefined\n);\n\nconst workflowsFtpItems = allItems.filter(item => \n item.json.workflowsFtpUpload !== undefined\n);\n\nconst aggregatedWorkflowItems = allItems.filter(item => \n item.json.workflows !== undefined\n);\n\nconsole.log('[Backup Summary] Credentials FTP items:', credentialsFtpItems.length);\nconsole.log('[Backup Summary] Workflows FTP items:', workflowsFtpItems.length);\nconsole.log('[Backup Summary] Aggregated workflow items:', aggregatedWorkflowItems.length);\n\n// Get credentials FTP upload data\nconst credentialsFtpData = credentialsFtpItems.length > 0 ? \n credentialsFtpItems[0].json.credentialsFtpUpload : null;\n\n// Get workflows FTP upload data\nconst workflowsFtpData = workflowsFtpItems.length > 0 ? \n workflowsFtpItems[0].json.workflowsFtpUpload : null;\n\n// Get aggregated workflows data\nconst workflowsData = aggregatedWorkflowItems.length > 0 ? \n aggregatedWorkflowItems[0].json.workflows : [];\n\n// Get original workflow count from Fetch Workflows node\nconst workflows = $(\"Fetch Workflows\").all();\nconst totalWorkflows = workflows.length;\n\nconsole.log('[Backup Summary] Total workflows fetched:', totalWorkflows);\nconsole.log('[Backup Summary] Aggregated workflows found:', workflowsData.length);\nconsole.log('[Backup Summary] Credentials FTP data found:', credentialsFtpData !== null);\nconsole.log('[Backup Summary] Workflows FTP data found:', workflowsFtpData !== null);\n\n// Get credentials export result directly from the Export Credentials node\nconst credentialsResult = $(\"Export Credentials\").first()?.json || null;\n\n// Process workflow data from aggregated structure with binary data\nconst successfulWrites = [];\nconst failedWrites = [];\nlet totalSize = 0;\n\n// Get aggregated item with binary data\nconst aggregatedItem = aggregatedWorkflowItems[0];\nconst binaryData = aggregatedItem?.binary || {};\n\n// Since we have aggregated data, we assume all workflows were written successfully\n// (failed writes would have been caught by error handling in previous nodes)\nworkflowsData.forEach((workflowFile, index) => {\n const fileName = workflowFile.fileName;\n const workflowName = fileName ? fileName.split('/').pop().replace('.json', '') : `workflow_${index + 1}`;\n \n // Try to match with original workflow data to get more details\n const originalWorkflow = workflows.find(w => {\n const cleanedName = workflowName.replace(/[^a-zA-Z0-9]/g, '_');\n return w.json.cleanedFileName === cleanedName || \n w.json.name === workflowName;\n });\n \n const nodesCount = originalWorkflow ? \n (originalWorkflow.json.nodes ? originalWorkflow.json.nodes.length : 0) : 0;\n \n // Get real file size from binary data\n let actualSize = 0;\n const binaryKey = `data_${index}` || `data${index + 1}` || Object.keys(binaryData)[index];\n \n if (binaryData[binaryKey] && binaryData[binaryKey].data) {\n // Calculate size from base64 data (base64 is ~33% larger than original)\n const base64Data = binaryData[binaryKey].data;\n actualSize = Math.floor(base64Data.length * 0.75); // Convert base64 length to actual bytes\n }\n \n console.log(`[Backup Summary] Workflow ${workflowName}: ${actualSize} bytes`);\n \n successfulWrites.push({\n fileName: fileName,\n workflowName: workflowName,\n filePath: fileName,\n contentSize: actualSize,\n nodesCount: nodesCount\n });\n \n totalSize += actualSize;\n});\n\nconsole.log(`[Backup Summary] Processed ${successfulWrites.length} successful workflow writes`);\n\n// Use time data from Init node instead of recalculating\nconst initTimeData = summaryDataInput.timeData;\nconst endTime = new Date();\nconst startTime = new Date(initTimeData.startTime);\nconst durationSeconds = Math.round((endTime.getTime() - startTime.getTime()) / 1000);\n\n// Process credentials export result\nconst credentialsExportSuccess = credentialsResult && \n credentialsResult.exitCode !== undefined && \n credentialsResult.exitCode === 0;\n\nconst credentialsExport = {\n attempted: credentialsResult !== null,\n success: credentialsExportSuccess,\n exitCode: credentialsResult?.exitCode,\n message: credentialsExportSuccess ? \n 'Credentials exported successfully' : \n `Credentials export failed with exit code ${credentialsResult?.exitCode}`,\n stdout: credentialsResult?.stdout || '',\n stderr: credentialsResult?.stderr || ''\n};\n\n// Process credentials FTP upload results with disabled node detection\nconst credentialsFtpSuccess = credentialsFtpData && \n credentialsFtpData.summary && \n credentialsFtpData.summary.overallSuccess;\n\nconst credentialsFtpUpload = {\n attempted: credentialsFtpData !== null,\n disabled: credentialsFtpData === null,\n success: credentialsFtpSuccess,\n totalFiles: credentialsFtpData?.totalFiles || 0,\n successfulUploads: credentialsFtpData?.successfulUploads || 0,\n failedUploads: credentialsFtpData?.failedUploads || 0,\n message: credentialsFtpData === null ? \n '⚠️ Warning: no FTP data received (node likely disabled)' :\n (credentialsFtpSuccess ? \n credentialsFtpData.summary.summary :\n credentialsFtpData?.summary?.summary || 'Credentials FTP upload failed')\n};\n\n// Process workflows FTP upload results with disabled node detection\nconst workflowsFtpSuccess = workflowsFtpData && \n workflowsFtpData.summary && \n workflowsFtpData.summary.overallSuccess;\n\nconst workflowsFtpUpload = {\n attempted: workflowsFtpData !== null,\n disabled: workflowsFtpData === null,\n success: workflowsFtpSuccess,\n totalFiles: workflowsFtpData?.totalFiles || 0,\n successfulUploads: workflowsFtpData?.successfulUploads || 0,\n failedUploads: workflowsFtpData?.failedUploads || 0,\n message: workflowsFtpData === null ? \n '⚠️ Warning: no FTP data received (node likely disabled)' :\n (workflowsFtpSuccess ? \n workflowsFtpData.summary.summary :\n workflowsFtpData?.summary?.summary || 'Workflows FTP upload failed')\n};\n\nconsole.log('[Backup Summary] Credentials export status:', credentialsExport.success ? '✅ Success' : '❌ Failed');\nconsole.log('[Backup Summary] Credentials FTP status:', credentialsFtpUpload.disabled ? '⚠️ Disabled' : (credentialsFtpUpload.success ? '✅ Success' : '❌ Failed'));\nconsole.log('[Backup Summary] Workflows FTP status:', workflowsFtpUpload.disabled ? '⚠️ Disabled' : (workflowsFtpUpload.success ? '✅ Success' : '❌ Failed'));\nconsole.log(`[Backup Summary] Final counts - Successful: ${successfulWrites.length}, Failed: ${failedWrites.length}`);\n\n// Create a backupSummary structure\nconst backupSummary = {\n totalWorkflows: totalWorkflows,\n errorCount: failedWrites.length,\n errors: failedWrites.map(fail => ({\n workflowName: fail.workflowName,\n error: fail.error\n }))\n};\n\n// Determine FTP folder status icon\nconst ftpFolderStatusIcon = (credentialsFtpUpload.disabled || workflowsFtpUpload.disabled) ? \n '⚠️' : \n ((credentialsFtpUpload.success && workflowsFtpUpload.success) ? '✅' : '❌');\n\n// Create comprehensive backup log using Init variables\nconst backupLog = {\n timestamp: endTime.toISOString(),\n localTimestamp: endTime.toLocaleString('en-GB', { timeZone: initTimeData.localTimezone }),\n localTimezone: initTimeData.localTimezone,\n backupFolder: summaryDataInput.customConfig.backupFolder,\n dateFolder: initTimeData.datePrefix,\n duration: {\n seconds: durationSeconds,\n formatted: `${Math.floor(durationSeconds / 60)}m ${durationSeconds % 60}s`,\n startTime: initTimeData.startTime,\n endTime: endTime.toISOString()\n },\n statistics: {\n totalWorkflows: backupSummary.totalWorkflows,\n successfulBackups: successfulWrites.length,\n failedBackups: failedWrites.length,\n preparationErrors: backupSummary.errorCount,\n credentialsExported: credentialsExport.success,\n credentialsFtpUploaded: credentialsFtpUpload.success,\n credentialsFtpDisabled: credentialsFtpUpload.disabled,\n credentialsFtpFiles: credentialsFtpUpload.totalFiles,\n workflowsFtpUploaded: workflowsFtpUpload.success,\n workflowsFtpDisabled: workflowsFtpUpload.disabled,\n workflowsFtpFiles: workflowsFtpUpload.totalFiles,\n totalSizeMB: Math.round(totalSize / 1024 / 1024 * 100) / 100,\n averageSizeKB: successfulWrites.length > 0 ? Math.round(totalSize / successfulWrites.length / 1024) : 0\n },\n credentialsExport,\n credentialsFtpUpload,\n workflowsFtpUpload,\n successfulFiles: successfulWrites.map(file => ({\n fileName: file.fileName,\n workflowName: file.workflowName,\n contentSize: file.contentSize,\n nodesCount: file.nodesCount\n })),\n failedFiles: failedWrites.map(file => ({\n fileName: file.fileName,\n workflowName: file.workflowName,\n error: file.error\n })),\n preparationErrors: backupSummary.errors || [],\n status: failedWrites.length === 0 && \n backupSummary.errorCount === 0 && \n credentialsExport.success && \n (!credentialsFtpUpload.attempted || credentialsFtpUpload.success) &&\n (!workflowsFtpUpload.attempted || workflowsFtpUpload.success) ? \n 'success' : 'partial_success',\n configuration: {\n backupConfig: summaryDataInput.customConfig.backupConfig,\n workflowName: summaryDataInput.workflowConfig.WORKFLOW_NAME,\n adminEmail: summaryDataInput.workflowConfig.N8N_ADMIN_EMAIL\n },\n debug: {\n totalItemsReceived: allItems.length,\n credentialsFtpItemsFound: credentialsFtpItems.length,\n workflowsFtpItemsFound: workflowsFtpItems.length,\n aggregatedWorkflowItemsFound: aggregatedWorkflowItems.length,\n workflowsInAggregate: workflowsData.length,\n totalWorkflowsFetched: totalWorkflows,\n credentialsResultFound: credentialsResult !== null\n }\n};\n\n// Convert backup log to binary data for Write Files from Disk node\nconst logJsonContent = JSON.stringify(backupLog, null, 2);\nconst logBinaryData = Buffer.from(logJsonContent, 'utf8').toString('base64');\n\n// Create console & email output lines array using Init variables\nconst startTimeLocal = new Date(initTimeData.startTime).toLocaleString('sv-SE', { timeZone: initTimeData.localTimezone });\nconst endTimeLocal = endTime.toLocaleString('sv-SE', { timeZone: initTimeData.localTimezone });\n\n// Determine status text for FTP uploads\nconst credentialsFtpStatus = credentialsFtpUpload.disabled ? '⚠️ Disabled' : \n (credentialsFtpUpload.success ? '✅ Yes' : '❌ Failed');\n\nconst workflowsFtpStatus = workflowsFtpUpload.disabled ? '⚠️ Disabled' : \n (workflowsFtpUpload.success ? '✅ Yes' : '❌ Failed');\n\nconst consoleLines = [\n '=================================',\n '📁 N8N WORKFLOWS BACKUP SUMMARY',\n '=================================',\n `📅 Date: ${initTimeData.localTimeFormatted}`,\n `🌍 Timezone: ${initTimeData.localTimezone}`,\n `⏱️ Duration: ${backupLog.duration.formatted}`,\n ` • Start time (local): ${startTimeLocal}`,\n ` • End time (local): ${endTimeLocal}`,\n `📂 Backup folder: ${summaryDataInput.customConfig.backupFolder}/${initTimeData.datePrefix} ${successfulWrites.length > 0 ? '✅' : '❌'}`,\n `📤 FTP folder (${summaryDataInput.customConfig.FTPName || 'FTP Server'}): ${summaryDataInput.customConfig.FTP_BACKUP_FOLDER}/${initTimeData.datePrefix} ${ftpFolderStatusIcon}`,\n `📊 Workflows found: ${backupLog.statistics.totalWorkflows}`,\n `✅ Successfully backed up: ${backupLog.statistics.successfulBackups}`,\n `❌ Failed backups: ${backupLog.statistics.failedBackups}`,\n `⚠️ Preparation errors: ${backupLog.statistics.preparationErrors}`,\n `🔐 Credentials exported: ${credentialsExport.success ? '✅ Yes' : '❌ Failed'}`,\n `📤 Credentials FTP uploaded: ${credentialsFtpStatus}`,\n `📤 Workflows FTP uploaded: ${workflowsFtpStatus}`,\n `💾 Total size: ${backupLog.statistics.totalSizeMB} MB`,\n `📈 Status: ${backupLog.status.toUpperCase()}`,\n '================================='\n];\n\n// Console summary - display all lines\nconsoleLines.forEach(line => console.log(line));\n\n// Add error details to console lines if any\nif (failedWrites.length > 0) {\n console.log('❌ Failed files:');\n consoleLines.push('❌ Failed files:');\n failedWrites.forEach(fail => {\n const errorLine = ` - ${fail.fileName}: ${fail.error}`;\n console.log(errorLine);\n consoleLines.push(errorLine);\n });\n}\n\n// Add credentials export details - only show errors, not success details\nif (!credentialsExport.success) {\n console.log('🔐 Credentials export error:');\n consoleLines.push('🔐 Credentials export error:');\n consoleLines.push(` - Exit code: ${credentialsExport.exitCode}`);\n if (credentialsExport.stderr) {\n consoleLines.push(` - Error: ${credentialsExport.stderr}`);\n }\n}\n\n// Add credentials FTP upload details\nif (credentialsFtpUpload.disabled) {\n console.log('Credentials FTP upload disabled:');\n consoleLines.push('⚠️ Credentials FTP upload:');\n consoleLines.push(` - Warning: no FTP data received (node likely disabled)`);\n} else if (!credentialsFtpUpload.success && credentialsFtpUpload.attempted) {\n console.log('Credentials FTP upload errors:');\n consoleLines.push('❌ Credentials FTP upload errors:');\n consoleLines.push(` - ${credentialsFtpUpload.message}`);\n \n if (credentialsFtpData && credentialsFtpData.uploads) {\n const failedUploads = credentialsFtpData.uploads.filter(upload => !upload.success);\n failedUploads.forEach(upload => {\n const errorLine = ` • ${upload.fileName}: ${upload.error || 'Upload failed'}`;\n console.log(errorLine);\n consoleLines.push(errorLine);\n });\n }\n}\n\n// Add workflows FTP upload details\nif (workflowsFtpUpload.disabled) {\n console.log('Workflows FTP upload disabled:');\n consoleLines.push('⚠️ Workflows FTP upload:');\n consoleLines.push(` - Warning: no FTP data received (node likely disabled)`);\n} else if (!workflowsFtpUpload.success && workflowsFtpUpload.attempted) {\n console.log('Workflows FTP upload errors:');\n consoleLines.push('❌ Workflows FTP upload errors:');\n consoleLines.push(` - ${workflowsFtpUpload.message}`);\n \n if (workflowsFtpData && workflowsFtpData.uploads) {\n const failedUploads = workflowsFtpData.uploads.filter(upload => !upload.success);\n failedUploads.forEach(upload => {\n const errorLine = ` • ${upload.fileName || upload.workflowName}: ${upload.error || 'Upload failed'}`;\n console.log(errorLine);\n consoleLines.push(errorLine);\n });\n }\n}\n\n// Create email log file content as plain text\nconst emailLogText = consoleLines.join('\\n');\n\n// Convert execution log to binary data as plain text\nconst emailLogBinaryData = Buffer.from(emailLogText, 'utf8').toString('base64'); \n\n// Create execution log file name using Init variables\nconst emailLogFileName = `${initTimeData.localDateTime}_exec_log.txt`;\nconst emailLogFilePath = `${summaryDataInput.workflowConfig.LOGS_PATH}/${emailLogFileName}`;\n\nconst emailLog = {\n data: emailLogBinaryData,\n mimeType: 'text/plain', \n fileName: emailLogFileName,\n fileExtension: 'txt'\n}\n\nconst summaryData = {\n backupLog,\n emailLogText,\n logPathFile: summaryDataInput.workflowConfig.logPathFileName,\n emailLogPath: emailLogFilePath,\n emailLogFileName: emailLogFileName,\n backupCompleted: true,\n workflowStep: 'completed'\n};\n\n// Return with binary data for BOTH log files\nreturn [{\n json: summaryData,\n binary: {\n data: {\n data: logBinaryData,\n mimeType: 'application/json',\n fileName: summaryDataInput.workflowConfig.logFileName,\n fileExtension: 'json'\n },\n emailLog\n }\n}];"
},
"typeVersion": 2
},
{
"id": "--6",
"name": "バックアップログを書き込み",
"type": "n8n-nodes-base.readWriteFile",
"position": [
2320,
-560
],
"parameters": {
"options": {},
"fileName": "={{ $json.logPathFile }}",
"operation": "write"
},
"typeVersion": 1
},
{
"id": "--7",
"name": "メールを送信",
"type": "n8n-nodes-base.emailSend",
"position": [
2528,
-560
],
"webhookId": "",
"parameters": {
"text": "=Workflow: {{ $workflow.name }}\nWorkflow has executed sucessfully.\n\n{{ $json.emailLogText }}\n\nFind the exported Workflow JSON files attached to this email.",
"options": {
"attachments": "data",
"appendAttribution": false
},
"subject": "=n8n SUCCESS: {{ $workflow.name }}",
"toEmail": "={{ $env.N8N_ADMIN_EMAIL }}",
"fromEmail": "admin <admin@example.com>",
"emailFormat": "text"
},
"typeVersion": 2.1
},
{
"id": "--8",
"name": "メールログを書き込み",
"type": "n8n-nodes-base.readWriteFile",
"position": [
2320,
-752
],
"parameters": {
"options": {},
"fileName": "={{ $json.emailLogPath }}",
"operation": "write",
"dataPropertyName": "emailLog"
},
"typeVersion": 1
},
{
"id": "--9",
"name": "日付フォルダを作成",
"type": "n8n-nodes-base.executeCommand",
"position": [
368,
-560
],
"parameters": {
"command": "=mkdir -p \"{{ $json.customConfig.backupFolder }}/{{ $json.customConfig.datePrefix }}\""
},
"typeVersion": 1
},
{
"id": "--10",
"name": "初期化",
"type": "n8n-nodes-base.code",
"position": [
160,
-560
],
"parameters": {
"jsCode": "// Init Workflow Variables - Local timezone version\n// ✅ All dates/times handled in local timezone\n\n// ==========================================\n// 📅 LOCAL DATE/TIME INITIALIZATION\n// ==========================================\n\nconst now = new Date();\n\n// ==========================================\n// 🌍 USER-DEFINED TIMEZONE CONFIGURATION\n// ==========================================\n// ⚠️ IMPORTANT: This code runs on the n8n SERVER, not in your browser!\n// ⚠️ Configure the timezone where you want executions to be scheduled,\n// ⚠️ regardless of where your n8n server is physically located.\n//\n// 📍 Common timezone examples:\n// - 'Europe/Paris' → Central European Time (CET/CEST)\n// - 'America/New_York' → Eastern Time (EST/EDT)\n// - 'America/Chicago' → Central Time (CST/CDT)\n// - 'America/Los_Angeles'→ Pacific Time (PST/PDT)\n// - 'Asia/Tokyo' → Japan Standard Time (JST)\n// - 'Asia/Shanghai' → China Standard Time (CST)\n// - 'Australia/Sydney' → Australian Eastern Time (AET)\n// - 'UTC' → Coordinated Universal Time\n//\n// 🔧 TO CONFIGURE FOR YOUR USE CASE:\n// 1. Uncomment and edit the line below with your desired timezone\n// 2. Comment out the automatic detection line\n//\n// To use user-defined timezone instead, UNCOMMENT these lines:\n// const USER_TIMEZONE = 'Europe/Paris'; // 👈 EDIT THIS for your location\n// const LOCAL_TIMEZONE = USER_TIMEZONE; // 👈 EDIT THIS for your location\n//\n// For automatic detection (uses server timezone or environment variable):\nconst LOCAL_TIMEZONE = $env.TZ || 'Europe/Paris'; // 🟢 Default fallback\n// To use user-defined timezone instead, comment out the above line\n\n// Local date in YYYY-MM-DD format\nconst localDate = now.toLocaleDateString('sv-SE', { timeZone: LOCAL_TIMEZONE });\n\n// Local datetime in YYYY-MM-DD_HH-MM-SS format\nconst localDateTime = now.toLocaleString('sv-SE', { \n timeZone: LOCAL_TIMEZONE,\n year: 'numeric',\n month: '2-digit',\n day: '2-digit',\n hour: '2-digit',\n minute: '2-digit',\n second: '2-digit'\n}).replace(' ', '_').replace(/:/g, '-');\n\n// Local formatted time for display\nconst localTimeFormatted = now.toLocaleString('en-GB', { \n timeZone: LOCAL_TIMEZONE,\n year: 'numeric',\n month: '2-digit',\n day: '2-digit',\n hour: '2-digit',\n minute: '2-digit',\n second: '2-digit'\n});\n\nconst timestampUTC = now.toISOString().replace(/[:.]/g, '-').slice(0, 19); // 2025-01-20T09-30-45\n\nconsole.log('🚀 USER-DEFINE TIMEZONE VARIABLES INITIALISED (Local timezone):');\nconsole.log(` 🌍 Timezone: ${LOCAL_TIMEZONE}`);\nconsole.log(` 📅 Local date: ${localDate}`);\nconsole.log(` 📅 Local date/time: ${localDateTime}`);\nconsole.log(` 🕐 Local time: ${localTimeFormatted}`);\n\n// Create structure for time data output\nconst timeData = {\n nowUTC: now.toISOString(), // UTC for calculations\n localDate: localDate, // YYYY-MM-DD format\n localDateTime: localDateTime, // YYY-MM-DD_HH-MM-SS format\n localTimeFormatted: localTimeFormatted, // Human readable\n localTimezone: LOCAL_TIMEZONE, // For reference\n datePrefix: localDate,\n timestampUTC,\n startTime: now.toISOString()\n}\n\n// ==========================================\n// 📝 WORKFLOW STANDARD CONFIGURATION\n// ==========================================\n\nconst N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com';\nconst WORKFLOW_NAME = $workflow.name;\nconst N8N_PROJECTS_DIR = $env.N8N_PROJECTS_DIR || '/files/n8n-projects-data'; // ⚠️ Your projects’ ROOT folder here\n// projects-root-folder/\n// └── Your-project-folder-name/\n// ├── logs/\n// ├── reports/\n// ├── ...\n// └── [other project files]\nconst PROJECT_FOLDER_NAME = \"Workflow-backups\"; // ⚠️ Your project folder name\nconst PROJECT_ROOT_PATH = `${N8N_PROJECTS_DIR}/${PROJECT_FOLDER_NAME}`;\n// const N8N_MEDIA_ROOT_PATH = $env.N8N_MEDIA_SHARED || '/files/n8n-media-shared'; // ⚠️ Your public folder, accessible from inet\n// const mediaOutputFolder = \"/output\";\n// const mediaTempFolder = \"/temp\";\n// const N8N_FILE_SERVER_PURL = $env.N8N_FILE_SERVER_PURL || 'https://files.example.com'; // File serve public url\n\nconst LOGS_PATH = `${PROJECT_ROOT_PATH}/logs`;\n// const logFileName = `${localDateTime2}-${WORKFLOW_NAME.replace(/[^a-zA-Z0-9]/g, '_')}_logs.json`; // ⚠️ Your log file name\nconst logFileName = `${localDateTime}-backup_logs.json`; // ⚠️ Your log file name\nconst logPathFileName = `${LOGS_PATH}/${logFileName}`;\n\n// Configuration for report generation\nconst REPORTS_PATH = `${PROJECT_ROOT_PATH}/reports`;\nconst reportFileName = `${localDateTime}-report.txt`;\nconst reportPathFileName = `${REPORTS_PATH}/${reportFileName}`;\n\n// Console output\nconsole.log('🧾 STANDARD WORKFLOW VARIABLES INITIALISED:');\nconsole.log(` 📁 Admin email: ${N8N_ADMIN_EMAIL}`);\nconsole.log(` 📁 Workflow name: ${WORKFLOW_NAME}`);\nconsole.log(` 📁 Projects folder: ${N8N_PROJECTS_DIR}`);\nconsole.log(` 📁 Project folder name: ${PROJECT_FOLDER_NAME}`);\nconsole.log(` 📁 Project root path: ${PROJECT_ROOT_PATH}`);\n// console.log(` 📁 Media root path: ${N8N_MEDIA_ROOT_PATH}`);\n// console.log(` 📁 File server public URL: ${N8N_FILE_SERVER_PURL}`);\nconsole.log(` 📁 Log path: ${LOGS_PATH}`);\nconsole.log(` 📁 Log file name: ${logFileName}`);\nconsole.log(` 📁 Log file path: ${logPathFileName}`);\nconsole.log(` 📁 Reports path: ${REPORTS_PATH}`);\nconsole.log(` 📁 Report file name: ${reportFileName}`);\nconsole.log(` 📁 Report file path: ${reportPathFileName}`);\n\n// Create structure for workflow configuration output\nconst workflowConfig = {\n workflowStep: 'init',\n N8N_ADMIN_EMAIL,\n WORKFLOW_NAME,\n N8N_PROJECTS_DIR,\n PROJECT_FOLDER_NAME,\n PROJECT_ROOT_PATH,\n // N8N_MEDIA_ROOT_PATH,\n // mediaOutputFolder,\n // mediaTempFolder,\n // N8N_FILE_SERVER_PURL,\n LOGS_PATH,\n logFileName,\n logPathFileName,\n REPORTS_PATH,\n reportFileName,\n reportPathFileName\n}\n\n// ==========================================\n// 📝 WORKFLOW CUSTOM CONFIGURATION\n// ==========================================\n// ⚠️ INSERT HERE: your workflow custom configuration variables\n\n// Base backup path using your Docker and FTP volume configuration (folder must exist)\nconst BACKUP_FOLDER = $env.N8N_BACKUP_FOLDER || '/files/n8n-backups'; // ⚠️ Change the default value for your n8n backup folder\nconst FTP_BACKUP_FOLDER = $env.N8N_FTP_BACKUP_FOLDER || '/n8n-backups';\nconst FTPName = 'Your FTP Server Name'; // FTP server name for logging (display purposes only)\n\nconst credentials_backup_folder = \"n8n-credentials\"; // Credentials backup path\n\n// Configuration for the backup process\nconst backupConfig = {\n datePrefix: localDate, // Prefix for all backup files\n fileNaming: 'name_only', // Options: 'name_only', 'id_only', 'name_and_id'\n maxRetries: 3,\n timeout: 30000,\n includeCredentials: false, // Security: Don't export credential data\n compression: false // Future enhancement\n};\n\n// Console output\nconsole.log('🧾 CUSTOM WORKFLOW VARIABLES INITIALISED:');\nconsole.log(` 💾 Backup folder: ${BACKUP_FOLDER}`);\nconsole.log(' File prefix:', localDate);\nconsole.log(' Timestamp (UTC):', timestampUTC);\n\nconst customConfig = {\n backupFolder: BACKUP_FOLDER,\n FTP_BACKUP_FOLDER,\n FTPName,\n credentials: credentials_backup_folder,\n ...backupConfig\n}\n\n// ==========================================\n// 📊 OUTPUT DATA\n// ==========================================\n\nconst initData = {\n timeData,\n workflowConfig,\n customConfig\n};\n\nreturn [{ json: initData }];"
},
"typeVersion": 2
},
{
"id": "--11",
"name": "日次バックアップ",
"type": "n8n-nodes-base.scheduleTrigger",
"position": [
-48,
-560
],
"parameters": {
"rule": {
"interval": [
{
"triggerAtHour": 4
}
]
}
},
"typeVersion": 1.2
},
{
"id": "--12",
"name": "ファイルに変換",
"type": "n8n-nodes-base.convertToFile",
"position": [
1040,
-336
],
"parameters": {
"mode": "each",
"options": {
"format": true,
"fileName": "={{ $json.name }}"
},
"operation": "toJson",
"binaryPropertyName": "=data"
},
"typeVersion": 1.1
},
{
"id": "--13",
"name": "ワークフローを取得",
"type": "n8n-nodes-base.n8n",
"position": [
624,
-336
],
"parameters": {
"filters": {},
"requestOptions": {}
},
"typeVersion": 1
},
{
"id": "--14",
"name": "ファイル名をクリーンアップ",
"type": "n8n-nodes-base.code",
"onError": "continueErrorOutput",
"position": [
832,
-336
],
"parameters": {
"jsCode": "// ==========================================\n// 🧹 FLEXIBLE FILENAME CLEANER\n// ==========================================\n// Purpose: Clean filename for cross-platform compatibility\n// Usage: Can work with current input OR reference another node\n//\n// Recommended Configuration:\n// **Node Settings**:\n// * ☑️ **On Error**: Continue (using error output)\n// * ☑️ **Error Workflow**: Your notification workflow\n// The node stops at the first error and immediately triggers your notification system! 🚨\n\n// ⚙️ CONFIGURATION\nconst SOURCE_FIELD = 'name'; // Field containing the filename\nconst SOURCE_NODE = 'Fetch Workflows'; // Set to node name like 'Fetch Workflows' OR null to use current input\n // Examples: 'Fetch Workflows', 'HTTP Request', 'Set'\n\n// Get input items\nconst inputItems = $input.all();\nconst itemCount = inputItems.length;\n\nconsole.log(`Flexible Filename Cleaner: Processing ${itemCount} item(s)`);\nconsole.log(`Configuration: SOURCE_FIELD=\"${SOURCE_FIELD}\", SOURCE_NODE=${SOURCE_NODE || 'current input'}`);\n\nif (itemCount === 0) {\n console.error('No items to process');\n throw new Error('No items to process');\n}\n\nconst outputItems = [];\n\ninputItems.forEach((item, index) => {\n try {\n let originalFileName;\n let sourceDescription;\n \n // Determine where to get the filename from\n if (SOURCE_NODE) {\n // Get from specified node\n try {\n // Use the corresponding item from the source node (same index)\n const sourceItems = $(SOURCE_NODE).all();\n if (sourceItems[index]) {\n originalFileName = sourceItems[index].json[SOURCE_FIELD];\n sourceDescription = `${SOURCE_NODE}[${index}].${SOURCE_FIELD}`;\n } else {\n // Fallback to first item if index doesn't exist\n originalFileName = $(SOURCE_NODE).first().json[SOURCE_FIELD];\n sourceDescription = `${SOURCE_NODE}[0].${SOURCE_FIELD} (fallback)`;\n }\n } catch (nodeError) {\n const errorMsg = `Cannot access node '${SOURCE_NODE}': ${nodeError.message}`;\n console.error(errorMsg);\n throw new Error(errorMsg);\n }\n } else {\n // Get from current input item\n originalFileName = item.json[SOURCE_FIELD];\n sourceDescription = `current_input[${index}].${SOURCE_FIELD}`;\n }\n \n if (!originalFileName) {\n const errorMsg = `No filename found in ${sourceDescription}`;\n console.error(errorMsg);\n throw new Error(errorMsg);\n }\n \n console.log(`Item ${index + 1}/${itemCount}: \"${originalFileName}\" from ${sourceDescription}`);\n \n // Clean filename for cross-platform compatibility\n const cleanedFileName = originalFileName\n // Remove forbidden characters for Windows/Linux\n .replace(/[<>:\"/\\\\|?*\\x00-\\x1F\\x7F]/g, '_')\n // Replace multiple spaces/dots with single underscore\n .replace(/[\\s.]+/g, '_')\n // Remove consecutive underscores\n .replace(/_+/g, '_')\n // Remove leading/trailing underscores\n .replace(/^_+|_+$/g, '')\n // Limit length to 180 chars\n .substring(0, 180);\n \n // Fallback if name becomes empty after cleaning\n const finalCleanedFileName = cleanedFileName || `file_${index + 1}`;\n \n console.log(`✅ Cleaned: \"${finalCleanedFileName}\"`);\n \n // Pass through all original data + add cleaned filename\n outputItems.push({\n json: {\n ...item.json,\n cleanedFileName: finalCleanedFileName\n },\n binary: item.binary || {}\n });\n \n } catch (error) {\n console.error(`❌ Error processing item ${index + 1}:`, error.message);\n \n // Log error and re-throw to trigger error workflow\n throw new Error(`Failed to process item ${index + 1}: ${error.message}`);\n }\n});\n\nconsole.log(`✅ Successfully processed ${outputItems.length} item(s)`);\nreturn outputItems;"
},
"typeVersion": 2
},
{
"id": "--15",
"name": "各ワークフローをディスクに書き込み",
"type": "n8n-nodes-base.readWriteFile",
"position": [
1264,
-240
],
"parameters": {
"options": {
"append": false
},
"fileName": "={{ $('Init').first().json.customConfig.backupFolder }}/{{ $('Init').first().json.customConfig.datePrefix }}/{{ $('Clean Filename').all()[$itemIndex].json.cleanedFileName }}.json",
"operation": "write"
},
"typeVersion": 1
},
{
"id": "--16",
"name": "エラー: バックアップサマリー",
"type": "n8n-nodes-base.stopAndError",
"position": [
2336,
-368
],
"parameters": {
"errorMessage": "={{ $json.error }}"
},
"typeVersion": 1
},
{
"id": "--17",
"name": "エラー: ファイル名をクリーンアップ",
"type": "n8n-nodes-base.stopAndError",
"position": [
1040,
-144
],
"parameters": {
"errorMessage": "={{ $json.error }}"
},
"typeVersion": 1
},
{
"id": "--18",
"name": "ディスクからファイルを読み書き",
"type": "n8n-nodes-base.readWriteFile",
"position": [
1248,
-736
],
"parameters": {
"options": {},
"fileSelector": "={{ $json.localPath }}"
},
"typeVersion": 1
},
{
"id": "--19",
"name": "認証情報ファイルを一覧表示",
"type": "n8n-nodes-base.executeCommand",
"position": [
832,
-736
],
"parameters": {
"command": "=# List all JSON files in credentials backup folder\n# Command for Execute Command node\n\nls -1 \"{{ $('Init').first().json.customConfig.backupFolder }}/{{ $('Init').first().json.customConfig.datePrefix }}/{{ $('Init').first().json.customConfig.credentials }}\"/*.json 2>/dev/null || echo \"No JSON files found\""
},
"typeVersion": 1
},
{
"id": "--20",
"name": "認証情報アイテムを出力",
"type": "n8n-nodes-base.code",
"position": [
1040,
-736
],
"parameters": {
"jsCode": "// Process file list from Execute Command and create items for each file\n// ✅ Converts file list into individual items for Read Files node\n\n// ⚠️ CONFIGURE THIS NODE WITH:\n// SETTINGS > ON ERROR > CONTINUE (USING ERROR OUTPUT)\n\nconst initData = $('Init').first().json;\nconst commandResult = $input.first().json;\nconst exportResult = $('Export Credentials').first().json;\n\nconsole.log('[Process File List] Command result:', commandResult);\nconsole.log('[Process File List] Command stdout:', commandResult.stdout);\n\n// Check if command was successful\nif (commandResult.exitCode !== 0) {\n const errorMsg = `List files command failed with exit code ${commandResult.exitCode}`;\n console.error(`[Process File List] ${errorMsg}`);\n console.error(`[Process File List] Command stderr:`, commandResult.stderr);\n throw new Error(errorMsg);\n}\n\n// Parse the file list from stdout\nconst stdout = commandResult.stdout.trim();\n\nif (!stdout || stdout === \"No JSON files found\") {\n console.log('[Process File List] No credentials files found');\n return [{ json: { workflowStep: 'no_credentials_files', message: 'No credentials files found' } }];\n}\n\n// Split file paths and filter out empty lines\nconst filePaths = stdout.split('\\n').filter(path => path.trim().length > 0);\n\nconsole.log(`[Process File List] Found ${filePaths.length} credentials file(s)`);\n\nconst outputItems = filePaths.map((fullPath, index) => {\n // Extract filename from full path\n const fileName = fullPath.split('/').pop();\n \n // Build FTP remote path\n const ftpPath = `${initData.customConfig.FTP_BACKUP_FOLDER}/${initData.customConfig.datePrefix}/${initData.customConfig.credentials}/${fileName}`;\n \n console.log(`[Process File List] File ${index + 1}: ${fileName}`);\n console.log(` Local path: ${fullPath}`);\n console.log(` FTP path: ${ftpPath}`);\n \n return {\n json: {\n fileName: fileName,\n localPath: fullPath,\n ftpPath: ftpPath,\n fileType: 'credentials_file',\n workflowStep: 'credentials_file_read',\n fileIndex: index + 1,\n totalFiles: filePaths.length,\n exportResult: {\n exitCode: exportResult.exitCode,\n stdout: exportResult.stdout,\n stderr: exportResult.stderr\n }\n }\n };\n});\n\nconsole.log(`[Process File List] ✅ Created ${outputItems.length} items for file reading`);\n\nreturn outputItems;"
},
"typeVersion": 2
},
{
"id": "--21",
"name": "集約",
"type": "n8n-nodes-base.aggregate",
"position": [
1472,
-240
],
"parameters": {
"options": {
"includeBinaries": true
},
"aggregate": "aggregateAllItemData",
"destinationFieldName": "workflows"
},
"typeVersion": 1
},
{
"id": "--22",
"name": "マージ",
"type": "n8n-nodes-base.merge",
"position": [
1920,
-560
],
"parameters": {
"numberInputs": 3
},
"typeVersion": 3.2
},
{
"id": "-FTP--23",
"name": "認証情報をFTPにアップロード",
"type": "n8n-nodes-base.ftp",
"onError": "continueRegularOutput",
"disabled": true,
"position": [
1456,
-736
],
"parameters": {
"path": "={{ $('Output Credential Items').item.json.ftpPath }}",
"operation": "upload"
},
"typeVersion": 1
},
{
"id": "FTP--24",
"name": "FTPロガー(認証情報)",
"type": "n8n-nodes-base.code",
"disabled": true,
"position": [
1664,
-736
],
"parameters": {
"jsCode": "// FTP Upload Results Logger for Credentials Files\n// ✅ Processes FTP upload results from individual credentials files and creates comprehensive logging data\n// ✅ Enhanced error handling for FTP connection timeouts and server errors\n\n// ⚠️ CONFIGURE THIS NODE WITH:\n// SETTINGS > ON ERROR > CONTINUE (USING ERROR OUTPUT)\n\nconst allFtpResults = $input.all();\nconst initData = $('Init').first().json;\n\nconsole.log('[FTP Logger] Processing FTP upload results for credentials files...');\nconsole.log(`[FTP Logger] Received ${allFtpResults.length} FTP result item(s)`);\n\nconst ftpUploadResults = {\n timestamp: new Date().toISOString(),\n localTimestamp: new Date().toLocaleString('en-GB', { timeZone: initData.timeData.timezone }),\n totalFiles: allFtpResults.length,\n successfulUploads: 0,\n failedUploads: 0,\n uploads: []\n};\n\n// Process each FTP result\nallFtpResults.forEach((result, index) => {\n try {\n // Handle different result structures (success vs error outputs)\n let uploadData;\n let isConnectionError = false;\n let connectionErrorMessage = null;\n \n // Check if this is an error result from FTP node\n if (result.error) {\n // This comes from the error output of FTP node\n isConnectionError = true;\n connectionErrorMessage = result.error;\n uploadData = result.json || {};\n console.log(`[FTP Logger] FTP node error detected: ${connectionErrorMessage}`);\n } else if (result.json && result.json.error) {\n // Error embedded in JSON response\n isConnectionError = true;\n connectionErrorMessage = result.json.error;\n uploadData = result.json;\n console.log(`[FTP Logger] FTP JSON error detected: ${connectionErrorMessage}`);\n } else {\n // Normal success/failure response\n uploadData = result.json || {};\n }\n \n // Determine if upload was successful\n const isSuccess = !isConnectionError && !uploadData.error && uploadData.success !== false;\n \n // Extract file information with fallbacks\n const fileName = uploadData.fileName || uploadData.path || `credentials_file_${index + 1}`;\n const ftpPath = uploadData.ftpPath || uploadData.path || 'unknown';\n \n const uploadLog = {\n fileName: fileName,\n fileType: uploadData.fileType || 'credentials_file',\n localPath: uploadData.localPath || 'unknown',\n ftpPath: ftpPath,\n success: isSuccess,\n uploadTime: new Date().toISOString(),\n error: connectionErrorMessage || uploadData.error || null,\n ftpResponse: uploadData.response || null,\n fileSize: uploadData.fileSize || null,\n fileIndex: uploadData.fileIndex || index + 1,\n connectionError: isConnectionError\n };\n \n if (isSuccess) {\n ftpUploadResults.successfulUploads++;\n console.log(`[FTP Logger] ✅ Successfully uploaded: ${uploadLog.fileName} (${uploadLog.fileIndex}/${uploadData.totalFiles || ftpUploadResults.totalFiles})`);\n } else {\n ftpUploadResults.failedUploads++;\n if (isConnectionError) {\n console.error(`[FTP Logger] 🔌 Connection error for: ${uploadLog.fileName} - ${connectionErrorMessage}`);\n } else {\n console.error(`[FTP Logger] ❌ Failed to upload: ${uploadLog.fileName} (${uploadLog.fileIndex}/${uploadData.totalFiles || ftpUploadResults.totalFiles})`);\n }\n if (uploadLog.error) {\n console.error(`[FTP Logger] Error details: ${uploadLog.error}`);\n }\n }\n \n ftpUploadResults.uploads.push(uploadLog);\n \n } catch (error) {\n console.error(`[FTP Logger] Error processing result ${index + 1}:`, error.message);\n \n // Add error entry for processing failures\n ftpUploadResults.uploads.push({\n fileName: `unknown_file_${index + 1}`,\n fileType: 'credentials_file',\n localPath: 'unknown',\n ftpPath: 'unknown',\n success: false,\n error: `Processing error: ${error.message}`,\n uploadTime: new Date().toISOString(),\n fileIndex: index + 1,\n connectionError: false\n });\n \n ftpUploadResults.failedUploads++;\n }\n});\n\n// Create simplified summary\nconst connectionErrors = ftpUploadResults.uploads.filter(upload => upload.connectionError);\nconst hasConnectionErrors = connectionErrors.length > 0;\n\n// Create simplified summary\nconst ftpSummary = {\n overallSuccess: ftpUploadResults.failedUploads === 0,\n summary: hasConnectionErrors ? \n `FTP server connection failed - credentials files could not be uploaded due to server connectivity issues` :\n `${ftpUploadResults.successfulUploads}/${ftpUploadResults.totalFiles} credentials files uploaded successfully`\n};\n\nconsole.log('[FTP Logger] Credentials Upload Summary:');\nconsole.log(` Total credentials files: ${ftpUploadResults.totalFiles}`);\nconsole.log(` Successful uploads: ${ftpUploadResults.successfulUploads}`);\nconsole.log(` Failed uploads: ${ftpUploadResults.failedUploads}`);\n\nif (hasConnectionErrors) {\n console.log('[FTP Logger] 🔌 FTP server connection failed');\n}\n\n// Add summary to results\nftpUploadResults.summary = ftpSummary;\n\n// Create output data that will be merged with Backup Summary\nconst credentialsFtpLog = {\n credentialsFtpUpload: ftpUploadResults\n};\n\nconsole.log('[FTP Logger] ✅ Credentials FTP upload logging completed');\n\nreturn [{ json: credentialsFtpLog }];"
},
"typeVersion": 2
},
{
"id": "FTP--25",
"name": "FTPロガー(ワークフロー)",
"type": "n8n-nodes-base.code",
"disabled": true,
"position": [
1472,
-432
],
"parameters": {
"jsCode": "// FTP Upload Results Logger for Workflow Files\n// ✅ Processes FTP upload results from workflow files and creates summary for Backup Summary\n// ✅ Enhanced error handling for FTP connection timeouts and server errors\n\n// ⚠️ CONFIGURE THIS NODE WITH:\n// SETTINGS > ON ERROR > CONTINUE (USING ERROR OUTPUT)\n\nconst allFtpResults = $input.all();\nconst initData = $('Init').first().json;\n\nconsole.log('[FTP Workflow Logger] Processing FTP upload results for workflow files...');\nconsole.log(`[FTP Workflow Logger] Received ${allFtpResults.length} FTP result item(s)`);\n\nconst ftpUploadResults = {\n timestamp: new Date().toISOString(),\n localTimestamp: new Date().toLocaleString('en-GB', { timeZone: initData.timeData.localTimezone }),\n totalFiles: allFtpResults.length,\n successfulUploads: 0,\n failedUploads: 0,\n uploads: []\n};\n\n// Process each FTP result\nallFtpResults.forEach((result, index) => {\n try {\n // Handle different result structures (success vs error outputs)\n let uploadData;\n let isConnectionError = false;\n let connectionErrorMessage = null;\n \n // Check if this is an error result from FTP node\n if (result.error) {\n // This comes from the error output of FTP node\n isConnectionError = true;\n connectionErrorMessage = result.error;\n uploadData = result.json || {};\n console.log(`[FTP Workflow Logger] FTP node error detected: ${connectionErrorMessage}`);\n } else if (result.json && result.json.error) {\n // Error embedded in JSON response\n isConnectionError = true;\n connectionErrorMessage = result.json.error;\n uploadData = result.json;\n console.log(`[FTP Workflow Logger] FTP JSON error detected: ${connectionErrorMessage}`);\n } else {\n // Normal success/failure response\n uploadData = result.json || {};\n }\n \n // Determine if upload was successful\n const isSuccess = !isConnectionError && !uploadData.error && uploadData.success !== false;\n \n // Extract file information with fallbacks\n // For workflows, try to get the filename from Clean Filename node (which has both name and cleanedFileName)\n let originalWorkflowName = `workflow_${index + 1}`;\n let cleanedFileName = `workflow_${index + 1}`;\n \n try {\n const cleanFilenameItems = $('Clean Filename').all();\n if (cleanFilenameItems[index]) {\n originalWorkflowName = cleanFilenameItems[index].json.name || `workflow_${index + 1}`;\n cleanedFileName = cleanFilenameItems[index].json.cleanedFileName || `workflow_${index + 1}`;\n }\n } catch (error) {\n console.log(`[FTP Workflow Logger] Could not access Clean Filename node for item ${index + 1}, using fallback`);\n }\n \n const fileName = uploadData.fileName || uploadData.path || `${cleanedFileName}.json`;\n const ftpPath = uploadData.ftpPath || uploadData.path || 'unknown';\n \n const uploadLog = {\n fileName: fileName,\n workflowName: originalWorkflowName,\n cleanedFileName: cleanedFileName,\n fileType: 'workflow_file',\n localPath: uploadData.localPath || 'unknown',\n ftpPath: ftpPath,\n success: isSuccess,\n uploadTime: new Date().toISOString(),\n error: connectionErrorMessage || uploadData.error || null,\n ftpResponse: uploadData.response || null,\n fileSize: uploadData.fileSize || null,\n fileIndex: uploadData.fileIndex || index + 1,\n connectionError: isConnectionError\n };\n \n if (isSuccess) {\n ftpUploadResults.successfulUploads++;\n console.log(`[FTP Workflow Logger] ✅ Successfully uploaded: ${originalWorkflowName} (${uploadLog.fileIndex}/${ftpUploadResults.totalFiles})`);\n } else {\n ftpUploadResults.failedUploads++;\n if (isConnectionError) {\n console.error(`[FTP Workflow Logger] 🔌 Connection error for: ${originalWorkflowName} - ${connectionErrorMessage}`);\n } else {\n console.error(`[FTP Workflow Logger] ❌ Failed to upload: ${originalWorkflowName} (${uploadLog.fileIndex}/${ftpUploadResults.totalFiles})`);\n }\n if (uploadLog.error) {\n console.error(`[FTP Workflow Logger] Error details: ${uploadLog.error}`);\n }\n }\n \n ftpUploadResults.uploads.push(uploadLog);\n \n } catch (error) {\n console.error(`[FTP Workflow Logger] Error processing result ${index + 1}:`, error.message);\n \n // Add error entry for processing failures\n ftpUploadResults.uploads.push({\n fileName: `unknown_workflow_${index + 1}`,\n workflowName: `unknown_workflow_${index + 1}`,\n fileType: 'workflow_file',\n localPath: 'unknown',\n ftpPath: 'unknown',\n success: false,\n error: `Processing error: ${error.message}`,\n uploadTime: new Date().toISOString(),\n fileIndex: index + 1,\n connectionError: false\n });\n \n ftpUploadResults.failedUploads++;\n }\n});\n\n// Check for connection errors specifically\nconst connectionErrors = ftpUploadResults.uploads.filter(upload => upload.connectionError);\nconst hasConnectionErrors = connectionErrors.length > 0;\n\n// Create simplified summary\nconst ftpSummary = {\n overallSuccess: ftpUploadResults.failedUploads === 0,\n summary: hasConnectionErrors ? \n `FTP server connection failed - workflow files could not be uploaded due to server connectivity issues` :\n `${ftpUploadResults.successfulUploads}/${ftpUploadResults.totalFiles} workflow files uploaded successfully`\n};\n\nconsole.log('[FTP Workflow Logger] Workflow Upload Summary:');\nconsole.log(` Total workflow files: ${ftpUploadResults.totalFiles}`);\nconsole.log(` Successful uploads: ${ftpUploadResults.successfulUploads}`);\nconsole.log(` Failed uploads: ${ftpUploadResults.failedUploads}`);\n\nif (hasConnectionErrors) {\n console.log('[FTP Workflow Logger] 🔌 FTP server connection failed');\n}\n\n// Add summary to results\nftpUploadResults.summary = ftpSummary;\n\n// Create output data that will be merged with Backup Summary\nconst workflowsFtpLog = {\n workflowsFtpUpload: ftpUploadResults\n};\n\nconsole.log('[FTP Workflow Logger] ✅ Workflow FTP upload logging completed');\n\nreturn [{ json: workflowsFtpLog }];"
},
"typeVersion": 2
},
{
"id": "--26",
"name": "認証情報をエクスポート",
"type": "n8n-nodes-base.executeCommand",
"position": [
624,
-736
],
"parameters": {
"command": "=# Export des credentials avec backup automatique\nn8n export:credentials --backup --output={{ $('Init').item.json.customConfig.backupFolder }}/{{ $('Init').item.json.customConfig.datePrefix }}/{{ $('Init').item.json.customConfig.credentials }}"
},
"typeVersion": 1
},
{
"id": "-FTP--27",
"name": "ワークフローをFTPにアップロード",
"type": "n8n-nodes-base.ftp",
"onError": "continueRegularOutput",
"disabled": true,
"position": [
1264,
-432
],
"parameters": {
"path": "={{ $('Init').first().json.customConfig.FTP_BACKUP_FOLDER }}/{{ $('Init').first().json.customConfig.datePrefix }}/{{ $('Clean Filename').all()[$itemIndex].json.cleanedFileName }}.json",
"operation": "upload"
},
"credentials": {
"ftp": {
"id": "",
"name": "Your FTP Server Name"
}
},
"typeVersion": 1
}
],
"pinData": {},
"connections": {
"--10": {
"main": [
[
{
"node": "--9",
"type": "main",
"index": 0
}
]
]
},
"--22": {
"main": [
[
{
"node": "--5",
"type": "main",
"index": 0
}
]
]
},
"--21": {
"main": [
[
{
"node": "--22",
"type": "main",
"index": 2
}
]
]
},
"--11": {
"main": [
[
{
"node": "--10",
"type": "main",
"index": 0
}
]
]
},
"--5": {
"main": [
[
{
"node": "--6",
"type": "main",
"index": 0
},
{
"node": "--8",
"type": "main",
"index": 0
}
],
[
{
"node": "--16",
"type": "main",
"index": 0
}
]
]
},
"--14": {
"main": [
[
{
"node": "--12",
"type": "main",
"index": 0
}
],
[
{
"node": "--17",
"type": "main",
"index": 0
}
]
]
},
"--12": {
"main": [
[
{
"node": "-FTP--27",
"type": "main",
"index": 0
},
{
"node": "--15",
"type": "main",
"index": 0
}
]
]
},
"--13": {
"main": [
[
{
"node": "--14",
"type": "main",
"index": 0
}
]
]
},
"--6": {
"main": [
[
{
"node": "--7",
"type": "main",
"index": 0
}
]
]
},
"--9": {
"main": [
[
{
"node": "--13",
"type": "main",
"index": 0
},
{
"node": "--26",
"type": "main",
"index": 0
}
]
]
},
"--26": {
"main": [
[
{
"node": "--19",
"type": "main",
"index": 0
}
]
]
},
"--19": {
"main": [
[
{
"node": "--20",
"type": "main",
"index": 0
}
]
]
},
"FTP--25": {
"main": [
[
{
"node": "--22",
"type": "main",
"index": 1
}
]
]
},
"--20": {
"main": [
[
{
"node": "--18",
"type": "main",
"index": 0
}
]
]
},
"-FTP--27": {
"main": [
[
{
"node": "FTP--25",
"type": "main",
"index": 0
}
]
]
},
"FTP--24": {
"main": [
[
{
"node": "--22",
"type": "main",
"index": 0
}
]
]
},
"-FTP--23": {
"main": [
[
{
"node": "FTP--24",
"type": "main",
"index": 0
}
]
]
},
"--18": {
"main": [
[
{
"node": "-FTP--23",
"type": "main",
"index": 0
}
]
]
},
"--15": {
"main": [
[
{
"node": "--21",
"type": "main",
"index": 0
}
]
]
}
}
}よくある質問
このワークフローの使い方は?
上記のJSON設定コードをコピーし、n8nインスタンスで新しいワークフローを作成して「JSONからインポート」を選択、設定を貼り付けて認証情報を必要に応じて変更してください。
このワークフローはどんな場面に適していますか?
上級 - コンテンツ作成, マルチモーダルAI
有料ですか?
このワークフローは完全無料です。ただし、ワークフローで使用するサードパーティサービス(OpenAI APIなど)は別途料金が発生する場合があります。
関連ワークフロー
人間らしいアクティビティパターンを作成し、ランダム化されたワーキングフローと時間スロットを含む
人間に似せた活動パターンを作成し、ランダムなワークフロースケジューリングとタイムスロットを含む
N8n
Code
Merge
+
N8n
Code
Merge
32 ノードFlorent
コンテンツ作成
WordPressブログの自動化プロフェッショナル版(先端研究)v2.1マーケットプラグイン
GPT-4o、Perplexity AI、そして多言語対応を使ったSEO最適化ブログ作成の自動化
If
Set
Xml
+
If
Set
Xml
125 ノードDaniel Ng
コンテンツ作成
Google Drive、GitHub、メッセージ アラートを使用した自動作業フロー バックアップシステム
Google Drive、GitHub、そしてメッセージアラートを使用した自動化ワークフローバックアップシステム
If
N8n
Set
+
If
N8n
Set
20 ノードKhairul Muhtadin
コンテンツ作成
毎日のスポーツ要約
Google Gemini、Kokoro TTS、FFmpegを使用してRSSフィードをポッドキャストに変換
If
Set
Code
+
If
Set
Code
34 ノードJonas
コンテンツ作成
YouTube 動画をブログ記事に変換
Mistral AIとGemini画像生成を利用してYouTube動画からブログ記事を作成
Set
Code
Merge
+
Set
Code
Merge
56 ノードZakwan
コンテンツ作成
YouTube動画からDumpling AIを使用してプラットフォーム固有の投稿を自動生成
GPT-4oとDumpling AIを使ってYouTube動画からInstagram、Facebook、LinkedInの投稿を自動生成
Set
Code
Merge
+
Set
Code
Merge
20 ノードYang
コンテンツ作成
ワークフロー情報
難易度
上級
ノード数28
カテゴリー2
ノードタイプ12
作成者
Florent
@florentIT Business Analyst for 8+ years, I am finding joy in developping again, with the help of n8n and AIs 🤗
外部リンク
n8n.ioで表示 →
このワークフローを共有