[3/3] Werkzeug für Ausreißererkennung (Datensatz für Landwirtschaft)
Dies ist ein AI, SecOps-Bereich Automatisierungsworkflow mit 17 Nodes. Hauptsächlich werden Set, Code, HttpRequest, ExecuteWorkflowTrigger und andere Nodes verwendet, kombiniert mit KI-Technologie für intelligente Automatisierung. Ausreißer-[Bild]-Detektions-Tool [3/3 - Ausreißer]
- •Möglicherweise sind Ziel-API-Anmeldedaten erforderlich
Verwendete Nodes (17)
{
"id": "G8jRDBvwsMkkMiLN",
"meta": {
"instanceId": "205b3bc06c96f2dc835b4f00e1cbf9a937a74eeb3b47c99d0c30b0586dbf85aa"
},
"name": "[3/3] Anomaly detection tool (crops dataset)",
"tags": [
{
"id": "spMntyrlE9ydvWFA",
"name": "anomaly-detection",
"createdAt": "2024-12-08T22:05:15.945Z",
"updatedAt": "2024-12-09T12:50:19.287Z"
}
],
"nodes": [
{
"id": "e01bafec-eb24-44c7-b3c4-a60f91666350",
"name": "Sticky Note1",
"type": "n8n-nodes-base.stickyNote",
"position": [
-1200,
180
],
"parameters": {
"color": 6,
"width": 400,
"height": 740,
"content": "We are working here with crops dataset: \nExisting (so not anomalies) crops images in dataset are:\n- 'pearl_millet(bajra)',\n- 'tobacco-plant',\n- 'cherry',\n- 'cotton',\n- 'banana',\n- 'cucumber',\n- 'maize',\n- 'wheat',\n- 'clove',\n- 'jowar',\n- 'olive-tree',\n- 'soyabean',\n- 'coffee-plant',\n- 'rice',\n- 'lemon',\n- 'mustard-oil',\n- 'vigna-radiati(mung)',\n- 'coconut',\n- 'gram',\n- 'pineapple',\n- 'sugarcane',\n- 'sunflower',\n- 'chilli',\n- 'fox_nut(makhana)',\n- 'jute',\n- 'papaya',\n- 'tea',\n- 'cardamom',\n- 'almond'\n"
},
"typeVersion": 1
},
{
"id": "b9943781-de1f-4129-9b81-ed836e9ebb11",
"name": "Bild einbetten",
"type": "n8n-nodes-base.httpRequest",
"position": [
680,
60
],
"parameters": {
"url": "https://api.voyageai.com/v1/multimodalembeddings",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"inputs\": [\n {\n \"content\": [\n {\n \"type\": \"image_url\",\n \"image_url\": $('Image URL hardcode').first().json.imageURL\n }\n ]\n }\n ],\n \"model\": \"voyage-multimodal-3\",\n \"input_type\": \"document\"\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "genericCredentialType",
"genericAuthType": "httpHeaderAuth"
},
"credentials": {
"httpHeaderAuth": {
"id": "Vb0RNVDnIHmgnZOP",
"name": "Voyage API"
}
},
"typeVersion": 4.2
},
{
"id": "47b72bc2-4817-48c6-b517-c1328e402468",
"name": "Ähnlichkeit der Medoide abrufen",
"type": "n8n-nodes-base.httpRequest",
"position": [
940,
60
],
"parameters": {
"url": "={{ $('Variables for medoids').first().json.qdrantCloudURL }}/collections/{{ $('Variables for medoids').first().json.collectionName }}/points/query",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"query\": $json.data[0].embedding,\n \"using\": \"voyage\",\n \"limit\": $('Info About Crop Labeled Clusters').first().json.cropsNumber,\n \"with_payload\": true,\n \"filter\": {\n \"must\": [\n { \n \"key\": $('Variables for medoids').first().json.clusterCenterType,\n \"match\": {\n \"value\": true\n }\n }\n ]\n }\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "42d7eb27-ec38-4406-b5c4-27eb45358e93",
"name": "Scores vergleichen",
"type": "n8n-nodes-base.code",
"position": [
1140,
60
],
"parameters": {
"language": "python",
"pythonCode": "points = _input.first()['json']['result']['points']\nthreshold_type = _('Variables for medoids').first()['json']['clusterThresholdCenterType']\n\nmax_score = -1\ncrop_with_max_score = None\nundefined = True\n\nfor center in points:\n if center['score'] >= center['payload'][threshold_type]:\n undefined = False\n if center['score'] > max_score:\n max_score = center['score']\n crop_with_max_score = center['payload']['crop_name']\n\nif undefined:\n result_message = \"ALERT, we might have a new undefined crop!\"\nelse:\n result_message = f\"Looks similar to {crop_with_max_score}\"\n\nreturn [{\n \"json\": {\n \"result\": result_message\n }\n}]\n"
},
"typeVersion": 2
},
{
"id": "23aa604a-ff0b-4948-bcd5-af39300198c0",
"name": "Sticky Note4",
"type": "n8n-nodes-base.stickyNote",
"position": [
-1200,
-220
],
"parameters": {
"width": 400,
"height": 380,
"content": "## Crop Anomaly Detection Tool\n### This is the tool that can be used directly for anomalous crops detection. \nIt takes as input (any) **image URL** and returns a **text message** telling if whatever this image depicts is anomalous to the crop dataset stored in Qdrant. \n\n* An Image URL is received via the Execute Workflow Trigger which is used to generate embedding vectors via the Voyage.ai Embeddings API.\n* The returned vectors are used to query the Qdrant collection to determine if the given crop is known by comparing it to **threshold scores** of each image class (crop type).\n* If the image scores lower than all thresholds, then the image is considered an anomaly for the dataset."
},
"typeVersion": 1
},
{
"id": "3a79eca2-44f9-4aee-8a0d-9c7ca2f9149d",
"name": "Variablen für Medoide",
"type": "n8n-nodes-base.set",
"position": [
-200,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "dbbc1e7b-c63e-4ff1-9524-8ef3e9f6cd48",
"name": "clusterCenterType",
"type": "string",
"value": "is_medoid"
},
{
"id": "a994ce37-2530-4030-acfb-ec777a7ddb05",
"name": "qdrantCloudURL",
"type": "string",
"value": "https://152bc6e2-832a-415c-a1aa-fb529f8baf8d.eu-central-1-0.aws.cloud.qdrant.io"
},
{
"id": "12f0a9e6-686d-416e-a61b-72d034ec21ba",
"name": "collectionName",
"type": "string",
"value": "=agricultural-crops"
},
{
"id": "4c88a617-d44f-4776-b457-8a1dffb1d03c",
"name": "clusterThresholdCenterType",
"type": "string",
"value": "is_medoid_cluster_threshold"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "13b25434-bd66-4293-93f1-26c67b9ec7dd",
"name": "Sticky Note3",
"type": "n8n-nodes-base.stickyNote",
"position": [
-340,
260
],
"parameters": {
"color": 6,
"width": 360,
"height": 200,
"content": "**clusterCenterType** - either\n* \"is_text_anchor_medoid\" or\n* \"is_medoid\"\n\n\n**clusterThresholdCenterType** - either\n* \"is_text_anchor_medoid_cluster_threshold\" or\n* \"is_medoid_cluster_threshold\""
},
"typeVersion": 1
},
{
"id": "869b0962-6cae-487d-8230-539a0cc4c14c",
"name": "Info zu beschrifteten Crop-Clustern",
"type": "n8n-nodes-base.set",
"position": [
440,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "5327b254-b703-4a34-a398-f82edb1d6d6b",
"name": "=cropsNumber",
"type": "number",
"value": "={{ $json.result.hits.length }}"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "5d3956f8-f43b-439e-b176-a594a21d8011",
"name": "Gesamtpunkte in Sammlung",
"type": "n8n-nodes-base.httpRequest",
"position": [
40,
60
],
"parameters": {
"url": "={{ $json.qdrantCloudURL }}/collections/{{ $json.collectionName }}/points/count",
"method": "POST",
"options": {},
"jsonBody": "={\n \"exact\": true\n}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "14ba3db9-3965-4b20-b333-145616d45c3a",
"name": "Anzahl je Crop",
"type": "n8n-nodes-base.httpRequest",
"position": [
240,
60
],
"parameters": {
"url": "={{ $('Variables for medoids').first().json.qdrantCloudURL }}/collections/{{ $('Variables for medoids').first().json.collectionName }}/facet",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"key\": \"crop_name\",\n \"limit\": $json.result.count,\n \"exact\": true\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "e37c6758-0556-4a56-ab14-d4df663cb53a",
"name": "Image URL hardcode",
"type": "n8n-nodes-base.set",
"position": [
-480,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "46ceba40-fb25-450c-8550-d43d8b8aa94c",
"name": "imageURL",
"type": "string",
"value": "={{ $json.query.imageURL }}"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "b24ad1a7-0cf8-4acc-9c18-6fe9d58b10f2",
"name": "Execute Workflow Trigger",
"type": "n8n-nodes-base.executeWorkflowTrigger",
"position": [
-720,
60
],
"parameters": {},
"typeVersion": 1
},
{
"id": "50424f2b-6831-41bf-8de4-81f69d901ce1",
"name": "Sticky Note2",
"type": "n8n-nodes-base.stickyNote",
"position": [
-240,
-80
],
"parameters": {
"width": 180,
"height": 120,
"content": "Variables to access Qdrant's collection we uploaded & prepared for anomaly detection in 2 previous pipelines\n"
},
"typeVersion": 1
},
{
"id": "2e8ed3ca-1bba-4214-b34b-376a237842ff",
"name": "Sticky Note5",
"type": "n8n-nodes-base.stickyNote",
"position": [
40,
-120
],
"parameters": {
"width": 560,
"height": 140,
"content": "These three nodes are needed just to figure out how many different classes (crops) we have in our Qdrant collection: **cropsNumber** (needed in *\"Get similarity of medoids\"* node. \n[Note] *\"Total Points in Collection\"* -> *\"Each Crop Counts\"* were used&explained already in *\"[2/4] Set up medoids (2 types) for anomaly detection (crops dataset)\"* pipeline.\n"
},
"typeVersion": 1
},
{
"id": "e2fa5763-6e97-4ff5-8919-1cb85a3c6968",
"name": "Sticky Note6",
"type": "n8n-nodes-base.stickyNote",
"position": [
620,
240
],
"parameters": {
"height": 120,
"content": "Here, we're embedding the image passed to this workflow tool with the Voyage embedding model to compare the image to all crop images in the database."
},
"typeVersion": 1
},
{
"id": "cdb6b8d3-f7f4-4d66-850f-ce16c8ed98b9",
"name": "Sticky Note7",
"type": "n8n-nodes-base.stickyNote",
"position": [
920,
220
],
"parameters": {
"width": 400,
"height": 180,
"content": "Checking how similar the image is to all the centres of clusters (crops).\nIf it's more similar to the thresholds we set up and stored in centres in the previous workflow, the image probably belongs to this crop class; otherwise, it's anomalous to the class. \nIf image is anomalous to all the classes, it's an anomaly."
},
"typeVersion": 1
},
{
"id": "03b4699f-ba43-4f5f-ad69-6f81deea2641",
"name": "Sticky Note22",
"type": "n8n-nodes-base.stickyNote",
"position": [
-620,
580
],
"parameters": {
"color": 4,
"width": 540,
"height": 300,
"content": "### For anomaly detection\n1. The first pipeline is uploading (crops) dataset to Qdrant's collection.\n2. The second pipeline sets up cluster (class) centres in this Qdrant collection & cluster (class) threshold scores.\n3. **This is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant (crops) collection.**\n\n### To recreate it\nYou'll have to upload [crops](https://www.kaggle.com/datasets/mdwaquarazam/agricultural-crops-image-classification) dataset from Kaggle to your own Google Storage bucket, and re-create APIs/connections to [Qdrant Cloud](https://qdrant.tech/documentation/quickstart-cloud/) (you can use **Free Tier** cluster), Voyage AI API & Google Cloud Storage\n\n**In general, pipelines are adaptable to any dataset of images**\n"
},
"typeVersion": 1
}
],
"active": false,
"pinData": {
"Execute Workflow Trigger": [
{
"json": {
"query": {
"imageURL": "https://storage.googleapis.com/n8n-qdrant-demo/agricultural-crops%2Fcotton%2Fimage%20(36).jpg"
}
}
}
]
},
"settings": {
"executionOrder": "v1"
},
"versionId": "f67b764b-9e1a-4db0-b9f2-490077a62f74",
"connections": {
"b9943781-de1f-4129-9b81-ed836e9ebb11": {
"main": [
[
{
"node": "47b72bc2-4817-48c6-b517-c1328e402468",
"type": "main",
"index": 0
}
]
]
},
"14ba3db9-3965-4b20-b333-145616d45c3a": {
"main": [
[
{
"node": "869b0962-6cae-487d-8230-539a0cc4c14c",
"type": "main",
"index": 0
}
]
]
},
"e37c6758-0556-4a56-ab14-d4df663cb53a": {
"main": [
[
{
"node": "3a79eca2-44f9-4aee-8a0d-9c7ca2f9149d",
"type": "main",
"index": 0
}
]
]
},
"3a79eca2-44f9-4aee-8a0d-9c7ca2f9149d": {
"main": [
[
{
"node": "5d3956f8-f43b-439e-b176-a594a21d8011",
"type": "main",
"index": 0
}
]
]
},
"b24ad1a7-0cf8-4acc-9c18-6fe9d58b10f2": {
"main": [
[
{
"node": "e37c6758-0556-4a56-ab14-d4df663cb53a",
"type": "main",
"index": 0
}
]
]
},
"47b72bc2-4817-48c6-b517-c1328e402468": {
"main": [
[
{
"node": "42d7eb27-ec38-4406-b5c4-27eb45358e93",
"type": "main",
"index": 0
}
]
]
},
"5d3956f8-f43b-439e-b176-a594a21d8011": {
"main": [
[
{
"node": "14ba3db9-3965-4b20-b333-145616d45c3a",
"type": "main",
"index": 0
}
]
]
},
"869b0962-6cae-487d-8230-539a0cc4c14c": {
"main": [
[
{
"node": "b9943781-de1f-4129-9b81-ed836e9ebb11",
"type": "main",
"index": 0
}
]
]
}
}
}Wie verwende ich diesen Workflow?
Kopieren Sie den obigen JSON-Code, erstellen Sie einen neuen Workflow in Ihrer n8n-Instanz und wählen Sie "Aus JSON importieren". Fügen Sie die Konfiguration ein und passen Sie die Anmeldedaten nach Bedarf an.
Für welche Szenarien ist dieser Workflow geeignet?
Experte - Künstliche Intelligenz, Sicherheitsbetrieb
Ist es kostenpflichtig?
Dieser Workflow ist völlig kostenlos. Beachten Sie jedoch, dass Drittanbieterdienste (wie OpenAI API), die im Workflow verwendet werden, möglicherweise kostenpflichtig sind.
Verwandte Workflows
Diesen Workflow teilen