Overview
In this lesson, you will learn how to effectively use the Loop Over Items node in n8n to process multiple items in batches, implement conditional logic, and handle API rate limits. By following a practical example of automating email attachment processing and uploading images to a Notion task board, you will understand how to optimize workflows for performance and reliability.
What is Looping in n8n?
Looping allows an automation to repeat actions multiple times until a specified condition is met. In n8n, this is achieved using the Loop Over Items node, which is essential when you need to:
- Process multiple items dynamically.
- Perform batch processing with a specific batch size.
- Handle API rate limits by pacing requests.
- Execute step-by-step processing of items.
Why Use Looping?
Consider these common scenarios:
- Batch Processing: You have 100 customer records from a Google Sheet and want to process them in batches of 10 or 20 instead of all at once.
- API Rate Limits: Many APIs restrict the number of requests per second (e.g., sending only 100 emails per second). Looping lets you process records in batches with wait periods to avoid hitting these limits.
- Step-by-step Processing: Some actions require sequential execution, such as uploading files one by one.
Looping often works hand-in-hand with item linking and merging strategies to combine data from different branches. For a deeper understanding of how item linking affects merging in workflows, see n8n Item Linking Explained.
Practical Example: Processing Gmail Attachments and Uploading Images to Notion
In this example, you will build a workflow to:
- Trigger on new emails in Gmail.
- Filter emails that contain attachments.
- Split attachments into individual items.
- Loop over each attachment to:
- Upload it temporarily to an external server.
- Create a page in a Notion database with the image.
- Wait between iterations to avoid API rate limits.
Step-by-Step Workflow Setup
1. Create a New Workflow
- Open your n8n workspace.
- Click Create Workflow to start with a blank canvas.
2. Add Gmail Trigger Node
- Add the
Gmail Triggernode. - Configure it to scan your inbox every minute for new emails.
- Select your Gmail account.
- Disable the Simplify option to access attachments.
- Under Options, enable Download Attachments.
- Leave Attachment Prefix as default (
attachment_).
3. Test the Gmail Trigger
- Mark a test email with attachments as unread in your Gmail inbox.
- Click Fetch Test Event in the node.
- Verify that the node returns your email with attachment data.
4. Filter Emails with Attachments
- Add a
IFnode to filter emails. - Set the condition to check if the binary data exists:
{{ $json["binary"] !== undefined }}
This ensures only emails with attachments proceed in the workflow.
5. Split Attachments into Individual Items
- Add a
SplitInBatchesorSplit Itemsnode. - Use the expression to split the attachments:
{{ $binary }}
This splits the single email item with multiple attachments into multiple items, each representing one attachment.
6. Add Loop Over Items Node
- Add the
Loop Over Itemsnode. - Connect the output of the split node to the Loop input of the
Loop Over Itemsnode.
Note: If your batch size is 1, you might not need this node since n8n processes items individually by default. However, when dealing with rate limits or batch sizes greater than 1, this node is crucial.
The Loop Over Items node has two outputs:
- Loop Branch: For processing each batch or item.
- Done Branch: Executes after all items are processed.
For more details on how to merge data after looping or branching, refer to How Branching Works in n8n.
7. Upload Attachments to a Temporary Location
Since Notion requires a publicly accessible image URL, you need to upload attachments temporarily.
- Add an
HTTP Requestnode inside the loop branch. - Configure it to POST the binary attachment to a temporary file hosting service (e.g.,
tempfiles.org).
Example configuration:
{
"method": "POST",
"url": "https://tempfiles.org/api/files",
"options": {
"bodyContentType": "multipart/form-data"
},
"binaryPropertyName": "attachment"
}
- Set the binary data field name as
attachment. - This node will return a URL for the uploaded file.
Refer to the HTTP Request Node documentation for more configuration options.
8. Create a Notion Database Page with the Image
- Add the
Notionnode after the HTTP request node. - Set the action to Create Database Page.
- Choose your Notion account and target database.
- Configure the page title using the Gmail email subject:
{{ $json["subject"] }}
- Add a property like
Statuswith valueTo Do. - Add a block of type Image and set the URL to the temporary file URL from the HTTP node:
{{ $node["HTTP Request"].json["data"]["url"] }}
Before merging data from different nodes, you might want to prepare or set specific fields using the Set Node in n8n to ensure consistent data structure.
9. Add a Wait Node to Handle Rate Limits
- Add a
Waitnode after the Notion node. - Set it to wait for 2 seconds (or suitable delay).
- This prevents hitting API rate limits by pacing the requests.
10. Connect Loop Back and Done Branch
- Connect the
Waitnode output back to the Loop Over Items node input to continue looping. - Connect the Done branch of the
Loop Over Itemsnode to aNoOporEndnode to signify workflow completion.
Handling Dynamic Binary Data Keys
When dealing with attachments, the binary data keys are named dynamically (attachment_0, attachment_1, etc.). Hardcoding these keys causes errors.
To dynamically access the binary data:
- Use an expression to get the binary keys:
{{ Object.keys($binary)[0] }}
- Access the binary data dynamically:
{{ $binary[Object.keys($binary)[0]] }}
This ensures the workflow handles attachments regardless of their position.
Common Mistakes and Troubleshooting
1. Hardcoding Binary Field Names
- Problem: Using fixed names like
attachment_0causes failures on subsequent iterations. - Solution: Use dynamic expressions to access binary keys as shown above.
2. Not Adding a Wait Node
- Problem: Sending API requests too quickly results in rate limit errors.
- Solution: Insert a
Waitnode (1-2 seconds delay) between loops to pace requests.
3. Uploading Unsupported File Types
- Problem: Uploading PDFs or non-image files to Notion image blocks causes errors.
- Solution: Add a filter node before the loop to process only image attachments.
Example expression to filter image MIME types:
{{ $json["mimeType"] && $json["mimeType"].startsWith("image/") }}
4. Misconfiguring Loop Branch Connections
- Problem: Not connecting the last node back to the
Loop Over Itemsnode prevents the loop from continuing. - Solution: Always connect the last processing node back to the loop node's input.
Additional Tips for Optimizing Workflows
- Use the
Loop Over Itemsnode to batch process large datasets efficiently. - Combine filtering with loops to avoid unnecessary processing.
- Use temporary file hosting for APIs that require public URLs.
- Monitor API limits and adjust wait times accordingly.
- Use the
Donebranch to perform post-processing once all items are handled.
For advanced merging techniques after looping, consult the n8n Merge node documentation.
Useful n8n Documentation Links
Quick Reference Cheat Sheet
| Step | Node Type | Key Configuration |
|---|---|---|
| Trigger on new emails | Gmail Trigger |
Download attachments enabled, Simplify off |
| Filter emails with attachments | IF node |
Condition: {{ $json["binary"] !== undefined }} |
| Split attachments | Split Items |
Expression: {{ $binary }} |
| Loop over items | Loop Over Items |
Connect loop input, configure batch size if needed |
| Upload attachment | HTTP Request |
POST to tempfiles.org, binary property: attachment |
| Create Notion page | Notion |
Use email subject as title, set image URL from HTTP response |
| Wait between requests | Wait |
Delay: 2 seconds |
| Loop continuation | Back to Loop Over Items node |
Connect last node output to loop input |
| Workflow end | NoOp or End node |
Connect to Done branch of loop node |
By following this tutorial and leveraging the Loop Over Items node, you will build robust, scalable n8n workflows that handle large datasets and API constraints gracefully. As shown in the video above, this approach is essential for professional-grade automation projects.