Introduction
Read File in Segments in Oracle Integration Cloud is a critical capability when dealing with large file processing in enterprise integrations. In real-world Oracle Integration Cloud (OIC Gen 3) implementations, you rarely work with small payloads. Most enterprise systems—ERP, HCM, banking, retail—generate files that can easily exceed hundreds of MBs or even GBs.
If you try to process such files in one go, you will face memory issues, timeouts, and failed integrations. This is exactly where reading files in segments (chunk processing) becomes essential.
In this article, I’ll walk you through how this works in OIC Gen 3, when to use it, how to configure it, and the common mistakes I’ve seen in real client projects.
What is Read File in Segments in OIC?
Reading a file in segments means processing a large file in smaller chunks instead of loading the entire file into memory at once.
In Oracle Integration Cloud, this is typically implemented using:
-
Stage File Action
-
Read File in Segments option
-
For-each loop to process each segment
Instead of:
“Load entire file → Process → Send”
You follow:
“Read chunk → Process → Repeat → Complete”
This approach improves:
-
Performance
-
Scalability
-
Stability
Real-World Integration Use Cases
1. Payroll File Processing (HCM → Third-party Payroll)
A client generates a 500 MB payroll file daily from Oracle Fusion HCM.
Problem:
-
Integration fails due to payload size
Solution:
-
Use segmented file reading
-
Process employee records batch by batch
2. Bank Statement Reconciliation (ERP)
Bank sends large CSV files with millions of transactions.
Without segmentation:
-
Parsing fails
-
Memory overflow errors
With segmentation:
-
Each chunk processed independently
-
Data inserted into staging tables
3. E-commerce Order Bulk Processing
Orders from external systems come as large JSON/XML files.
Requirement:
-
Process orders in batches
-
Avoid duplicate processing on failure
Segment processing helps:
-
Retry only failed chunks
-
Maintain transaction integrity
Architecture / Technical Flow
Here is how segmented file processing works in OIC Gen 3:
-
File is received (SFTP / Object Storage / REST upload)
-
Stage File action reads file in segments
-
Loop iterates through each segment
-
Data is transformed and processed
-
Final aggregation or completion step
Flow Representation
Prerequisites
Before implementing this, ensure:
1. Connectivity Setup
-
SFTP Connection OR REST Trigger
-
Proper file access permissions
2. File Format Definition
-
CSV / XML / JSON schema
-
Sample file for structure
3. Integration Pattern
-
App Driven Orchestration (recommended)
4. Understanding of:
-
Stage File actions
-
For-each loops
-
Data mapping
Step-by-Step Build Process
Let’s walk through a real implementation scenario.
Step 1 – Create Integration
Navigate:
OIC Console → Integrations → Create
-
Select: App Driven Orchestration
-
Name:
Read_File_In_Segments_Demo
Step 2 – Configure Trigger
Example:
-
Use REST Adapter or SFTP Adapter
-
Accept file as input or fetch file from location
Step 3 – Add Stage File Action
Drag Stage File action.
Select operation:
👉 Read File in Segments
Step 4 – Configure Stage File Properties
Important fields:
| Field | Value |
|---|---|
| File Location | Staging Directory |
| File Name | input_file.csv |
| File Format | CSV / XML |
| Segment Size | Example: 200 records |
Consultant Tip:
-
Always test with different segment sizes (100, 500, 1000)
-
Choose based on performance and memory usage
Step 5 – Define Schema
Upload sample file to generate schema.
This ensures:
-
Proper parsing
-
Accurate mapping
Step 6 – Add For-Each Loop
After Stage File:
-
Add For-Each action
-
Loop over segments
Each iteration processes:
👉 One chunk of data
Step 7 – Process Each Segment
Inside loop:
-
Add Mapper
-
Transform data
-
Call target system (ERP / HCM / DB / REST API)
Example:
-
Insert into staging table
-
Call ERP API
-
Send to external system
Step 8 – Handle Errors (Important)
Inside loop:
-
Add Scope
-
Enable fault handling
If one segment fails:
-
Log error
-
Continue processing next segment
Step 9 – Save and Activate
-
Validate integration
-
Activate
Testing the Technical Component
Test Scenario
Input file:
-
1000 records
-
Segment size: 200
Expected behavior:
-
5 iterations
-
Each processes 200 records
Validation Checks
-
All segments processed?
-
Any failures logged?
-
Data consistency maintained?
Debugging Tip
Use:
-
Tracking ID
-
Integration Insight
-
Activity Stream
Common Errors and Troubleshooting
1. File Not Found
Cause:
-
Incorrect directory path
Fix:
-
Verify Stage File location
2. Schema Mismatch
Cause:
-
File structure differs from sample
Fix:
-
Regenerate schema using latest file
3. Memory Issues Despite Segmentation
Cause:
-
Segment size too large
Fix:
-
Reduce chunk size
4. Loop Not Iterating
Cause:
-
Incorrect For-each configuration
Fix:
-
Map correct repeating element
5. Partial Processing
Cause:
-
Failure inside loop
Fix:
-
Implement fault handling scope
Best Practices
1. Always Use Optimal Segment Size
-
Start with 200–500 records
-
Tune based on performance
2. Implement Idempotency
Ensure:
-
Reprocessing doesn’t duplicate data
3. Log Segment-Level Details
Track:
-
Segment number
-
Record count
-
Status
4. Use Parallel Processing Carefully
OIC supports parallel execution, but:
-
Avoid overloading target systems
-
Use only when necessary
5. Externalize Configuration
Store:
-
File name
-
Segment size
In:
-
Lookup
-
Integration parameters
6. Use Staging Tables
For large data:
-
Insert into DB first
-
Then process asynchronously
Real Consultant Insight
In one project, we processed 1.2 million records daily from a logistics system.
Initial approach:
-
Single file processing → Failed consistently
After segmentation:
-
500-record chunks
-
Parallel processing enabled
Result:
-
Processing time reduced from 2 hours to 25 minutes
-
Zero failures
Summary
Reading files in segments in Oracle Integration Cloud is not just an optimization—it is a necessity for enterprise-grade integrations.
Key takeaways:
-
Avoid processing large files in a single payload
-
Use Stage File “Read in Segments”
-
Combine with For-each loops
-
Implement proper error handling
-
Optimize segment size
This approach ensures:
-
High performance
-
Scalability
-
Reliable integrations
FAQs
1. What is the ideal segment size in OIC?
There is no fixed value. Typically:
-
100–500 records for complex transformations
-
500–1000 for simple processing
Always test based on your use case.
2. Can we process segments in parallel?
Yes, OIC supports parallel processing, but:
-
Use carefully
-
Ensure target systems can handle load
3. What happens if one segment fails?
If fault handling is implemented:
-
Only that segment fails
-
Others continue processing
Without it:
-
Entire integration may fail
Additional Reference
For deeper understanding, refer to Oracle official documentation: