Import and Load Data using HCM Data Loader (HDL)
When working in Oracle Fusion HCM, one of the most critical implementation tasks is data migration and bulk data processing. Whether you are onboarding employees, updating assignments, or loading organization structures, HCM Data Loader (HDL) is the primary tool used in real-world projects.
In almost every implementation or support engagement, consultants rely on HDL for high-volume, secure, and structured data loads. In this guide, we’ll break down HDL from a practical consultant perspective, including configurations, file structures, real use cases, and troubleshooting strategies.
What is HCM Data Loader (HDL)?
HCM Data Loader (HDL) is a file-based data import tool used in Oracle Fusion HCM to load, update, and delete business objects using structured .dat files.
Unlike spreadsheet uploads or UI-based entry, HDL allows:
- Bulk data processing
- Complex hierarchical data loads
- Automated integrations
- Incremental updates
Key Concept
HDL works using:
- Business Objects (Worker, Assignment, Job, Department, etc.)
- .dat files (pipe-delimited format)
- Metadata-driven structure
Key Features of HCM Data Loader
1. High Volume Data Handling
Supports millions of records efficiently — critical during initial data migration.
2. Object Hierarchy Support
Allows loading parent-child relationships (e.g., Worker → Assignment → Salary).
3. Incremental Updates
You can update specific attributes without reloading full records.
4. Error Handling and Validation
Detailed log files help identify issues at record level.
5. Automation Ready
Used extensively with Oracle Integration Cloud (OIC Gen 3) for integrations.
Real-World Business Use Cases
Use Case 1: Employee Data Migration During Implementation
A company migrating from legacy HR system to Fusion loads:
- 25,000 employees
- Job structures
- Departments
- Salary data
HDL is used to ensure structured, validated bulk migration.
Use Case 2: Monthly Bulk Updates
A global company updates:
- Allowances
- Benefits
- Cost centers
Instead of manual UI updates, HDL processes thousands of records in minutes.
Use Case 3: Integration with Third-Party Systems
Payroll or recruitment systems send data to Fusion via:
- API → OIC → HDL
Example:
- New hires from recruitment system automatically created using HDL.
Architecture / Technical Flow
Understanding the HDL flow is critical for consultants.
HDL Flow Overview
- Source Data Preparation (CSV/Excel)
- Convert to
.datfile format - Zip the files
- Upload via:
- Data Exchange UI OR
- UCM (Universal Content Management)
- Submit HDL process
- Monitor logs and validate results
Prerequisites
Before working with HDL, ensure:
- Required roles:
- Human Capital Management Integration Specialist
- Application Implementation Consultant
- Access to:
- Data Exchange work area
- UCM (Content Server)
- Knowledge of:
- Business Objects
- HDL File Structure
- Load Types (MERGE, DELETE, etc.)
HDL File Structure Explained
HDL uses a pipe (|) delimited format.
Example: Worker.dat
MERGE|Worker|EMP001|1001|2024/01/01|John|Doe
Important Points
- First row = METADATA
- Second row = operation (MERGE/DELETE)
- Fields must match Oracle structure exactly
- Date format = YYYY/MM/DD
Step-by-Step Process to Load Data using HDL
Step 1 – Prepare Data File
- Create Excel sheet
- Map fields according to HDL object
- Convert to
.datformat
Tip: Always refer to HDL templates from Oracle documentation.
Step 2 – Zip the Files
- Place
.datfiles in a folder - Zip the folder
Example:
Step 3 – Upload the File
Navigation Path:
Navigator → Tools → Data Exchange → Import and Load Data
- Click Import File
- Upload the
.zipfile
Step 4 – Submit Load Process
- Select uploaded file
- Click Load
- Choose parameters:
- Business Object
- Load Type
Step 5 – Monitor Process
Go to:
Navigator → Tools → Scheduled Processes
Check:
- Status (Succeeded / Error)
- Log files
- Error messages
Testing the HDL Load
Example Scenario: Load New Employee
Input Data
- Person Number: 2001
- Name: Ravi Kumar
- Hire Date: 2025/01/01
Expected Outcome
- Worker created successfully
- Assignment generated
- Employee visible in Person Management
Validation Checks
- Search employee in UI
- Verify assignment details
- Check logs for warnings
Common Implementation Challenges
1. Data Format Errors
- Incorrect date format
- Missing mandatory fields
Fix: Validate data before upload
2. Incorrect Business Object Usage
Example:
Using Worker object for assignment updates
Fix: Use correct object hierarchy
3. Sequence Issues
Child records loaded before parent
Fix: Maintain proper order
4. Duplicate SourceSystemId
HDL uses SourceSystemId as unique identifier
Fix: Ensure uniqueness
5. Large File Failures
Uploading huge files may fail
Fix: Split into smaller batches
Best Practices from Real Projects
1. Always Use SourceSystemId
This allows:
- Easy updates
- Data traceability
2. Use Incremental Loads
Avoid full reloads — use MERGE wisely
3. Validate in Lower Environment First
Never directly load into production.
4. Maintain Data Mapping Document
Track:
- Source fields
- Target fields
- Transformations
5. Use HDL with OIC for Automation
In real implementations:
- OIC receives data
- Converts to HDL format
- Uploads automatically
6. Log Analysis is Critical
Always review:
- Error files
- Business object logs
Summary
HCM Data Loader (HDL) is one of the most powerful tools in Oracle Fusion HCM for data migration, bulk updates, and integrations.
From initial implementation to ongoing support, HDL plays a central role in:
- Loading employee data
- Managing organizational structures
- Automating integrations
For consultants, mastering HDL means:
- Faster implementations
- Fewer manual errors
- Better control over data
To explore more, refer to the official Oracle documentation:
https://docs.oracle.com/en/cloud/saas/human-resources/index.html
FAQs
1. What is the difference between HDL and HSDL?
- HDL = File-based bulk data load
- HSDL = Spreadsheet-based UI tool
HDL is preferred for large-scale implementations.
2. Can HDL be automated?
Yes. HDL is commonly integrated with:
- Oracle Integration Cloud (OIC Gen 3)
- REST APIs
3. What is MERGE in HDL?
MERGE is used to:
- Insert new records
- Update existing records
It is the most commonly used operation in HDL