Copy Job is a lightweight, guided experience in Data Factory that lets you move data from source to destination, without needing to build full pipelines. It supports both batch and incremental copy patterns and is ideal for quick ingestion tasks, onboarding scenarios, or operational data syncs.
Key Advantages:
Simplicity: No need to build full pipelines. Just select source, destination, and go.
Incremental Copy: Supports watermark-based and CDC-based incremental loads.
Flexible Write Modes: Choose between append, overwrite, or merge (using PK).
High Performance: Serverless, parallelized architecture for PB-scale data movement.
Connector Rich: Over 100 built-in connectors including SQL, Lakehouse, Blob, Snowflake, Salesforce, etc.
Preview Reset: You can reset incremental logic per job or per table—great for troubleshooting.
Monitoring: Real-time job status and logs, integrated with Fabric workspace artifacts.
Check out the links to get a better understanding.