Azure Data Factory

Skill
Data Engineering

Azure Data Factory Interview Questions and Answers 2025

8 questions found

Updated 23 Nov 2024

Q1. How many types of trigger are there in Adf

Ans.

There are two types of triggers in Azure Data Factory: Schedule-based triggers and Event-based triggers.

  • Schedule-based triggers are based on a time schedule and can be set to run at specific intervals.

  • Event-based triggers are triggered by events such as the completion of a pipeline run or the arrival of new data.

  • Triggers can be used to automate the execution of pipelines in Azure Data Factory.

Add your answer
Frequently asked in

Q2. How do we do delta load using adf?

Ans.

Delta load in ADF is achieved by comparing source and target data and only loading the changed data.

  • Use a Lookup activity to retrieve the latest watermark or timestamp from the target table

  • Use a Source activity to extract data from the source system based on the watermark or timestamp

  • Use a Join activity to compare the source and target data and identify the changed records

  • Use a Sink activity to load only the changed records into the target table

View 1 answer
Frequently asked in

Q3. 2. What is the get metadata activity and what are the parameters we have to pass?

Ans.

Get metadata activity is used to retrieve metadata of a specified data store or dataset in Azure Data Factory.

  • Get metadata activity is used in Azure Data Factory to retrieve metadata of a specified data store or dataset.

  • Parameters to pass include dataset, linked service, and optional folder path.

  • The output of the activity includes information like schema, size, last modified timestamp, etc.

  • Example: Get metadata of a SQL Server table using a linked service to the database.

View 1 answer
Frequently asked in

Q4. how to do performance tuning in adf

Ans.

Performance tuning in Azure Data Factory involves optimizing data flows and activities to improve efficiency and reduce processing time.

  • Identify bottlenecks in data flows and activities

  • Optimize data partitioning and distribution

  • Use appropriate data integration patterns

  • Leverage caching and parallel processing

  • Monitor and analyze performance metrics

Add your answer
Frequently asked in
Are these interview questions helpful?

Q5. How many types of integration runtime is present

Ans.

There are three types of integration runtime: Self-hosted, Azure, and SSIS

  • Self-hosted integration runtime is installed on a local machine or a virtual machine within an on-premises network

  • Azure integration runtime is managed by Azure Data Factory and runs in the Azure cloud

  • SSIS integration runtime is used to run SQL Server Integration Services packages in Azure Data Factory

Add your answer
Frequently asked in

Q6. How to copy recent files in adf

Ans.

Use GetMetadata activity to get recent files and then copy them using Copy Data activity in ADF.

  • Use GetMetadata activity in Azure Data Factory to retrieve information about recent files.

  • Filter the files based on their last modified date to get the most recent ones.

  • Use Copy Data activity to copy the recent files to the desired destination.

  • Configure the Copy Data activity with the source and destination datasets and mappings.

Add your answer
Share interview questions and help millions of jobseekers 🌟

Q7. What are integration runtime?

Ans.

Integration runtimes are compute infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments.

  • Integration runtimes can be self-hosted or Azure-hosted.

  • They are used to move data between cloud and on-premises data stores.

  • Integration runtimes provide connectivity to various data sources and destinations.

  • Examples include Azure Integration Runtime and Self-hosted Integration Runtime.

Add your answer
Frequently asked in

Q8. What are 7 layers in Azure Data Factory to do the pipelining to accept data from on-prem and to complete the process to push processed data to Azure Cloud

Ans.

The 7 layers in Azure Data Factory for pipelining data from on-premises to Azure Cloud

  • 1. Ingestion Layer: Collects data from various sources such as on-premises databases, cloud storage, or IoT devices.

  • 2. Storage Layer: Stores the ingested data in a data lake or data warehouse for processing.

  • 3. Batch Layer: Processes data in batches using technologies like Azure Databricks or HDInsight.

  • 4. Stream Layer: Processes real-time data streams using technologies like Azure Stream Anal...read more

Add your answer
Frequently asked in

Azure Data Factory Jobs

Lead Engineer - 1 5-10 years
PEPSICO GLOBAL BUSINESS SERVICES INDIA LLP
4.1
Hyderabad / Secunderabad
Inviting applications For Tech Lead - Azure Cloud services - 2PM -12PM 8-10 years
Genpact
3.9
Gurgaon / Gurugram
Senior Data Engineer 7-11 years
TE Connectivity
4.2
Bangalore / Bengaluru

Top Interview Questions for Related Skills

Interview Tips & Stories
Ace your next interview with expert advice and inspiring stories

Interview Questions of Azure Data Factory Related Designations

Interview experiences of popular companies

3.7
 • 10.2k Interviews
3.9
 • 8k Interviews
3.4
 • 1.4k Interviews
View all
Azure Data Factory Interview Questions
Share an Interview
Stay ahead in your career. Get AmbitionBox app
qr-code
Helping over 1 Crore job seekers every month in choosing their right fit company
70 Lakh+

Reviews

5 Lakh+

Interviews

4 Crore+

Salaries

1 Cr+

Users/Month

Contribute to help millions
Get AmbitionBox app

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2024 Info Edge (India) Ltd.

Follow us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter