Upload Button Icon Add office photos

Accenture

Compare button icon Compare button icon Compare

Filter interviews by

Accenture Azure Data Engineer Interview Questions, Process, and Tips

Updated 8 Nov 2024

Top Accenture Azure Data Engineer Interview Questions and Answers

View all 14 questions

Accenture Azure Data Engineer Interview Experiences

8 interviews found

Interview experience
2
Poor
Difficulty level
Moderate
Process Duration
2-4 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Aug 2023. There were 2 interview rounds.

Round 1 - Technical 

(4 Questions)

  • Q1. What is difference between scheduled trigger and tumbling window trigger
  • Ans. 

    Scheduled trigger is time-based while tumbling window trigger is data-based.

    • Scheduled trigger is based on a specific time or interval, such as every hour or every day.

    • Tumbling window trigger is based on the arrival of new data or a specific event.

    • Scheduled trigger is useful for regular data processing tasks, like ETL jobs.

    • Tumbling window trigger is useful for aggregating data over fixed time intervals.

    • Scheduled trigger...

  • Answered by AI
  • Q2. What are the control flow activites in adf
  • Ans. 

    Control flow activities in Azure Data Factory (ADF) are used to define the workflow and execution order of activities.

    • Control flow activities are used to manage the flow of data and control the execution order of activities in ADF.

    • They allow you to define dependencies between activities and specify conditions for their execution.

    • Some commonly used control flow activities in ADF are If Condition, For Each, Until, and Sw...

  • Answered by AI
  • Q3. What is linked services in adf
  • Ans. 

    Linked services in ADF are connections to external data sources or destinations that allow data movement and transformation.

    • Linked services are used to connect to various data sources such as databases, file systems, and cloud services.

    • They provide the necessary information and credentials to establish a connection.

    • Linked services enable data movement activities like copying data from one source to another or transform...

  • Answered by AI
  • Q4. What are the types of IR
  • Ans. 

    IR stands for Integration Runtime. There are two types of IR: Self-hosted and Azure-SSIS.

    • Self-hosted IR is used to connect to on-premises data sources.

    • Azure-SSIS IR is used to run SSIS packages in Azure Data Factory.

    • Self-hosted IR requires an on-premises machine to be installed and configured.

    • Azure-SSIS IR is a fully managed service provided by Azure.

    • Both types of IR enable data movement and transformation in Azure Dat

  • Answered by AI
Round 2 - Technical 

(1 Question)

  • Q1. Which IR should we use if we want to copy data from on-premise db to azure
  • Ans. 

    We should use the Self-hosted Integration Runtime (IR) to copy data from on-premise db to Azure.

    • Self-hosted IR allows data movement between on-premise and Azure

    • It is installed on a local machine or virtual machine in the on-premise network

    • Self-hosted IR securely connects to the on-premise data source and transfers data to Azure

    • It supports various data sources like SQL Server, Oracle, MySQL, etc.

    • Self-hosted IR can be ma

  • Answered by AI

Skills evaluated in this interview

Interview experience
5
Excellent
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(2 Questions)

  • Q1. Activities used in ADF
  • Ans. 

    Activities in Azure Data Factory (ADF) are the building blocks of a pipeline and perform various tasks like data movement, data transformation, and data orchestration.

    • Activities can be used to copy data from one location to another (Copy Activity)

    • Activities can be used to transform data using mapping data flows (Data Flow Activity)

    • Activities can be used to run custom code or scripts (Custom Activity)

    • Activities can be u...

  • Answered by AI
  • Q2. Dataframes in pyspark
  • Ans. 

    Dataframes in pyspark are distributed collections of data organized into named columns.

    • Dataframes are similar to tables in a relational database, with rows and columns.

    • They can be created from various data sources like CSV, JSON, Parquet, etc.

    • Dataframes support SQL queries and transformations using PySpark functions.

    • Example: df = spark.read.csv('file.csv')

  • Answered by AI
Round 2 - HR 

(2 Questions)

  • Q1. Managerial Questions
  • Q2. About project roles and resposibilities

Skills evaluated in this interview

Azure Data Engineer Interview Questions Asked at Other Companies

asked in TCS
Q1. 7. How can we load multiple(50)tables at a time using adf?
asked in KPMG India
Q2. Difference between RDD, Dataframe and Dataset. How and what you h ... read more
asked in Techigai
Q3. What is incremental load and other types of loads? How do you imp ... read more
asked in TCS
Q4. 2. What is the get metadata activity and what are the parameters ... read more
asked in KPMG India
Q5. What are key components in ADF? What all you have used in your pi ... read more
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Selected Selected

I was interviewed in Oct 2024.

Round 1 - Coding Test 

Most of the questions were from resume and scenario-based. The basic SQL and Pyspark Questions

Interview Preparation Tips

Topics to prepare for Accenture Azure Data Engineer interview:
  • SQL
  • Pyspark
  • Adf
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
2-4 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Mar 2024. There was 1 interview round.

Round 1 - Technical 

(5 Questions)

  • Q1. Tell me about yourself
  • Ans. Tell about your background and current project work
  • Answered Anonymously
  • Q2. What is azure data factory
  • Ans. 

    Azure Data Factory is a cloud-based data integration service that allows you to create, schedule, and manage data pipelines.

    • Azure Data Factory is used to move and transform data from various sources to destinations.

    • It supports data integration and orchestration of workflows.

    • You can monitor and manage data pipelines using Azure Data Factory.

    • It provides a visual interface for designing and monitoring data pipelines.

    • Azure...

  • Answered by AI
  • Q3. What is Azure data lake
  • Ans. 

    Azure Data Lake is a scalable data storage and analytics service provided by Microsoft Azure.

    • Azure Data Lake Store is a secure data repository that allows you to store and analyze petabytes of data.

    • Azure Data Lake Analytics is a distributed analytics service that can process big data using Apache Hadoop and Apache Spark.

    • It is designed for big data processing and analytics tasks, providing high performance and scalabili

  • Answered by AI
  • Q4. 2nd highest salary from employee table
  • Q5. What is index in table
  • Ans. 

    An index in a table is a data structure that improves the speed of data retrieval operations on a database table.

    • Indexes are used to quickly locate data without having to search every row in a table.

    • They can be created on one or more columns in a table.

    • Examples of indexes include primary keys, unique constraints, and non-unique indexes.

  • Answered by AI

Interview Preparation Tips

Topics to prepare for Accenture Azure Data Engineer interview:
  • Adf
Interview preparation tips for other job seekers - Just comlete the basics.

Skills evaluated in this interview

Accenture interview questions for designations

 Azure Cloud Engineer

 (1)

 Azure DevOps Engineer

 (1)

 Azure Developer

 (3)

 Azure Administrator

 (1)

 Data Engineer

 (78)

 Senior Data Engineer

 (13)

 Data Engineer 1

 (3)

 Gcp Data Engineer

 (3)

Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
-

I was interviewed in May 2024.

Round 1 - Technical 

(2 Questions)

  • Q1. What is copy activity
  • Ans. 

    Copy activity is a tool in Azure Data Factory used to move data between data stores.

    • Copy activity is a feature in Azure Data Factory that allows you to move data between supported data stores.

    • It supports various data sources and destinations such as Azure Blob Storage, Azure SQL Database, and more.

    • You can define data movement tasks using pipelines in Azure Data Factory and monitor the progress of copy activities.

  • Answered by AI
  • Q2. Data bricks architecture

Skills evaluated in this interview

Get interview-ready with Top Accenture Interview Questions

Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Referral and was interviewed in Sep 2023. There were 4 interview rounds.

Round 1 - Resume Shortlist 
Pro Tip by AmbitionBox:
Keep your resume crisp and to the point. A recruiter looks at your resume for an average of 6 seconds, make sure to leave the best impression.
View all tips
Round 2 - Technical 

(2 Questions)

  • Q1. 2 queries related to window function
  • Q2. Some theory question on ADF
Round 3 - Technical 

(2 Questions)

  • Q1. Question on data modelling,security,synapse table types,scenario based adf pipeline questions
  • Q2. How to mask data in azure
  • Ans. 

    Data masking in Azure helps protect sensitive information by replacing original data with fictitious data.

    • Use Dynamic Data Masking in Azure SQL Database to obfuscate sensitive data in real-time

    • Leverage Azure Purview to discover, classify, and mask sensitive data across various data sources

    • Implement Azure Data Factory to transform and mask data during ETL processes

    • Utilize Azure Information Protection to apply encryption

  • Answered by AI
Round 4 - HR 

(1 Question)

  • Q1. Salary discussion

Skills evaluated in this interview

Interview experience
3
Average
Difficulty level
-
Process Duration
-
Result
No response
Round 1 - Technical 

(2 Questions)

  • Q1. What is azure IR
  • Ans. 

    Azure IR stands for Azure Integration Runtime, which is a data integration service in Azure Data Factory.

    • Azure IR is used to provide data integration capabilities across different network environments.

    • It allows data movement between cloud and on-premises data sources.

    • Azure IR can be configured to run data integration activities in Azure Data Factory pipelines.

    • It supports different types of data integration activities s...

  • Answered by AI
  • Q2. Integrationrun tume
Round 2 - Technical 

(1 Question)

  • Q1. What is scd type 1
  • Ans. 

    SCD Type 1 is a method of updating data in a data warehouse by overwriting existing data with new information.

    • Overwrites existing data with new information

    • No historical data is kept

    • Simplest and fastest method of updating data

  • Answered by AI

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
Moderate
Process Duration
2-4 weeks
Result
Selected Selected

I applied via Referral and was interviewed in Nov 2023. There was 1 interview round.

Round 1 - Technical 

(1 Question)

  • Q1. Given 2 datasets, and asked to explain the number of records for each type of join.

Interview questions from similar companies

Interview experience
5
Excellent
Difficulty level
Easy
Process Duration
2-4 weeks
Result
Selected Selected

I was interviewed in Jan 2025.

Round 1 - Technical 

(2 Questions)

  • Q1. SQL medium questions to be done in Pyspark parallely.
  • Q2. Datafactory and other azure resources scenario based
Round 2 - Technical 

(1 Question)

  • Q1. Project discussion with the team lead and manager
Round 3 - Client Interview 

(1 Question)

  • Q1. Technical and general discussion
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Not Selected

I applied via Naukri.com and was interviewed in Nov 2024. There were 2 interview rounds.

Round 1 - Technical 

(5 Questions)

  • Q1. How would you create a pipeline for ADLS to SQL data movement?
  • Q2. How would you create a pipeline from REST API to ADLS? What is there are 8 million rows of records?
  • Q3. IF data needs filtering, joining and aggregation, how would you do it with ADF?
  • Q4. Explain medallion architecture.
  • Q5. Explain medallion with databricks
Round 2 - HR 

(1 Question)

  • Q1. Basic questions and salary expectation.

Interview Preparation Tips

Topics to prepare for Capgemini Azure Data Engineer interview:
  • ADF
  • Databricks

Accenture Interview FAQs

How many rounds are there in Accenture Azure Data Engineer interview?
Accenture interview process usually has 1-2 rounds. The most common rounds in the Accenture interview process are Technical, HR and Coding Test.
What are the top questions asked in Accenture Azure Data Engineer interview?

Some of the top questions asked at the Accenture Azure Data Engineer interview -

  1. Which IR should we use if we want to copy data from on-premise db to az...read more
  2. What is difference between scheduled trigger and tumbling window trig...read more
  3. What are the control flow activites in ...read more

Tell us how to improve this page.

Accenture Azure Data Engineer Interview Process

based on 8 interviews

2 Interview rounds

  • Technical Round - 1
  • Technical Round - 2
View more
Accenture Azure Data Engineer Salary
based on 475 salaries
₹4.4 L/yr - ₹16 L/yr
12% more than the average Azure Data Engineer Salary in India
View more details

Accenture Azure Data Engineer Reviews and Ratings

based on 21 reviews

3.0/5

Rating in categories

3.2

Skill development

3.0

Work-life balance

2.6

Salary

3.2

Job security

2.9

Company culture

2.0

Promotions

2.6

Work satisfaction

Explore 21 Reviews and Ratings
Application Development Analyst
38.9k salaries
unlock blur

₹3 L/yr - ₹12 L/yr

Application Development - Senior Analyst
27k salaries
unlock blur

₹6.9 L/yr - ₹17.5 L/yr

Team Lead
24.3k salaries
unlock blur

₹7.1 L/yr - ₹25.6 L/yr

Senior Software Engineer
18.2k salaries
unlock blur

₹6 L/yr - ₹19.5 L/yr

Software Engineer
17.4k salaries
unlock blur

₹3.6 L/yr - ₹13.4 L/yr

Explore more salaries
Compare Accenture with

TCS

3.7
Compare

Cognizant

3.8
Compare

Capgemini

3.7
Compare

Infosys

3.6
Compare
Did you find this page helpful?
Yes No
write
Share an Interview