Accenture
Proud winner of ABECA 2024 - AmbitionBox Employee Choice Awards
Filter interviews by
I was interviewed in Mar 2024.
Data proc is short for data processing, which involves transforming raw data into a more usable format for analysis.
Data proc involves cleaning, transforming, and aggregating raw data
It helps in preparing data for analysis and visualization
Examples include cleaning and formatting data from multiple sources before loading into a database
Data processing involves transforming raw data into meaningful information, while data flow refers to the movement of data between systems or components.
Data processing focuses on transforming raw data into a usable format for analysis or storage.
Data flow involves the movement of data between different systems, processes, or components.
Data processing can include tasks such as cleaning, aggregating, and analyzing data...
I applied via LinkedIn and was interviewed in Mar 2024. There were 2 interview rounds.
I led a project to develop a real-time data processing system for a retail company.
Designed data pipelines to ingest, process, and analyze large volumes of data
Implemented ETL processes using tools like Apache Spark and Kafka
Built data models and dashboards for business insights
Collaborated with cross-functional teams to gather requirements and deliver solutions
Salary expectations should be based on industry standards, experience, and location.
Research industry standards for Data Engineer salaries
Consider your level of experience and skills
Take into account the cost of living in the location of the job
Be prepared to negotiate based on the job responsibilities and benefits package
Remove duplicate characters from a string
Iterate through the string and keep track of characters seen
Use a set to store unique characters and remove duplicates
Reconstruct the string without duplicates
SQL query to retrieve second highest salary from a table
Use the ORDER BY clause to sort salaries in descending order
Use the LIMIT clause to retrieve the second row
What people are saying about Accenture
I applied via Referral and was interviewed in Feb 2024. There was 1 interview round.
External views are virtual tables that provide a way to present data from one or more tables in a database.
External views do not store data themselves, but instead provide a way to access data from underlying tables.
They can be used to simplify complex queries by presenting data in a more user-friendly format.
External views can also be used to restrict access to certain columns or rows of data for security purposes.
BigQuery slots are units of computational capacity used to process queries in Google BigQuery.
BigQuery slots are used to allocate resources for query processing in Google BigQuery.
Each query consumes a certain number of slots based on the complexity and size of the data being processed.
Users can purchase additional slots to increase query processing capacity.
Slots are used to parallelize query execution and improve per...
Accenture interview questions for designations
Get interview-ready with Top Accenture Interview Questions
Serverless computing in Databricks allows users to run code without managing servers, scaling automatically based on workload.
Serverless computing in Databricks enables users to focus on writing code without worrying about server management.
It automatically scales resources based on workload, reducing costs and improving efficiency.
Users can run code in Databricks without provisioning or managing servers, making it eas...
I applied via Naukri.com and was interviewed before Nov 2023. There was 1 interview round.
Databricks supports two types of clusters: Standard and High Concurrency.
Databricks supports Standard clusters for single user workloads
Databricks supports High Concurrency clusters for multi-user workloads
Standard clusters are suitable for ad-hoc analysis and ETL jobs
High Concurrency clusters are suitable for shared notebooks and interactive dashboards
Integration run time in Azure Data Factory (ADF) refers to the time taken for data integration processes to run.
Integration run time can vary based on the complexity of the data integration tasks and the volume of data being processed.
Factors such as network latency, data source location, and the number of parallel activities can also impact integration run time.
Monitoring and optimizing integration run time is importa...
Triggers in Azure Data Factory (ADF) are events that cause a pipeline to execute.
Types of triggers in ADF include schedule, tumbling window, event-based, and manual.
Schedule triggers run pipelines on a specified schedule, like daily or hourly.
Tumbling window triggers run pipelines at specified time intervals.
Event-based triggers execute pipelines based on events like file arrival or HTTP request.
Manual triggers require
Activities in ADF and their uses
Data movement activities like Copy Data and Data Flow
Data transformation activities like Mapping Data Flow and Wrangling Data Flow
Data orchestration activities like Execute Pipeline and Wait
Control activities like If Condition and For Each
Integration Runtimes for executing activities in ADF
Simple python questions
Some of the top questions asked at the Accenture Data Engineer interview -
The duration of Accenture Data Engineer interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 85 interviews
3 Interview rounds
based on 205 reviews
Rating in categories
Application Development Analyst
38.9k
salaries
| ₹3 L/yr - ₹12 L/yr |
Application Development - Senior Analyst
27k
salaries
| ₹6.9 L/yr - ₹17.5 L/yr |
Team Lead
24.3k
salaries
| ₹7.1 L/yr - ₹25.6 L/yr |
Senior Software Engineer
18.2k
salaries
| ₹6 L/yr - ₹19.5 L/yr |
Software Engineer
17.4k
salaries
| ₹3.6 L/yr - ₹13.4 L/yr |
TCS
Cognizant
Capgemini
Infosys