i
Ernst &
Young
Filter interviews by
Expectations from your side include collaboration, clear communication, and a focus on innovative solutions to meet client needs.
Collaboration: Work closely with your team to understand project goals and deliverables.
Clear Communication: Maintain open lines of communication to ensure alignment on expectations and progress.
Innovative Solutions: Provide creative and effective technology solutions tailored to client ...
Detecting and handling memory leaks in Node.js involves using tools like heap snapshots and monitoring memory usage.
Use tools like heap snapshots to identify memory leaks
Monitor memory usage over time to detect abnormal increases
Implement proper garbage collection strategies to free up memory
Avoid creating unnecessary closures or retaining references to objects
Cloud computing is the delivery of computing services over the internet, including storage, databases, networking, software, and more.
Allows users to access and store data and applications on remote servers instead of on their local devices
Provides scalability, flexibility, and cost-effectiveness for businesses
Examples include Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform
Data privacy refers to the protection of personal information from unauthorized access or disclosure.
Data privacy involves controlling who has access to personal information
It includes ensuring that data is securely stored and transmitted
Examples include encryption, access controls, and data anonymization
What people are saying about Ernst & Young
Create an Employee POJO in Java 8 and sort a list of employees by their salary.
Define the Employee class with fields: salary, id, and name.
Use a constructor to initialize the Employee objects.
Implement the Comparable interface or use a Comparator for sorting.
Utilize Java 8 Streams to sort the list of employees by salary.
Lambda function is a small anonymous function in Python that can have any number of arguments, but can only have one expression.
Lambda functions are defined using the lambda keyword.
They are commonly used with built-in functions like filter(), map() and reduce().
Lambda functions can be used to create simple one-line functions.
They are often used as arguments to higher-order functions.
Lambda functions can be assign...
Hadoop architecture is a distributed computing framework that allows for the processing of large data sets.
Hadoop consists of two main components: Hadoop Distributed File System (HDFS) and MapReduce.
HDFS is responsible for storing data across multiple nodes in a cluster.
MapReduce is responsible for processing the data stored in HDFS.
Hadoop also includes other components such as YARN, which manages resources in the...
AWS Lambda is a serverless computing service that allows you to run code without provisioning or managing servers.
AWS Lambda is event-driven and can be triggered by various AWS services or external events
It supports multiple programming languages including Node.js, Python, Java, and C#
You only pay for the compute time that you consume, with no upfront costs or minimum fees
Lambda functions can be used for a variety...
Denormalization is the process of adding redundant data to a database to improve read performance.
Denormalization involves duplicating data from normalized tables into a single denormalized table.
It can improve query performance by reducing the need for joins and aggregations.
Denormalization is often used in data warehousing and reporting applications.
Examples of denormalization include creating summary tables or ...
Data cleansing involves identifying and correcting errors in data to improve its quality. Ways to analyze data include using statistical methods, data visualization, and machine learning algorithms.
Identify and remove duplicate records
Standardize data formats and values
Fill in missing values using imputation techniques
Use data profiling to understand data quality issues
Apply data validation rules to ensure accurac...
I applied via Referral and was interviewed in Sep 2024. There were 3 interview rounds.
I have 3 years of experience working in the Data Analytic team where I analyzed large datasets to provide insights and recommendations.
Performed data cleaning and preprocessing tasks to ensure data accuracy
Utilized statistical analysis and machine learning techniques to extract valuable information
Created data visualizations and reports to communicate findings to stakeholders
Data cleansing involves identifying and correcting errors in data to improve its quality. Ways to analyze data include using statistical methods, data visualization, and machine learning algorithms.
Identify and remove duplicate records
Standardize data formats and values
Fill in missing values using imputation techniques
Use data profiling to understand data quality issues
Apply data validation rules to ensure accuracy
Util...
I would deal with the overwhelming nature of working in a Big4 by prioritizing tasks, seeking support from colleagues, and practicing self-care.
Prioritize tasks based on deadlines and importance to avoid feeling overwhelmed
Seek support from colleagues or mentors to discuss challenges and brainstorm solutions
Practice self-care activities such as exercise, meditation, or hobbies to manage stress and maintain work-life ba...
Basic aptitude and coding
Denormalization is the process of adding redundant data to a database to improve read performance.
Denormalization involves duplicating data from normalized tables into a single denormalized table.
It can improve query performance by reducing the need for joins and aggregations.
Denormalization is often used in data warehousing and reporting applications.
Examples of denormalization include creating summary tables or addin...
I applied via Campus Placement and was interviewed in Jan 2024. There were 4 interview rounds.
Aptitude Test was fairly moderate. There were questions which are quite easy and there were 4/5 question which were tricky and those decided the cutoff.
Two questions were asked. One was string manipulation and another was dynamic programming. Nothing too crazy, the questions were solvable.
In my Group Discussion Round, our group asked to introduce ourselves. The selection happened based on how well you can introduce yourself. They were looking for professional background, personal and social background.
Data privacy refers to the protection of personal information from unauthorized access or disclosure.
Data privacy involves controlling who has access to personal information
It includes ensuring that data is securely stored and transmitted
Examples include encryption, access controls, and data anonymization
Cloud computing is the delivery of computing services over the internet, including storage, databases, networking, software, and more.
Allows users to access and store data and applications on remote servers instead of on their local devices
Provides scalability, flexibility, and cost-effectiveness for businesses
Examples include Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform
I applied via Naukri.com and was interviewed before Sep 2023. There were 2 interview rounds.
Snowflake aptitude test for professional experience
I appeared for an interview before May 2024, where I was asked the following questions.
I want to join EY for its innovative culture, commitment to professional growth, and the opportunity to work on impactful projects globally.
EY's reputation for innovation aligns with my passion for leveraging technology to solve complex business challenges.
The firm's commitment to continuous learning and development, such as the EYU platform, excites me as I seek to enhance my skills.
Working on diverse projects across ...
Expectations from your side include collaboration, clear communication, and a focus on innovative solutions to meet client needs.
Collaboration: Work closely with your team to understand project goals and deliverables.
Clear Communication: Maintain open lines of communication to ensure alignment on expectations and progress.
Innovative Solutions: Provide creative and effective technology solutions tailored to client requi...
I applied via LinkedIn and was interviewed before May 2023. There were 2 interview rounds.
Detecting and handling memory leaks in Node.js involves using tools like heap snapshots and monitoring memory usage.
Use tools like heap snapshots to identify memory leaks
Monitor memory usage over time to detect abnormal increases
Implement proper garbage collection strategies to free up memory
Avoid creating unnecessary closures or retaining references to objects
based on 16 interview experiences
Difficulty level
Duration
based on 65 reviews
Rating in categories
6-12 Yrs
Not Disclosed
Senior Consultant
19.3k
salaries
| ₹15 L/yr - ₹27 L/yr |
Consultant
13.1k
salaries
| ₹10 L/yr - ₹18 L/yr |
Manager
8k
salaries
| ₹23 L/yr - ₹40 L/yr |
Assistant Manager
6.8k
salaries
| ₹14.3 L/yr - ₹25.5 L/yr |
Associate Consultant
4.3k
salaries
| ₹5.2 L/yr - ₹12 L/yr |
Deloitte
PwC
EY Global Delivery Services ( EY GDS)
Accenture