i
Ernst &
Young
Filter interviews by
ggplot2 is a popular library for creating graphs in R.
ggplot2 is a powerful and flexible library for creating a wide range of graphs in R.
It allows for customization of almost every aspect of the graph.
Other popular graphing libraries in R include lattice and plotly.
Given an array/list of strings STR_LIST
, group the anagrams together and return each group as a list of strings. Each group must contain strings that are anagrams of each other.
Group anagrams in a list of strings together and return each group as a list of strings.
Iterate through the list of strings and sort each string alphabetically to identify anagrams.
Use a hashmap to group anagrams together based on their sorted versions.
Return the values of the hashmap as the grouped anagrams.
Your task is to design a data structure that efficiently stores a mapping of keys to values and performs operations in constant time.
1. INSERT(key, ...
Design a HashMap data structure with operations like INSERT, DELETE, SEARCH, GET, GET_SIZE, and IS_EMPTY.
Implement a hash table with efficient key-value mapping.
Ensure constant time complexity for operations.
Handle cases where key is not found or data structure is empty.
Example: INSERT('key1', 10), SEARCH('key1'), GET('key1'), DELETE('key1'), GET_SIZE(), IS_EMPTY()
Dependency injection is a design pattern in which components are given their dependencies rather than creating them internally.
Dependency injection helps in achieving loose coupling between classes.
It allows for easier testing by providing mock dependencies.
There are three types of dependency injection: constructor injection, setter injection, and interface injection.
What people are saying about Ernst & Young
The @RestController annotation in Spring Boot is used to define a class as a RESTful controller.
Used to create RESTful web services in Spring Boot
Combines @Controller and @ResponseBody annotations
Eliminates the need for @ResponseBody annotation on each method
Returns data directly in the response body as JSON or XML
Spring Boot is a framework that simplifies the development of Java applications by providing pre-configured setups.
Auto-configuration: Spring Boot automatically configures the application based on dependencies added to the project.
Embedded server: Spring Boot comes with an embedded Tomcat, Jetty, or Undertow server for easy deployment.
Actuator: Provides production-ready features like monitoring, metrics, and healt...
Java 8 streams are a sequence of elements that support functional-style operations.
Streams allow for processing sequences of elements in a functional way.
They can be created from various data sources like collections, arrays, or I/O channels.
Operations like filter, map, reduce, and collect can be performed on streams.
Streams are lazy, meaning intermediate operations are only executed when a terminal operation is c...
Java 8 program to iterate through a Stream using forEach method
Create a Stream of elements using Stream.of() or any other method
Use the forEach() method to iterate through the Stream and perform an action on each element
Example: Stream.of(1, 2, 3, 4, 5).forEach(System.out::println);
Spring Boot is a framework that simplifies the development of Java applications by providing pre-configured settings and tools.
Spring Boot eliminates the need for manual configuration by providing defaults for most settings.
It includes embedded servers like Tomcat, Jetty, or Undertow, making it easy to run applications as standalone JAR files.
Spring Boot also offers production-ready features like metrics, health c...
MVC in Spring is a design pattern that separates an application into three main components: Model, View, and Controller.
Model represents the data and business logic of the application.
View is responsible for rendering the user interface based on the data from the Model.
Controller acts as an intermediary between Model and View, handling user input and updating the Model accordingly.
Spring MVC provides annotations l...
I applied via Referral and was interviewed in Sep 2024. There were 3 interview rounds.
I have 3 years of experience working in the Data Analytic team where I analyzed large datasets to provide insights and recommendations.
Performed data cleaning and preprocessing tasks to ensure data accuracy
Utilized statistical analysis and machine learning techniques to extract valuable information
Created data visualizations and reports to communicate findings to stakeholders
Data cleansing involves identifying and correcting errors in data to improve its quality. Ways to analyze data include using statistical methods, data visualization, and machine learning algorithms.
Identify and remove duplicate records
Standardize data formats and values
Fill in missing values using imputation techniques
Use data profiling to understand data quality issues
Apply data validation rules to ensure accuracy
Util...
I would deal with the overwhelming nature of working in a Big4 by prioritizing tasks, seeking support from colleagues, and practicing self-care.
Prioritize tasks based on deadlines and importance to avoid feeling overwhelmed
Seek support from colleagues or mentors to discuss challenges and brainstorm solutions
Practice self-care activities such as exercise, meditation, or hobbies to manage stress and maintain work-life ba...
Basic aptitude and coding
Denormalization is the process of adding redundant data to a database to improve read performance.
Denormalization involves duplicating data from normalized tables into a single denormalized table.
It can improve query performance by reducing the need for joins and aggregations.
Denormalization is often used in data warehousing and reporting applications.
Examples of denormalization include creating summary tables or addin...
I applied via Campus Placement and was interviewed in Jan 2024. There were 4 interview rounds.
Aptitude Test was fairly moderate. There were questions which are quite easy and there were 4/5 question which were tricky and those decided the cutoff.
Two questions were asked. One was string manipulation and another was dynamic programming. Nothing too crazy, the questions were solvable.
In my Group Discussion Round, our group asked to introduce ourselves. The selection happened based on how well you can introduce yourself. They were looking for professional background, personal and social background.
Data privacy refers to the protection of personal information from unauthorized access or disclosure.
Data privacy involves controlling who has access to personal information
It includes ensuring that data is securely stored and transmitted
Examples include encryption, access controls, and data anonymization
Cloud computing is the delivery of computing services over the internet, including storage, databases, networking, software, and more.
Allows users to access and store data and applications on remote servers instead of on their local devices
Provides scalability, flexibility, and cost-effectiveness for businesses
Examples include Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform
I applied via Naukri.com and was interviewed before Sep 2023. There were 2 interview rounds.
Snowflake aptitude test for professional experience
I appeared for an interview before May 2024, where I was asked the following questions.
I want to join EY for its innovative culture, commitment to professional growth, and the opportunity to work on impactful projects globally.
EY's reputation for innovation aligns with my passion for leveraging technology to solve complex business challenges.
The firm's commitment to continuous learning and development, such as the EYU platform, excites me as I seek to enhance my skills.
Working on diverse projects across ...
Expectations from your side include collaboration, clear communication, and a focus on innovative solutions to meet client needs.
Collaboration: Work closely with your team to understand project goals and deliverables.
Clear Communication: Maintain open lines of communication to ensure alignment on expectations and progress.
Innovative Solutions: Provide creative and effective technology solutions tailored to client requi...
I applied via LinkedIn and was interviewed before May 2023. There were 2 interview rounds.
Detecting and handling memory leaks in Node.js involves using tools like heap snapshots and monitoring memory usage.
Use tools like heap snapshots to identify memory leaks
Monitor memory usage over time to detect abnormal increases
Implement proper garbage collection strategies to free up memory
Avoid creating unnecessary closures or retaining references to objects
based on 16 interview experiences
Difficulty level
Duration
based on 65 reviews
Rating in categories
6-12 Yrs
Not Disclosed
Senior Consultant
19.4k
salaries
| ₹9.1 L/yr - ₹30 L/yr |
Consultant
13.2k
salaries
| ₹6.4 L/yr - ₹21 L/yr |
Manager
8k
salaries
| ₹16.8 L/yr - ₹51 L/yr |
Assistant Manager
6.8k
salaries
| ₹9.9 L/yr - ₹28.5 L/yr |
Associate Consultant
4.3k
salaries
| ₹4.8 L/yr - ₹15 L/yr |
Deloitte
PwC
EY Global Delivery Services ( EY GDS)
Accenture