i
Deloitte
Proud winner of ABECA 2024 - AmbitionBox Employee Choice Awards
Filter interviews by
I applied via campus placement at SRM university (SRMU) and was interviewed in Apr 2023. There were 3 interview rounds.
2 questions were asked in coding round, both are medium level
I applied via Campus Placement and was interviewed in Dec 2021. There was 1 interview round.
I applied via Job Portal and was interviewed in Oct 2024. There were 2 interview rounds.
Its good and better the exam
I applied via Indeed and was interviewed in Jan 2024. There was 1 interview round.
It comprises of 30 questions from verbal quant.
It comprises of few coding questions related to graph dp.
I applied via Naukri.com and was interviewed before Mar 2023. There were 2 interview rounds.
Spark is a fast and general-purpose cluster computing system.
Spark is designed for speed and ease of use in data processing.
It can run programs up to 100x faster than Hadoop MapReduce.
Spark provides high-level APIs in Java, Scala, Python, and R.
It supports various workloads such as batch processing, interactive queries, streaming analytics, and machine learning.
Spark can be used standalone, on Mesos, or on Hadoop YARN
Hive is a data warehouse infrastructure built on top of Hadoop for providing data summarization, query, and analysis.
Hive uses a SQL-like language called HiveQL to query and manage large datasets stored in Hadoop
It allows users to write complex queries to analyze and process data
Hive organizes data into tables, partitions, and buckets for efficient querying
It is commonly used for data warehousing, data analysis, and da
There is no one-size-fits-all answer to which file format is best, as it depends on the specific use case and requirements.
Consider the type of data being stored or transmitted (e.g. text, images, videos, etc.)
Take into account compatibility with software and devices that will be used to access the files
Evaluate factors such as file size, compression, and quality requirements
Common file formats include PDF for document...
Designing ingestion pipeline involves defining data sources, data processing steps, data storage, and data delivery mechanisms.
Identify data sources such as databases, APIs, files, etc.
Define data processing steps like data extraction, transformation, and loading (ETL).
Choose appropriate data storage solutions like databases, data lakes, or data warehouses.
Implement data delivery mechanisms for downstream applications ...
I applied via Campus Placement and was interviewed in Jun 2024. There were 2 interview rounds.
Easy prepare the topics analytical verbal english quant basic level
I applied via Referral and was interviewed in Apr 2024. There were 2 interview rounds.
ADF stands for Azure Data Factory, a cloud-based data integration service that allows you to create, schedule, and manage data pipelines.
ADF allows you to create data-driven workflows for orchestrating and automating data movement and data transformation.
It supports a wide range of data sources, including Azure Blob Storage, Azure SQL Database, and on-premises data sources.
You can use ADF to ingest data from various so...
Dashboard is a visual display of key metrics and trends, while a report is a detailed collection of data and analysis.
Dashboard provides a high-level overview, while a report dives deeper into specific data points
Dashboards are interactive and allow for drill-down capabilities, reports are static
Dashboards are typically used for monitoring and decision-making, reports are used for in-depth analysis
Example: A dashboard ...
Lead and lag are SQL functions used to access data from a previous or subsequent row in a result set.
Lead function is used to access data from a subsequent row in the result set.
Lag function is used to access data from a previous row in the result set.
Both functions can be used to compare values between rows or calculate differences.
Example: SELECT lead(column_name) OVER (ORDER BY column_name) FROM table_name;
I want to join PWC because of its reputation for providing excellent opportunities for professional growth and development.
PWC is known for its strong commitment to employee training and development.
I am impressed by PWC's diverse client base and the opportunity to work on challenging projects.
I believe PWC's collaborative work environment will allow me to learn from experienced professionals and grow in my career.
Seeking new challenges and growth opportunities in a different environment.
Looking for new challenges and opportunities for growth
Interested in exploring different work environments and cultures
Seeking a change in career path or industry
Desire for a higher level of responsibility or leadership role
Wishing to relocate to a different location for personal reasons
I applied via Campus Placement
Question from maths reasoning andc
Two leetcode medium questions
based on 9 reviews
Rating in categories
Consultant
32.8k
salaries
| ₹6.2 L/yr - ₹23 L/yr |
Senior Consultant
20.9k
salaries
| ₹11 L/yr - ₹42 L/yr |
Analyst
13.9k
salaries
| ₹3.8 L/yr - ₹12.6 L/yr |
Assistant Manager
9.9k
salaries
| ₹7.6 L/yr - ₹24 L/yr |
Manager
7k
salaries
| ₹15.7 L/yr - ₹52 L/yr |
Accenture
PwC
Ernst & Young
Cognizant