Filter interviews by
Clear (1)
I applied via Referral
Top trending discussions
Spark has a master-slave architecture with a cluster manager and worker nodes.
Spark has a driver program that communicates with a cluster manager to allocate resources and schedule tasks.
The cluster manager can be standalone, Mesos, or YARN.
Worker nodes execute tasks and store data in memory or on disk.
Spark can also utilize external data sources like Hadoop Distributed File System (HDFS) or Amazon S3.
Spark supports va...
I applied via Naukri.com and was interviewed before Dec 2020. There were 3 interview rounds.
I have worked with tech stacks such as Hadoop, Spark, Kafka, AWS, and SQL.
Experience with Hadoop ecosystem including HDFS, MapReduce, Hive, and Pig
Proficient in Spark for data processing and analysis
Worked with Kafka for real-time data streaming
Familiar with AWS services such as EC2, S3, and EMR
Strong SQL skills for data querying and manipulation
I applied via Referral and was interviewed in Sep 2020. There were 3 interview rounds.
I applied via Campus Placement and was interviewed before Jan 2021. There were 4 interview rounds.
I have worked on various technologies including Hadoop, Spark, SQL, Python, and AWS.
Experience with Hadoop and Spark for big data processing
Proficient in SQL for data querying and manipulation
Skilled in Python for data analysis and scripting
Familiarity with AWS services such as S3, EC2, and EMR
Knowledge of data warehousing and ETL processes
I applied via Campus Placement and was interviewed before Jul 2021. There were 3 interview rounds.
In this round we have aptitude plus coding mcq questions
Here we have to write full fledge code 2 questions were there and are easy
I applied via Walk-in and was interviewed before Feb 2020. There was 1 interview round.
ADLS is Azure Data Lake Storage, a scalable and secure data lake solution in Azure.
ADLS is designed for big data analytics workloads
It supports Hadoop Distributed File System (HDFS) and Blob storage APIs
It provides enterprise-grade security and compliance features
To pass parameters from ADF to Databricks, use the 'Set Parameters' activity in ADF and reference them in Databricks notebooks
I applied via Referral and was interviewed before Jun 2021. There were 2 interview rounds.
based on 3 interviews
Interview experience
based on 11 reviews
Rating in categories
GL Accountant
191
salaries
| ₹0 L/yr - ₹0 L/yr |
Financial Analyst
117
salaries
| ₹0 L/yr - ₹0 L/yr |
Financial Associate
90
salaries
| ₹0 L/yr - ₹0 L/yr |
Data Engineer
81
salaries
| ₹0 L/yr - ₹0 L/yr |
Technical Lead
53
salaries
| ₹0 L/yr - ₹0 L/yr |
Accenture
IBM
TCS
Wipro