Filter interviews by
I applied via LinkedIn and was interviewed in Apr 2024. There were 2 interview rounds.
Functional programming is a programming paradigm that treats computation as the evaluation of mathematical functions and avoids changing state and mutable data.
Focuses on the evaluation of functions
Avoids changing state and mutable data
Emphasizes immutability and pure functions
Supports higher-order functions and recursion
Examples: Haskell, Scala, Clojure
The 3 major pillars of OOP are encapsulation, inheritance, and polymorphism.
Encapsulation: Bundling data and methods that operate on the data into a single unit. Example: Class in Java.
Inheritance: Ability of a class to inherit properties and behavior from another class. Example: Subclass extending a superclass.
Polymorphism: Ability to present the same interface for different data types. Example: Method overloading in
Apache Spark is a fast and general-purpose cluster computing system for big data processing.
Apache Spark is designed for speed and ease of use in processing large datasets.
It provides APIs in Java, Scala, Python, and R for building parallel applications.
Spark can run on Hadoop, standalone, or in the cloud, and includes libraries for SQL, streaming, machine learning, and graph processing.
It uses in-memory computing to i...
A cross join in SQL is a join operation that produces the Cartesian product of two tables.
It combines each row from the first table with every row from the second table.
It does not require any matching condition like other types of joins.
It can result in a large number of rows if the tables being joined have many rows.
Top trending discussions
I applied via Referral
Pyspark, Hive, Yarn, Python
I was interviewed in Dec 2024.
2nd with VP which is easy but he seems not okay with my ECTC
I applied via Naukri.com and was interviewed in Apr 2024. There was 1 interview round.
Develop ETL pipeline using Airflow to process CSV files in S3 and load data into Snowflake.
Set up an S3 sensor in Airflow to detect when a new CSV file is dropped in the specified bucket.
Create a custom Python operator in Airflow to read the CSV file from S3, perform necessary transformations, and load data into Snowflake.
Use SnowflakeHook in Airflow to establish connection with Snowflake and execute SQL queries to loa...
I applied via LinkedIn and was interviewed in Jan 2024. There were 3 interview rounds.
Spark Sql and Spark Scripting
Data Modelling for retail brand like dmart
ETL Pipeline which handles streaming data as well
I applied via Referral and was interviewed before Jun 2022. There were 3 interview rounds.
I applied via Referral and was interviewed before Jan 2024. There was 1 interview round.
I applied via Recruitment Consulltant and was interviewed in Jul 2022. There were 3 interview rounds.
based on 1 interview
Interview experience
HDFC Bank
ICICI Bank
Axis Bank
Kotak Mahindra Bank