i
Bajaj Finserv
Proud winner of ABECA 2024 - AmbitionBox Employee Choice Awards
Filter interviews by
I applied via Campus Placement
Quantitative And Reasoning questions
There is one code which u will have written in any language u are want.
Python features and implementation
While loop is used when the number of iterations is not known beforehand, while for loop is used when the number of iterations is known.
While loop is entry-controlled, meaning the condition is checked before the loop body is executed.
For loop is exit-controlled, meaning the condition is checked after the loop body is executed.
While loop is used when the number of iterations is not known beforehand, for loop is used whe
I was interviewed in May 2022.
Round duration - 90 minutes
Round difficulty - Medium
Round duration - 30 minutes
Round difficulty - Medium
Mostly asked stuff related to python and SQL as per the requirement of the role. Timing was around 1 pm, the environment and the interviewer were both suitable and friendly respectively.
The overall experience of the interview was great
Round duration - 15 minutes
Round difficulty - Easy
Asked team related question, the environment was ms teams, HR was friendly. Overall interview experience was great
Tip 1 : practice atleast 3 medium to hard DSA questions daily
Tip 2 : do dynamic programming 2 questions daily
Tip 3 : dont forget to study other subjects like operating systems, DBMS as they are generally asked
Tip 1 : write briefly about yourself
Tip 2 : not only academic but also put in some extra curricular information
I applied via Approached by Company and was interviewed in Feb 2022. There were 5 interview rounds.
Was asked 10 MCQ and 1 coding question of easy difficulty.
Bajaj Finserv interview questions for designations
I applied via Campus Placement and was interviewed in Mar 2022. There were 4 interview rounds.
It was aptitude and coding both at a time given MCQs on core subjects like oops, dsa, data base and given 1 coding question.
I applied via Campus Placement and was interviewed before Apr 2023. There was 1 interview round.
Questions on aptitude,SQL, python
Top trending discussions
I applied via Walk-in and was interviewed in Apr 2024. There were 3 interview rounds.
Lazy evaluation in Spark delays the execution of transformations until an action is called.
Lazy evaluation allows Spark to optimize the execution plan by combining multiple transformations into a single stage.
Transformations are not executed immediately, but are stored as a directed acyclic graph (DAG) of operations.
Actions trigger the execution of the DAG and produce results.
Example: map() and filter() are transformat...
MapReduce is a programming model and processing technique for parallel and distributed computing.
MapReduce is used to process large datasets in parallel across a distributed cluster of computers.
It consists of two main functions - Map function for processing key/value pairs and Reduce function for aggregating the results.
Popularly used in big data processing frameworks like Hadoop for tasks like data sorting, searching...
Skewness is a measure of asymmetry in a distribution. Skewed tables are tables with imbalanced data distribution.
Skewness is a statistical measure that describes the asymmetry of the data distribution around the mean.
Positive skewness indicates a longer tail on the right side of the distribution, while negative skewness indicates a longer tail on the left side.
Skewed tables in data engineering refer to tables with imba...
Spark is a distributed computing framework designed for big data processing.
Spark is built around the concept of Resilient Distributed Datasets (RDDs) which allow for fault-tolerant parallel processing of data.
It provides high-level APIs in Java, Scala, Python, and R for ease of use.
Spark can run on top of Hadoop, Mesos, Kubernetes, or in standalone mode.
It includes modules for SQL, streaming, machine learning, and gra...
I applied via Naukri.com and was interviewed in Mar 2024. There were 3 interview rounds.
Error handling in PySpark involves using try-except blocks and logging to handle exceptions and errors.
Use try-except blocks to catch and handle exceptions in PySpark code
Utilize logging to record errors and exceptions for debugging purposes
Consider using the .option('mode', 'PERMISSIVE') method to handle corrupt records in data processing
posted on 16 Oct 2024
2 Interview rounds
based on 8 reviews
Rating in categories
Assistant Manager
1.3k
salaries
| ₹1.8 L/yr - ₹6.7 L/yr |
Sales Officer
1.3k
salaries
| ₹1 L/yr - ₹5 L/yr |
Sales Executive
1.2k
salaries
| ₹0.9 L/yr - ₹5.1 L/yr |
Sales Manager
1k
salaries
| ₹1.5 L/yr - ₹10 L/yr |
Manager
864
salaries
| ₹3 L/yr - ₹11 L/yr |
HDFC Bank
ICICI Bank
Axis Bank
State Bank of India