i
GfK MODE
Filter interviews by
I applied via Referral and was interviewed in Sep 2024. There were 2 interview rounds.
Coalesce is a function used to return the first non-null value in a list of expressions.
Coalesce function is commonly used in SQL to handle null values.
It takes multiple arguments and returns the first non-null value.
Example: COALESCE(column1, column2, 'default') will return the value of column1 if not null, else column2, and if both are null, it will return 'default'.
Repartition is the process of redistributing data across partitions in a distributed system.
Repartitioning helps in balancing the workload and improving performance in distributed computing environments.
It involves moving data between partitions based on certain criteria such as key values or hash functions.
Repartitioning can be done in Spark using operations like repartition() or coalesce().
A non repeatable character is a character that appears only once in a given string.
Iterate through the string and count the frequency of each character
Identify the characters that have a frequency of 1
Return the first non repeatable character found
Sort list without using sort method
Create a custom sorting function using loops
Compare each element with every other element to determine the correct order
Swap elements based on the comparison results
Top trending discussions
I was interviewed in Sep 2024.
They asked various questions on guestimates and few SQL Questions
I applied via Instahyre and was interviewed in Dec 2024. There was 1 interview round.
I applied via Campus Placement
Very Tough Test, high chances of giving up in the beginning. Test was 2 hrs, covered OOPS, Networks, ML, SQL and some other topics total of 8 questions.
I applied via Naukri.com and was interviewed in Sep 2023. There were 2 interview rounds.
1 Python and 1 sql question
I applied via Naukri.com and was interviewed in Sep 2023. There was 1 interview round.
Reading and writing parquet files in PySpark involves using the SparkSession API.
Create a SparkSession object
Read a parquet file using spark.read.parquet() method
Write a DataFrame to a parquet file using df.write.parquet() method
posted on 2 Aug 2024
I applied via Referral and was interviewed before Aug 2023. There were 3 interview rounds.
Window functions in SQL are used to perform calculations across a set of table rows related to the current row.
Window functions are used to calculate values based on a set of rows related to the current row
They do not cause rows to become grouped into a single output row like aggregate functions
Common window functions include ROW_NUMBER(), RANK(), DENSE_RANK(), and NTILE()
I applied via Naukri.com and was interviewed in Sep 2024. There was 1 interview round.
I applied via Company Website and was interviewed in Jul 2024. There were 2 interview rounds.
Aws, scala and cloud concept
I applied via Job Portal and was interviewed in Jan 2024. There was 1 interview round.
based on 1 interview
Interview experience
Analyst
127
salaries
| ₹3 L/yr - ₹4.6 L/yr |
Junior Analyst
92
salaries
| ₹2.8 L/yr - ₹4 L/yr |
Senior Analyst
92
salaries
| ₹3.8 L/yr - ₹5.8 L/yr |
Associate
89
salaries
| ₹3 L/yr - ₹4 L/yr |
Principal Analyst
66
salaries
| ₹4.5 L/yr - ₹6.1 L/yr |
Nielsen
Kantar
Ipsos
Market Xcel Data Matrix