i
eligarf Technologies
Filter interviews by
I applied via Referral and was interviewed before Oct 2022. There were 4 interview rounds.
The tentative date of joining is not available.
The candidate does not have the available data for the tentative date of joining.
The candidate should mention that the date is not confirmed yet.
The candidate can express their willingness to discuss the joining date further during the interview process.
Depends on the profile you are interviewing
An Assignment related to python programming and SQL.
Top trending discussions
I applied via Naukri.com and was interviewed in Apr 2024. There were 4 interview rounds.
Sicario based questions in Databricks and azure data factory
Simple question.. easy to crack
Yet to attend the 2nd round
I applied via Naukri.com and was interviewed in Feb 2023. There were 4 interview rounds.
Sql and python coding conducted
I applied via Naukri.com and was interviewed in Jun 2023. There were 4 interview rounds.
I applied via Company Website and was interviewed before Jan 2024. There were 3 interview rounds.
I have experience working as a Data Engineer in various industries, including healthcare and finance.
Developed data pipelines to ingest, process, and analyze large datasets
Implemented ETL processes to transform data into usable formats
Worked with stakeholders to understand data requirements and deliver solutions
Utilized tools like Apache Spark, Hadoop, and SQL for data processing
Ensured data quality and integrity throu...
Basic Python and SQL
I applied via Referral and was interviewed in Jun 2022. There were 5 interview rounds.
It was through a link they share and you need to write and submit your code, no execution needed.
I applied via Naukri.com and was interviewed in Jul 2021. There was 1 interview round.
Slowly changing data handling in Spark involves updating data over time.
Slowly changing dimensions (SCD) are used to track changes in data over time.
SCD Type 1 updates the data in place, overwriting the old values.
SCD Type 2 creates a new record for each change, with a start and end date.
SCD Type 3 adds a new column to the existing record to track changes.
Spark provides functions like `from_unixtime` and `unix_timestam
Explanation of cumulative sum and rank functions in Spark
Cumulative sum function calculates the running total of a column
Rank function assigns a rank to each row based on the order of values in a column
Both functions can be used with window functions in Spark
Example: df.withColumn('cumulative_sum', F.sum('column').over(Window.orderBy('order_column').rowsBetween(Window.unboundedPreceding, Window.currentRow)))
Example: df...
I applied via Naukri.com and was interviewed before Feb 2022. There were 4 interview rounds.
MCQ SQL round coding easy questions
based on 1 interview
Interview experience
based on 8 reviews
Rating in categories
Data Annotation Engineer
11
salaries
| ₹3 L/yr - ₹4 L/yr |
Consultant
8
salaries
| ₹3.5 L/yr - ₹4.4 L/yr |
Senior Consultant
7
salaries
| ₹4.7 L/yr - ₹10.9 L/yr |
Data Analyst
5
salaries
| ₹3.5 L/yr - ₹5.5 L/yr |
Full Stack Developer
5
salaries
| ₹3.5 L/yr - ₹12.6 L/yr |
TCS
Infosys
Wipro
HCLTech