Filter interviews by
Python and Spark online coding
Use a simple algorithm to find the best days to buy and sell a stock based on price list.
Iterate through the list of prices and keep track of the minimum price and maximum profit
Calculate the profit for each day by subtracting the current price from the minimum price
Update the maximum profit if a higher profit is found
Return the buy and sell days that result in the maximum profit
Project experience and expectations.
Top trending discussions
I applied via Naukri.com and was interviewed in Dec 2024. There were 4 interview rounds.
NA kjwnoi wniowe nfiow flmi
NA fklwmoiwef,m ionfwno njnwfeio onfwp
I applied via LinkedIn and was interviewed in Jul 2024. There were 2 interview rounds.
It was pair programming round where we need to attempt a couple of Spark Scenario along with the Interviewer. You will have a boiler plate code with some functionalities to be filled up. You will be assessed on writing clean and extensible code and test cases.
Python and sql questions
I applied via Approached by Company and was interviewed in Oct 2023. There were 2 interview rounds.
I applied via Referral and was interviewed before Jul 2023. There were 3 interview rounds.
I applied via Naukri.com and was interviewed in Apr 2023. There was 1 interview round.
I applied via Referral and was interviewed before May 2023. There were 2 interview rounds.
Spark is a distributed computing framework that provides in-memory processing capabilities for big data analytics.
Spark has a master-slave architecture with a central coordinator called the Driver and distributed workers called Executors.
It uses Resilient Distributed Datasets (RDDs) for fault-tolerant distributed data processing.
Spark supports various data sources like HDFS, Cassandra, HBase, and S3 for input and outpu...
SQL code for handling various situations in data analysis
Use CASE statements for conditional logic
Use COALESCE function to handle NULL values
Use GROUP BY and HAVING clauses for aggregating data
Use subqueries for complex filtering or calculations
To create a Spark DataFrame, use the createDataFrame() method.
Import the necessary libraries
Create a list of tuples or a dictionary containing the data
Create a schema for the DataFrame
Use the createDataFrame() method to create the DataFrame
based on 1 interview
Interview experience
Software Engineer
166
salaries
| ₹2.6 L/yr - ₹9.7 L/yr |
Softwaretest Engineer
134
salaries
| ₹2.4 L/yr - ₹9.5 L/yr |
Recruitment Analyst
132
salaries
| ₹2 L/yr - ₹4.5 L/yr |
Senior Software Engineer
125
salaries
| ₹6.5 L/yr - ₹24.6 L/yr |
Software Developer
65
salaries
| ₹3.2 L/yr - ₹11.6 L/yr |
Infosys
Wipro
TCS
HCLTech