i
TCS
Filter interviews by
I applied via Naukri.com and was interviewed in Apr 2024. There was 1 interview round.
Various performance optimization techniques in Databricks
RDD is a basic abstraction in Spark representing data as a distributed collection of objects, while DataFrame is a distributed collection of data organized into named columns.
RDD is more low-level and less optimized compared to DataFrame
DataFrames are easier to use for data manipulation and analysis
DataFrames provide a more structured way to work with data compared to RDDs
RDDs are suitable for unstructured data process...
I applied via Naukri.com and was interviewed in Oct 2023. There was 1 interview round.
What people are saying about TCS
I was interviewed in Mar 2024.
Reading files in notebook, configuring data, using ADF trigger, parquet format, window functions vs group by, reading CSV file and storing in parquet, dataset vs dataframe, transformations, delta lake
To read files in notebook, use libraries like pandas or pyspark
Configuration needed includes specifying file path, format, and any additional options
ADF trigger can be used for automated data processing, but may not be nec...
TCS interview questions for designations
I applied via Naukri.com and was interviewed in May 2024. There were 2 interview rounds.
The project architecture includes Spark transformations for processing large volumes of data.
Spark transformations are used to manipulate data in distributed computing environments.
Examples of Spark transformations include map, filter, reduceByKey, join, etc.
Use window functions like ROW_NUMBER() to find highest sales from each city in SQL.
Use PARTITION BY clause in ROW_NUMBER() to partition data by city
Order the data by sales in descending order
Filter the results to only include rows with row number 1
Databricks can be mounted using the Databricks CLI or the Databricks REST API.
Use the Databricks CLI command 'databricks fs mount' to mount a storage account to a Databricks workspace.
Alternatively, you can use the Databricks REST API to programmatically mount storage.
I applied via Referral and was interviewed in May 2024. There was 1 interview round.
I was interviewed in May 2024.
Copy activity is a tool in Azure Data Factory used to move data between data stores.
Copy activity is a feature in Azure Data Factory that allows you to move data between supported data stores.
It supports various data sources and destinations such as Azure Blob Storage, Azure SQL Database, and more.
You can define data movement tasks using pipelines in Azure Data Factory and monitor the progress of copy activities.
I applied via Naukri.com and was interviewed in Oct 2020. There were 4 interview rounds.
It was a SQL-related question that required you to solve the problem.
Interview experience
based on 62 reviews
Rating in categories
System Engineer
1.1L
salaries
| ₹1 L/yr - ₹9 L/yr |
IT Analyst
67.9k
salaries
| ₹5.1 L/yr - ₹16 L/yr |
AST Consultant
51k
salaries
| ₹8 L/yr - ₹25 L/yr |
Assistant System Engineer
31.3k
salaries
| ₹2.2 L/yr - ₹5.6 L/yr |
Associate Consultant
28.6k
salaries
| ₹8.9 L/yr - ₹32 L/yr |
Amazon
Wipro
Infosys
Accenture