Upload Button Icon Add office photos
Engaged Employer

i

This company page is being actively managed by TCS Team. If you also belong to the team, you can get access from here

TCS Verified Tick

Compare button icon Compare button icon Compare
3.7

based on 85.2k Reviews

Filter interviews by

TCS Gcp Data Engineer Interview Questions and Answers for Freshers

Updated 17 Jul 2024

TCS Gcp Data Engineer Interview Experiences for Freshers

1 interview found

Interview experience
4
Good
Difficulty level
Easy
Process Duration
2-4 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Jun 2024. There was 1 interview round.

Round 1 - Technical 

(3 Questions)

  • Q1. String is palindrome or not
  • Ans. 

    Check if a string is a palindrome or not

    • Compare the string with its reverse to check for palindrome

    • Ignore spaces and punctuation marks when comparing

    • Examples: 'racecar' is a palindrome, 'hello' is not

  • Answered by AI
  • Q2. Create gcs bucket using python
  • Ans. 

    Use Python to create a GCS bucket

    • Import the necessary libraries like google.cloud.storage

    • Authenticate using service account credentials

    • Use the library functions to create a new bucket

  • Answered by AI
  • Q3. Write a python code to trigger a dataflow job in cloud function
  • Ans. 

    Python code to trigger a dataflow job in cloud function

    • Use the googleapiclient library to interact with the Dataflow API

    • Authenticate using service account credentials

    • Submit a job to Dataflow using the projects.locations.templates.launch endpoint

  • Answered by AI

Skills evaluated in this interview

Interview questions from similar companies

Interview experience
5
Excellent
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(5 Questions)

  • Q1. What is windows function bigquery
  • Ans. 

    Window functions in BigQuery are used to perform calculations across a set of table rows related to the current row.

    • Window functions allow you to perform calculations on a set of rows related to the current row

    • They are used with the OVER() clause in SQL queries

    • Common window functions include ROW_NUMBER(), RANK(), and NTILE()

    • They can be used to calculate moving averages, cumulative sums, and more

  • Answered by AI
  • Q2. What types on nosql databases in gcp
  • Ans. 

    Types of NoSQL databases in GCP include Firestore, Bigtable, and Datastore.

    • Firestore is a flexible, scalable database for mobile, web, and server development.

    • Bigtable is a high-performance NoSQL database service for large analytical and operational workloads.

    • Datastore is a highly scalable NoSQL database for web and mobile applications.

  • Answered by AI
  • Q3. Write code to find max number of product by customer
  • Ans. 

    Code to find max number of product by customer

    • Iterate through each customer's purchases

    • Keep track of the count of each product for each customer

    • Find the product with the maximum count for each customer

  • Answered by AI
  • Q4. Read dataframe python and pyspark
  • Q5. Create dataframe
  • Ans. 

    Creating a dataframe in GCP Data Engineer

    • Use the pandas library to create a dataframe

    • Provide data in the form of a dictionary or list of lists

    • Specify column names if needed

  • Answered by AI

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - One-on-one 

(6 Questions)

  • Q1. SQL: Find keys present in table A but not in B(B is old copy of A)
  • Ans. 

    Use SQL to find keys present in table A but not in table B (old copy of A).

    • Use a LEFT JOIN to combine tables A and B based on the key column

    • Filter the results where the key column in table B is NULL

    • This will give you the keys present in table A but not in table B

  • Answered by AI
  • Q2. SQL: 4th highest salary
  • Q3. Case Study: Using GCP's tool make a pipeline to transfer file from one GCS bucket to another
  • Ans. 

    Use GCP Dataflow to transfer files between GCS buckets

    • Create a Dataflow pipeline using Apache Beam to read from source bucket and write to destination bucket

    • Use GCS connector to read and write files in Dataflow pipeline

    • Set up appropriate permissions for Dataflow service account to access both buckets

  • Answered by AI
  • Q4. Case Study: A new joiner in IT, how will you explain flow of project and ownership of work. Considering my YOE 3 years
  • Q5. Explain your project, and reasons behind why did you choose airflow over other orchestration tool.
  • Q6. Discuss other orchestration tool in GCP
  • Ans. 

    Cloud Composer is another orchestration tool in GCP

    • Cloud Composer is a fully managed workflow orchestration service built on Apache Airflow

    • It allows you to author, schedule, and monitor workflows that span across GCP services

    • Cloud Composer provides a rich set of features like DAGs, plugins, and monitoring capabilities

    • It integrates seamlessly with other GCP services like BigQuery, Dataflow, and Dataproc

  • Answered by AI

Skills evaluated in this interview

Interview experience
2
Poor
Difficulty level
-
Process Duration
-
Result
-
Round 1 - One-on-one 

(5 Questions)

  • Q1. Technical questions
  • Q2. Tell me about yourself
  • Q3. Questions on SQL queries
  • Q4. Python questions
  • Q5. DBT/snowflake questions

Interview Preparation Tips

Interview preparation tips for other job seekers - Despite completing all three rounds and the verbal offer , they have not issued the offer letter, citing project constraints.
Interview experience
5
Excellent
Difficulty level
Hard
Process Duration
2-4 weeks
Result
-

I applied via Naukri.com and was interviewed in Sep 2024. There was 1 interview round.

Round 1 - One-on-one 

(2 Questions)

  • Q1. What is scd type 2?
  • Ans. 

    SCD type 2 is a method used in data warehousing to track historical changes by creating a new record for each change.

    • SCD type 2 stands for Slowly Changing Dimension type 2

    • It involves creating a new record in the dimension table whenever there is a change in the data

    • The old record is marked as inactive and the new record is marked as current

    • It allows for historical tracking of changes in data over time

    • Example: If a cust...

  • Answered by AI
  • Q2. Pyspark question read CSV from folder and add column in each csv file and write it to different location.

Skills evaluated in this interview

Interview experience
5
Excellent
Difficulty level
Easy
Process Duration
-
Result
-

I applied via Campus Placement

Round 1 - Technical 

(1 Question)

  • Q1. ML and deep learning questions
Round 2 - Interview 

(2 Questions)

  • Q1. Projects discussion
  • Q2. Chatgpt architecture
Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
6-8 weeks
Result
Not Selected

I applied via Recruitment Consulltant and was interviewed in Feb 2024. There was 1 interview round.

Round 1 - Technical 

(1 Question)

  • Q1. What is L1 and L2 Regularization?
  • Ans. 

    L1 and L2 regularization are techniques used in machine learning to prevent overfitting by adding penalty terms to the cost function.

    • L1 regularization adds the absolute values of the coefficients as penalty term to the cost function.

    • L2 regularization adds the squared values of the coefficients as penalty term to the cost function.

    • L1 regularization can lead to sparse models by forcing some coefficients to be exactly zer...

  • Answered by AI

Skills evaluated in this interview

Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Dec 2022. There were 6 interview rounds.

Round 1 - Coding Test 

A="bala",b="Babu";
Print=A+b
O/p:
balababu

Round 2 - Aptitude Test 

Python is a computer programming language use to build up software and websites designed by Rossum appear in 20 Feb 1991
1.Numpy is a python libraries working with arrays.
2.pandas is used to read the data sets.
3.matplotlib libraries for visualisation.
4.mission learning is divided into two parts.
*Supervised learning
*Unsupervised learning
5.deep learning is used to create artificial neurons.
6.advance excel is used for graphs and calculate the whole members at same time lot of useful things that have.
7mysql is mainly used for store the data.

Round 3 - Assignment 

Python, numpy, pandas, mission learning, deep learning, basic statistics, advance excel, MySQL

Round 4 - Case Study 

Python, numpy, pandas, mission learning, deep learning, basic statistics, advance excel, MySQL

Round 5 - One-on-one 

(3 Questions)

  • Q1. Why we use mission learning Mission learning used for analysis the data's and we can able to predict and we add some additional algorithm it's mainly used for prediction and AI.
  • Ans. 

    Mission learning is used for data analysis and prediction with additional algorithms for AI.

    • Mission learning is a subset of machine learning that focuses on predicting outcomes based on data analysis.

    • It involves using algorithms to learn patterns and make predictions based on new data.

    • Examples include image recognition, natural language processing, and recommendation systems.

  • Answered by AI
  • Q2. Deep learning uses: Deep learning Mainly used for creating artificial neurons in the neural network. Deep learning is very aid about the mission how to work and artificial intelligence helps also
  • Q3. Statistics: Statistics is used for mathamatics operation and it helps to prediction. Example: We started to calculate how many persons live in between 20to25 age. That time statistics is very help to reduc...
Round 6 - Group Discussion 

I don't know idea about group discussion

Interview Preparation Tips

Interview preparation tips for other job seekers - I am fresher in data science so i need to work in IT sector internship first i work that field salary is next and work is first.

Skills evaluated in this interview

Interview experience
5
Excellent
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Resume Shortlist 
Pro Tip by AmbitionBox:
Keep your resume crisp and to the point. A recruiter looks at your resume for an average of 6 seconds, make sure to leave the best impression.
View all tips
Round 2 - Coding Test 

Basic DP, Array Questions

Round 3 - One-on-one 

(1 Question)

  • Q1. Resume Walkthrough and Discussion, Medium level coding questions
Round 4 - One-on-one 

(1 Question)

  • Q1. Discussion with Manager
Round 5 - HR 

(1 Question)

  • Q1. Normal HR round
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I applied via Job Portal and was interviewed in Apr 2023. There were 2 interview rounds.

Round 1 - Resume Shortlist 
Pro Tip by AmbitionBox:
Don’t add your photo or details such as gender, age, and address in your resume. These details do not add any value.
View all tips
Round 2 - Personality Assessment 

(4 Questions)

  • Q1. Tell me your self
  • Ans. 

    I am a detail-oriented individual with strong data entry skills and experience in various industries.

    • I have worked as a data entry operator for 2 years at XYZ company

    • I am proficient in Microsoft Excel and have experience in data analysis

    • I have excellent typing speed and accuracy

    • I am a quick learner and can adapt to new software and systems easily

  • Answered by AI
  • Q2. About your family members
  • Q3. About our company
  • Q4. Freshers or experienced

Interview Preparation Tips

Interview preparation tips for other job seekers - Tell me about yourself.

TCS Interview FAQs

How many rounds are there in TCS Gcp Data Engineer interview for freshers?
TCS interview process for freshers usually has 1 rounds. The most common rounds in the TCS interview process for freshers are Technical.
How to prepare for TCS Gcp Data Engineer interview for freshers?
Go through your CV in detail and study all the technologies mentioned in your CV. Prepare at least two technologies or languages in depth if you are appearing for a technical interview at TCS. The most common topics and skills that interviewers at TCS expect are GCP, Python, Spark, Big Data and Hadoop.
What are the top questions asked in TCS Gcp Data Engineer interview for freshers?

Some of the top questions asked at the TCS Gcp Data Engineer interview for freshers -

  1. write a python code to trigger a dataflow job in cloud funct...read more
  2. create gcs bucket using pyt...read more
  3. string is palindrome or ...read more

Tell us how to improve this page.

People are getting interviews through

based on 1 TCS interview
Job Portal
100%
Low Confidence
?
Low Confidence means the data is based on a small number of responses received from the candidates.
TCS Gcp Data Engineer Salary
based on 175 salaries
₹3.6 L/yr - ₹10.6 L/yr
25% less than the average Gcp Data Engineer Salary in India
View more details

TCS Gcp Data Engineer Reviews and Ratings

based on 18 reviews

3.7/5

Rating in categories

3.3

Skill development

3.8

Work-Life balance

2.1

Salary & Benefits

4.5

Job Security

3.1

Company culture

2.1

Promotions/Appraisal

3.0

Work Satisfaction

Explore 18 Reviews and Ratings
System Engineer
1.1L salaries
unlock blur

₹1 L/yr - ₹9 L/yr

IT Analyst
67.9k salaries
unlock blur

₹5.1 L/yr - ₹16 L/yr

AST Consultant
51k salaries
unlock blur

₹8 L/yr - ₹25 L/yr

Assistant System Engineer
31.3k salaries
unlock blur

₹2.2 L/yr - ₹5.6 L/yr

Associate Consultant
28.6k salaries
unlock blur

₹8.9 L/yr - ₹31.9 L/yr

Explore more salaries
Compare TCS with

Amazon

4.1
Compare

Wipro

3.7
Compare

Infosys

3.7
Compare

Accenture

3.9
Compare

Calculate your in-hand salary

Confused about how your in-hand salary is calculated? Enter your annual salary (CTC) and get your in-hand salary
Did you find this page helpful?
Yes No
write
Share an Interview