Upload Button Icon Add office photos
Engaged Employer

i

This company page is being actively managed by Tech Mahindra Team. If you also belong to the team, you can get access from here

Tech Mahindra Verified Tick

Compare button icon Compare button icon Compare
3.6

based on 33.4k Reviews

Filter interviews by

Tech Mahindra Gcp Data Engineer Interview Questions, Process, and Tips

Updated 20 Dec 2024

Top Tech Mahindra Gcp Data Engineer Interview Questions and Answers

View all 6 questions

Tech Mahindra Gcp Data Engineer Interview Experiences

2 interviews found

Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I applied via Naukri.com and was interviewed in Nov 2024. There were 2 interview rounds.

Round 1 - One-on-one 

(7 Questions)

  • Q1. Explain your project
  • Ans. 

    Developed a data pipeline to ingest, process, and analyze customer feedback data for a retail company.

    • Used Google Cloud Platform services like BigQuery, Dataflow, and Pub/Sub for data processing.

    • Implemented data cleansing and transformation techniques to ensure data quality.

    • Created visualizations and dashboards using tools like Data Studio for stakeholders to easily interpret the data.

  • Answered by AI
  • Q2. GCP bigquery architecture
  • Q3. Gcp object versioning
  • Q4. Gcp storage class types
  • Ans. 

    GCP offers different storage classes for varying performance and cost requirements.

    • Standard Storage: for frequently accessed data

    • Nearline Storage: for data accessed less frequently

    • Coldline Storage: for data accessed very infrequently

    • Archive Storage: for data stored for long-term retention

  • Answered by AI
  • Q5. Sql optimisation techniques
  • Ans. 

    SQL optimization techniques focus on improving query performance by reducing execution time and resource usage.

    • Use indexes to speed up data retrieval

    • Avoid using SELECT * and instead specify only the columns needed

    • Optimize joins by using appropriate join types and conditions

    • Limit the use of subqueries and instead use JOINs where possible

    • Use EXPLAIN to analyze query execution plans and identify bottlenecks

  • Answered by AI
  • Q6. Sql coding date format and joins related questions
  • Q7. Partitioning and cluster by
Round 2 - One-on-one 

(2 Questions)

  • Q1. Advanced sql recursive cte and python code using lambda function to explode the list
  • Q2. Project explanation and daily activity and challenge faced in project

Interview Preparation Tips

Topics to prepare for Tech Mahindra Gcp Data Engineer interview:
  • SQL
  • PYSPARK
  • GCP
  • PYTHON
Interview preparation tips for other job seekers - Practice SQL and Python coding extensively; they are evaluating our problem-solving approach and logic, as well as the depth of knowledge we possess.

Gcp Data Engineer Interview Questions & Answers

user image Manisa Sarangi

posted on 8 Jun 2024

Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(4 Questions)

  • Q1. Which of these 2 select * from table and select * from table limit 100 is faster
  • Ans. 

    select * from table limit 100 is faster

    • Using 'select * from table' retrieves all rows from the table, which can be slower if the table is large

    • Using 'select * from table limit 100' limits the number of rows retrieved, making it faster

    • Limiting the number of rows fetched can improve query performance

  • Answered by AI
  • Q2. Explain scd and Merge in bigquery
  • Ans. 

    SCD stands for Slowly Changing Dimension and Merge is a SQL operation used to update or insert data in BigQuery.

    • SCD is used to track changes to data over time in a data warehouse

    • Merge in BigQuery is used to perform insert, update, or delete operations in a single statement

    • Example: MERGE INTO target_table USING source_table ON condition WHEN MATCHED THEN UPDATE SET col1 = value1 WHEN NOT MATCHED THEN INSERT (col1, col2)

  • Answered by AI
  • Q3. Architecture of bigquery
  • Ans. 

    BigQuery is a fully managed, serverless data warehouse that enables scalable analysis over petabytes of data.

    • BigQuery uses a columnar storage format for efficient querying.

    • It supports standard SQL for querying data.

    • BigQuery allows for real-time data streaming for analysis.

    • It integrates with various data sources like Google Cloud Storage, Google Sheets, etc.

    • BigQuery provides automatic scaling and high availability.

  • Answered by AI
  • Q4. Dataflow function to split sentence
  • Ans. 

    Dataflow function to split sentence

    • Use the Split transform in Dataflow to split the sentence into words

    • Apply ParDo function to process each word individually

    • Use regular expressions to handle punctuation and special characters

  • Answered by AI

Skills evaluated in this interview

Gcp Data Engineer Interview Questions Asked at Other Companies

asked in Cognizant
Q1. GCP Services, What is use of Bigquery? What is Pubsub,Dataflow,cl ... read more
asked in Accenture
Q2. what is Iam what is sa what is bigquery various optimisations joi ... read more
asked in 66degrees
Q3. How to migrate the datawarehouse with gcp services using real tim ... read more
asked in Capgemini
Q4. Explain Google cloud bigquery architecture?
asked in Cognizant
Q5. What is GCP Bigquery, Architecture of BQ, Cloud composer, What Is ... read more

Interview questions from similar companies

Interview experience
1
Bad
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(6 Questions)

  • Q1. What are the GCP services used in your project
  • Ans. 

    The GCP services used in our project include BigQuery, Dataflow, Pub/Sub, and Cloud Storage.

    • BigQuery for data warehousing and analytics

    • Dataflow for real-time data processing

    • Pub/Sub for messaging and event ingestion

    • Cloud Storage for storing data and files

  • Answered by AI
  • Q2. What is cloud function
  • Ans. 

    Cloud Functions are event-driven functions that run in response to cloud events.

    • Serverless functions that automatically scale based on demand

    • Can be triggered by events from various cloud services

    • Supports multiple programming languages like Node.js, Python, etc.

  • Answered by AI
  • Q3. How to shedule job to trigger every hr in Airflow
  • Ans. 

    To schedule a job to trigger every hour in Airflow, you can use the Cron schedule interval

    • Define a DAG (Directed Acyclic Graph) in Airflow

    • Set the schedule_interval parameter to '0 * * * *' to trigger the job every hour

    • Example: schedule_interval='0 * * * *'

  • Answered by AI
  • Q4. Bigquey architecture
  • Q5. How display string in reverse using python
  • Ans. 

    Use Python's slicing feature to display a string in reverse order.

    • Use string slicing with a step of -1 to reverse the string.

    • Example: 'hello'[::-1] will output 'olleh'.

  • Answered by AI
  • Q6. What is pub sub and where are you getting used in your project.
  • Ans. 

    Pub/Sub is a messaging service that allows communication between independent applications.

    • Pub/Sub is used for real-time messaging and event-driven systems.

    • It is commonly used for data ingestion, streaming analytics, and event-driven architectures.

    • Examples of Pub/Sub services include Google Cloud Pub/Sub, Apache Kafka, and Amazon SNS/SQS.

  • Answered by AI
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I applied via LinkedIn and was interviewed in Oct 2024. There were 2 interview rounds.

Round 1 - Technical 

(2 Questions)

  • Q1. Questions on SQL Joins and Window Functions
  • Q2. GCP Big query and Cloud Storage qs
Round 2 - HR 

(2 Questions)

  • Q1. About overall IT experience
  • Q2. Project experience and services used
  • Ans. 

    I have experience working on projects involving data processing, transformation, and analysis using GCP services like BigQuery, Dataflow, and Dataproc.

    • Utilized BigQuery for storing and querying large datasets

    • Implemented data pipelines using Dataflow for real-time data processing

    • Utilized Dataproc for running Apache Spark and Hadoop clusters for data processing

    • Worked on data ingestion and transformation using Cloud Stora

  • Answered by AI
Interview experience
5
Excellent
Difficulty level
Hard
Process Duration
Less than 2 weeks
Result
-

I applied via Naukri.com and was interviewed in Oct 2024. There was 1 interview round.

Round 1 - Technical 

(1 Question)

  • Q1. What is Iam what is sa what is bigquery various optimisations joins sql complex query what is qliksense GIThub schema routines schedules delete drop truncate GUI and terraform related spark basics file fo...
  • Ans. 

    IAM is Identity and Access Management, SA is Service Account, BigQuery is a data warehouse, QlikSense is a data visualization tool, GitHub is a version control system, Spark is a distributed computing framework, Airflow is a workflow automation tool, Bigtable is a NoSQL database, Cloud Composer is a managed workflow orchestration service, Pub/Sub is a messaging service.

    • IAM is used to manage access to resources in Googl...

  • Answered by AI

Interview Preparation Tips

Topics to prepare for Accenture Gcp Data Engineer interview:
  • SQL
  • Python
  • pyspark

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
Easy
Process Duration
2-4 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Jun 2024. There was 1 interview round.

Round 1 - Technical 

(3 Questions)

  • Q1. String is palindrome or not
  • Ans. 

    Check if a string is a palindrome or not

    • Compare the string with its reverse to check for palindrome

    • Ignore spaces and punctuation marks when comparing

    • Examples: 'racecar' is a palindrome, 'hello' is not

  • Answered by AI
  • Q2. Create gcs bucket using python
  • Ans. 

    Use Python to create a GCS bucket

    • Import the necessary libraries like google.cloud.storage

    • Authenticate using service account credentials

    • Use the library functions to create a new bucket

  • Answered by AI
  • Q3. Write a python code to trigger a dataflow job in cloud function
  • Ans. 

    Python code to trigger a dataflow job in cloud function

    • Use the googleapiclient library to interact with the Dataflow API

    • Authenticate using service account credentials

    • Submit a job to Dataflow using the projects.locations.templates.launch endpoint

  • Answered by AI

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - One-on-one 

(6 Questions)

  • Q1. SQL: Find keys present in table A but not in B(B is old copy of A)
  • Ans. 

    Use SQL to find keys present in table A but not in table B (old copy of A).

    • Use a LEFT JOIN to combine tables A and B based on the key column

    • Filter the results where the key column in table B is NULL

    • This will give you the keys present in table A but not in table B

  • Answered by AI
  • Q2. SQL: 4th highest salary
  • Q3. Case Study: Using GCP's tool make a pipeline to transfer file from one GCS bucket to another
  • Ans. 

    Use GCP Dataflow to transfer files between GCS buckets

    • Create a Dataflow pipeline using Apache Beam to read from source bucket and write to destination bucket

    • Use GCS connector to read and write files in Dataflow pipeline

    • Set up appropriate permissions for Dataflow service account to access both buckets

  • Answered by AI
  • Q4. Case Study: A new joiner in IT, how will you explain flow of project and ownership of work. Considering my YOE 3 years
  • Q5. Explain your project, and reasons behind why did you choose airflow over other orchestration tool.
  • Q6. Discuss other orchestration tool in GCP
  • Ans. 

    Cloud Composer is another orchestration tool in GCP

    • Cloud Composer is a fully managed workflow orchestration service built on Apache Airflow

    • It allows you to author, schedule, and monitor workflows that span across GCP services

    • Cloud Composer provides a rich set of features like DAGs, plugins, and monitoring capabilities

    • It integrates seamlessly with other GCP services like BigQuery, Dataflow, and Dataproc

  • Answered by AI

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
Moderate
Process Duration
6-8 weeks
Result
Selected Selected

I applied via Company Website and was interviewed in Sep 2023. There were 3 interview rounds.

Round 1 - Technical 

(1 Question)

  • Q1. Bigquery architecture, Advanced SQL question scenario based, cloud storage questions,pubsub,dataflow
Round 2 - Technical 

(1 Question)

  • Q1. Used cases on bigquery and sql
  • Ans. 

    BigQuery is used for analyzing large datasets and running complex queries, while SQL is used for querying databases.

    • BigQuery is used for analyzing large datasets quickly and efficiently

    • SQL is used for querying databases to retrieve specific data

    • BigQuery can handle petabytes of data, making it ideal for big data analysis

    • SQL can be used to perform operations like filtering, sorting, and aggregating data

  • Answered by AI
Round 3 - HR 

(1 Question)

  • Q1. Salary discussion and location discussion

Skills evaluated in this interview

Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
2-4 weeks
Result
Not Selected

I applied via Company Website and was interviewed before Mar 2023. There were 2 interview rounds.

Round 1 - Technical 

(2 Questions)

  • Q1. What are sql joins explain About bigquery related
  • Ans. 

    SQL joins are used to combine rows from two or more tables based on a related column between them.

    • SQL joins are used to retrieve data from multiple tables based on a related column between them

    • Types of SQL joins include INNER JOIN, LEFT JOIN, RIGHT JOIN, and FULL JOIN

    • In BigQuery, joins can be performed using standard SQL syntax

    • Example: SELECT * FROM table1 INNER JOIN table2 ON table1.column = table2.column

  • Answered by AI
  • Q2. Bigquery, GCS related
Round 2 - One-on-one 

(1 Question)

  • Q1. About project related questions

Skills evaluated in this interview

I applied via LinkedIn and was interviewed before Nov 2021. There were 3 interview rounds.

Round 1 - Resume Shortlist 
Pro Tip by AmbitionBox:
Keep your resume crisp and to the point. A recruiter looks at your resume for an average of 6 seconds, make sure to leave the best impression.
View all tips
Round 2 - Technical 

(1 Question)

  • Q1. Ask about the GCP Projects we did before
Round 3 - Technical 

(1 Question)

  • Q1. Managerial questions with salary discussion

Interview Preparation Tips

Interview preparation tips for other job seekers - Be confident and try to elaborate your projects. Easy to get into IBM.

Tech Mahindra Interview FAQs

How many rounds are there in Tech Mahindra Gcp Data Engineer interview?
Tech Mahindra interview process usually has 1-2 rounds. The most common rounds in the Tech Mahindra interview process are One-on-one Round and Technical.
How to prepare for Tech Mahindra Gcp Data Engineer interview?
Go through your CV in detail and study all the technologies mentioned in your CV. Prepare at least two technologies or languages in depth if you are appearing for a technical interview at Tech Mahindra. The most common topics and skills that interviewers at Tech Mahindra expect are GCP, SQL, Python, ETL and Migration.
What are the top questions asked in Tech Mahindra Gcp Data Engineer interview?

Some of the top questions asked at the Tech Mahindra Gcp Data Engineer interview -

  1. which of these 2 select * from table and select * from table limit 100 is fas...read more
  2. gcp storage class ty...read more
  3. sql optimisation techniq...read more

Tell us how to improve this page.

People are getting interviews through

based on 1 Tech Mahindra interview
Job Portal
100%
Low Confidence
?
Low Confidence means the data is based on a small number of responses received from the candidates.
Tech Mahindra Gcp Data Engineer Salary
based on 30 salaries
₹7.2 L/yr - ₹22 L/yr
59% more than the average Gcp Data Engineer Salary in India
View more details

Tech Mahindra Gcp Data Engineer Reviews and Ratings

based on 1 review

5.0/5

Rating in categories

5.0

Skill development

5.0

Work-Life balance

5.0

Salary & Benefits

5.0

Job Security

5.0

Company culture

5.0

Promotions/Appraisal

5.0

Work Satisfaction

Explore 1 Review and Rating
Software Engineer
26.3k salaries
unlock blur

₹2 L/yr - ₹9.1 L/yr

Senior Software Engineer
21.2k salaries
unlock blur

₹5.5 L/yr - ₹22.5 L/yr

Technical Lead
11.5k salaries
unlock blur

₹9.2 L/yr - ₹38 L/yr

Associate Software Engineer
5.4k salaries
unlock blur

₹1.8 L/yr - ₹6 L/yr

Team Lead
4.9k salaries
unlock blur

₹5.1 L/yr - ₹16.8 L/yr

Explore more salaries
Compare Tech Mahindra with

Infosys

3.7
Compare

Cognizant

3.8
Compare

Accenture

3.9
Compare

Wipro

3.7
Compare

Calculate your in-hand salary

Confused about how your in-hand salary is calculated? Enter your annual salary (CTC) and get your in-hand salary
Did you find this page helpful?
Yes No
write
Share an Interview