Upload Button Icon Add office photos

Filter interviews by

Bornemindz Software Solutions Gcp Data Engineer Interview Questions and Answers

Updated 20 Feb 2024

Bornemindz Software Solutions Gcp Data Engineer Interview Experiences

1 interview found

Interview experience
5
Excellent
Difficulty level
Hard
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Jan 2024. There were 4 interview rounds.

Round 1 - Coding Test 

Prepare as you mentioned in resume

Round 2 - Coding Test 

SQL python questions

Round 3 - Group Discussion 

In detail discussion Project explaination

Round 4 - HR 

(2 Questions)

  • Q1. Offer letter discussion
  • Q2. Offer letter discussion in hr discussion

Interview Preparation Tips

Interview preparation tips for other job seekers - Learn resume details

Interview questions from similar companies

Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I was interviewed in Jan 2025.

Round 1 - Technical 

(8 Questions)

  • Q1. Self intro and project architecture
  • Q2. What are the data sources used?
  • Q3. Bigquery architecture
  • Q4. Partition vs clustering
  • Q5. Bq commands on create table and load csv file
  • Q6. Bq commands on show the schema of the table
  • Q7. Explain about leaf nodes and columnar storage.
  • Q8. How many slots are there in bigquery?
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I applied via Naukri.com and was interviewed in Nov 2024. There were 2 interview rounds.

Round 1 - One-on-one 

(7 Questions)

  • Q1. Explain your project
  • Q2. GCP bigquery architecture
  • Q3. Gcp object versioning
  • Q4. Gcp storage class types
  • Q5. Sql optimisation techniques
  • Q6. Sql coding date format and joins related questions
  • Q7. Partitioning and cluster by
Round 2 - One-on-one 

(2 Questions)

  • Q1. Advanced sql recursive cte and python code using lambda function to explode the list
  • Q2. Project explanation and daily activity and challenge faced in project

Interview Preparation Tips

Topics to prepare for Tech Mahindra Gcp Data Engineer interview:
  • SQL
  • PYSPARK
  • GCP
  • PYTHON
Interview preparation tips for other job seekers - Practice SQL and Python coding extensively; they are evaluating our problem-solving approach and logic, as well as the depth of knowledge we possess.
Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
-

I applied via Naukri.com and was interviewed in Oct 2024. There were 2 interview rounds.

Round 1 - Technical 

(1 Question)

  • Q1. Topics asked Advanced SQL spark dataproc Bigquery Qliksense Git Hub Any innovations or process optimised or not Google Cloud
Round 2 - Behavioral 

(1 Question)

  • Q1. Bigquery spark hadoop hive questions asked on above topics Why are you changing? what extra can you bring with you? whats motivating you to join Deloitte?

Interview Preparation Tips

Topics to prepare for Deloitte Gcp Data Engineer interview:
  • sql
  • bigquery
  • Spark
  • Hadoop
  • dataproc
  • GCP
Interview preparation tips for other job seekers - prepare well
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I applied via LinkedIn and was interviewed in Oct 2024. There were 2 interview rounds.

Round 1 - Technical 

(2 Questions)

  • Q1. Questions on SQL Joins and Window Functions
  • Q2. GCP Big query and Cloud Storage qs
Round 2 - HR 

(2 Questions)

  • Q1. About overall IT experience
  • Q2. Project experience and services used
  • Ans. 

    I have experience working on projects involving data processing, transformation, and analysis using GCP services like BigQuery, Dataflow, and Dataproc.

    • Utilized BigQuery for storing and querying large datasets

    • Implemented data pipelines using Dataflow for real-time data processing

    • Utilized Dataproc for running Apache Spark and Hadoop clusters for data processing

    • Worked on data ingestion and transformation using Cloud Stora

  • Answered by AI
Interview experience
5
Excellent
Difficulty level
Hard
Process Duration
Less than 2 weeks
Result
-

I applied via Naukri.com and was interviewed in Oct 2024. There was 1 interview round.

Round 1 - Technical 

(1 Question)

  • Q1. What is Iam what is sa what is bigquery various optimisations joins sql complex query what is qliksense GIThub schema routines schedules delete drop truncate GUI and terraform related spark basics file fo...
  • Ans. 

    IAM is Identity and Access Management, SA is Service Account, BigQuery is a data warehouse, QlikSense is a data visualization tool, GitHub is a version control system, Spark is a distributed computing framework, Airflow is a workflow automation tool, Bigtable is a NoSQL database, Cloud Composer is a managed workflow orchestration service, Pub/Sub is a messaging service.

    • IAM is used to manage access to resources in Googl...

  • Answered by AI

Interview Preparation Tips

Topics to prepare for Accenture Gcp Data Engineer interview:
  • SQL
  • Python
  • pyspark

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
Easy
Process Duration
2-4 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Jun 2024. There was 1 interview round.

Round 1 - Technical 

(3 Questions)

  • Q1. String is palindrome or not
  • Ans. 

    Check if a string is a palindrome or not

    • Compare the string with its reverse to check for palindrome

    • Ignore spaces and punctuation marks when comparing

    • Examples: 'racecar' is a palindrome, 'hello' is not

  • Answered by AI
  • Q2. Create gcs bucket using python
  • Ans. 

    Use Python to create a GCS bucket

    • Import the necessary libraries like google.cloud.storage

    • Authenticate using service account credentials

    • Use the library functions to create a new bucket

  • Answered by AI
  • Q3. Write a python code to trigger a dataflow job in cloud function
  • Ans. 

    Python code to trigger a dataflow job in cloud function

    • Use the googleapiclient library to interact with the Dataflow API

    • Authenticate using service account credentials

    • Submit a job to Dataflow using the projects.locations.templates.launch endpoint

  • Answered by AI

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - One-on-one 

(6 Questions)

  • Q1. SQL: Find keys present in table A but not in B(B is old copy of A)
  • Ans. 

    Use SQL to find keys present in table A but not in table B (old copy of A).

    • Use a LEFT JOIN to combine tables A and B based on the key column

    • Filter the results where the key column in table B is NULL

    • This will give you the keys present in table A but not in table B

  • Answered by AI
  • Q2. SQL: 4th highest salary
  • Q3. Case Study: Using GCP's tool make a pipeline to transfer file from one GCS bucket to another
  • Ans. 

    Use GCP Dataflow to transfer files between GCS buckets

    • Create a Dataflow pipeline using Apache Beam to read from source bucket and write to destination bucket

    • Use GCS connector to read and write files in Dataflow pipeline

    • Set up appropriate permissions for Dataflow service account to access both buckets

  • Answered by AI
  • Q4. Case Study: A new joiner in IT, how will you explain flow of project and ownership of work. Considering my YOE 3 years
  • Q5. Explain your project, and reasons behind why did you choose airflow over other orchestration tool.
  • Q6. Discuss other orchestration tool in GCP
  • Ans. 

    Cloud Composer is another orchestration tool in GCP

    • Cloud Composer is a fully managed workflow orchestration service built on Apache Airflow

    • It allows you to author, schedule, and monitor workflows that span across GCP services

    • Cloud Composer provides a rich set of features like DAGs, plugins, and monitoring capabilities

    • It integrates seamlessly with other GCP services like BigQuery, Dataflow, and Dataproc

  • Answered by AI

Skills evaluated in this interview

Interview experience
4
Good
Difficulty level
Moderate
Process Duration
6-8 weeks
Result
Selected Selected

I applied via Company Website and was interviewed in Sep 2023. There were 3 interview rounds.

Round 1 - Technical 

(1 Question)

  • Q1. Bigquery architecture, Advanced SQL question scenario based, cloud storage questions,pubsub,dataflow
Round 2 - Technical 

(1 Question)

  • Q1. Used cases on bigquery and sql
  • Ans. 

    BigQuery is used for analyzing large datasets and running complex queries, while SQL is used for querying databases.

    • BigQuery is used for analyzing large datasets quickly and efficiently

    • SQL is used for querying databases to retrieve specific data

    • BigQuery can handle petabytes of data, making it ideal for big data analysis

    • SQL can be used to perform operations like filtering, sorting, and aggregating data

  • Answered by AI
Round 3 - HR 

(1 Question)

  • Q1. Salary discussion and location discussion

Skills evaluated in this interview

Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
2-4 weeks
Result
Not Selected

I applied via Company Website and was interviewed before Mar 2023. There were 2 interview rounds.

Round 1 - Technical 

(2 Questions)

  • Q1. What are sql joins explain About bigquery related
  • Ans. 

    SQL joins are used to combine rows from two or more tables based on a related column between them.

    • SQL joins are used to retrieve data from multiple tables based on a related column between them

    • Types of SQL joins include INNER JOIN, LEFT JOIN, RIGHT JOIN, and FULL JOIN

    • In BigQuery, joins can be performed using standard SQL syntax

    • Example: SELECT * FROM table1 INNER JOIN table2 ON table1.column = table2.column

  • Answered by AI
  • Q2. Bigquery, GCS related
Round 2 - One-on-one 

(1 Question)

  • Q1. About project related questions

Skills evaluated in this interview

Bornemindz Software Solutions Interview FAQs

How many rounds are there in Bornemindz Software Solutions Gcp Data Engineer interview?
Bornemindz Software Solutions interview process usually has 4 rounds. The most common rounds in the Bornemindz Software Solutions interview process are Coding Test, Group Discussion and HR.

Tell us how to improve this page.

Bornemindz Software Solutions Gcp Data Engineer Interview Process

based on 1 interview

Interview experience

5
  
Excellent
View more

Bornemindz Software Solutions Gcp Data Engineer Reviews and Ratings

based on 1 review

5.0/5

Rating in categories

5.0

Skill development

5.0

Work-life balance

3.0

Salary

5.0

Job security

4.0

Company culture

5.0

Promotions

5.0

Work satisfaction

Explore 1 Review and Rating
Software Engineer
34 salaries
unlock blur

₹3 L/yr - ₹7 L/yr

Devops Engineer
17 salaries
unlock blur

₹3.6 L/yr - ₹7 L/yr

Salesforce Developer
9 salaries
unlock blur

₹3.8 L/yr - ₹7 L/yr

Data Engineer
8 salaries
unlock blur

₹4.3 L/yr - ₹6.2 L/yr

Software Developer
6 salaries
unlock blur

₹3.4 L/yr - ₹4.5 L/yr

Explore more salaries
Compare Bornemindz Software Solutions with

TCS

3.7
Compare

Accenture

3.8
Compare

Wipro

3.7
Compare

Cognizant

3.8
Compare
Did you find this page helpful?
Yes No
write
Share an Interview