Upload Button Icon Add office photos
Premium Employer

i

This company page is being actively managed by Insight Global Technologies Team. If you also belong to the team, you can get access from here

Insight Global Technologies

Compare button icon Compare button icon Compare
3.3

based on 7 Reviews

Filter interviews by

Insight Global Technologies Azure Data Engineer Interview Questions, Process, and Tips

Updated 2 Oct 2024

Insight Global Technologies Azure Data Engineer Interview Experiences

1 interview found

Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via LinkedIn and was interviewed in Sep 2024. There were 2 interview rounds.

Round 1 - Technical 

(6 Questions)

  • Q1. Setup an ETL flow for data present in Lake House using Databricks
  • Ans. 

    Set up ETL flow for data in Lake House using Databricks

    • Connect Databricks to Lake House storage (e.g. Azure Data Lake Storage)

    • Define ETL process using Databricks notebooks or jobs

    • Extract data from Lake House, transform as needed, and load into target destination

    • Monitor and schedule ETL jobs for automated data processing

  • Answered by AI
  • Q2. How did you handle failures in ADF Pipelines
  • Ans. 

    I handle failures in ADF Pipelines by setting up monitoring, alerts, retries, and error handling mechanisms.

    • Implement monitoring to track pipeline runs and identify failures

    • Set up alerts to notify when a pipeline fails

    • Configure retries for transient failures

    • Use error handling activities like Try/Catch to manage exceptions

    • Utilize Azure Monitor to analyze pipeline performance and troubleshoot issues

  • Answered by AI
  • Q3. Have you worked on any Data Validation Framework?
  • Ans. 

    Yes, I have worked on developing a Data Validation Framework to ensure data accuracy and consistency.

    • Developed automated data validation scripts to check for data accuracy and consistency

    • Implemented data quality checks to identify and resolve data issues

    • Utilized tools like SQL queries, Python scripts, and Azure Data Factory for data validation

    • Worked closely with data stakeholders to define validation rules and requirem

  • Answered by AI
  • Q4. Write a SQL query to fetch the Top 3 revenue generating Product from Sales table
  • Ans. 

    SQL query to fetch Top 3 revenue generating Products from Sales table

    • Use the SELECT statement to retrieve data from the Sales table

    • Use the GROUP BY clause to group the data by Product

    • Use the ORDER BY clause to sort the revenue in descending order

    • Use the LIMIT clause to fetch only the top 3 revenue generating Products

  • Answered by AI
  • Q5. Write a SQL query to fetch the Customer who have not done any transaction in last 30 day but did before 30 days
  • Ans. 

    SQL query to fetch customers who have not transacted in last 30 days but did before

    • Use a subquery to filter customers who transacted before 30 days

    • Use NOT IN or NOT EXISTS to exclude customers who transacted in last 30 days

  • Answered by AI
  • Q6. What is Dynamic Content in ADF and how did you use in previous projects
  • Ans. 

    Dynamic Content in ADF allows for dynamic values to be passed between activities in Azure Data Factory.

    • Dynamic Content can be used to pass values between activities, such as passing output from one activity as input to another.

    • Expressions can be used within Dynamic Content to manipulate data or create dynamic values.

    • Dynamic Content can be used in various ADF components like datasets, linked services, and activities.

    • For...

  • Answered by AI
Round 2 - One-on-one 

(4 Questions)

  • Q1. What all optimization techniques have you applied in projects using Databricks
  • Ans. 

    I have applied optimization techniques like partitioning, caching, and cluster sizing in Databricks projects.

    • Utilized partitioning to improve query performance by limiting the amount of data scanned

    • Implemented caching to store frequently accessed data in memory for faster retrieval

    • Adjusted cluster sizing based on workload requirements to optimize cost and performance

  • Answered by AI
  • Q2. What is Distributed table in Synapse? How to choose distribution type
  • Ans. 

    Distributed table in Synapse is a table that is distributed across multiple nodes for parallel processing.

    • Distributed tables in Synapse are divided into distributions to optimize query performance.

    • There are three distribution types: Hash distribution, Round-robin distribution, and Replicate distribution.

    • Hash distribution is ideal for joining large tables on a common key, Round-robin distribution evenly distributes data...

  • Answered by AI
  • Q3. How to load data Synapse which is available in Databricks
  • Ans. 

    You can load data from Databricks to Synapse using PolyBase or Azure Data Factory.

    • Use PolyBase to load data from Databricks to Synapse by creating an external table in Synapse pointing to the Databricks data location.

    • Alternatively, use Azure Data Factory to copy data from Databricks to Synapse by creating a pipeline with Databricks as source and Synapse as destination.

    • Ensure proper permissions and connectivity between

  • Answered by AI
  • Q4. Have you worked on any real time data processing projects
  • Ans. 

    Yes, I have worked on real-time data processing projects using technologies like Apache Kafka and Spark Streaming.

    • Implemented real-time data pipelines using Apache Kafka for streaming data ingestion

    • Utilized Spark Streaming for processing and analyzing real-time data

    • Worked on monitoring and optimizing the performance of real-time data processing systems

  • Answered by AI

Interview Preparation Tips

Interview preparation tips for other job seekers - Be confident and answer precise to question
Good to brush the basics as well

Skills evaluated in this interview

Insight Global Technologies Interview FAQs

How many rounds are there in Insight Global Technologies Azure Data Engineer interview?
Insight Global Technologies interview process usually has 2 rounds. The most common rounds in the Insight Global Technologies interview process are Technical and One-on-one Round.
What are the top questions asked in Insight Global Technologies Azure Data Engineer interview?

Some of the top questions asked at the Insight Global Technologies Azure Data Engineer interview -

  1. Write a SQL query to fetch the Customer who have not done any transaction in la...read more
  2. What is Distributed table in Synapse? How to choose distribution t...read more
  3. What is Dynamic Content in ADF and how did you use in previous proje...read more

Tell us how to improve this page.

Insight Global Technologies Azure Data Engineer Interview Process

based on 3 interviews

Interview experience

4.7
  
Excellent
View more
HR Recruiter
3 salaries
unlock blur

₹1.5 L/yr - ₹2.8 L/yr

Senior Data Engineer
3 salaries
unlock blur

₹34 L/yr - ₹42.4 L/yr

Explore more salaries
Compare Insight Global Technologies with

TCS

3.7
Compare

Wipro

3.7
Compare

Infosys

3.6
Compare

HCLTech

3.5
Compare
Did you find this page helpful?
Yes No
write
Share an Interview