Upload Button Icon Add office photos
Engaged Employer

i

This company page is being actively managed by LTIMindtree Team. If you also belong to the team, you can get access from here

LTIMindtree Verified Tick

Compare button icon Compare button icon Compare

Filter interviews by

LTIMindtree Junior Data Analyst Interview Questions and Answers

Updated 1 Jul 2024

LTIMindtree Junior Data Analyst Interview Experiences

1 interview found

Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(2 Questions)

  • Q1. What is duel axis
  • Ans. 

    Dual axis is a feature in data visualization where two different scales are used on the same chart to represent two different data sets.

    • Dual axis allows for comparing two different measures on the same chart

    • Each measure is assigned to its own axis, allowing for easy comparison

    • Commonly used in tools like Tableau for creating more complex visualizations

  • Answered by AI
  • Q2. What is scatter plot
  • Ans. 

    A scatter plot is a type of data visualization that displays the relationship between two numerical variables through dots on a graph.

    • Scatter plots are used to identify patterns and relationships between variables.

    • Each dot on the plot represents a single data point with the x-axis representing one variable and the y-axis representing the other variable.

    • The pattern of the dots can indicate the strength and direction of ...

  • Answered by AI
Round 2 - One-on-one 

(2 Questions)

  • Q1. What is blending
  • Ans. 

    Blending is the process of combining multiple data sources or datasets to create a unified view.

    • Blending involves merging data from different sources to gain insights or make decisions.

    • It helps in creating a comprehensive dataset by combining relevant information from various sources.

    • Blending can be done using tools like Tableau, Power BI, or Python libraries like Pandas.

    • For example, blending sales data from CRM with c...

  • Answered by AI
  • Q2. What is joining

Skills evaluated in this interview

Interview questions from similar companies

I applied via Company Website and was interviewed before Feb 2021. There were 5 interview rounds.

Round 1 - Coding Test 

It was technical MCQ with 60 questions, based Spark, Hive, Python, ML.

Round 2 - Coding Test 

Coding scenario like as - read xml, json using pyspark, flatten nested xml, json and basic data transformation related scenarios.

Round 3 - Technical 

(1 Question)

  • Q1. Window functions, optimization techniques etc.
Round 4 - Technical 

(1 Question)

  • Q1. Databricks, Pyspark level core technical concepts related architecture and internal working of functions with optimizations techniques.
Round 5 - HR 

(1 Question)

  • Q1. What are your salary expectations?

Interview Preparation Tips

Topics to prepare for Accenture Senior Data Engineer interview:
  • Python
  • Spark
  • Cloud Computing
  • Data Warehousing
Interview preparation tips for other job seekers - Need to have good understanding of core concepts of big data processing technologies and core working methodology of Data engineering with spark, python, databricks, data warehouse concepts.

I applied via Naukri.com and was interviewed before Jul 2021. There were 2 interview rounds.

Round 1 - One-on-one 

(1 Question)

  • Q1. Details about current project and the issues faced
Round 2 - HR 

(1 Question)

  • Q1. About how to handle while production issues and some common HR questions

Interview Preparation Tips

Interview preparation tips for other job seekers - Apply through accenture website or tel anyone working in Accenture to refer
Interview experience
1
Bad
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(1 Question)

  • Q1. Adf,databricks,spark,basic and easy question .rude interviewer with less knowledge

I applied via Naukri.com and was interviewed in Jul 2021. There were 4 interview rounds.

Interview Questionnaire 

1 Question

  • Q1. As it was developer role so they asked performance turning practise in all Hadoop tool like hive, sqoop,spark etc.

Interview Preparation Tips

Interview preparation tips for other job seekers - There was 2 technical round. 1st was 45 min and 2nd was 30 min. As I am hadoop data engineer so they asked question from different Hadoop tool like spark hive scala. Suggestion is like U need to prepare whichever tool/technology u have mentioned in resume.
Interview experience
3
Average
Difficulty level
Easy
Process Duration
2-4 weeks
Result
Selected Selected

I applied via Recruitment Consulltant and was interviewed before Jul 2023. There was 1 interview round.

Round 1 - Technical 

(2 Questions)

  • Q1. SQL basic join and window function
  • Q2. PySpark basic operations and spark basic
Interview experience
4
Good
Difficulty level
Easy
Process Duration
2-4 weeks
Result
Selected Selected

I appeared for an interview in Feb 2025.

Round 1 - Technical 

(2 Questions)

  • Q1. Explain SCD and how you will achieve them
  • Ans. 

    SCD (Slowly Changing Dimensions) manages historical data changes in data warehouses.

    • SCD Type 1: Overwrite old data (e.g., updating a customer's address without keeping history).

    • SCD Type 2: Create new records for changes (e.g., adding a new row for a customer's address change).

    • SCD Type 3: Store current and previous values in the same record (e.g., adding a 'previous address' column).

    • Implementation can be done using ETL ...

  • Answered by AI
  • Q2. If your source has multiple inputs how will you handle
  • Ans. 

    Handling multiple inputs in data sources requires effective integration, transformation, and validation strategies.

    • Use ETL (Extract, Transform, Load) processes to consolidate data from various sources.

    • Implement data validation checks to ensure data quality from each input source.

    • Utilize data orchestration tools like Apache Airflow to manage workflows and dependencies.

    • Consider using a message queue (e.g., Kafka) for rea...

  • Answered by AI
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Oct 2022. There were 2 interview rounds.

Round 1 - Resume Shortlist 
Pro Tip by AmbitionBox:
Keep your resume crisp and to the point. A recruiter looks at your resume for an average of 6 seconds, make sure to leave the best impression.
View all tips
Round 2 - Technical 

(2 Questions)

  • Q1. Big data Hadoop architecture and HDFS commands to copy and list files in hdfs spark architecture and Transformation and Action question what happen when we submit spark program spark dataframe coding que...
  • Ans. 

    Questions on big data, Hadoop, Spark, Scala, Git, project and Agile.

    • Hadoop architecture and HDFS commands for copying and listing files in HDFS

    • Spark architecture and Transformation and Action question

    • What happens when we submit a Spark program

    • Spark DataFrame coding question

    • Scala basic program on List

    • Git and Github

    • Project-related question

    • Agile-related

  • Answered by AI
  • Q2. What happen when we submit spark program spark dataframe coding quesiton scala basic program on list git and github project related question agile related

Interview Preparation Tips

Interview preparation tips for other job seekers - Practice spark related question and also do some practice on scala or python coding question

Skills evaluated in this interview

I applied via Referral and was interviewed in Mar 2022. There was 1 interview round.

Round 1 - One-on-one 

(1 Question)

  • Q1. Basics of the projects and also the technical details

Interview Preparation Tips

Interview preparation tips for other job seekers - Prepare basics well and also have proper understanding in the domain knowledge
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
-

I applied via LinkedIn and was interviewed in Feb 2024. There were 3 interview rounds.

Round 1 - Technical 

(2 Questions)

  • Q1. How to work with nested json using pyspark
  • Ans. 

    Working with nested JSON using PySpark involves using the StructType and StructField classes to define the schema and then using the select function to access nested fields.

    • Define the schema using StructType and StructField classes

    • Use the select function to access nested fields

    • Use dot notation to access nested fields, for example df.select('nested_field.sub_field')

  • Answered by AI
  • Q2. How to implement scd2 step by step
  • Ans. 

    Implementing SCD2 involves tracking historical changes in data over time.

    • Identify the business key that uniquely identifies each record

    • Add effective start and end dates to track when the record was valid

    • Insert new records with updated data and end date of '9999-12-31'

    • Update end date of previous record when a change occurs

  • Answered by AI
Round 2 - Technical 

(2 Questions)

  • Q1. Write a SQL query to select data from table 2 where data exists in table 1
  • Ans. 

    Use a SQL query to select data from table 2 where data exists in table 1

    • Use a JOIN statement to link the two tables based on a common column

    • Specify the columns you want to select from table 2

    • Use a WHERE clause to check for existence of data in table 1

  • Answered by AI
  • Q2. After performing joins how many records would be retrieved for inner, left, right and outer joins
  • Ans. 

    The number of records retrieved after performing joins depends on the type of join - inner, left, right, or outer.

    • Inner join retrieves only the matching records from both tables

    • Left join retrieves all records from the left table and matching records from the right table

    • Right join retrieves all records from the right table and matching records from the left table

    • Outer join retrieves all records from both tables, filling

  • Answered by AI
Round 3 - HR 

(1 Question)

  • Q1. About previous company and reason for leaving

Interview Preparation Tips

Interview preparation tips for other job seekers - Don't be afraid of giving interviews. Prepare well attend confidently if you clear it's an opportunity if you don't it's an experience!!

Skills evaluated in this interview

LTIMindtree Interview FAQs

How many rounds are there in LTIMindtree Junior Data Analyst interview?
LTIMindtree interview process usually has 2 rounds. The most common rounds in the LTIMindtree interview process are Technical and One-on-one Round.
What are the top questions asked in LTIMindtree Junior Data Analyst interview?

Some of the top questions asked at the LTIMindtree Junior Data Analyst interview -

  1. What is duel a...read more
  2. what is scatter p...read more
  3. what is blend...read more

Tell us how to improve this page.

LTIMindtree Junior Data Analyst Interview Process

based on 1 interview

Interview experience

4
  
Good
View more
LTIMindtree Junior Data Analyst Salary
based on 7 salaries
₹3 L/yr - ₹5.5 L/yr
13% less than the average Junior Data Analyst Salary in India
View more details
Senior Software Engineer
21.5k salaries
unlock blur

₹5 L/yr - ₹19 L/yr

Software Engineer
16.2k salaries
unlock blur

₹2 L/yr - ₹10 L/yr

Technical Lead
6.4k salaries
unlock blur

₹9.4 L/yr - ₹36 L/yr

Module Lead
5.9k salaries
unlock blur

₹7 L/yr - ₹25.5 L/yr

Senior Engineer
4.4k salaries
unlock blur

₹4.2 L/yr - ₹16.8 L/yr

Explore more salaries
Compare LTIMindtree with

Cognizant

3.7
Compare

Capgemini

3.7
Compare

Accenture

3.8
Compare

TCS

3.7
Compare
Did you find this page helpful?
Yes No
write
Share an Interview