Upload Button Icon Add office photos
Premium Employer

i

This company page is being actively managed by NTT Data Team. If you also belong to the team, you can get access from here

NTT Data

Compare button icon Compare button icon Compare
3.9

based on 3.6k Reviews

Filter interviews by

NTT Data Data Executive Interview Questions and Answers

Updated 15 May 2024

NTT Data Data Executive Interview Experiences

1 interview found

Interview experience
3
Average
Difficulty level
Easy
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Approached by Company and was interviewed before May 2023. There was 1 interview round.

Round 1 - One-on-one 

(2 Questions)

  • Q1. About ms office
  • Q2. Tell me about ms office

Interview questions from similar companies

Interview experience
5
Excellent
Difficulty level
Easy
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Naukri.com and was interviewed in Nov 2024. There were 2 interview rounds.

Round 1 - Aptitude Test 

The Aptitude Test session accesses mathematical and logical reasoning abilities

Round 2 - Technical 

(6 Questions)

  • Q1. What is Vlookup
  • Q2. Some IF else Question in Excel
  • Q3. What does your day in your previous organization look like?
  • Q4. Could you share the technical skills you possess?
  • Q5. Can you explain what a Pivot Table is?
  • Q6. Find the Highest-paid employee in each department along with their salary and department name.

Interview Preparation Tips

Topics to prepare for Nagarro Data Analyst interview:
  • SQL
  • Excel
  • Problem Solving
  • PowerBI
  • SQL Queries
Interview preparation tips for other job seekers - Practice common interviews and scenarios, especially for your role.
Be prepared to discuss past challenges and how did you overcome.
Interview experience
3
Average
Difficulty level
Moderate
Process Duration
2-4 weeks
Result
Not Selected

I applied via Approached by Company and was interviewed in Dec 2024. There was 1 interview round.

Round 1 - Technical 

(2 Questions)

  • Q1. Tell me the high level overview of dataguard installation?
  • Q2. What are your daily tasks adn what things you handel in your team?
Interview experience
3
Average
Difficulty level
-
Process Duration
-
Result
No response
Round 1 - Technical 

(4 Questions)

  • Q1. What is the architecture of Apache Spark?
  • Ans. 

    Apache Spark architecture includes a cluster manager, worker nodes, and driver program.

    • Apache Spark architecture consists of a cluster manager, which allocates resources and schedules tasks.

    • Worker nodes execute tasks and store data in memory or disk.

    • Driver program coordinates tasks and communicates with the cluster manager.

    • Spark applications run as independent sets of processes on a cluster, coordinated by the SparkCon...

  • Answered by AI
  • Q2. What is the difference between the reduceBy and groupBy transformations in Apache Spark?
  • Ans. 

    reduceBy is used to aggregate data based on key, while groupBy is used to group data based on key.

    • reduceBy is a transformation that combines the values of each key using an associative function and a neutral 'zero value'.

    • groupBy is a transformation that groups the data based on a key and returns a grouped data set.

    • reduceBy is more efficient for aggregating data as it reduces the data before shuffling, while groupBy shu...

  • Answered by AI
  • Q3. What is the difference between RDD (Resilient Distributed Datasets) and DataFrame in Apache Spark?
  • Ans. 

    RDD is a low-level abstraction representing a distributed collection of objects, while DataFrame is a higher-level abstraction representing a distributed collection of data organized into named columns.

    • RDD is more suitable for unstructured data and low-level transformations, while DataFrame is more suitable for structured data and high-level abstractions.

    • DataFrames provide optimizations like query optimization and code...

  • Answered by AI
  • Q4. What are the different modes of execution in Apache Spark?
  • Ans. 

    The different modes of execution in Apache Spark include local mode, standalone mode, YARN mode, and Mesos mode.

    • Local mode: Spark runs on a single machine with one executor.

    • Standalone mode: Spark runs on a cluster managed by a standalone cluster manager.

    • YARN mode: Spark runs on a Hadoop cluster using YARN as the resource manager.

    • Mesos mode: Spark runs on a Mesos cluster with Mesos as the resource manager.

  • Answered by AI

Data Engineer Interview Questions & Answers

Genpact user image Sashikanta Parida

posted on 17 Dec 2024

Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Not Selected

I applied via Recruitment Consulltant and was interviewed in Nov 2024. There were 2 interview rounds.

Round 1 - Technical 

(3 Questions)

  • Q1. What are different type of joins available in Databricks?
  • Ans. 

    Different types of joins available in Databricks include inner join, outer join, left join, right join, and cross join.

    • Inner join: Returns only the rows that have matching values in both tables.

    • Outer join: Returns all rows when there is a match in either table.

    • Left join: Returns all rows from the left table and the matched rows from the right table.

    • Right join: Returns all rows from the right table and the matched rows ...

  • Answered by AI
  • Q2. How do you make your data pipeline fault tolerant?
  • Ans. 

    Implementing fault tolerance in a data pipeline involves redundancy, monitoring, and error handling.

    • Use redundant components to ensure continuous data flow

    • Implement monitoring tools to detect failures and bottlenecks

    • Set up automated alerts for immediate response to issues

    • Design error handling mechanisms to gracefully handle failures

    • Use checkpoints and retries to ensure data integrity

  • Answered by AI
  • Q3. What is AutoLoader?
  • Ans. 

    AutoLoader is a feature in data engineering that automatically loads data from various sources into a data warehouse or database.

    • Automates the process of loading data from different sources

    • Reduces manual effort and human error

    • Can be scheduled to run at specific intervals

    • Examples: Apache Nifi, AWS Glue

  • Answered by AI
Round 2 - Technical 

(2 Questions)

  • Q1. How do you connect to different services in Azure?
  • Ans. 

    To connect to different services in Azure, you can use Azure SDKs, REST APIs, Azure Portal, Azure CLI, and Azure PowerShell.

    • Use Azure SDKs for programming languages like Python, Java, C#, etc.

    • Utilize REST APIs to interact with Azure services programmatically.

    • Access and manage services through the Azure Portal.

    • Leverage Azure CLI for command-line interface interactions.

    • Automate tasks using Azure PowerShell scripts.

  • Answered by AI
  • Q2. What are linked Services?
  • Ans. 

    Linked Services are connections to external data sources or destinations in Azure Data Factory.

    • Linked Services define the connection information needed to connect to external data sources or destinations.

    • They can be used in Data Factory pipelines to read from or write to external systems.

    • Examples of Linked Services include Azure Blob Storage, Azure SQL Database, and Amazon S3.

  • Answered by AI
Interview experience
5
Excellent
Difficulty level
Easy
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Recruitment Consulltant and was interviewed in Nov 2024. There were 2 interview rounds.

Round 1 - HR 

(2 Questions)

  • Q1. Can you provide an overview of your background, including your past experiences and daily activities, as well as the tools you use in your routine?
  • Ans. 

    I have a background in data analysis with experience in using tools like Python, SQL, and Tableau.

    • I have a degree in Statistics and have worked as a Data Analyst for 3 years.

    • My daily activities include cleaning and analyzing data, creating visualizations, and presenting insights to stakeholders.

    • I use Python for data manipulation and analysis, SQL for querying databases, and Tableau for creating interactive dashboards.

    • I...

  • Answered by AI
  • Q2. What are the concepts of advanced Excel and Power BI projects, and how are they utilized within a company or for clients?
  • Ans. 

    Advanced Excel and Power BI are tools used for data analysis and visualization in companies and for clients.

    • Advanced Excel allows for complex data manipulation, analysis, and visualization using features like pivot tables, macros, and VBA programming.

    • Power BI is a business analytics tool that provides interactive visualizations and business intelligence capabilities, connecting to various data sources.

    • These tools are u...

  • Answered by AI
Round 2 - One-on-one 

(2 Questions)

  • Q1. Can you explain your project experience related to Advanced Excel and Power BI?
  • Ans. 

    I have extensive experience in using Advanced Excel and Power BI for data analysis projects.

    • Created complex formulas and macros in Excel to automate data processing tasks

    • Designed interactive dashboards in Power BI to visualize and analyze data trends

    • Integrated data from multiple sources into Power BI for comprehensive analysis

    • Used Power Query and Power Pivot in Excel to manipulate and analyze large datasets

    • Provided dat...

  • Answered by AI
  • Q2. What are the concepts of credit and operations, particularly in relation to Know Your Customer (KYC) procedures and the privacy of client data?
  • Ans. 

    Credit and operations concepts in relation to KYC procedures and client data privacy.

    • Credit refers to the extension of money or resources to a client based on their financial history and ability to repay.

    • Operations involve the day-to-day processes and procedures within a financial institution to ensure smooth functioning.

    • KYC procedures are used to verify the identity of clients to prevent fraud and money laundering.

    • Pri...

  • Answered by AI

Interview Preparation Tips

Interview preparation tips for other job seekers - If your resume is shortlisted, then there is a higher chance that you will be selected.
Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(3 Questions)

  • Q1. What are the optimization techniques used in Apache Spark?
  • Q2. 2 SQL queries , 1 PySpark code and 1 Python Code .
  • Q3. 2-3 Scenario Based questions from ADF and databricks .
Interview experience
1
Bad
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I applied via Job Fair and was interviewed in Nov 2024. There were 2 interview rounds.

Round 1 - Technical 

(2 Questions)

  • Q1. DAX Related Syntax and Codes
  • Q2. Data Modelling, SQL, Python
Round 2 - Technical 

(1 Question)

  • Q1. No Response from HR after calling of selection after Round 1
Interview experience
5
Excellent
Difficulty level
-
Process Duration
-
Result
-

I was interviewed in Dec 2024.

Round 1 - Technical 

(2 Questions)

  • Q1. Windows function-related questions.
  • Q2. Join Related Questions.
Round 2 - Technical 

(2 Questions)

  • Q1. Join related Questions
  • Q2. Subqueries related queations.
Interview experience
4
Good
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Not Selected

I applied via Company Website and was interviewed in Sep 2024. There were 2 interview rounds.

Round 1 - Coding Test 

Platform - Hackerank
Duration - 2 Hours
Topics - Spark and SQL

Round 2 - Technical 

(3 Questions)

  • Q1. What are the common file formats used in data storages? Which one is best for compression?
  • Ans. 

    Common file formats used in data storages include CSV, JSON, Parquet, Avro, and ORC. Parquet is best for compression.

    • CSV (Comma-Separated Values) - simple and widely used, but not efficient for large datasets

    • JSON (JavaScript Object Notation) - human-readable and easy to parse, but can be inefficient for storage

    • Parquet - columnar storage format that is highly efficient for compression and query performance

    • Avro - efficie...

  • Answered by AI
  • Q2. SQL Problem - Given the empoyee attendance table, write a query to print the employees who is abscent for more than cosecutive 10 days in their tenure.
  • Q3. Given the list of words, write the Python program to print the most repeating substring out of all words.
  • Ans. 

    Python program to find the most repeating substring in a list of words.

    • Iterate through each word in the list

    • Generate all possible substrings for each word

    • Count the occurrences of each substring using a dictionary

    • Find the substring with the highest count

  • Answered by AI

Interview Preparation Tips

Interview preparation tips for other job seekers - Prepare well in SQL, Spark and Python coding problems.

Skills evaluated in this interview

NTT Data Interview FAQs

How many rounds are there in NTT Data Data Executive interview?
NTT Data interview process usually has 1 rounds. The most common rounds in the NTT Data interview process are One-on-one Round.

Tell us how to improve this page.

Interview Questions from Similar Companies

TCS Interview Questions
3.7
 • 10.2k Interviews
Accenture Interview Questions
3.9
 • 8k Interviews
Infosys Interview Questions
3.7
 • 7.5k Interviews
Wipro Interview Questions
3.7
 • 5.5k Interviews
Cognizant Interview Questions
3.8
 • 5.5k Interviews
Capgemini Interview Questions
3.8
 • 4.7k Interviews
Tech Mahindra Interview Questions
3.6
 • 3.7k Interviews
HCLTech Interview Questions
3.5
 • 3.7k Interviews
Genpact Interview Questions
3.9
 • 3k Interviews
IBM Interview Questions
4.1
 • 2.3k Interviews
View all
Software Engineer
932 salaries
unlock blur

₹2.8 L/yr - ₹11 L/yr

Senior Associate
762 salaries
unlock blur

₹1.2 L/yr - ₹9.3 L/yr

Network Engineer
647 salaries
unlock blur

₹1.8 L/yr - ₹10 L/yr

Software Developer
615 salaries
unlock blur

₹2.5 L/yr - ₹13 L/yr

Senior Software Engineer
512 salaries
unlock blur

₹6.5 L/yr - ₹24 L/yr

Explore more salaries
Compare NTT Data with

Tata Communications

4.1
Compare

Bharti Airtel

4.0
Compare

Reliance Communications

4.0
Compare

Vodafone Idea

4.1
Compare

Calculate your in-hand salary

Confused about how your in-hand salary is calculated? Enter your annual salary (CTC) and get your in-hand salary
Did you find this page helpful?
Yes No
write
Share an Interview