i
NTT Data
Filter interviews by
ABAP Dictionary is a central repository for data definitions used in SAP applications. ALV report events are user actions in ALV grid. OOPs ABAP is object-oriented programming in ABAP.
ABAP Dictionary is used to define and manage data definitions in SAP applications
ALV report events include user actions like clicking on a row or column header
OOPs ABAP allows for object-oriented programming concepts like classes and inhe
Aptitude 15 MCQ reasoning 15 MCQ verbal 15 mcq
Some normal topics like with mobile phone and without mobile phone , our coutry developed are developing or not developping like that
I applied via Recruitment Consultant and was interviewed in Jul 2021. There was 1 interview round.
What people are saying about NTT Data
Coding test followed by technical discussion
Governer limits are restrictions set by Salesforce to prevent code from consuming too much resources.
Governer limits are restrictions set by Salesforce to ensure efficient use of resources.
They include limits on CPU time, heap size, SOQL queries, DML statements, etc.
Exceeding these limits can result in exceptions like 'Apex CPU time limit exceeded'.
Reverse a given string by iterating through it backwards and appending each character to a new string.
Create a new empty string to store the reversed string
Iterate through the original string from the last character to the first character
Append each character to the new string
Return the new string as the reversed string
I applied via Walk-in and was interviewed in Feb 2024. There were 2 interview rounds.
Coding round with manager or lead
GD on common topics like Gen AI , tell about yourself, speak with team
Azure Databricks is a unified analytics platform that combines big data processing and machine learning.
Collaborative environment for data scientists, data engineers, and business analysts
Integrated with Azure services for data storage, processing, and analytics
Supports popular programming languages like Python, Scala, and SQL
Provides tools for data visualization and machine learning model development
Cache in Databricks is a mechanism to store intermediate results of computations for faster access.
Cache in Databricks is used to store intermediate results of computations in memory for faster access.
It helps in reducing the time taken to recompute the same data by storing it in memory.
Data can be cached at different levels such as DataFrame, RDD, or table.
Example: df.cache() will cache the DataFrame 'df' in memory fo
Math questions based on logic
National anthem pros n cons
Example coding logic algorithm n
based on 5 reviews
Rating in categories
Hyderabad / Secunderabad,
Bangalore / Bengaluru
+16-11 Yrs
₹ 15-20 LPA
Software Engineer
935
salaries
| ₹2.8 L/yr - ₹11 L/yr |
Senior Associate
761
salaries
| ₹1.2 L/yr - ₹7.3 L/yr |
Network Engineer
654
salaries
| ₹1.8 L/yr - ₹10 L/yr |
Software Developer
615
salaries
| ₹2.5 L/yr - ₹13 L/yr |
Senior Software Engineer
510
salaries
| ₹6.5 L/yr - ₹25.5 L/yr |
Tata Communications
Bharti Airtel
Reliance Communications
Vodafone Idea