Faster and better experience!
i
KPI Partners
Filter interviews by
I applied via LinkedIn and was interviewed in Nov 2024. There was 1 interview round.
I applied via Naukri.com and was interviewed in Nov 2023. There were 3 interview rounds.
Concat in SQL and trim function
Tableau is a data visualization tool used to build interactive charts and dashboards.
Connect to data source
Drag and drop dimensions and measures
Choose chart type
Customize colors, labels, and formatting
Add filters and parameters
Create calculated fields
Build interactive dashboards
KPI Partners interview questions for designations
Top trending discussions
Informatica is a data integration tool used for ETL (Extract, Transform, Load) processes in data engineering.
Informatica is used for extracting data from various sources like databases, flat files, etc.
It can transform the data according to business rules and load it into a target data warehouse or database.
Informatica provides a visual interface for designing ETL workflows and monitoring data integration processes.
It ...
Datastage is an ETL tool used for extracting, transforming, and loading data from various sources to a target destination.
Datastage is part of the IBM Information Server suite.
It provides a graphical interface to design and run data integration jobs.
Datastage supports parallel processing for high performance.
It can connect to a variety of data sources such as databases, flat files, and web services.
Datastage jobs can b...
General Aptitude test
I applied via Approached by Company and was interviewed before May 2023. There were 2 interview rounds.
Our tech stack includes Python, SQL, Apache Spark, Hadoop, AWS, and Docker.
Python is used for data processing and analysis
SQL is used for querying databases
Apache Spark is used for big data processing
Hadoop is used for distributed storage and processing
AWS is used for cloud infrastructure
Docker is used for containerization
posted on 9 May 2022
I applied via Approached by Company and was interviewed in Nov 2021. There was 1 interview round.
Normalization is a process of organizing data in a database to reduce redundancy and improve data integrity.
Normalization involves breaking down a table into smaller tables and defining relationships between them.
It helps in reducing data redundancy and inconsistencies.
Views are virtual tables that are created based on the result of a query. They can be used to simplify complex queries.
Stored procedures are precompiled...
based on 20 reviews
Rating in categories
Data Engineer
73
salaries
| ₹3.2 L/yr - ₹12.7 L/yr |
Senior Data Engineer
61
salaries
| ₹12.5 L/yr - ₹28 L/yr |
Senior Consultant
45
salaries
| ₹8 L/yr - ₹22 L/yr |
Lead Data Engineer
38
salaries
| ₹21.5 L/yr - ₹33 L/yr |
Senior Data Analyst
24
salaries
| ₹10.4 L/yr - ₹24 L/yr |
Accenture
Deloitte
PwC
KPMG India