i
KPI Partners
Filter interviews by
Minimum and maximum cement content in concrete depends on the type of concrete mix and its intended use.
Minimum cement content is typically around 300 kg/m3 for general use
Maximum cement content is usually limited to prevent excessive heat generation and cracking
For example, in high strength concrete mixes, the maximum cement content may be around 450 kg/m3
Water-cement ratio in M-30 grade concrete is typically between 0.45 to 0.55.
The water-cement ratio in M-30 grade concrete is crucial for its strength and durability.
A lower water-cement ratio (e.g. 0.45) results in stronger concrete but may be difficult to work with.
A higher water-cement ratio (e.g. 0.55) makes the concrete easier to work with but may reduce its strength.
Maintaining the correct water-cement ratio is im...
Mix design code refers to the process of determining the proportions of materials in concrete mixtures. K-7 and K-9 refer to wire mesh grades used in construction.
Mix design code is used to determine the proportions of materials in concrete mixtures.
Wire mesh code refers to the grade of wire mesh used in construction, with K-7 and K-9 being common grades.
The difference between K-7 and K-9 wire mesh grades lies in their...
I applied via Naukri.com and was interviewed in Mar 2024. There was 1 interview round.
Facts and dimensions are key concepts in data warehousing. Facts are numerical data that can be measured, while dimensions are descriptive attributes related to the facts.
Facts are quantitative data that can be aggregated, such as sales revenue or quantity sold.
Dimensions are descriptive attributes that provide context to the facts, such as product category, customer name, or date.
Facts are typically stored in fact tab...
I applied via Approached by Company and was interviewed in Sep 2024. There was 1 interview round.
Informatica is a data integration tool used for ETL (Extract, Transform, Load) processes in data engineering.
Informatica is used for extracting data from various sources like databases, flat files, etc.
It can transform the data according to business rules and load it into a target data warehouse or database.
Informatica provides a visual interface for designing ETL workflows and monitoring data integration processes.
It ...
Datastage is an ETL tool used for extracting, transforming, and loading data from various sources to a target destination.
Datastage is part of the IBM Information Server suite.
It provides a graphical interface to design and run data integration jobs.
Datastage supports parallel processing for high performance.
It can connect to a variety of data sources such as databases, flat files, and web services.
Datastage jobs can b...
I applied via Recruitment Consulltant and was interviewed before Nov 2023. There was 1 interview round.
posted on 7 Jan 2025
I applied via Approached by Company and was interviewed before Jan 2024. There were 3 interview rounds.
Basics of SQL, Python
Experience based questions, SQL and Python
I have worked on projects involving building data pipelines, optimizing data storage, and developing machine learning models.
Built data pipelines using Apache Spark and Airflow
Optimized data storage by implementing partitioning and indexing strategies
Developed machine learning models for predictive analytics
I applied via Referral and was interviewed in Jul 2024. There were 2 interview rounds.
1hour,Time speed distance
1hour,sql,python,algebra,Average
Databricks is a unified data analytics platform that includes components like Databricks Workspace, Databricks Runtime, and Databricks Delta.
Databricks Workspace: Collaborative environment for data science and engineering teams.
Databricks Runtime: Optimized Apache Spark cluster for data processing.
Databricks Delta: Unified data management system for data lakes.
To read a JSON file, use a programming language's built-in functions or libraries to parse the file and extract the data.
Use a programming language like Python, Java, or JavaScript to read the JSON file.
Import libraries like json in Python or json-simple in Java to parse the JSON data.
Use functions like json.load() in Python to load the JSON file and convert it into a dictionary or object.
Access the data in the JSON fi...
To find the second highest salary in SQL, use the MAX function with a subquery or the LIMIT clause.
Use the MAX function with a subquery to find the highest salary first, then use a WHERE clause to exclude it and find the second highest salary.
Alternatively, use the LIMIT clause to select the second highest salary directly.
Make sure to handle cases where there may be ties for the highest salary.
Spark cluster configuration involves setting up memory, cores, and other parameters for optimal performance.
Specify the number of executors and executor memory
Set the number of cores per executor
Adjust the driver memory based on the application requirements
Configure shuffle partitions for efficient data processing
Enable dynamic allocation for better resource utilization
based on 1 interview
Interview experience
Data Engineer
85
salaries
| ₹3 L/yr - ₹12 L/yr |
Senior Data Engineer
56
salaries
| ₹12.5 L/yr - ₹28 L/yr |
Lead Data Engineer
52
salaries
| ₹19 L/yr - ₹31 L/yr |
Senior Consultant
45
salaries
| ₹8 L/yr - ₹22 L/yr |
Senior Data Analyst
24
salaries
| ₹10.4 L/yr - ₹24 L/yr |
Tekwissen
Softenger
XcelServ Solutions
Damco Solutions