LTIMindtree
Canara HSBC Life Insurance Interview Questions and Answers
Q1. What are 7 layers in Azure Data Factory to do the pipelining to accept data from on-prem and to complete the process to push processed data to Azure Cloud
The 7 layers in Azure Data Factory for pipelining data from on-premises to Azure Cloud
1. Ingestion Layer: Collects data from various sources such as on-premises databases, cloud storage, or IoT devices.
2. Storage Layer: Stores the ingested data in a data lake or data warehouse for processing.
3. Batch Layer: Processes data in batches using technologies like Azure Databricks or HDInsight.
4. Stream Layer: Processes real-time data streams using technologies like Azure Stream Anal...read more
Q2. What are the steps to convert normal file to flat file in Python
To convert a normal file to a flat file in Python, you can read the file line by line and write the data to a new file with a delimiter.
Open the normal file in read mode
Read the file line by line
Split the data based on the delimiter (if applicable)
Write the data to a new file with a delimiter
Q3. what is lambda architecture
Lambda architecture is a data processing architecture designed to handle massive quantities of data by using both batch and stream processing methods.
Combines batch processing layer, speed layer, and serving layer
Batch layer processes historical data in large batches
Speed layer processes real-time data
Serving layer merges results from batch and speed layers for querying
Example: Apache Hadoop for batch processing, Apache Storm for real-time processing
Q4. what is data vault
Data Vault is a modeling methodology for designing highly scalable and flexible data warehouses.
Data Vault focuses on long-term historical data storage
It consists of three main components: Hubs, Links, and Satellites
Hubs represent business entities, Links represent relationships between entities, and Satellites store attributes of entities
Data Vault allows for easy scalability and adaptability to changing business requirements
Q5. explain azure data factory
Azure Data Factory is a cloud-based data integration service that allows you to create, schedule, and manage data pipelines.
Azure Data Factory is used to move and transform data from various sources to destinations.
It supports data integration processes like ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform).
You can create data pipelines using a visual interface in Azure Data Factory.
It can connect to on-premises and cloud data sources such as SQL Server, Azure...read more
Q6. window function coding test
Window function coding test involves using window functions in SQL to perform calculations within a specified window of rows.
Understand the syntax and usage of window functions in SQL
Use window functions like ROW_NUMBER(), RANK(), DENSE_RANK(), etc. to perform calculations
Specify the window frame using PARTITION BY and ORDER BY clauses
Practice writing queries with window functions to get comfortable with their usage
Q7. whether have onsite exposure
Yes, I have onsite exposure in previous roles.
I have worked onsite at various client locations to gather requirements and implement solutions.
I have experience collaborating with cross-functional teams in person.
I have conducted onsite training sessions for end users on data architecture best practices.
I have participated in onsite data migration projects.
I have worked onsite to troubleshoot and resolve data-related issues.
More about working at LTIMindtree
Reviews
Interviews
Salaries
Users/Month