i
Baxter International
Filter interviews by
I applied via Approached by Company and was interviewed in Apr 2023. There were 3 interview rounds.
As a Data Steward in my current org, my roles and responsibilities include managing data quality, ensuring compliance with data policies, and providing data governance support.
Managing data quality by identifying and resolving data issues
Ensuring compliance with data policies and regulations
Providing data governance support by creating and maintaining data dictionaries, data lineage, and data classification
Collaboratin...
I have worked in various domains including finance, healthcare, and retail.
Worked as a data steward in a finance company, ensuring data accuracy and compliance with regulations.
Managed data for a healthcare organization, ensuring patient privacy and security.
Worked with retail data to analyze customer behavior and improve sales strategies.
We use Collibra as our Data Governance tool.
Collibra is a popular Data Governance tool used by many organizations.
It helps in managing data assets, data quality, and data privacy.
Collibra provides a centralized platform for data governance and collaboration.
It also offers features like data lineage, data cataloging, and data stewardship.
Collibra integrates with various data sources and tools like Tableau, Informatica,
Implementing a Data Governance framework involves defining policies, procedures, and roles to manage data assets.
Identify stakeholders and their roles in data governance
Define policies and procedures for data management
Establish data quality standards and metrics
Implement data security and privacy measures
Create a data catalog and inventory
Monitor and enforce compliance with data governance policies
Continuously review
I ensure proper data management through a systematic approach.
Identify data sources and stakeholders
Establish data quality standards and policies
Implement data governance framework
Ensure data security and privacy
Regularly monitor and audit data
Continuously improve data management processes
We use Apache NiFi for our data processing pipelines.
Apache NiFi is an open-source tool for automating and managing data flows between systems.
It provides a web-based interface for designing, building, and monitoring data pipelines.
NiFi supports a wide range of data sources and destinations, including databases, Hadoop, and cloud services.
It also has built-in security and data provenance features.
Some examples of our N...
Setting up a Data Quality workflow involves defining standards, identifying data sources, and implementing data cleansing processes.
Define data quality standards and metrics
Identify data sources and assess their quality
Implement data cleansing processes
Establish data governance policies and procedures
Monitor and measure data quality over time
Data catalog and lineage are done through metadata management and tracking data flow.
Create a metadata repository to store information about data sources, data types, and data lineage.
Track data flow through the use of data lineage tools and techniques such as data mapping and data profiling.
Ensure data quality by implementing data governance policies and procedures.
Regularly update the metadata repository to reflect c...
Top trending discussions
I applied via Approached by Company and was interviewed before Oct 2023. There were 2 interview rounds.
Break even point is the point at which total revenue equals total costs, resulting in neither profit nor loss.
Break even point is where total revenue equals total costs.
It is the point where a company neither makes a profit nor incurs a loss.
It helps in determining the level of sales needed to cover all costs.
Formula: Break Even Point = Fixed Costs / (Selling Price per Unit - Variable Costs per Unit)
Example: A company ...
Confidence level
Pressure handling
Team building attitude
Identification of Goal
I applied via Walk-in and was interviewed before Sep 2019. There were 3 interview rounds.
I manage myself in a hectic environment by prioritizing tasks, staying organized, and practicing stress management techniques.
Prioritize tasks based on urgency and importance
Break down complex tasks into smaller, manageable steps
Use tools like to-do lists, calendars, and reminders to stay organized
Practice time management techniques, such as setting deadlines and allocating specific time slots for different tasks
Take s...
Time intelligence functions like YTD (Year-to-Date) and QTD (Quarter-to-Date) can be used for YTD and QTD analysis in data analytics.
YTD analysis can be done using functions like YTD() or TOTALYTD() to calculate values from the beginning of the year up to the selected date.
QTD analysis can be done using functions like QTD() or TOTALQTD() to calculate values from the beginning of the quarter up to the selected date.
Thes...
Import brings data into Power BI for further analysis, while direct query connects to data source in real-time.
Import: Data is loaded into Power BI for analysis, improving performance but requiring periodic refreshes.
Direct Query: Connects to data source in real-time, allowing for up-to-date information but potentially slower performance.
Import is suitable for smaller datasets, while direct query is better for large da
I applied via Campus Placement and was interviewed in Sep 2022. There were 3 interview rounds.
Easy but time limit was the main issue to address
Hard coding test on hacker earth platform no cameras and plagiarism tester will be on
I appeared for an interview before Mar 2024, where I was asked the following questions.
Tableau offers various types of filters including quick filters, context filters, and data source filters.
Quick filters allow users to easily filter data by selecting values from a list.
Context filters are applied before other filters, affecting the data visible to subsequent filters.
Data source filters limit the data available to Tableau by filtering data at the data source level.
Other types of filters include extract...
There are two types of refresh in Tableau: Extract Refresh and Data Source Refresh.
Extract Refresh: Refreshes the data in Tableau extract, which is a snapshot of data from the original data source.
Data Source Refresh: Refreshes the data directly from the original data source.
Examples: Extract Refresh is useful when working with large datasets to improve performance, while Data Source Refresh ensures real-time data upda
Extract connection imports data into Tableau workbook while live connection directly connects to data source.
Extract connection creates a static snapshot of data in Tableau workbook.
Live connection directly queries data from the source in real-time.
Extract connection is useful for large datasets or when offline access is needed.
Live connection is beneficial for real-time data analysis and dynamic updates.
Example: Extra...
Some of the top questions asked at the Baxter International Data Steward interview -
based on 1 interview
Interview experience
Senior Engineer
82
salaries
| ₹10.1 L/yr - ₹40 L/yr |
Associate
82
salaries
| ₹4 L/yr - ₹10.9 L/yr |
Senior Executive
75
salaries
| ₹3.4 L/yr - ₹10 L/yr |
Principal Engineer
69
salaries
| ₹17.8 L/yr - ₹45 L/yr |
Senior Associate
53
salaries
| ₹6 L/yr - ₹16.5 L/yr |
Apollo Hospitals
GeBBS Healthcare Solutions
UnitedHealth
Max Healthcare