Filter interviews by
Pega is a leading software platform for customer engagement and operational excellence.
Pega is a low-code platform that enables businesses to build and deploy enterprise applications quickly.
It provides tools for process automation, case management, customer service, and decision management.
Pega's platform uses a model-driven approach, allowing users to visually design and configure applications.
It offers AI capabiliti...
I applied via Recruitment Consulltant and was interviewed before Jan 2022. There were 2 interview rounds.
I have worked on optimizing data pipeline performance by implementing parallel processing, caching, and optimizing queries.
Implemented parallel processing to increase throughput
Utilized caching to reduce data retrieval time
Optimized queries to reduce database load
Used compression techniques to reduce data transfer time
Implemented load balancing to distribute workload
Used indexing to improve query performance
Best practices for DWH
Design a scalable and flexible architecture
Ensure data quality and consistency
Implement proper security measures
Use ETL tools for data integration
Create a data dictionary for easy understanding
Regularly monitor and optimize performance
Implement disaster recovery and backup plans
I applied via Naukri.com and was interviewed in Jul 2024. There was 1 interview round.
I applied via Recruitment Consulltant and was interviewed before Dec 2021. There were 3 interview rounds.
Different use cases require different solutions. It's important to understand the problem before proposing a solution.
Identify the problem and its root cause
Research and analyze possible solutions
Evaluate the pros and cons of each solution
Choose the best solution based on the requirements and constraints
Implement and test the solution
Monitor and optimize the solution over time
Optimisation can be achieved through various methods such as improving algorithms, reducing data size, and using efficient hardware.
Improving algorithms to reduce time complexity
Reducing data size to improve memory usage
Using efficient hardware to improve processing speed
Caching frequently used data to reduce database queries
Parallel processing to improve performance
Using compression techniques to reduce data transfer
Technical challenges & way to resolve that.
I applied via Recruitment Consulltant and was interviewed before Jan 2022. There were 2 interview rounds.
I have worked on optimizing data pipeline performance by implementing parallel processing, caching, and optimizing queries.
Implemented parallel processing to increase throughput
Utilized caching to reduce data retrieval time
Optimized queries to reduce database load
Used compression techniques to reduce data transfer time
Implemented load balancing to distribute workload
Used indexing to improve query performance
Best practices for DWH
Design a scalable and flexible architecture
Ensure data quality and consistency
Implement proper security measures
Use ETL tools for data integration
Create a data dictionary for easy understanding
Regularly monitor and optimize performance
Implement disaster recovery and backup plans
I was interviewed in Apr 2021.
I applied via Recruitment Consultant and was interviewed before Oct 2020. There were 3 interview rounds.
I applied via Recruitment Consultant and was interviewed in Sep 2020. There were 3 interview rounds.
Senior Consultant
192
salaries
| ₹7.5 L/yr - ₹30 L/yr |
Consultant
139
salaries
| ₹3.8 L/yr - ₹15 L/yr |
Lead Consultant
99
salaries
| ₹12 L/yr - ₹28 L/yr |
Pega Developer
38
salaries
| ₹4 L/yr - ₹14.7 L/yr |
Business Architect
38
salaries
| ₹10 L/yr - ₹16.5 L/yr |
TCS
Infosys
Wipro
HCLTech