68 Confidis Advisory Services Jobs
Data Engineer - Python/PySpark (5-11 yrs)
Confidis Advisory Services
posted 21hr ago
Key skills for the job
About the Job :
Domain : IT Services & Consulting.
Position : GCP Data Engineer.
Experience : 5-11 Years.
Location : Bangalore, Hyderabad, Pune, Mumbai, Gurgaon & Chennai.
Your Team :
You are invited to work with a top-tier organization that's been in the game for 50+ years, partnering with some of the world's biggest businesses.
As India's largest multinational business group, this enterprise boast a workforce of highly skilled consultants spread across 60+ countries, at the forefront of the financial markets and data industry, delivering exceptional services in Data & Analytics, Capital Markets, and Post Trade.
Your Job :
- Design, code, and develop new features/fix bugs/add enhancements.
- Analyze and improve the efficiency, scalability and stability of various system resources.
- Lead and Mentor junior engineers and drive a culture of technical perfection.
- Drive creative and innovative solutions to complex problems, exemplifying good technical discernment.
- Drive improvements and new approaches to address potential systemic pain points and technical debt, anticipate and avoid problems.
- Take a hands-on approach in developing prototypes, independently and with others, to establish design decisions and/or technical feasibility.
- Evaluate, Install, Setup, Maintain and Upgrade, Data Engineering, Machine Learning and CI/CD infrastructure tools hosted on Cloud (GCP/AWS).
- Drive the CI/CD infrastructure tooling-related work in collaboration with various internal teams to get the user stories, epics, and goals to closure.
- Propose, participate, and implement architecture-level enhancements/changes strategically through Dev/Stage/Prod environments.
- Design, evangelize, deliver comprehensive best practices & efficient usage of available tooling resources/capabilities to run high-performance systems.
- Provide innovative & and strategic solutions, along with cost & risk analysis, to improve the stability, scalability, and performance of the tools' infrastructure.
- Perform troubleshooting, analysis, and resolution of environmental issues.
- Innovate and automate processes/tasks to improve operational efficiency.
- Document and maintain application setup, runbooks, administration, and troubleshooting guides.
Your Capabilities :
- Extensive experience with Google Cloud Platform (GCP) data services such as Big Query, Cloud Storage, and Dataflow.
- Expertise in ETL (Extract, Transform, Load) processes and data integration on GCP.
- Strong SQL and database querying optimization skills on GCP.
- Experience with data warehousing and data architecture on GCP.
- Programming Skills : Algorithms, Design Skills, Refactoring Skills, Debugging Skills, Unit testing Skills.
- Database : SQL Databases (Oracle/MySQL/MSSQL etc.,), No SQL Databases, Data Lake, Snowflake.
Qualifications :
- B.E / B.Tech / M.Tech / MBA.
Skills :
- GCP, SQL, Python, Java, Bigquery, PySpark.
Functional Areas: Software/Testing/Networking
Read full job description