Your work will directly contribute to PayPal s overarching mission of revolutionizing customer engagement globally
By building, enhancing, and scaling the back-end data pipelines that underpin our marketing technology experiences, you will be a key player in enabling seamless and innovative ways of engagement of our customers worldwide
Your efforts in developing high-quality, secure, and performant software solutions will not only improve user experiences but also drive inclusion and flexibility that is critical in todays digital economy
Your role goes beyond coding; its about making a tangible impact on the lives of millions
Your day to day
Building highly scalable backend data pipeline with high throughput in GCP using BigQuery and Dataproc.
Write and maintain pipelines in Python framework
Scheduling jobs using UC4, airflow
Independently work on multiple product features, utilizing your technical expertise to propose innovative solutions for both new and existing functionalities, informed by a growing understanding of our products and the business domain.
Manage your own project deliverables, timelines, and priorities, effectively balancing multiple tasks to meet project deadlines and performance targets.
Actively engage in design and code reviews, providing constructive feedback to peers and incorporating feedback into your own work to maintain high standards of code quality and functionality.
Sharing your knowledge and experience to new members to help onboard them onto the team quickly and efficiently, fostering a culture of learning and continuous improvement.
What do you need to bring-
A bachelor s degree in computer science or an equivalent combination of technical education and work experience.
3+ years of ETL Expertise ie managing data extraction, transformation, and loading from various sources using advanced SQL and Jupyter Notebooks/Python.
Working knowledge on Big Data, GCP Cloud databases, Streaming Integrations.
Experience in design and building highly scalable distributed applications capable of handling very high volume of data in GCP using BigQuery and python.
Strong conceptual knowledge in Data warehouses, Data marts, distributed data platforms and data lakes, Data Modeling, Schema design and CI/CD
Experience working on SaaS platform(s): Adobe RTCDP is a plus.
Experience using Atlassian JIRA, Service Now, Atlassian Confluence tools.
Experience in delivering projects using Agile Methodology.