PayPal Marketing Technology team is dedicated to creating a best-in-class platform. We are looking for highly talented, professional, and motivated engineers to join our team.As a Data Engineer 3 on our Marketing Technology Platform, you will be at the forefront of designing and developing backend data pipelines using GCP (BigQuery, Bigtable and Dataproc), Python programming. As part of your responsibilities, you ll engage in all facets of data pipeline development, including design, coding, ensuring security measures, testing, and overseeing production releases. You will be responsible for delivering new features, enhancements, and fixes for our Data Pipelines within MarTech platform.
Meet our team At PayPal Marketing Technology Platform, we are very supportive, forward-thinking community of customer-centric technologists. We celebrate our successes, learn from our challenges, and always keep pushing forward. Whether were brainstorming the next big feature, tackling complex technical challenges, or sharing insights from our latest project, theres a shared sense of purpose and excitement for what were building. Together, we share a common goal: to build seamless, secure, and scalable solutions that empower individuals and businesses around the globe.
Your way to impact
Your work will directly contribute to PayPal s overarching mission of revolutionizing customer engagement globally. By building, enhancing, and scaling the back-end data pipelines that underpin our marketing technology experiences, you will be a key player in enabling seamless and innovative ways of engagement of our customers worldwide. Your efforts in developing high-quality, secure, and performant software solutions will not only improve user experiences but also drive inclusion and flexibility that is critical in todays digital economy. Your role goes beyond coding; its about making a tangible impact on the lives of millions.
Your day to day
Building highly scalable backend data pipeline with high throughput in GCP using BigQuery and Dataproc.
Write and maintain pipelines in Python framework
Scheduling jobs using UC4, airflow
Independently work on multiple product features, utilizing your technical expertise to propose innovative solutions for both new and existing functionalities, informed by a growing understanding of our products and the business domain.
Manage your own project deliverables, timelines, and priorities, effectively balancing multiple tasks to meet project deadlines and performance targets.
Actively engage in design and code reviews, providing constructive feedback to peers and incorporating feedback into your own work to maintain high standards of code quality and functionality.
Sharing your knowledge and experience to new members to help onboard them onto the team quickly and efficiently, fostering a culture of learning and continuous improvement.
What do you need to bring-
A bachelor s degree in computer science or an equivalent combination of technical education and work experience.
3+ years of ETL Expertise i.e. managing data extraction, transformation, and loading from various sources using advanced SQL and Jupyter Notebooks/Python.
Working knowledge on Big Data, GCP Cloud databases, Streaming Integrations.
Experience in design and building highly scalable distributed applications capable of handling very high volume of data in GCP using BigQuery and python.
Strong conceptual knowledge in Data warehouses, Data marts, distributed data platforms and data lakes, Data Modeling, Schema design and CI/CD
Experience working on SaaS platform(s): Adobe RTCDP is a plus.
Experience using Atlassian JIRA, Service Now, Atlassian Confluence tools.
Experience in delivering projects using Agile Methodology.