Our Data Analytics group is responsible for working with various business owners/stakeholders from Sales, Marketing, People, GCS, Infosec, Operations, and Finance to solve complex business problems which will have a direct impact on the metrics defined to showcase the progress of Palo Alto Networks. We leverage the latest technologies from the Cloud Big Data ecosystem to improve business outcomes and create through prototyping, Proof-of-Concept projects and application development. We are looking for a Staff IT Data Engineer with extensive experience in Data engineering, SQL, Cloud engineering and business intelligence (BI) tools. The ideal candidate will be responsible for designing, implementing, and maintaining scalable data transformations and analytical solutions that support our business objectives. This role requires a strong understanding of data engineering principles, as well as the ability to collaborate with cross-functional teams to deliver high-quality data solutions.
Your Impact
Design, develop, and maintain data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse or data lake environment.
Collaborate with stakeholders to gather requirements and translate business needs into technical solutions.
Optimize and tune existing data pipelines for performance, reliability, and scalability.
Implement data quality and governance processes to ensure data accuracy, consistency, and compliance with regulatory standards.
Work closely with the BI team to design and develop dashboards, reports, and analytical tools that provide actionable insights to stakeholders.
Mentor junior members of the team and provide guidance on best practices for data engineering and BI development.
Your Experience
Bachelors degree in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering, with a focus on building and maintaining data pipelines and analytical solutions.
Expertise in SQL programming and database management systems
Hands-on experience with ETL tools and technologies (e.g. Apache Spark, Apache Airflow).
Familiarity with cloud platforms such as Google Cloud Platform (GCP), and experience with relevant services (e.g. GCP Dataflow, GCP DataProc, Biq Query, Procedures, Cloud Composer etc).
Experience with Big data tools like Spark, Kafka, etc.
Experience with object-oriented/object function scripting languages: Python/Scala, etc
Experience with BI tools and visualization platforms (e.g. Tableau) is a plus.
Experience with SAP HANA, SAP BW, SAP ECC, or other SAP modules is a plus
Strong analytical and problem-solving skills, with the ability to analyze complex data sets and derive actionable insights.
Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.