108 hirist.tech Jobs
Avalara - Data Engineer/Lead - Cloud Platform (8-10 yrs)
hirist.tech
posted 3d ago
Key skills for the job
Role : Lead Cloud Data Engineer Snowflake & AWS | Remote
What you will do :
The Data Science Engineering team is looking for a Lead Cloud Data Engineer to build the data infrastructure for Avalara's core data assets- empowering us with accurate, data to lead data-backed decisions. As a Lead Cloud Data Engineer, you will help develop our data and reporting infrastructure using Snowflake, SSRS, Python, AWS Services, Airflow, DBT Modeling and automation.
You will influence the implementation of technologies and solutions to solve real challenges. You have deep SQL experience, an understanding of modern data stacks and technology, experience with data and all things data-related, and experience guiding a team through technical and design challenges. You will report into the Sr. Manager, Cloud Software Engineering and be a part of the larger Data Engineering team.
Responsibilities :
- Collaborate with teams to understand data requirements and translate them into technical solutions
- Work with data analysts and data scientists to provide them with clean and structured datasets for analysis and modeling
- Must be efficient in database and know visualization best practices
- Data modeling and reporting (ad hoc report generation) techniques
- Prepare low-level design for project to proceed into implementation and support final go-live
- Maintain comprehensive documentation of the reporting infrastructure architecture, configurations, and processes; create regular reports on the performance of the pipeline, data quality, and incidents detected.
Qualifications :
- Bachelor/master's degree in computer science or equivalent
- 8+ years' experience in data engineering field, with deep SQL knowledge.
- Have proficiency in Snowflake, Python, AWS Services, Advanced SQL and SQL Server Reporting Services (SSRS) is must.
- 4+ years working with Snowflake and Python
- 1+ year working with Automation, Docker, Terraform, Container, CI/CD and Kubernetes
- Hands-on experience working with SQL Server Reporting Services (SSRS)
- Experience of common container and orchestration technologies such as Docker, Terraform, Container, CI/CD and Kubernetes.
- Familiarity with cloud platforms such as AWS, GCP, and experience with cloud-based data solutions.
- Experience communicating updates and resolutions to customers and other partners (verbal/written) to deliver the technical insights and interpret the data reports to the clients.
- Help in understanding and serving to your client's requirements and creating technical documentation.
- Important Traits required: Technically sound, Leadership, Experience communicating updates and resolutions to customers and other partners, Problem Solving, Accountability, Collaboration, and Data-driven.
Good to have :
- Certificate in Snowflake is plus
- Relevant certifications in data warehousing or cloud platform.
- Hands-on experience in Grafana-Prometheus
- Experience architecting complex data marts leveraging DBT and Airflow.
Technologies you are likely to be working with : Snowflake, Python, Cloud Computing AWS, RDBMS, Automation, SQL Server Reporting Services (SSRS)and Mongo.
Good to have : Airflow, orchestration technologies such as Docker, Container, CI/CD and Kubernetes, GCP and reporting Presentation layer.
Note : If shortlisted, you will be invited for initial rounds on 1st March'25 (Saturday) in Bengaluru
Functional Areas: Software/Testing/Networking
Read full job description