13 Acelucid Technologies Jobs
Airflow Developer - ETL/SQL (4-5 yrs)
Acelucid Technologies
posted 7d ago
Flexible timing
Key skills for the job
Job Description :
We are seeking a skilled Airflow Developer with 4-5 years of experience to design, implement, and maintain efficient, scalable, and reliable workflows using Apache Airflow. The ideal candidate will have a strong understanding of data pipelines, ETL processes, and orchestration techniques. You will collaborate with cross-functional teams to ensure seamless integration and optimization of workflows in our data ecosystem.
Key Responsibilities :
- Design and develop workflows, DAGs (Directed Acyclic Graphs), and custom operators using Apache Airflow.
- Collaborate with data engineers, analysts, and stakeholders to understand requirements and translate them into scalable workflows.
- Monitor, troubleshoot, and optimize Airflow pipelines to ensure reliability and performance.
- Integrate Airflow with various data sources, storage systems, and APIs.
- Implement best practices for code quality, version control, and deployment processes.
- Write clear documentation for workflows, configurations, and processes.
- Ensure data pipeline scalability, security, and maintainability by implementing appropriate solutions.
- Manage and maintain the Airflow environment, including upgrades and plugin configurations.
Required Skills and Qualifications :
- 4 to 5 years of experience in building and managing workflows with Apache Airflow.
- Proficiency in Python programming for Airflow DAGs and custom operator development.
- Strong understanding of ETL/ELT processes and data pipeline orchestration.
- Experience working with SQL and databases (e.g., PostgreSQL, MySQL, or NoSQL).
- Familiarity with cloud platforms (AWS, GCP, or Azure) and their data services.'
- Knowledge of CI/CD pipelines and version control systems (e.g., Git).
- Strong problem-solving skills and attention to detail.
Preferred Qualifications:
- Experience with containerization tools like Docker and orchestration systems like Kubernetes.
- Familiarity with big data technologies such as Spark, Hadoop, or Kafka.
- Understanding of data warehousing solutions like Snowflake, Redshift, or BigQuery.
- Knowledge of monitoring and alerting tools (e.g., Prometheus, Grafana).
Functional Areas: Software/Testing/Networking
Read full job description5-7 Yrs
Kolkata, Mumbai, New Delhi +4 more