18 Xander Consulting And Advisory Jobs
Data Engineer - ETL (5-8 yrs)
Xander Consulting And Advisory
posted 12hr ago
Key skills for the job
Job Summary:
The Data Engineer will work with a cross-functional team to design, implement, and maintain data infrastructure and pipelines, ensuring data is accessible, reliable, and timely for analysis and decision-making. This role requires expertise in data processing, transformation, and integration, with a focus on optimizing data flow and performance.
Key Responsibilities :
- Design, build, and maintain scalable and reliable data pipelines to process and transform large datasets.
- Collaborate with data scientists, analysts, and other stakeholders to ensure data is delivered in a usable and accessible format.
- Optimize and fine-tune existing data infrastructure for improved performance and cost-efficiency.
- Ensure data quality, integrity, and consistency across the system.
- Work with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (Spark, Hadoop, etc.).
- Implement ETL (Extract, Transform, Load) processes for both batch and real-time data workflows.
- Build and maintain data models, databases, and data storage solutions.
- Monitor and troubleshoot data systems, ensuring minimal downtime and high availability.
- Document processes and workflows to enable efficient collaboration across teams.
Required Skills and Qualifications :
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- Strong experience in data engineering, with hands-on expertise in SQL, Python, and/or Java.
- Proficiency in building ETL pipelines using tools like Apache Spark, Airflow, or similar.
- Experience with cloud data services (AWS, GCP, or Azure) and related technologies.
- Solid understanding of database management systems (e.g., MySQL, PostgreSQL, NoSQL).
- Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake).
- Ability to work with both structured and unstructured data.
- Strong problem-solving skills and ability to troubleshoot data issues.
- Excellent communication skills, both written and verbal.
Preferred Skills :
- Knowledge of machine learning frameworks and data science workflows.
- Experience with containerization tools like Docker or Kubernetes.
- Familiarity with Apache Kafka, Flink, or similar tools for stream processing.
- Exposure to DevOps tools for deployment and automation (e.g., Jenkins, Terraform).
Functional Areas: Software/Testing/Networking
Read full job description