ADA GLOBAL is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey
We are seeking a highly skilled andcertified Data Engineer with expertise in Databricks to join our dynamic team.The ideal candidate will be responsible for designing, developing, andmaintaining data pipelines, ETL processes, and data integration solutions usingthe Databricks platform. You will work closely with data scientists, analysts,and other stakeholders to ensure our organization's data is accessible,accurate, and ready for analysis.
Responsibilities:
Design, build, and maintain scalable and efficient data pipelines using Databricks, ensuring the timely and reliable flow of data from various sources to data storage and analytical platforms.
Develop ETL processes to clean, transform, and enrich raw data into usable formats for data analysis, reporting, and machine learning.
Collaborate with cross-functional teams to integrate data from multiple sources, ensuring data consistency, quality, and integrity.
Contribute to the design and maintenance of data architecture, including data lakes, data warehouses, and data marts on the Databricks platform.
Identify and resolve performance bottlenecks in data pipelines and ETL processes, making necessary optimizations for enhanced data processing speed.
Implement data quality checks and monitoring to ensure that data is accurate and reliable for analytics and reporting purposes.
Follow best practices for data security and ensure that data handling complies with relevant data protection regulations and industry standards.
Create and maintain documentation for data pipelines, ETL processes, and data integration workflows to ensure clarity and knowledge transfer within the team.
Provide support for data-related issues and troubleshoot problems that may arise in data pipelines or Databricks workflows.
Work closely with data scientists, data analysts, and other stakeholders to understand their data needs and deliver solutions that enable data-driven decision-making.
Requirements:
Bachelor's degree in Computer Science, Information Technology, or a related field.
Databricks Professional Certification is required.
5+ years of overall data engineering experience is required
Proven experience in data engineering, ETL, and data integration, with a strong focus on Databricks.
Proficiency in programming languages such as Python, Scala, or Java.
Strong knowledge of SQL and experience with data warehousing concepts.
Familiarity with cloud platforms like AWS, Azure, or GCP.
Excellent problem-solving and analytical skills.
Strong communication and collaboration skills to work effectively in a team.
Experience with data modelling and schema design.
Knowledge of best practices in data governance and data security
Employment Type: Full Time, Permanent
Functional Areas: Analytics & Business Intelligence