14 SCALEMonks Technologies Jobs
Data Engineer - ETL/Data Warehousing (3-7 yrs)
SCALEMonks Technologies
posted 19hr ago
Key skills for the job
Job Description :
Function : Software Engineering - Big Data / DWH / ETL
We are seeking a skilled and motivated Data Engineer with 3+ years of experience to join our dynamic data team.
The ideal candidate will have strong expertise in data integration, ETL processes, and data pipeline management.
As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data systems, enabling data-driven decision-making across the organization.
Responsibilities :
- Design, build, and maintain efficient, scalable, and reliable data pipelines to support various business needs.
- Collaborate with data scientists, analysts, and other teams to understand data requirements and ensure the seamless flow of data.
- Develop ETL (Extract, Transform, Load) processes to ensure data is collected, processed, and made available for analysis.
- Optimize data storage and retrieval for performance and scalability using appropriate data architecture tools.
- Monitor and troubleshoot data pipelines and workflows to ensure data integrity, quality, and availability.
- Work with big data technologies (e.g , Hadoop, Spark, Kafka) and cloud platforms (e.g , AWS, Azure, GCP) for large-scale data processing.
- Implement data security measures and ensure compliance with data governance policies.
- Develop and maintain documentation for data pipelines, processes, and architecture.
- Continuously evaluate new data technologies and tools to improve data engineering workflows.
Requirements :
- Master's degree in Computer Science, Engineering, or a related field.
- 3+ years of experience as a Data Engineer or in a similar role.
- Strong knowledge of SQL and experience with relational and non-relational databases (e.g , MySQL, PostgreSQL, MongoDB, Cassandra).
- Proficient in programming languages like Python, Java, or Scala for data engineering tasks.
- Hands-on experience with ETL tools such as Apache NiFi, Talend, or Informatica
- Familiarity with cloud data services (e.g , AWS Redshift, Google BigQuery, Azure Synapse).
- Experience with data processing frameworks like Apache Hadoop, Apache Spark, or Kafka.
- Strong understanding of data modeling, data warehousing, and database optimization techniques.
- Experience with version control systems like Git.
- Excellent problem-solving skills and attention to detail.
- Strong communication skills and the ability to work collaboratively with cross-functional teams
Functional Areas: Software/Testing/Networking
Read full job description7-11 Yrs