21 Adept Global Jobs
Data Analyst - ETL (6-8 yrs)
Adept Global
posted 12hr ago
Flexible timing
Key skills for the job
About the Role :
We are seeking a highly skilled and experienced Data & Analytics Analyst to design and develop our data warehouse, including the critical user discovery & data exploration system.
This role will be instrumental in defining the architecture and implementation of our data infrastructure, ensuring scalability, performance, and reliability as we grow.
Key Responsibilities :
Data Warehouse Architecture :
- Design and implement a robust and scalable data warehouse architecture to support our business intelligence and analytics needs.
- Evaluate and select appropriate data warehousing technologies and tools, including data lakes, data marts, and data pipelines.
- Define data models, schemas, and ETL/ELT processes for optimal data ingestion, transformation, and storage.
- Ensure data quality, consistency, and security across the entire data pipeline.
User Discovery / Data Exploration System :
- Design and develop a sophisticated user discovery / data exploration system to understand user preferences, behaviour, and viewing habits.
- Implement algorithms and machine learning models to personalize content recommendations and improve user engagement.
- Integrate user data with other relevant data sources to gain deeper insights into user behaviour.
- Perform exploratory data analysis (EDA) to identify patterns, trends, anomalies, and relationships within the data.
- Utilize statistical and machine learning techniques to analyse data and generate insights.
- Develop and maintain data visualizations (dashboards, reports) to communicate findings effectively.
Technology Stack :
Selection and implementation of open-source technologies for the data warehouse, such as :
- Data warehousing platforms (e.g, Apache Hive, Presto, Trino)
- Data streaming platforms (e.g, Apache Kafka, Apache Pulsar)
- Data processing frameworks (e.g, Apache Spark, Apache Flink)
- Cloud-native technologies (e.g, AWS, GCP)
Qualifications :
Experience :
- Bachelor's or Master's degree in Computer Science or a related field.
- 6-8 years of experience in data warehousing and data engineering.
- Proven experience in designing and implementing large-scale data platforms.
- Hands-on experience with open-source technologies (e.g, Hadoop, Spark, Kafka).
- Experience with cloud-native technologies (AWS, GCP, or Azure) is a plus.
Technical Skills :
- Strong understanding of data modelling, data warehousing principles, and ETL/ELT processes.
- Expertise in SQL and scripting languages (e.g, Python, Scala).
- Experience with data visualization and business intelligence tools (e.g, Druid, Imply).
- Familiarity with machine learning concepts and their application to data analysis.
- Expertise in Python development
Benefits :
- Opportunity to work on a cutting-edge AOSP-based platform.
- Be part of a dynamic and growing team.
- Work with open-source technologies and contribute to the open-source community
Functional Areas: Other
Read full job description