145 TalentXO Jobs
Senior/Lead Data Engineer - Big Data Technologies (5-9 yrs)
TalentXO
posted 11hr ago
Flexible timing
Role & Responsibilities :
Our client is is growing rapidly and seeking a strong Data Engineer to be a key member of the Data and Business Intelligence organization with a focus on deep data engineering projects. You will be joining as one of the few initial data engineers as part of the data platform team in their Bengaluru office. You will have an opportunity to help define our technical strategy and data engineering team culture in India.
You will design and build data platforms and services while managing our data infrastructure in cloud environments that fuel strategic business decisions across their products.
A successful candidate will be a self-starter, who drives excellence, is ready to jump into a variety of big data technologies & frameworks, and is able to coordinate and collaborate with other engineers, as well as mentor other engineers in the team.
What You'll Be Doing :
- Build highly scalable, available, fault-tolerant distributed data processing systems (batch and streaming systems) processing over 100s of terabytes of data ingested every day and petabyte-sized data warehouse and elastic search cluster.
- Build quality data solutions and refine existing diverse datasets to simplified models encouraging self-service.
- Build data pipelines that optimize on data quality and are resilient to poor-quality data sources.
- Own the data mapping, business logic, transformations, and data quality.
- Low-level systems debugging, performance measurement & optimization on large production clusters.
- Participate in architecture discussions, influence product roadmap, and take ownership and responsibility over new projects.
- Maintain and support existing platforms and evolve to newer technology stacks and architectures.
Ideal Candidate :
- Proficiency in Python and PySpark.
- Deep understanding of Apache Spark, Spark tuning, creating RDDs, and building data frames.
- Experience in big data technologies like HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, etc.
- Must have Familiarity with data formats like Parquet, Avro, and NoSQL databases.
- Must have worked for Product Companies or Analytics-heavy companies
- Experience in building distributed environments using any of Kafka, Spark, Hive, Hadoop, etc.
- Good understanding of the architecture and functioning of distributed database systems.
- Experience working with various file formats like Parquet, Avro, etc., for large volumes of data.
- Experience with AWS, GCP.
- 5+ years of professional experience as a data or software engineer.
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Big Data Engineer roles with real interview advice
10-15 Yrs
4-8 Yrs