2 Trinity Technology Group Jobs
5-7 years
Chennai
Senior Data Engineer - Big Data Technologies (5-7 yrs)
Trinity Technology Group
posted 3mon ago
Flexible timing
Key skills for the job
We're seeking a highly skilled and experienced Senior Data Engineer to join our team! You will play a pivotal role in designing, developing, and deploying robust data pipelines and data warehouse solutions leveraging Google Cloud Platform (GCP).
You will leverage your expertise in big data technologies like Hadoop, Spark, and Airflow to build scalable and efficient data processing pipelines for large datasets.
Responsibilities :
- Design, develop, and implement data pipelines using GCP services (e., Dataflow, Cloud Functions).
- Architect and optimize data warehouse solutions using BigQuery to ensure efficient data storage and retrieval.
- Utilize big data technologies (Hadoop, Spark, Hive, Pig) to process, transform, and analyze large datasets.
- Write clean, maintainable, and well-documented code using Python and other relevant programming languages.
- Implement data quality checks and data validation processes to ensure data integrity.
- Manage and monitor data pipelines using Airflow and CI/CD tools (e., Jenkins, Screwdriver) for automation.
- Collaborate with data analysts and data scientists to understand data requirements and translate them into technical solutions.
- Leverage expertise in data formats like Avro, Parquet, and ORC for efficient data storage and processing.
- Stay up-to-date with the latest advancements in big data technologies and cloud computing platforms.
Technical Skills & Qualifications :
- 5+ years of experience in big data engineering or a related field.
- In-depth knowledge of Google Cloud Platform (GCP) services for data processing and storage (e., Dataflow, BigQuery, Cloud Storage).
- Expertise in big data technologies like Hadoop, Spark, Hive, Pig, and HQL for data processing and analysis.
- Strong proficiency in Python and experience with other scripting languages (e., Bash, Shell).
- Solid understanding of data warehousing concepts and data modeling techniques.
- Experience with CI/CD tools like Airflow, Jenkins, or Screwdriver for automation.
- Familiarity with data quality tools and techniques for data validation.
- Experience with data visualization tools (Looker, Tableau, Business Objects) is a plus.
- Strong understanding of Unix/Linux operating systems.
- Excellent problem-solving, analytical, and critical thinking skills.
- Effective communication and collaboration skills.
- Ability to work independently and as part of a cross-functional team
Functional Areas: Software/Testing/Networking
Read full job description