1 Risah Careers Job
Data Engineer - Data Pipeline (5-7 yrs)
Risah Careers
posted 11d ago
Key skills for the job
We are hiring for US Based Company for remote location
Location: Hyderabad (Remote should come to office based on client requirement twice in a week )
Job Description :
We are seeking experience with 3+ years experience as a software engineer - or equivalent - designing large data-heavy distributed systems and/or high-traffic web-apps.
Primary Skills : GCP, Python CODING MUST, SQL Coding skills, Big Query, Airflow and Airflow Dag's.
Requirements :
- Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies.
- Experience working with cloud platforms such as GCP, Big Query.
- Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment
Must have : Experience in pipeline orchestration (i.e. Airflow)
Job Description :
Key Responsibilities :
- Develop and maintain scalable and reliable data pipelines to ingest data from various APIs into the AWS ecosystem.
- Manage data storage solutions using S3 buckets, ensuring best practices in data organization and security.
- Utilize AWS Redshift for data warehousing tasks, optimizing data retrieval and query performance.
- Configure and use AWS Glue for ETL processes, ensuring data is clean, well-structured, and ready for analysis.
- Utilize EC2 instances for custom applications and services that require compute capacity.
- Implement data lake and warehousing strategies to support analytics and business intelligence initiatives.
- Collaborate with cross-functional teams to understand data needs and deliver solutions that align with business goals.
- Ensure compliance with data governance and security policies.
Qualifications :
- Solid experience in AWS services, especially S3, Redshift, Glue, and EC2.
- Proficiency in data ingestion and integration, particularly with APIs.
- Strong understanding of data warehousing, ETL processes, and cloud data storage.
- Experience with scripting languages such as Python for automation and data manipulation.
- Familiarity with infrastructure as code tools for managing AWS resources.
- Excellent problem-solving skills and ability to work in a dynamic environment.
- Strong communication skills for effective collaboration and documentation
Functional Areas: Software/Testing/Networking
Read full job description