Data Pipeline Development: Design, develop, and maintain efficient, scalable, and reliable ETL (Extract, Transform, Load) pipelines to handle large datasets, supporting analytics and business intelligence needs.
Data Integration: Work with data from a variety of sources, databases, APIs, and files to create unified datasets optimized for analytics and machine learning.
Database Management: Oversee and optimize data storage solutions, such as Snowflake, Redshift, and relational databases like MySQL and PostgreSQL, ensuring high performance and data availability.
Data Quality & Governance: Implement data quality checks with tools like Great Expectations to maintain data accuracy, integrity, and consistency, and ensure compliance with data governance practices.
Performance Optimization: Enhance system performance, scalability, and efficiency by optimizing data processing systems.
Cross Collaboration: Partner with data scientists, analysts, and software engineers to translate business needs into robust data solutions.
Documentation: Maintain documentation for data architecture, workflows, and systems to support scalability and ease of maintenance.
What Cowbell needs from you:
Educational Background : A Bachelor s or Master s degree in Computer Science, Information Technology, Engineering, or a closely related field.
Professional Experience : 4+ years in a Software Developer, Data Engineer, or similar role.
Technical Skills:
Expertise in Python, Pandas programming, API development (FastAPI, REST API, GraphQL), and version control (GitHub).
Proficiency in SQL and NoSQL databases, including MySQL, and PostgreSQL
Familiarity with big data tools like Hadoop, Spark, and data warehousing platforms like Redshift and Snowflake.
Extensive experience with AWS cloud services, including S3, Glue, Redshift, EMR, Kinesis, Athena, DynamoDB, RDS, Lambda, MWAA, CloudWatch, EC2, and other relevant AWS tools.
Competency with ETL tools such as Apache Airflow as data pipeline orchestration.
Strong understanding of data modeling, data architecture, and best practices in data governance.
Soft Skills:
Problem-solving abilities with a keen attention to detail.
Capable of working both independently and within a team setting.
Strong communication and collaboration skills.
Flexibility to adapt to new technologies and a commitment to continuous learning.
Preferred Qualifications:
Experience with real-time data streaming technologies (e.g., Kafka, Flink) is a plus.
Familiarity with containerization (Docker) and orchestration tools (Kubernetes).
Understanding of data security, compliance, cyber security, and privacy best practices.
Familiarity with DevOps practices, CI/CD pipelines, and IaC tools like Terraform or CloudFormation for project management in data engineering.
What Cowbell brings to the table:
Employee equity plan.
Comprehensive wellness program, meditation app subscriptions, lunch and learn, book club, happy hours and much more.
Professional development and the opportunity to learn the ins and outs of cyber insurance, cyber security as well as continuing to build your professional skills in a team environment.
What Cowbell brings to the table:
Employee equity plan for all and wealth enablement plan for select customer facing roles
Comprehensive wellness program, meditation app subscriptions, lunch and learn, book club, happy hours and much more
Professional development and the opportunity to learn the ins and outs of cyber insurance, cyber security as well as continuing to build your professional skills in a team environment