i
Cogoport
8 Cogoport Jobs
Cogoport - Senior Data Engineer - ETL (6-10 yrs)
Cogoport
posted 13d ago
Fixed timing
Key skills for the job
Cogoport is on a mission to bridge the $3.4 trillion Trade Knowledge and Execution Gap, empowering businesses worldwide by simplifying global trade. As a leading Global Trade Platform, we help companies connect with trade partners, optimize logistics, and improve cash flow. Recognized as an Asia-Pacific High-Growth Company (Financial Times) and an Indian Growth Champion (Economic Times),- we are expanding across India, China, Vietnam, Singapore, Thailand, and Indonesia. -
Why Join Us ?
At Cogoport, you'll work with some of the brightest minds in the industry, driving digital transformation in logistics. We foster an entrepreneurial culture, where innovation, impact, and career growth go hand in hand.
- Impact : Work on real-world logistics challenges that directly optimize global trade and supply chain efficiencies.
- Innovation : Develop AI-driven logistics solutions- that improve freight booking, shipment tracking, and cost optimization.
- Collaboration : Work with cross-functional teams- across data science, engineering, and product to deliver impactful solutions.
- Growth : Be part of a fast-growing company and scale next-gen logistics platforms using advanced data engineering and AI.
- Join us in revolutionizing digital freight and logistics!
About the role ?
We are looking for a- Senior Data Engineer (SDE 3) to build scalable, high-performance data solutions- that drive actionable insights for logistics, freight pricing, shipment tracking, and operational efficiencies. -
Key Responsibilities :
1. Scalable Data Pipeline Development :
- Design and develop real-time and batch ETL/ELT pipelines for structured and unstructured logistics data (freight rates, shipping schedules, tracking events, etc.).
- Optimize data ingestion, transformation, and storage for high availability and cost efficiency
- Ensure seamless integration of data from global trade platforms, carrier APIs, and operational databases.
2. Data Architecture & Infrastructure :
- Architect scalable, cloud-native data platforms using AWS (S3, Glue, EMR, Redshift), GCP (BigQuery, Dataflow), or Azure.
- Build and manage data lakes, warehouses, and real-time processing frameworks to support analytics, machine learning, and reporting needs.
- Optimize distributed databases (Snowflake, Redshift, BigQuery, Apache Hive) for logistics analytics -
3. Real-time & Predictive Analytics :
- Develop streaming data solutions using Apache Kafka, Pulsar, or Kinesis to power real-time shipment tracking, anomaly detection, and dynamic pricing.
- Enable AI-driven freight rate predictions, demand forecasting, and shipment delay analytics.
- Improve customer experience by providing real-time visibility into supply chain disruptions and delivery timeline
4. Performance Optimization & Data Governance :
- Ensure high availability, fault tolerance, and data security compliance (GDPR, CCPA) across the platform.
- Improve query performance for large-scale logistics datasets through indexing, partitioning, and caching techniques.
- Implement data quality checks, lineage tracking, and metadata management for transparency and governance.
5. Collaboration & Leadership :
- Work closely with product managers, data scientists, and logistics experts to translate business needs into scalable data solutions. -
- Partner with business teams to deliver revenue-impacting insights on freight, shipments, and trade patterns
Required Qualifications :
- 6+ years of experience- in data engineering, working with large-scale distributed systems.
- Strong proficiency in Python, Java, or Scala- for data processing
- Expertise in SQL and NoSQL databases- (PostgreSQL, Cassandra, Snowflake, Apache Hive, Redshift). -
- Experience with big data processing frameworks- (Apache Spark, Flink, Hadoop).
- Hands-on experience with real-time data streaming- (Kafka, Kinesis, Pulsar) for logistics use cases. -
- Deep knowledge of- AWS/GCP/Azure- cloud data services like S3, Glue, EMR, Databricks, or equivalent. -
- Familiarity with Airflow, Prefect, or Dagster- for workflow orchestration. -
- Strong understanding of logistics and supply chain data- structures, including freight pricing models, carrier APIs, and shipment tracking systems. -
Preferred Qualifications :
- Experience in building predictive analytics solutions for freight rates, demand forecasting, and risk assessment.
- Experience in machine learning model deployment- and MLOps- practices.
- Exposure to real-time analytics and OLAP systems.
- Certifications in AWS/GCP Data Engineering.
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Senior Data Engineer roles with real interview advice
2-4 Yrs
4-6 Yrs