3 Queuesphere Tech Jobs
Data Engineer - ETL/SQL/Python (2-6 yrs)
Queuesphere Tech
posted 5d ago
Key skills for the job
Role Overview :
Data engineers design next-gen AI-powered data platforms for real-time insights, high-speed processing, and analytics to support business and product needs.
Key Responsibilities :
- Build and optimize real-time AI-powered data pipelines using tools like Apache Flink, Kafka Streams, or Spark Structured Streaming.
- Implement vectorized storage and search systems (e.g., Pinecone, Weaviate) for ML applications.
- Develop automated data quality checks and anomaly detection pipelines with AI-driven observability.
- Collaborate with AI/ML teams to provide well-structured datasets for model training and inference.
- Ensure compliance with modern data privacy laws (GDPR, CCPA) using AI-powered data governance tools.
Technological Expectations :
- Frameworks: Expertise in Delta Lake, Snowflake, and Apache Iceberg for modern data lakes.
- AI Tooling: Use dbt Cloud with AI Plugins, Datafold, or Soda for smarter ETL pipeline monitoring.
- Languages: Master Python (PySpark) and SQL for high-performance data transformation.
- Workflow Management: Proficiency in using Prefect or Dagster for orchestrating complex workflows.
- Cloud Platforms: Deep experience with AWS EMR, GCP BigQuery, or Azure Synapse Analytics.
- Data Security: Implement automated data masking and access control with AI solutions.
Cultural Expectations :
- Drive an AI-first approach for ETL and data enrichment processes.
- Foster a culture of proactive monitoring with alert systems powered by AI models.
- Promote team collaboration with knowledge-sharing tools like Notion AI or Confluence with AI extensions
Functional Areas: Software/Testing/Networking
Read full job description