11 Simpplr Jobs
Senior Data Architect - Big Data (1-2 yrs)
Simpplr
posted 19hr ago
Flexible timing
Key skills for the job
The Opportunity :
We are seeking a highly hands-on Analytics Technical Architect with experience in designing, implementing, and scaling big data and analytics ecosystems.
This role demands deep technical expertise, a strong problem-solving mindset, and the ability to architect and build scalable data platforms using AWS, Snowflake, and distributed data processing frameworks (PySpark or Scala).
You will play a critical role in hands-on development, building real-time and batch data pipelines, integrating embedded analytics and BI tools (Sisense, ThoughtSpot, GoodData, Omni.co, Reveallab.io, Yellowfin, Tableau, Power BI), and ensuring data-driven decision-making across the organization.
You will be responsible for delivering cutting-edge solutions, supporting real-time insights, and ensuring scalable analytics infrastructure that aligns with business objectives. If you thrive on writing code, optimizing performance, and getting deep into data infrastructure, this is the role for you.
Key Responsibilities :
Big Data & Analytics Platform Architecture :
- Design, develop, and optimize scalable big data architectures using AWS-native services (AWS EMR, AWS Redshift, AWS Glue, S3) and Snowflake.
- Write production-level code in PySpark, Scala, or SQL for big data processing, ETL, and analytics workflows.
- Implement high-performance distributed computing solutions that scale to large data volumes and real-time insights.
- Optimize query performance and resource utilization on Snowflake, Redshift, and Spark-based architectures.
BI & Embedded Analytics :
- Integrate BI tools and embedded analytics platforms such as Sisense, ThoughtSpot, GoodData, Omni.co, Reveallab.io, Yellowfin, Tableau, and Power BI with Snowflake and Redshift.
- Optimize BI performance, ensuring efficient querying, caching, and dashboard responsiveness for end-users.
- Build custom analytics solutions that embed directly into applications for seamless data visualization.
- Develop self-service analytics frameworks, enabling non-technical users to extract insights without engineering dependency.
Data Governance, Security & Compliance :
- Enforce data governance frameworks, ensuring data quality, integrity, and security across the entire platform.
- Implement AWS IAM, KMS, CloudTrail, and Snowflake security features to manage access control, encryption, and auditing.
- Define and implement policies for data lineage, metadata management, and compliance (GDPR, CCPA etc).
Big Data Processing & Streaming :
- Develop and optimize real-time data pipelines using AWS Kinesis, Kafka, and AWS EMR with Apache Spark.
- Write and debug streaming jobs to handle millions of real-time events per second, ensuring low-latency analytics.
- Design fault-tolerant and scalable streaming architectures for event-driven analytics applications.
Data Pipelines & ETL/ELT Workflows :
- Architect and develop robust ETL/ELT workflows using AWS Glue, Lambda, Apache Kafka, dbt, and PySpark/Scala.
- Own end-to-end data flow from ingestion to transformation and analytics, ensuring low-latency, high-throughput pipelines.
- Implement automated data pipeline monitoring, alerting, and performance tuning.
Collaboration & Leadership :
- Work hands-on with engineers, data scientists, and analysts, guiding them in best practices for scalable analytics development.
- Conduct code reviews, architecture discussions, and deep dives into performance bottlenecks, optimizing for cost, speed, and efficiency.
- Partner with business leaders to understand analytics needs and translate them into practical, scalable data solutions.
- Lead cloud-based data platform initiatives, ensuring high availability, fault tolerance, and cost optimization.
What Makes You a Great Fit for Us?
- 5+ years of experience in data architecture, analytics, and big data processing.
- Expertise in BI tools/analytics platforms such as Sisense, ThoughtSpot, GoodData, Omni.co, Reveallab.io, Yellowfin, Tableau, Power BI etc, and their integration with Snowflake to provide actionable insights.
Deep Knowledge of Modern Data Technologies :
- Big Data & Analytics : Spark, Kafka, Hadoop, Druid, ClickHouse, Presto, Snowflake, EMR, Redshift, Glue, S3.
- Databases : PostgreSQL, MongoDB, ElasticSearch.
- Deep expertise in coding with PySpark, Scala, and SQL for big data processing and analytics workflows.
- Proven ability to design, build, and scale real-time and batch data pipelines using AWS Glue, Kafka, Dbt, and Lambda.
- Deep knowledge of real-time data streaming (AWS Kinesis, Kafka, Spark Streaming).
- Strong problem-solving ability with a focus on hands-on coding, debugging, and performance tuning.
- Microservices & Event-Driven Architecture : Understanding of real-time event processing architectures.
- Strategic Thinking : Ability to design and implement long-term data strategies aligned with business goals.
- Problem-Solving & Optimization : Strong analytical skills with a deep understanding of performance tuning for large-scale data systems.
- Visionary Leadership : Ability to think strategically and drive engineering excellence within the team.
- Communication Skills : Strong interpersonal and communication skills to collaborate effectively across teams.
- Attention to Detail : An eye for detail with the ability to translate ideas into tangible, impactful outcomes.
- Agility : Comfortable managing and delivering work in a fast-paced, dynamic environment.
Preferred Skills (Good to Have) :
- Hands-on experience with AWS Public Cloud.
- Experience with Hadoop, HBase, and legacy big data architectures.
- Knowledge of machine learning pipelines and MLOps in big data environments.
- AWS Certifications (i.e., AWS Solutions Architect, AWS Big Data).
- Experience designing self-service analytics solutions for non-technical users.
- Hands-on experience with Kubernetes, Terraform, and Infrastructure-as-Code (IaC) for data platforms.
- Experience with data security, encryption, and access control mechanisms.
- Experience in Event/Data Streaming platforms.
- Experience in risk management and compliance frameworks.
Functional Areas: Other
Read full job descriptionPrepare for Senior Data Architect roles with real interview advice
3-6 Yrs
Gurgaon / Gurugram