48 FxConsulting Jobs
Database Architect - Big Data/Hadoop (13-20 yrs)
FxConsulting
posted 11d ago
Key skills for the job
As a Data Architect, you will play a pivotal role in designing and building the data infrastructure that supports our high-scale, fast-moving operations.
Given our rapid growth as a quick commerce leader, your expertise will be crucial in architecting systems that can handle massive volumes of real-time data from millions of transactions, dynamic inventory updates, delivery routing, and customer interactions-all while ensuring low latency and high availability.
You will be responsible for creating scalable, robust, and efficient data systems that not only support the day-to-day operations but also enable predictive analytics, demand forecasting, and advanced data-driven decision-making. With a focus on real-time data processing and high-frequency transactional data, this role offers a unique opportunity to shape the future of data systems in an industry where speed, precision, and scalability are paramount
What would you be doing/ Expected from this role?
- Develop a long-term database management plan, including architecture to support multi-tenancy and infrastructure scaling.
- Monitor database health, storage growth, and predict potential performance issues.
- Conduct query cost analysis and optimize performance.
- Collaborate with DB Managed Services vendors to identify and mitigate performance issues.
- Work with AWS for regular database operational and performance reviews.
- Develop and maintain BigBasket-specific database standards and ensure compliance.
- Analyze and optimize database and team costs, ensuring timely setup and renewal of reservations to reduce costs.
- Collaborate with development teams to review and optimize DB queries.
- Plan and manage regular database maintenance activities.
- Oversee DB Managed Services vendor performance and evaluate monitoring setups, alerts, and thresholds.
- Perform regular database upgrades and ensure database hardening to eliminate vulnerabilities.
- Address data security requirements, including encryption and sanitization during non-prod DB refresh activities.
- Handle user security and audit activities.
- Evaluate new tools and capabilities.
- Develop and maintain best practices for development teams.
Who are we looking for?
- Bachelor's degree in computer science or equivalent practical experience.
- At least 9-12 years of experience in data architecture, data engineering, or related fields, with a focus on designing and implementing large scale data solutions.
- Strong experience in leading data architecture initiatives, including building data warehouses, data lakes, and cloud-based data platforms.
- Proven track record in designing and optimizing data systems for high availability, scalability, and performance.
- Expertise in relational and NoSQL databases (e.g., PostgreSQL, MySQL, Cassandra, Timescale).
- Extensive experience with cloud data platforms (e.g., AWS)) and cloud-native data services (e.g., Redshift, BigQuery, Snowflake, Databricks).
- Strong expertise in data modeling, ETL/ELT processes, and data integration tools (e.g., Apache Kafka, Talend, Informatica, dbt).
- Proficiency in data pipeline automation, orchestration tools (e.g., Airflow, Apache NiFi).
- Hands-on experience with big data processing frameworks (e.g., Hadoop, Spark).
- Proficient in SQL, Scala, and/or Java for data management and automation.
- Strong leadership and mentorship abilities, with a proven track record of leading technical teams.
- Excellent communication skills, with the ability to engage with both technical and non-technical stakeholders.
- Ability to influence and drive change within a cross-functional team, advocating for best practices in data management.
- Analytical and problem-solving mindset with the ability to think strategically and execute tactically.
- Self-motivated and adaptable to changing business needs and evolving technologies.
Functional Areas: Software/Testing/Networking
Read full job description