12 Nirdesa Networks Jobs
Neysa.ai - Database Administrator (7-12 yrs)
Nirdesa Networks
posted 11hr ago
Key skills for the job
Key Responsibilities :
- Data Pipeline Management : Design, implement, and manage robust data pipelines to ensure seamless data flow between systems for AI models and machine learning workloads.
- Database Reliability & Optimization : Oversee the reliability, availability, and performance of databases and data stores across the platform. Tune and optimize queries, schema, and configurations for improved performance.
- Data Security & Compliance : Implement data security measures to protect sensitive information, ensuring compliance with data privacy regulations and company policies.
- Collaboration with AI Teams : Work closely with Data Science and ML teams to ensure data is accessible, clean, and in the right format for model training and inference.
- Monitoring & Troubleshooting : Continuously monitor the health of data pipelines and databases. Quickly troubleshoot and resolve performance bottlenecks or system failures.
- Automation & Optimization : Develop automated solutions for data pipeline operations, reducing manual intervention and ensuring high scalability and low latency.
- Documentation : Maintain clear documentation of database architecture, data pipelines, and troubleshooting processes for cross-team collaboration.
Must have skills :
- Database Administration : 4-8 years of experience managing relational databases (PostgreSQL, MySQL, or similar), including backup, recovery, performance tuning, and high availability solutions.
- Data Pipeline Development : Solid experience in building and managing scalable data pipelines using tools like Apache Kafka, Apache Airflow, or other ETL frameworks.
- Experience on management of Elastic
- Ability to configure and run the Postgres, Kafka and Elastic Clusters,
- Security Practices : Expertise in implementing data encryption, access controls, and security policies to ensure the safety of sensitive data within databases and pipelines.
- SQL & Query Optimization : Proficiency in writing complex SQL queries and optimizing database performance for large datasets and high-velocity transactions.
- Problem-Solving & Troubleshooting : Strong analytical and troubleshooting skills to diagnose and resolve system and data pipeline issues in real-time.
Good to have :
- If you know any of the graph and time series databases like Neo4J and Timescale DB it would be awesome.
- NoSQL Databases: Familiarity with NoSQL databases such as MongoDB, or Redis for handling unstructured data.
- AI/ML Collaboration: Experience working closely with AI/ML teams to support data requirements for model training and production environments.
- Big Data Technologies: Familiarity with big data tools like Apache Hadoop, Spark, or related technologies for handling large-scale data processing.
- Scripting: Proficiency in scripting languages like Python, Bash, or similar for automating tasks and developing custom solutions.
Technical Skills :
- Databases : PostgreSQL, MySQL, Elastic
- Data Pipeline Tools : Apache Kafka, or similar ETL frameworks
- Security : Data encryption, IAM (Identity and Access Management), access control policies
- Programming/Scripting : SQL, Python, Bash, or similar languages for automation
- Monitoring & Logging : Prometheus, Grafana, ELK Stack, CloudWatch
- NoSQL : MongoDB, Neo4j, Timescale DB, Redis
- Big Data : Apache Spark, Hadoop, Flink
Functional Areas: Software/Testing/Networking
Read full job description10-12 Yrs