38 HR Works Consultancy Jobs
3-12 years
Bangalore / Bengaluru, Hyderabad / Secunderabad, Chennai + 2 more
Data Scientist - Machine Learning (3-12 yrs)
HR Works Consultancy
posted 2mon ago
Key skills for the job
Job Description :
The ideal candidate is a hands-on technology developer with experience in developing scalable applications and platforms. They must be at ease working in an agile environment with little supervision. The person should be a self-motivated person with a passion for problem-solving and continuous learning.
Designation :
Project Tech. Lead (4A) : 5 - 7 Years
Project Manager/Architect (4B/5A) : 7 - 10 Years
Role and responsibilities :
- Project Management (50%)
- Front Door (Requirements, Metadata collection, classification & security clearance)
- Data pipeline template development
- Data pipeline Monitoring development & support (operations)
- Design, develop, deploy, and maintain production-grade scalable data transformation, machine learning and deep learning code, pipelines; manage data and model versioning, training, tuning, serving, experiment and evaluation tracking dashboards.
- Manage ETL and machine learning model lifecycle: develop, deploy, monitor, maintain, and update data and models in production.
- Build and maintain tools and infrastructure for data processing for AI/ML development initiatives.
Technical skills requirements :
The candidate must demonstrate proficiency in :
- Experience deploying machine learning models into a production environment.
- Strong DevOps, Data Engineering and ML background with Cloud platforms
- Experience in containerization and orchestration (such as Docker, Kubernetes)
- Experience with ML training/retraining, Model Registry, ML model performance measurement using ML Ops open-source frameworks.
- Experience building/operating systems for data extraction, ingestion and processing of large data sets
- Experience with MLOps tools such as MLFlow and Kubeflow
- Experience in Python scripting
- Experience with CI/CD
- Fluency in Python data tools e.g. Pandas, Dask, or Pyspark
- Experience working on a large scale, distributed systems
- Python/Scala for data pipelines
- Scala/Java/Python for micro-services and APIs
- HDP, Oracle skills & SQL; Spark, Scala, Hive and Oozie DataOps (DevOps, CDC)
Nice-to-have skills :
- Jenkins, K8S
- Google Cloud certification
- Unix or Shell scripting
Functional Areas: Other
Read full job description3-12 Yrs
Bangalore / Bengaluru, Hyderabad / Secunderabad, Chennai +2 more