We seek a candidate with strong collaboration skills and the ability to derive meaningful insights from complex data sets to address critical questions. In this role, you ll thrive in a dynamic, fast-paced learning environment and contribute to solving challenges for some of the world s largest organizations, including Fortune 500 companies. You ll work closely with internal business teams and clients to tackle data-driven problems, including dashboard creation, automation, reporting, and predictive and prescriptive analytics.
**Key Responsibilities:**
**Model Development and Deployment:**
Design, implement, and deploy scalable machine learning models using Azure ML Workbench/Amazon sagemaker and Azure Cognitive Services (e.g., Open AI, NLQ).
**Algorithm Expertise:**
Leverage advanced knowledge of machine learning algorithms, including linear regression, logistic regression, decision trees, random forests, and time series forecasting, to solve complex business challenges and enhance existing models.
**Collaborative Development:**
Partner with data science and engineering teams to integrate models into cloud-based applications and services.
**Optimization:**
Continuously monitor and refine model performance to improve accuracy and efficiency.
**Continuous Learning:**
Stay updated on the latest advancements in machine learning, particularly with Aws and related technologies, to ensure our solutions remain cutting-edge.
**Required Skills & Experience:**
4-8 years of hands-on experience in developing ML models and utilizing AutoML tools
Proficiency in implementing GenAI and open-source LLM models, with expertise in Natural Language Processing algorithms.
Strong skills in SQL and Python, with familiarity with libraries like PyTorch, Pandas, NumPy, Spark-NLP, Hugging Face Transformers, Langchain, OpenAI, GitHub and scikit-learn.
Practical knowledge of various machine learning and deep learning algorithms.
Proven ability to deploy scalable machine learning models in real-world scenarios.
Excellent problem-solving skills with the ability to translate business needs into analytical solutions.
Strong foundation in statistical modeling (e.g., regression, clustering, time series analysis), hypothesis testing, and experimental design.
Experience with data visualization tools (e.g., Tableau, matplotlib, ggplot) for presenting complex insights effectively.
Familiarity with big data technologies and cloud platform: PySpark, AWS(Comprehend, Textract, EC2, Sagemaker, lambda, SNS, SQS, Event Bridge)
**Preferred Qualifications:**
Bachelor s degree in Computer Science, Engineering, Operations Research, Mathematics, Economics, or a related field.
Strong expertise in SQL, Python, and Scala with hands-on experience.
Ability to work independently in a high-pressure environment with tight deadlines.
Proven track record of engaging cross-functional teams to implement project and program requirements.