Involve from inception of projects to understand requirements, architect, develop, deploy, and maintain data.
Work in a multi-disciplinary, agile squad which involves partnering with program and product managers to expand product offering based on business demands.
Focus on speed to market and getting data products and services in the hands of our stakeholders and passion to transform financial industry is key to the success of this role.
Maintain a positive and collaborative working relationship with teams within the NCR Atleos technology organization, as well as with wider business. Creative and inventive problem-solving skills for reduced turnaround times are required, and valued, and will be a major part of the job.
Location: Gurugram , Haryana
And Ideal candidate would have:
BA/BS in Computer Science or equivalent practical experience
Experience applying machine learning and AI techniques on modernizing data and reporting use cases.
Overall 2+ years of experience on Data Analytics or Data Warehousing projects.
At least 2+ years experience as a Data Scientist in a medium to large scale solutions to include experience with tasks such as data migration
At least 2+ years of Cloud experience on Azure.
Microsoft Azure , ADF , Synapse.
Programing in Python, PySpark, with experience using pandas, ml libraries etc.
Data streaming with Flink/Spark structured streaming.
Open-source orchestration frameworks like DBT, ADF, AirFlow
Open-source data ingestion frameworks like Airbyte, Debezium
Experience migrating from traditional on-prem OLTP/OLAP databases to cloud native DBaaS and/or NoSQL databases like Cassandra, Neo4J, Mongo DB etc.
Deep expertise operating in a cloud environment, and with cloud native databases like Cosmos DB, Couchbase etc
Proficiency in various data modeling techniques, such as ER, Hierarchical, Relational, or NoSQL modeling.
Excellent design, development and tuning experience with SQL (OLTP and OLAP) and NoSQL databases.
Experience with modern database DevOps tools like Liquibase or Redgate Flyway or DBmaestro.
Deep understanding of data security and compliance, and related architecture
Deep understanding and strong administrative experience with distributed data processing frameworks such as Hadoop, Spark and others
Experience with programming languages like Python, Java, Scala and machine learning libraries
Experience with dev ops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps
Experience with Agile development concepts and related tools
Ability to tune and trouble shoot performance issues across the codebase and database queries
Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions.
Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.
Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment.
Additional Skills:
Leverage machine learning and AI techniques on operationalizing data pipelines and building data products. Provide data services using APIs. Containerization data products and services using Kubernetes and/or Docker.
Job Description :
Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos, powering our future growth.
Data Scientist at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers.
A
We are looking for Data engineer who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up modernizing and improving application capabilities.
Responsibilities
As a Data Engineer, you will be joining our Data Engineering Modernization team transforming our global financial network and improving our data products and services we provide to our internal customers. This team will leverage cutting edge data engineering modernization techniques to develop scalable solutions for managing data and building data products. In this role, you are expected to
Involve from inception of projects to understand requirements, architect, develop, deploy, and maintain data.
Work in a multi-disciplinary, agile squad which involves partnering with program and product managers to expand product offering based on business demands.
Focus on speed to market and getting data products and services in the hands of our stakeholders and passion to transform financial industry is key to the success of this role.
Maintain a positive and collaborative working relationship with teams within the NCR Atleos technology organization, as well as with wider business. Creative and inventive problem-solving skills for reduced turnaround times are required, and valued, and will be a major part of the job.
Location: Gurugram , Haryana
And Ideal candidate would have:
BA/BS in Computer Science or equivalent practical experience
Experience applying machine learning and AI techniques on modernizing data and reporting use cases.
Overall 2+ years of experience on Data Analytics or Data Warehousing projects.
At least 2+ years experience as a Data Scientist in a medium to large scale solutions to include experience with tasks such as data migration
At least 2+ years of Cloud experience on Azure.
Microsoft Azure , ADF , Synapse.
Programing in Python, PySpark, with experience using pandas, ml libraries etc.
Data streaming with Flink/Spark structured streaming.
Open-source orchestration frameworks like DBT, ADF, AirFlow
Open-source data ingestion frameworks like Airbyte, Debezium
Experience migrating from traditional on-prem OLTP/OLAP databases to cloud native DBaaS and/or NoSQL databases like Cassandra, Neo4J, Mongo DB etc.
Deep expertise operating in a cloud environment, and with cloud native databases like Cosmos DB, Couchbase etc
Proficiency in various data modeling techniques, such as ER, Hierarchical, Relational, or NoSQL modeling.
Excellent design, development and tuning experience with SQL (OLTP and OLAP) and NoSQL databases.
Experience with modern database DevOps tools like Liquibase or Redgate Flyway or DBmaestro.
Deep understanding of data security and compliance, and related architecture
Deep understanding and strong administrative experience with distributed data processing frameworks such as Hadoop, Spark and others
Experience with programming languages like Python, Java, Scala and machine learning libraries
Experience with dev ops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps
Experience with Agile development concepts and related tools
Ability to tune and trouble shoot performance issues across the codebase and database queries
Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions.
Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.
Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment.
Additional Skills:
Leverage machine learning and AI techniques on operationalizing data pipelines and building data products. Provide data services using APIs. Containerization data products and services using Kubernetes and/or Docker.