We are looking for a skilled 3- 5 years experienced Senior Technical Consultant with expertise in SQL, SSIS, Python and PySpark to join our team The ideal candidate will have proficiency in building scalable Interfaces, performance tuning, data cleansing and validation strategies by leveraging the defined tech stack for data processing and data movement
What you ll do:
Advanced Execution & Data Management: Oversee and manage intricate project tasks, providing insights and directions related to advanced data ingestion, transformation, validation, and publishing
Review and analyse the data provided by the customer along with its technical/functional intent and interdependencies
Engage proactively with functional teams, ensuring a thorough understanding of end-toend data flows as related to the technical integration
Build data Ingress or Egress pipelines , handling of huge volume of data and developing data transformation functions using languages such as SSIS, Python, Pyspark, SQL etc
Integration of various data sources definitions like Teradata, SAP ERP, SQL Server, Oracle, Sybase, ODBC connectors & Flat Files through API or Batch
Production Deployment and Hypercare : Assist with Production Deployment tasks; Assists with triage of issues, testing and identifying root cause; Carry out timely response and resolution of batch automation disruptions, in order to meet customer SLA s with accurate and on-time results
Technical Leadership & Coding Oversight : Guide and review the code developed by junior consultants, ensuring alignment with best practices
Incorporate o9 ways of working and embed the industry standards for smoother project executions
What you should have:
3+ years experience in Data architecture, Data engineering, or a related field, with a strong focus on data modelling, ETL processes, and cloud-based data platforms
Hands-on experience with SSIS Packages, Python, PySpark, SQL languages
along with workflow management tools like Airflow, SSIS
Experience working with Parquet, JSON, Restful APIs, HDFS, Delta Lake and query frameworks like Hive, Presto
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
Working experience with version control platforms, eg GitHub, Azure DevOps
Familiarity with Agile methodology
Proactive mindset and the right attitude to embrace the agility of learning
Excellent verbal and written communication skills
Good to have
Hands-on Experience with Delta Lake
Experience with Supply chain planning applications
Experience with Amazon Web Services (AWS), AZURE, Google Cloud Infrastructures
What we ll do for you
Competitive salary with stock options to eligible candidates
Stock options to eligible candidates
Flat organization: With a very strong entrepreneurial culture (and no corporate politics)
Great people and unlimited fun at work
Possibility to make a difference in a scale-up environment
Opportunity to travel onsite in specific phases depending on project requirements
Support network: Work with a team you can learn from every day
Diversity: We pride ourselves on our international working environment
Feel part of A team: https: / / youtube / QbjtgaCyhes?feature=shared
How the process works:
Apply by clicking the button below
You ll be contacted by our recruiter - Reddy Babu, who ll fill you in on all things at o9, give you some background about the role and get to know you They ll contact you either via video call or phone call - whatever you prefer
During the interview phase, you will meet with technical panels for 60 minutes The recruiter will contact you after the interview to let you know if we d like to progress your application
We will have 2 rounds of Technical interview followed by Hiring Manager interview
Our recruiter will let you know if you re a successful candidate Good luck!