Refine and introduce new tools and best practices of data engineering
Contribute significantly to the product development strategy
Evaluate the performance of data pipelines systems against business objectives
Write readable code, create documentation for existing code, and refactor the previously written code into a readable state
Job Requirements:
Bachelor s/Master s degree in Engineering, Computer Science (or equivalent experience)
At least 5+ years of relevant data engineering experience
Expertise in Python, Go, HTTP, RESTful Design or gRPC Services, HTML, and JavaScript/Typescript/Angular
Ability to function well in a team environment
Expertise with techniques and tools for extracting and processing web data (e.g., Puppeteer, Selenium, UIPath, SQL, Scraping APIs, Proxy Services, or other leading RPA solutions)
Experience with version control, open-source practices, and code review
Experience building applications to present content for internal review, approval, and curation
Familiarity with data pipeline engineering, database design, web platform development, API design, or distributed systems
Ability to work as part of an agile, collaborative team in a fast-paced environment
Experience collaborating with Product and Engineering teams to build a high-level roadmap
Have a solid vision for a team delivering against a collaboratively defined roadmap
Experience using OCR and Machine Learning, TensorFlow, SageMaker, or other similar services is nice to have