Revionics uses data from a variety of sources, maintains business-critical data models in BigQuery and SqlServer and deploys sophisticated, customer-facing visualizations through Looker embedded in our pricing application. We are looking for the right person to embed best practice standards and grow the value of our data estate.
Who you are
4+ years of experience in a data related role - data engineering, data analytics.
Bachelor s or Master s in corresponding domains, e.g., statistics, mathematics, computer science, software engineering, or IT.
Excellent Interpersonal and communication skills - experience in working with a variety of stakeholders, able to effectively interpret, translate and articulate business requirements into data assets.
Outstanding SQL skills with OLTP databases (e.g. MySql, SqlServer or Oracle) and OLAP (preferably BigQuery)
Experience with programming languages, especially python
Experience with software source control management software, GitLab / Github.
Experience with at least one established BI visualization tool, Looker preferred but also Qlikview, Tableau, Power BI.
Experience in data integration / data engineering - building and maintaining data pipelines with established tools: AWS Glue, Talend, Informatica, Airflow
Experience with design and development of cloud-based infrastructure and applications - Google Cloud Platform knowledge preferred
Some experience in data warehousing - as a minimum an understanding of core concepts, structures, common challenges and solutions.
Exposure to CI/CD, Terraform a plus
What will you do
Data modeling
Model BigQuery and SqlServer data into clean, tested, and reusable datasets used for analytical and other downstream purposes.
Define the rules and requirements for the formats and attributes of data.
Data transformation. Apply various data transformations to get data into required shape for analytical (Looker) and other downstream uses.
Data documentation. Ensure all managed data assets are clearly and accurately documented.
Defining data quality rules, standards, and metrics. Design and develop standards and processes to ensure quality standards aligned with downstream use purposes.
Setting development best practices for analytics. Applying software engineering best practices.
Version control to trace the history of changes in datasets and roll back to older versions if something goes wrong;
Data unit testing to examine small chunks of data transformations for quality and correspondence to the set tasks; and
Continuous integration and continuous delivery (CI/CD) to ensure up-to-date and reliable data.
Data visualization. Working with Looker developers to design, build, maintain and debug Looker visualizations critical to out customers pricing workflow.
Application configuration and integration. Ensuring our Looker software is set up on cloud infrastructure with requisite connections to data sources and our pricing application - working with cloud engineers to deploy new enhancements
Close collaboration with the wider business. Work collaboratively with all stakeholders namely data engineers, business analysts , and data scientists to align business requirements with data assets.