Develop, test, implement, evaluate, and document transformation logic for data integration patterns involving both structured and unstructured data into the big data ecosystem
Support and maintain previously implemented big data projects, as well as provide guidance and consultation on other projects in active development as needed
Document and communicate technical complexities completely and clearly to team members and other key stakeholders
Contribute to govern data decisions related to data model design, data access patterns, and data services
Document logical data integration (ETL) strategies for data flows between disparate source/target systems for structured and unstructured data into common data lake and the enterprise information repositories
Think critically to deliver innovative solutions which solve tactical and strategic business needs
Requirements :
BS Degree in Computer Science or Engineering, or related field
3+ years experience in Data Warehousing
2+ years experience in Snowflake Data Platform
3 + years experience with ETL (In computing , extract, transform and load (ETL) refers to a process in database usage and especially in data warehousing that involves:
Extracting data from outside sources - Transforming it to fit operational needs (which can include quality levels) - Loading it into the end target (database, more specifically, operational data store , data mart or data warehouse ) and DWH (data warehouse: is a database used for reporting and data analysis
Experience in Matillion or other cloud ETL tool is a preference
Functional knowledge in CRM application processes is a plus
Excellent Communication skills
Fluent English is a must
Strong Experience in SQL is required
Experience in integrating complex, corporate-wide processes and data models
Advanced analytical thinking and problem solving skills