158 Dotflick Solutions Jobs
ETL Developer - Big Data/Hadoop (3-13 yrs)
Dotflick Solutions
posted 10hr ago
Key skills for the job
Job Designation : ETL Developer
WFH / WFO
Job Description :
- Work Experience : Overall 3+ years of experience working on Databases, Data Warehouse, Data Integration and BI/Reporting solutions with relevant experience.
Role & Responsibilities :
- To be a Data expert/Architect in implementing solutions using Databases/ETL/DWH/Data Integration/Reporting tools with prior experience in same domain. Should have an overall 3+ years of relevant experience in Databases, Data Warehousing, Data Integration and BI/Reporting solutions.
- Technology Leadership - Independently he/she should be able to architect, design, implement deliver complex Databases/Data Warehousing/Data Lake, Data Integration and BI/Reporting Solutions.
- Work with stakeholders to understand business objectives and develop value-added data driven solutions, frameworks and reporting solutions.
- Architecture/Technical Design and Development - Expertise in any 2 ETL tools (Informatica, Talend, Matillion, Data Stage), and cloud-based technologies like AWS S3 is mandatory.
- Expert knowledge of SQL with the capability to performance tune complex SQL queries in tradition and distributed RDDMS systems is must.
- Hands on experience across Python, PySpark and Unix/Linux Shell Scripting.
- Logical Thinking - Able to think analytically, use a systematic and logical approach to analyse data, problems, and situations.
- Communication - Able to convey ideas and information clearly and accurately to self or others whether in writing or verbal.
- Task/Team Management/Collaboration - Should be able to manage technical team and delegate tasks to accomplish milestones as per plan. Should be comfortable in discussing and prioritizing work items with team members in an onshore-offshore model.
Technical :
- Design and implement effective database solutions and data models to store and retrieve data.
- Hands on experience in the design of reporting schemas, data marts and development of reporting solutions.
- Prepare scalable database design and architecture in terms of defining multi-tenants schemas, data ingestion, data transformation and data aggregation models.
- Should have expertise and working experience in at least 2 ETL tools among Informatica, SSIS, Talend & Matillion
- Should have expertise and working experience in at least 2 DBMS/appliances among Redshift, SQL Server, PostgreSQL, Oracle.
- Should have strong Data Warehousing, Reporting and Data Integration fundamentals.
- Advanced expertise with SQL
- Experience on AWS/Azure cloud data stores and its DB/DW related service offerings.
- Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.
- Should have technical expertise and working experience in at least 2 Reporting tools among Power BI, Tableau, Jasper soft and QlikView/QlikSense
- Advanced technical Competencies in SQL
Functional Areas: Software/Testing/Networking
Read full job description3-14 Yrs
1-16 Yrs
3-16 Yrs