1 DataMetica Big Data Lead Job
Bigdata Lead
DataMetica
posted 26d ago
Flexible timing
Key skills for the job
We at Onix Datametica Solutions Private Limited are looking for Bigdata Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytic s including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators
Check out more about us on our website below! https://www.onixnet.com/
Job Description
Experience : 6 to 10 Years
Location : Pune
6+ years of overall experience in developing, testing & implementing Big data projects using Hadoop, Spark, Hive.
Hands-on experience playing lead role in Big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members.
Experience in setting up Hadoop services, implementing ETL/ELT pipelines, working with Terabytes of data ingestion & processing from varied systems
Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing HDD & LDD documents
Required Skills and Abilities:
Mandatory Skills Spark, Scala/Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie,
Hue, Java, Python, SQL, Flume, bash(shell scripting)
Understanding of Data Governance concepts and experience implementing metadata capture,
lineage capture, business glossary
Experience implementing CICD pipelines and working experience with tools like SCM tools such
as GIT, Bit bucket, etc
Ability to assign and manage tasks for team members, provide technical guidance, work with
architects on HDD, LDD, POCs
Hands on experience in writing data ingestion pipelines, data processing pipelines using spark
and sql, experience in implementing SCD type 1 & 2, auditing, exception handling mechanism
Data Warehousing projects implementation with either Java, or Scala based Hadoop
programming background.
Proficient with various development methodologies like waterfall, agile/scrum.
Exceptional communication, organization, and time management skills
Collaborative approach to decision-making & Strong analytical skills
Good To Have - Certifications in any of GCP, AWS or Azure, Cloudera
Work on multiple Projects simultaneously, prioritizing appropriately
Employment Type: Full Time, Permanent
Read full job descriptionPrepare for DataMetica roles with real interview advice