i
Quest Global
112 Quest Global Jobs
Data Engineer Hadoop
Quest Global
posted 26d ago
Flexible timing
Key skills for the job
Job Requirements
Responsibilities
• Work on complex enterprise-wide initiatives spanning multiple services/programs and drives resolution
• Work with business/product owners to architect and deliver on new services to introduce new products and bundles
• Drive the architectural design, including dependent services, service interactions, and policies
• Take ownership to improve the customer experience of an entire set of services/applications sold as products or bundled services
• Contribute and lead Guild initiatives by engaging and mentoring Engineers at all levels to improve the craftmanship of Software Engineering
• Simplify and improve the cost/benefit of a function/service/architecture strategy
• Apply judgment and experience to balance trade-offs between competing interests
• Venture beyond comfort zone to take on assignments across different areas of Software Engineering
• Take on organization-wide and public speaking engagements and publishes white papers and blogs on relevant and emerging technical topics
• Consult across teams and across organization lines to identify synergies and reuse opportunities
• Participate and contribute to Principal review architecture meetings and drive resolutions to enterprise-wide challenges and regulatory needs
• Write recommendations for job promotions based on an unbiased view of one's accomplishments
• Conduct technical interviews for hiring engineering staff and raising the performance bar
• Identify and raise awareness to Silo-ed behaviors within the organization and teams
Work Experience
Experience/expertise in Hadoop Eco System Spark, Python, Hive, Impala.
• Experience/expertise on Scripting languages like Unix, Python and OOPs programming languages (Scala preferrable).
• Experience in performance tuning of data processing in Hadoop, esp. with huge datasets (in TBs).
• Understanding of Exadata/Oracle/SQL server databases.
• Good to have exposure to NiFi/Kafka.
• Strong SQL programming skills.
• Reporting development in Data Warehouse environment is an added advantage.
• Understanding of Business Intelligence solutions for customers.
• Exposure and understanding of complete SDLC.
• Experience in AGILE methodology.
• Should be technically strong.
• Should have strong communication and coordination skills.
• Available For Hire
Employment Type: Full Time, Permanent
Read full job descriptionPrepare for Data Engineer roles with real interview advice