27 iFlow Jobs
iFlow - QlikSense Developer (6-10 yrs)
iFlow
posted 7mon ago
Role : Qlik Sense Developer
Location : Pune
Experience : 6+
Must-Have Skills :
- Excellent development experience in QlikSense, VizLib Library and VizLib Collaboration.
- Experience in building QlikSense Mashups.
- Expert in data modelling and mapping e.g. snowflake and star schema models in Qlik.
- Expertise in data extraction, transformation and loading (ETL) using Qlik script editor.
- Expertise in Qlik Section Access, Reduce and Reload techniques for data level access restrictions.
- Expertise in performant loading techniques e.g. resident load and preceding load.
- Expertise on set analysis and advanced aggregation functions.
- IT methodology/practices knowledge and solid experience in Agile/Scrum, DevOps and ITIL principles.
- Experience in Collaboration tools usage such as JIRA/Confluence/Various board types.
- BS/MS degree in Computer/Data Science, Engineering or a related subject.
- Excellent communication and interpersonal skills in English. Proficiency in verbal, listening and written English is crucial.
- Enthusiastic willingness to rapidly and independently learn and develop technical and soft skills as needs require.
- Strong organisational and multi-tasking skills.
Good to Have Skills :
- Qlik Compose and Qlik Replicate.
- Experience with NPrinting.
- Proven experience in developing, refactoring, optimization of SQL/T-SQL procedures in BigQuery or equivalent Cloud Databases.
- Good understanding of GCP Core and Data Products, Architecting and Designs/Patterns
- Experience in web development and building UI web apps - HTML, CSS, Javascript, etc.
- Experience developing BI/MI reports and dashboards in other industry tools such as Tableau, PowerBI, Looker, etc.
- Data preparation, wrangling and refactoring skills, for example as part of a Data Science pipelines.
- Expert in the preparation, usage, visualization and editing of data in web, dashboard or other user interfaces (3-tier architecture, CRUD procedures, etc.)
- Experience in GCP based big data / ETL solutions DevOps / DataOps model.
- Experience with application monitoring & Production Support.
- Experience of deploying and operating Datafusion/CDAP based solutions.
- Experience in building and operating CI/CD life-cycle management Git, Jenkins, Groovy, Checkmarx, Nexus, Sonar IQ and etc.
- Expertise of Java, Python, DataFlow.
- Broad experience with IT development and collaboration tools.
- An understanding of IT Security and Application Development best practice.
- Understanding of and interest in various investment products and life cycle and the nature of the investment banking business.
- Experience of working with infrastructure teams to deliver the best architecture for applications.
- Working in a global team with different cultures.
Responsibilities :
- Build, test and deploy QlikSense dashboard applications.
- Meet dashboard design specifications through the use of VizLib Library and/or Mashups.
- Deliver dashboards with read-write capability through writeback tables and commentary, with VizLib Collaboration extensions.
- Use Qlik best practices to build fit for purpose apps that are performant for our stakeholders.
- Connect to data from our Google BigQuery data warehouse and govern the ingest and data relationships within Qlik.
- Develop and share reusable Qlik components.
- Review and refine, interpret and implement business and technical requirements.
- Ensure you are part of the on-going productivity and priorities by refining User Stories, Epics and Backlogs in Jira.
- Estimate, commit and deliver requirements to scope, quality, and time expectations.
- Protect the solution with appropriate Authorization and Authentication models, data encryption and other security components.
- Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc.
- Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective.
- Onboarding new data sources, design, build, test and deploy Cloud data ingest, pipelines, warehouse and data models/products (GCP DataFusion, Spark and etc.)
- Ensure a consistent approach to logging, monitoring, error handling and automated recovery.
- Write automated unit and regression tests as part of a test-centric development approach.
- Write well-commented, maintainable and self-documenting code.
- Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team.
- Protect the solution with relevant Data Governance, Security, Sovereignty, Masking and Lineage capabilities.
- Maintain good quality and up to date knowledge base, wiki and admin pages of the solution.
- Peer review of colleague's changes.
What we provide :
- Opportunities to develop and grow as an engineer. We are at the forefront of our industry, always expanding into new areas, and working with open-source and new technologies.
- A set of hardworking and dedicated peers.
- Growth and mentorship. We believe in growing engineers through ownership and leadership opportunities. We also believe mentors help both sides of the equation
Functional Areas: Other
Read full job description3-6 Yrs
6-11 Yrs
Mumbai, Hyderabad / Secunderabad, Bangalore / Bengaluru