Data Engineer - Informatica Power Centre 10x or BDM
The speed of change is throwing traditional business methods into question and disrupting the relevance of entire industries. Capgemini, a global leader in consulting, digital transformation, technology and engineering services, is at the forefront of innovation and well placed to address opportunities for our clients in the evolving world of cloud, digital and platforms. Building on its strong 50-year heritage and deep industry-specific expertise, Capgemini enables organisations to realise their business ambitions through an array of services from strategy to operations.
Capgemini is driven by the conviction that the business value of technology comes from and through people. Today, it is a multicultural company with over 270,000 team members in almost 50 countries. With Altran, the Group reported 2019 combined revenues of EUR 17 billion.
Learn more about us at www.capgemini.com.
Let's talk about the team: Our Insights and Data team helps our clients make better business decisions by transforming an ocean of data into streams of insight. Our clients are among Australia's top performing companies and they choose to partner with Capgemini for a very good reason - our exceptional people. Due to continued growth within Capgemini's Insights & Data practice we intend to recruit a BDM Data Engineer with extensive experience in a consulting environment .
If you are already working in a consultancy role, or have excellent client-facing skills gained within large organizations, we would like to discuss our consultant opportunities with you. Let's talk about the role and responsibilities: The responsibilities, skills and duties of the BDM Data Engineer include: Performing analysis, design and development of ETL processes to support project requirements; Developing Informatica mappings, SQL/stored procedures as well as data maps for PowerExchange, or Unix shell scripts; Developing Hive tables and queries; Develop Spark jobs in (Scala/Python/Java) in order to stream / publish or consume data from Hadoop; Performs unit testing, QA, and work with business partners to resolve any issues discovered during UAT; Responsible for peer-review of mappings and workflows when required; Maintains development and test data environments by populating the data based on project requirements; Works with production control and operations as needed to promote mappings/workflows, implement schedules and resolve the issues; Reviews ETL performance and conducts performance tuning as required on mappings / workflows or SQL; Maintains all applicable documentation pertaining to specific SDLC phases;" Let's talk about your qualifications and experience: Strong ETL skills with at least 5 years Experience in both Informatica Power Centre 10x or BDM - mandatory (or extremely good at Power Center 10.x and have other bigdata skills (Hive, sqoop, spark, hadoop, etc.
) Excellent knowledge on Informatica platform and the integration among different Informatica components and servic....