Job Description

5.00 to 15.00 Years
MELBOURNE [Australia]
Must have experience:¿ Experience: 8+ years IT experience, with at least 4+ years in ¿ETL/Data Migration¿ focused title/role,¿ Domains: ¿Data¿ migration/Integration or ETL activity/outcomes,¿ Design: Create Source/Target mapping low level design vs. it being given to you ¿ ETL Tools: Use of Informatic ETL tool. E.g. Power Center or Big Data Management (BDM)¿ Languages: Unix (Shell) Scripting/commands, SQL ¿ Platform: Mid range Unix ¿ Databases: Oracle¿ File formats: Text, CSV, Parquet, JSON, Mainframe (format types),¿ Scheduling: Understand concepts and use of scheduling tools for ETL jobs. E.g. Control M, Jenkins, etc. ¿ Source (version) Control tool:  GIT ¿ SDLC: Experience in full lifecycle of program: design, development, testing, production migration & support¿ Industry: Financial Services/BankingNice to have:¿ Platform: Mainframe  ¿ Databases: POSTGRESS, NOSQL¿ Languages: Python, Java¿ Big data: Understanding of concepts and components. E.g. Hive (Schema), HDFS, Kafka, Beam¿ Automation: Designing/creation of scripts (copy/migration/ETL) for repetitive activity. Using Informatica and or Unix scripting,¿ Performance: Improve/tune Informatica mappings, SQL queries, Batch feeds,¿ Real Time:  Experience processing real time data feeds such as KAFKA,¿ Activity Reporting/Repository: Jira, Confluence,¿ Micro Services: Design & Build jobs for AWS,¿ Agile: Previous exposure to Agile/Sprint/Scrum environments and delivery practicesSoft skills: ¿ Collaborate with application teams on subjects relating to the ingestions, firewall, network, file orchestration. Meta data, domains, scripts and automated checking gates to ensure data is consistent to design¿ Able to collaborate in an off shore onshore organization/environment Note: