- Skill Set: BIG DATA ANALYTICS
- Total Experience: 5.00 to 15.00 Years
- No of Openings: 1
- Job Post Date: 28/04/2021
- Job Expiry Date: 03/06/2021
- Domain: IT
- Location: SYDNEY [Australia]
¿ Have worked in Telecommunication industry with minimum 10+ years of experience
¿ Minimum 3 4 years in solution design of Large scale data warehouses
¿ Knowledge on Telecom domain and Data Modelling.
¿ Worked on Teradata and have performed warehouse development
¿ Ability to build Delta Lake Architecture on Big Data Platform.
¿ Strong experience in designing and developing Data Lake using Hadoop Technology stack.
¿ Have experience in Hadoop Echo system (HDFS, Map Reduce, YARN, Pig, Hive, HBase, Kafka, Impala, Flume, Sqoop, Zookeeper, Oozie, Spark, Python, Scala, Cassandra, Kudu, Informatica BDM 10.x and Shell Scripting) and on Google Cloud Platform ¿ Google Analytics
¿ Worked on Spark Python & Scala
¿ Worked on NSQL databases like HBase or Presto
¿ Worked on Snowflake, GCP, AWS and Azure
¿ Worked on Tableau for Reporting use cases
¿ Experience in Cloudera Distribution
¿ Requirement Analysis, Design and implementation in Hadoop / Big data applications
¿ Attending Client meeting and coordinating with onsite and offshore development team.
¿ Build, transform data sources and develop consumption views for tableau reporting in Teradata/ BDP
¿ Develop and Document Hadoop/Big data applications
¿ Spark based batch data ingestion.
¿ Transformations using Impala or Spark
¿ To build data pipelines using Spark Scala/Python
¿ To perform analytics using R/Python in Big Data and Cloud(AWS, Snowflake, GCP and Azure)
¿ Hadoop cluster management using Cloudera Manager
¿ Build Dashboards using Tableau/ Power BI
¿ To develop workflows using Informatica BDM 10.2.2
¿ Testing and Support of OSS Services / components
Project Management Office ¿ ANZ CME
ITC 7 | TMEC Bangalore
Desk: 080 6780 4303
Mobile: +91 8688653032