Job Description
A Bachelor’s or Higher Degree is the minimum entry required for the position

  • Skill Set: AZURE
  • Total Experience: 5.00 to 7.00 Years
  • No of Openings: 1
  • Job Post Date: 26/04/2021
  • Job Expiry Date: 04/06/2021
  • Domain: IT
  • Location: CHENNAI [India]
  • Job Reference No:

Job Summary

Job Description 

MS Azure cloud Data Architect

Apr 2021


Required skills:

¿ More than 12 years of overall IT industry experience along with 2+ years of experience on cloud based solution.
¿ More than 5 years of experience in Big data, Spark applications.
¿ Good understanding of Architectural Development Management practices and exposure to Enterprise Architecture frameworks (e.g. TOGAF)
¿ Well versed with different Cloud Architecture / Design / Integration Patterns, application migration methodology, and implementation approach
¿ Strong understanding of different Multi tenancy model implementation techniques using various Cloud Service Models (e.g. BPaaS, SaaS, PaaS, IaaS)
¿ Good knowledge on Data Lake, Staging area, 3NF, data labs and data marts including starflake and snowflake schemas
¿ Experience in migrating and integrating different types of data into data lake within in premise or cloud ecosystem
¿ Experience on Big Data Tools and Technologies such as Spark, Kafka, Flume, Sqoop, Hive, HDFS, Mapreduce, HBase etc. and with respect to MS Azure general services used to build the data platforms like  Azure AD, Data catalog, Stream services. Data factory etc.
¿ Proficient in distributed computing principles and familiar with key architectures including Lambda and Kappa architectures, and has a broad experience across a set of data stores (e. g., HDFS, Azure Data Lake Store, Azure Blob Storage, Azure SQL Data Warehouse, Apache HBase, Azure DocumentDB), messaging systems (e. g., Apache Kafka, Azure Event Hubs, Azure IoT Hub) and data processing engines (e. g., Apache Hadoop, Apache Spark, Azure Data Lake Analytics, Apache Storm, Azure HDInsight, Azure Databricks).


¿ The Azure Data Architect is responsible for helping to design, deploy, manage and support the systems and infrastructure required for a data processing pipeline in support of a products requirements.
¿ Primary responsibilities revolve around DevOps and include implementing ETL (extract, transform and load) pipelines, monitoring/maintaining data pipeline performance.

Project language: Business communication skills in English language

Location: India

Other info:

¿ Starting Date: May 2021

Recommend to Friend