Varad Adya

Recruitment Consultant 
YASH IT AND HR Services
Bengaluru / Bangalore
Follow my job updates

Apply Now    Reply
Dear Hello,Greetings for the day.One of our Clients, A BIG 4 is looking out for candidates in the below mentioned Positions for Bangalore location. Interviews are put up on 6th April in Bangalore office.(Please Ignore if already applied)About Company – Deloitte IndiaDELOITTE, is one of the “Big Four” professional services firms along wi th PwC, EY, and KPMG. Its global headquarters are located in the United States. Deloitte is the largest professiona l services network in the world by revenue and by the number of professionals. Deloitte provides Audit, Tax, Consulting, Enterprise risk and Financial advisory services.Deloitte is the brand under which tens of thousands of dedicated professionals in independent firms throughout the world collaborate to provide Assurance, Consulting, Financial Advisory, Risk Management and certain Tax services to selected clients. These firms are members firms of Deloitte Touche Tohmatsu Limited (DTTL), a UK private company limited by guarantee. Each member firm provides services in a particular geographic area and is subject to the laws and professional regulations of the particular country or countries in which it operates.With approximately 2,40,000 professionals, providing fully integrated services in over 150 countries, Deloitte member firms serve more than 80 per cent of the world’s largest companies as well as large national enterprises, public institutions and successful fast-growing companies. Deloitte offers a range of services delivering knowledge and expertise that help our clients to solve complex issues and make better strategic decisions. Our solutions are based on experience in specific industries and business environments.1. Azure Data Lake Architect· 10 – 12 Years with 4-5 large Big Data / Data Lake implementations for large clients· In depth understanding of Hadoop architecture and how to architect data ingestion, data management and data delivery layer for insights generation· Has implementation experience on Azure· Hands on expertise in building Hadoop based services like HBase, Spark, Pig and Hive· Solid hands-on experience in designing Db & NoSQL Db solutions.· Articulating solutions for business problems using Azure and Azure Big Data services like HDInsights and Databricks to various stakeholders· Has understanding or exposure of other cloud platforms like AWS, GCP or Big Data platforms like Cloudera or Hortonworks· Design specifications; develop data strategy and high-level data architecture documentation.· Develop big data solutions by designing proposed system; defining database physical structure and functional capabilities, security, back up, and recovery.· Strong working knowledge on Azure Data Factory, Azure Data Lake Store, Azure Stream Analytics, Blob Storage, SQL DB and SQL DW· Ability to understand business needs and translate them into technical specifications.· Experience with Microservice and REST APIs· Ensure all solutions contain proper architecture and technical components.· Train and coach a high performance engineering and architecture team for Data lake implementations and integrations· Certification – AZ-300: Microsoft Azure Architect Technologies or AZ-301: Microsoft Azure. Architect Design (Nice to have).2. AWS Data Lake Architect· 10 – 12 Years with 4-5 large Big Data / Data Lake implementations for large clients· In depth understanding of Hadoop architecture and how to architect data ingestion, data management and data delivery layer for insights generation· Has implementation experience AWS· Hands on expertise in building Hadoop based services like HBase, Spark, Pig and Hive· Solid hands-on experience in designing Db & NoSQL Db solutions.· Articulating solutions for business problems using AWS and AW Big Data services to various stakeholders· Has understanding or exposure of other cloud platforms like Azure, GCP or Big Data platforms like Cloudera or Hortonworks· Design specifications; develop data strategy and high-level data architecture documentation.· Develop big data solutions by designing proposed system; defining database physical structure and functional capabilities, security, back up, and recovery.· Strong working knowledge on Redshift, Athena, Kinesis, DynamoDb and S3· Ability to understand business needs and translate them into technical specifications.· Experience with Microservice and REST APIs· Ensure all solutions contain proper architecture and technical components.· Train and coach a high performance engineering and architecture team for Data lake implementations and integrations3. GCP Data and ML Architect· 10 – 12 Years with 4-5 large Big Data / Data Lake implementations for large clients· In depth understanding of Hadoop architecture and how to architect data ingestion, data management and data delivery layer for insights generation· Has implementation experience on GCP· Hands on expertise in building Hadoop based services like HBase, Spark, Pig and Hive· Solid hands-on experience in designing Db & NoSQL Db solutions· Articulating solutions for business problems using GCP using BigTable, BigQuery, CloudSQL, etc.· Has understanding or exposure of other cloud platforms like AWS, Azure or Big Data platforms like Cloudera or Hortonworks· Design specifications; develop data strategy and high-level data architecture documentation· Develop big data solutions by designing proposed system; defining database physical structure and functional capabilities, security, back up, and recovery· Strong working knowledge of Google AutoML framework, Big Data modelling techniques: Data Vault 2.0, SQL DB and SQL DW, etc.· Ability to understand business needs and translate them into technical specifications· Strong understanding of Java/Python programming languages; knowledge of Microservices and REST APIs· Experience working with recommendation engines, data pipelines, or distributed machine learning and experience with data analytics and data visualization techniques and software· Ensure all solutions contain proper architecture and technical components· Train and coach a high performance engineering and architecture team for Data lake implementations and integrations· Excellent written and oral communication skills· Certification –Google Cloud Certified Professional Cloud ArchitectIf this position interests you, share your profile with the below mentioned details :Questionnaire :Interested, to apply for DELOITTE INDIA (Y/N) :Current Location :Interested to work from Bangalore (Y/N) :Role Applied for (Azure DataLake Architect/AWS Datalake Architect/GCP Datalake&ML Architect) :Available for interviews on 6th April in Bangalore (Y/N) :Have you been interviewed with Deloitte in last six months? (Y/N) :Current Company Name :Total Experience:Experience in Big Data (HBase, Spark, Pig and Hive) :Experience in Data Lake Implementation :Experience in Machine Learning :Experience in Azure/AWS/GCP :Current CTC :Expected CTC :Notice Period :Please send us your CV if you are interested. We will call you back to discuss more about this position.Regards,Varad AdyaYASH IT & HR Services, Bangalore| [email protected] | www.yashonline.co.in

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *