Azure Cloud Architect Resume
Las, VegaS
PROFESSIONAL SUMMARY:
- 12+ years of IT experience in Software Development Life Cycle (SDLC) focusing into Azure, Data Engineer
- 5 years of Experience in Cloud in Confidential AZURE using Azure Cloud, Azure Data Factory, Azure Data Lake Analytics, Azure Data Bricks, GIT, Azure DevOps, Azure SQL Data Warehouse.
- Proven experience with varied Architecture experience across IT including Client Server Technologies (J2EE, .NET), Cloud Technologies (Azure, AWS), Security Architecture (SAML 2.0, OKTA), Security User provisioning (CA solutions), Mulesoft, and WebSphere Message broker - WMB/WBIMB interfaces, MQ Series
- Strong Knowledge of Security features like AAA (authentication, authorization, Auditing), Encryption, Decryption, Digital Signatures, Secure Socket Layer (SSL) Profiles, Single Sign-On, html forms and OAuth etc.
- Expertise in One-way SSL, Two - way SSL and Mutual Authentication
- Business knowledge in domains like Retail, Transportation, Energy & Utilities, Pharmaceutical, Health insurance, Sales.
- Hands on experience working on Flat Files, XML, COBOL Files and Databases including Oracle, SQL Server, DB2 and Teradata.
- Exposure to Hadoop and Big Data.
- Extensively worked with scheduling and supporting Informatica power center.
- Extensive experience in error handling and problem fixing in Informatica.
- Involved in unit testing for validating the data is mapped correctly which provides a qualitative check of overall data flow up and deposited correctly in target tables.
- Strong analytical and conceptual skills in database design and development using Oracle.
- Expertise in Database Programming (SQL, PL/SQL) using Oracle.
- Core competencies in Python, Spark, U-SQL, HQL, COBOL, JCL, VSAM, SQL, DB2.
- Extensive knowledge of IBM tools & utilities including File-Aid, IDCAMS, QMF, SCLM, Xpediter, Endevor, TSO, Control-M, SYSCSORT, SPUFI, Pan-valet and File Master.
- Extensively worked in Requirement Analysis, Design, Development, Testing and Implementation.
- Worked extensively with the Onsite - Offshore Delivery model, Co-ordination with clients/users, understanding of client requirements.
- Experience in analyzing the existing mainframe legacy system, understanding the business functionality.
- Expertise in analyzing Functional specifications and technical specifications based on the requirements.
- An innovative and effective team player with good initiative.
- Excellent Communication skills, organizational skills, analytical skills and strong interpersonal skills.
TECHNICAL SKILLS:
Domain Expertise: Transportation, Energy & Utilities, Pharmacy, Health Insurance, Sales
Operating System: OS/390, MVS/ESA, WINDOWS 98/NT/XP/10, UNIX
Cloud Infrastructure: Azure Cloud, Azure Data Factory, Azure Data Lake Analytics, Azure Databricks, GIT, Azure DevOps, Azure SQL Data Warehouse
Languages: Python, Spark, HQL, U-SQL, SQL, PL/SQL, COBOL, JCL, XML, HTML
.NET Technologies: ASP.Net 3.5/4.0/4.5/4.6/4.7 , ADO.NET, AJAX, LINQ, Entity Framework
Reporting Tools: SSRS, Power BI, MicroStrategy
Oracle, SQL Server, DB2, Teradata, MS: Access
Control: D, Maestro
File System: VSAM
PROFESSIONAL EXPERIENCE:
Confidential
Azure Cloud Architect
Responsibilities:
- Participate and contribute in Cloud Reference Architecture for T-Mobile Cloud Governance Committee.
- Provide expertise and leadership regarding Cloud Architecture for both infrastructure and applications in Confidential Azure
- Demonstrate thought leadership in cloud computing across multiple channels including DevOps Engineering teams and become a trusted advisor to our Product Team.
- Perform Azure development and design work that may include logical design, I/O design, cloud architecture analysis and design, and systems engineering.
- Develop technical road-maps for future Azure cloud implementations and migrations
- Develop business case analysis on potential projects
- Ensure security is integrated into all cloud architecture solutions
- Perform advanced systems modeling, simulation, and analysis for TMO customer onboarding.
- Act as a subject-matter expert around Confidential Azure for other sales and solutions teams Skills Needed
- Experience in all aspects of cloud computing (e.g. infrastructure, storage, platforms and data, and Network Layer)
- Troubleshooting complex scenarios related to an Application Deployment
- Data analytics and engineering experience in multiple Azure platforms such as Azure SQL, Azure SQL Data warehouse, Azure Data Factory, Azure Storage Account etc. for source stream extraction, cleansing, consumption and publishing across multiple user bases.
- Created Azure Data Factory pipeline to insert the flat file, Orc file data into Azure SQL.
- Cloud based report generation, development and implementation using SCOPE constructs and power BI. Expert in U-SQL constructs for interacting multiple source streams with in Azure Data Lake.
- Involved in data analysis. Performed data quality checks and prepared data quality assessment report
- Designed source target mapping sheets for data loads and transformation
- Developed pipelines to transform data using activities like U-SQL scripts on Azure Data Lake Analytics
- Transform data using Hadoop Streaming activity in Azure Data Factory
- Developed Pipelines to load data from on prem to AZURE cloud database.
- Loaded JSON input files to Azure Data warehouse
- Developed Pipelines in Azure data factory using copy activity, Notebook, Hive, U-SQL to load data
- Developed Pipelines in Azure data factory to call Notebooks to transform data for reporting and analytics.
- Reports are developed on Power BI on top of Views in Azure SQL.
- Scheduled Pipelines in Azure pipeline
Environment: Azure Data Factory, Azure SQL Server, Azure Data Lake Analytics, Azure Data Bricks, Python, Spark, HQL, U-SQL, SQL, HDInsight, Azure Data warehouse, GIT, Azure DevOps, Power BI, JSON.
Confidential
Sr Azure Data Engineer / Lead
Responsibilities:
- Data analytics and engineering experience in multiple Azure platforms such as Azure SQL, Azure SQL Data warehouse, Azure Data Factory, Azure Storage Account etc. for source stream extraction, cleansing, consumption and publishing across multiple user bases.
- Created Azure Data Factory pipeline to insert the flat file data into Azure SQL.
- Cloud based report generation, development and implementation using SCOPE constructs and power BI. Expert in U-SQL constructs for interacting multiple source streams with in Azure Data Lake.
- Involved in data analysis. Performed data quality checks and prepared data quality assessment report
- Designed source target mapping sheets for data loads and transformation
- Developed pipelines to transform data using activities like U-SQL scripts on Azure Data Lake Analytics
- Transform data using Hadoop Streaming activity in Azure Data Factory
- Developed informatica mappings to load data from on prem to AZURE cloud database.
- Loaded JSON input files to Azure Data warehouse
- Developed Pipelines in Azure data factory using copy activity to load data
- Designed and developed stored procedure in SQL server
- Developed Pipelines in Azure data factory to call stored procedures to transform data for reporting and analytics.
- Reports are developed on Power BI on top of Views in Azure SQL.
- Scheduled Pipelines in Azure pipeline
- Did daily health check and monitored batch jobs
Environment: Azure Data Factory, Azure SQL Server, Azure Data Lake Analytics, Azure Data Bricks, Python, Spark, HQL, HDInsight, Azure Data warehouse, GIT, Azure DevOps, Power BI, JSON, Informatica Power Center.
Confidential
Azure Data Engineer / Data Architect
Responsibilities:
- Worked in Regulatory Compliance IT team where worked as Data Architect role which involved Data Profiling, Data Modeling, and ETL.
- Responsible for Big data initiatives and engagement including analysis, brainstorming and architecture and worked with Big Data and Big Data on Cloud, Master Data Management and Data Governance.
- Transform data by running a Python activity in Azure Databricks.
- Created Azure Data Factory pipeline to insert the flat file, Orc file data into Azure SQL.
- Cloud based report generation, development and implementation using SCOPE constructs and power BI. Expert in U-SQL constructs for interacting multiple source streams with in Azure Data Lake.
- Designed and Developed SSIS packages
- Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
- Involved in creating Hive tables and loading and analyzing data using hive queries Developed Hive queries to process the data and generate the data.
- Developed Pipelines in Azure data factory to call Notebooks to transform data for reporting and analytics.
- Designed and developed a Data Lake using Hadoop for processing raw and processed data via Hive.
- Utilized Apache Spark with Python to develop and execute Data.
- Used ETL/ELT process with Azure Data Warehouse to keep data in Blob Storage with almost no limitation on data volume.
- Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark and implemented Data Integrity and Data Quality checks in Hadoop using Hive scripts.
- Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data and developed SQL Stored procedures to query dimension and fact tables in data warehouse.
- Coordinating with Client and Business Analyst to understand and develop reports.
Environment: Azure Data Factory, Azure SQL Server, Azure Data Lake Analytics, Azure Data Bricks, Python, Spark, HQL, U-SQL, SQL, HDInsight, Azure Data warehouse, GIT, Azure DevOps, Power BI, JSON.
Confidential, Las Vegas
Integration Consultant- Middleware
Responsibilities:
- Enterprise Solution Architecture and Cloud Architecture for Cognitive Analytics BI Solution and MDM solution for Advance Autoparts
- Lead a team of BSAs, Project Architects for Requirements Capture and Requirements Coverage and Technical team liasioning.
- ESA for AWS based solution for Cognitive Analytics BI solution using AWS services including VPCs, EMRs, Redshift, S3, Quicksight, Direct Connect, R, and Python.
- ESA for MDM solution with Reltio COTS product for MDM solution with Mulesoft based integration, Collibra for Data Governance.
- ESA for Service Oriented Architecture solutions including Web Services, Rest Services, and using HATEOS principles.
- Architecture Solution across AAP Domain as per TOGAF and Zachmann framework
- Architecture solution for High performance and HA.
- Technical solution for performance improvement for the applications in Big Data, MF Legacy, Service Integration, and Cloud based applications of AAP.
- Solution and Technical Architecture to standup the MDM solution for Advance Auto Parts using Reltio, Mulesoft
- Solution and Technical Architecture to standup a Cognitive Analytics Data Lake solution.
- Setting up AWS Data Pipeline for On Premise to AWS S3 data movement
- Setting up Data Visualization tool for reading data from AWS Redshift.
- Setting up AWS Glue ETL job for pulling data from S3 and pushing to AWS Redshift
- Security best practices to secure AWS data
- Setting up Jenkins CI pipeline
- Implementing the End to End architecture and solution for AAP MDM for Professional Customer, Location Domain using Mulesoft gateway
- End to End Delivery of AS400 Legacy data extracts to AWS S3 and Reltio MDM SaaS solution.
- Technical Lead responsible for Mulesoft Application components delivery for MDM, using S3, Reltio Connectors
- Ownership and Delivery of Mulesoft Performance improvement strategies and code review for Mulesoft Integration in MDM project of Advance Autoparts.
- Ownership and Delivery of Rest Services integration, System API, Process API, Experience API in MDM project of Advance Autoparts.
- Responsible for End to End Architecture delivery of Reflexis, SaaS application,replacing the current in-house CDL (Customer Driven Labor) application.
- Management and configuration of REST API based integration, File based integration.
- Analysis, Vendor liaison for introduction of COTS products for Integration (Mulesoft), Data Governance (Collibra), Reporting (Yellowfin) to AAP Enterprise.
- Tech Lead responsible for end to end data delivery pipeline for data push from On-Prem Legacy system (AS400) to AWS Data Lake using In-house ETL capability (Datastage), Batch Scheduling (UC4), Linux Scripting, AWS CLI, DB2 SQL.
- Senior Developer delivered UML model artifacts.
- Designed and Developed Rest Services for Reflexis using Spring Boot.
- Implemented service layer using Spring IOC and annotations and Controllers using Spring MVC
- Involved in BI Analytics data discovery and data governance for Cognitive Analytics with in-depth Data Cataloging.
- Peer to Peer Reviews of Enterprise Architecture delivery.
- Creation and Laying out of Mulesoft API standards for AAP Enterprise.
Environment: AWS S3, AWS CLI, Mule Server 3.7.2, Mulesoft Anypoint 6.5, UC4(Automic), Datastage 8.5,AS400, DB2, Oracle 12g, SQL, JDK 1.8, GitHub, Jenkins, AWS S3, Reltio MDM, Spring Boot 1.2.3, Java 8.
Confidential, Princeton, NJ
Lead Developer
Responsibilities:
- Provided UI using C#.NET/VB.NET/ASP.NET for Rich GUI of the Application
- Used XML XSTL, XPath for navigating and updating XML documents
- Developing financial reports using SSRS for store wide sales and returns using SQL Server 2008.
- Developed the OHT application using ASP.NET, MVC4, HTML5, CSS3, Java script and JQuery.
- Developed OHT application compatible to mobiles using Kendo UI Mobile from Telerik
- Developed the data model using Entity Framework.
- Developed and designed several pages in MVC4 with razor.
- Used Java Web Service in the ASP.NET application using C#.NET.
- Developed complex Stored Procedures and views to generate various Drill-down reports, Dynamic Reports and linked reports using SSRS.
- Implemented and consumed SOA services using Web Services and WCF.
- Developed Stored Procedures, Functions, and Triggers using Oracle 11g/SQL Server 2008
- Developed UI interfaces using Java Script, JQuery, CSS, HTML5, ASP.NET, C#.NET
- Developed web based application - Availability to see the schedule of associates working in Stores using Telerik Controls ( RadControls for ASP.NET AJAX ) and Kendo UI Mobile controls.
- Prepared Unit Test Case documents.
- Developed scheduling jobs using Windows service.
- Involved in Test case design reviews and Code reviews.
- Used Agile Scrum development methodology for this project
Environment : Visual Studio 2010, .NET Frame Work 3.5/4.0, TFS, Silver light 5.1, Web Forms, SQL Server 2008 R2, SSRS/SSIS, C#.NET 4.0, ASP.NET 4.0, MVC4, AJAX, WCF, NUnit, Java Script, HTML/HTML5, CSS/CSS3, Telerik Controls, RIA Framework, Entity Framework 4.0, Windows 7.
Confidential, Omaha, NE
.NET Lead Developer
Responsibilities:
- Significant role in Analysis and Design stages of the project life cycle.
- Design and develop the GUI, Business Logic Layer and Data Access Layer.
- Developed the Web forms for new policy endorsement, policy cancellation, renewal, reinstatement, account correction, audit, premium calculation, coverage, policy information, billing details, policy inquiry, and work file information using ASP.NET, C#, XML, XSLT & JavaScript.
- Developed a suite of applications including Windows service's and a WinForms test harness - allowing member purchasing via biometric reads using Visual Studio 2010 (C#, .NET 3.5, LINQ), SQL Server, Castle Windsor, Confidential Application Blocks, and M2-Hamster Plus.
- Involved in Extraction, Transformation, and Loading (ETL) solutions using SQL Server Integration Services (SSIS)
- Worked on managing SOAP UI Webservices and Kendo UI Designing.
- Created Tasks, Sites, InfoPath forms that can be used to easily create XML forms to meet each business-specific need using SharePoint Portal Services.
- Designed and Developed Master and Content Pages (Web Forms) using ASP.Net Server Controls and C# as code-behind.
- Developed web services for premium calculation and coverage.
- Deployed the 3-Tier Architecture Application with UI, Business Layer and Data Access layers using C# .NET.
- Used Confidential Messaging Queue (MSMQ) technology for Inter- process communication.
- Implement Role Based security with Form Based Authentication.
- Developed web services and used them for data access layer and business layer.
- Implemented and consumed Web Services (SOAP, WSDL and UDDI) for automatic dump in SQL server.
- Written complex SQL queries with Joins on multiple tables, stored procedures, triggers, backup and restore database, define roles, create database users and used Data Transformation Services (DTS)
- Decided all the Security Concern (IIS Security, Web application security and Database Security) needs to be taken during application development.
- Developed the web user controls and dynamic creations of web controls.
- Used SQL Server as backend and implemented ADO.NET data objects such as Data Adapter, Data Reader, Dataset, Data table.
- Created reports to create, deploy, and manage reports using Crystal Reports.
- Used Object Oriented Programming in developing controls using interfaces.
- Used C#, data grids and XML to pull the data from the database and display it on the UI.
- Designed Logical and Physical Data Model.
- Developed application Data Flow Diagrams (DFDs) using MS Visio.
- Involved in unit testing, fixing bugs and maintenance of the product.
- Used Visual Source Safe for source code version controlling.
Environment: C#, ASP.NET 3.5, SQL Server 2008, Crystal Reports, ADO.NET, JSON, SOAP, REST, Visual Studio 2012, XML, HTML, JavaScript, OAuth, VB.NET, MS Visio, VSS.
Confidential
.NET Developer
Responsibilities:
- Involved in Requirement Analysis, Design and Development Phases of project.
- Used Telerik controls Rad Combo box, Rad Text Boxes for the UI design.
- Designed and developed user interfaces using ASP.NET MVC, .NET Framework4.0 & C#.
- Developed the Entities needed for the Database tables using Entity Framework.
- Extensively used Entity Framework with WCF services.
- Used views like Partial and Shared views for displaying results, which actually contains the business logic.
- Involved in developing WCF services using C#, to centralize the business process.
- Used fluent validations and jQuery to provide Client-side validations.
- Used jQuery Ajax for server calls from client.
- Involved in Production activities like merging code to the TFS, created package and move the code to the Staging and production.
- Designed and created database tables in Sql Server 2012.
- Worked on new user stories in each sprint.
- Worked on Scrum and agile methodology.
- Preparation and Execution of Unit Test Cases.
- Created stored procedures and user-defined functions to interact with the database.
- Involved in TDD and mock up unit testing.
Environment: C#, Visual Studio.NET 4.0, ASP.NET MVC 4, Entity Framework, SQL Server 2012, WCF, HTML, Repository and Unit of Work patterns, jQuery, Bootstrap, CSS, TFS, MOQ, SSIS.