Enterprise Data Architect Resume
SUMMARY
- Over 12 years of experience in Data architecture, data modeling and data warehouse designing.
- Over 16 years of experience in data integration and ETL (SQL & Informatica)
- Over 4 years of experience with Microsoft Azure Cloud architecture and Snowflake CDW
- Hands on experience with Azure Data Lake, Azure Data Factory, Databricks, Synapse (SQL Data Warehouse), Azure Blob storage, Azure Storage Explorer
- Experience in implementing hybrid connectivity between Azure and on - premise using virtual networks, VPN and Express Route
- Microsoft certified Azure Data Engineer
- Microsoft certified SQL Server developer
- Informatica certified developer and administrator.
- Masters degree from University of Texas, Arlington
- Handled all phases of SDLC including requirements gathering, analysis, design, development, unit testing, on-going maintenance and production support
- Experience with different data modeling methodologies including Star, Snowflake and 3NF
- Worked on multiple database systems including SQL server, Oracle, Teradata and DB2
- Experience in data analysis, data profiling, data standardization and cleansing
- Lead six sigma projects and participated in multiple quality & process improvement projects.
- Experience in health care, clinical, care management, insurance, finance and telecom domain
TECHNICAL SKILLS
Languages: T-SQL, PL/SQL, XQuery, Python, PySpark, Spark SQL
ETL: Informatica power center, SSIS, Informatica Administrator, Informatica Power exchange, IDS, Informatica developer
Tools: /Applications: SQL server management studio, Visual studio Code, MS Visio, Enterprise Architect (Sparx EA), SQL Developer, Toad, ER studio, Erwin, PuTTY, WinSCP, Tectia SSH, MS Office, SharePoint, Microsoft Power BI
Code: Repository GIT, TortoiseSVN, Bitbucket, Bamboo, Visual Sourcesafe, TFS
Ticketing system: Atlassian Jira,, ServiceNow, BMC Remedy
RDBMS: MS SQL Server, Oracle, IBM DB2, MS Access, Teradata
Scripting: Power Shell, UNIX Shell Scripting (Bash Shell, Korn Shell), Python
Operating Systems: Windows, DOS, UNIX, Linux, Solaris, AIX
Schedulers: Informatica scheduler, Tidal scheduler, CronTab, Control-M
Project methodologies: Waterfall, Agile (Scrum, Kanban)
PROFESSIONAL EXPERIENCE
Enterprise Data Architect
Confidential
Responsibilities:
- Architected and implemented ETL and data movement solutions using Azure Data Platform services (Azure Data Lake, Azure Data Factory, Databricks, Delta lake)
- Experience in building cloud solutions architecture using Microsoft Azure and Snowflake CDW
- Designed and implemented migration strategies for traditional systems onto Azure platform.
- Developed conceptual solutions and created proof-of-concepts to demonstrate viability of solutions and performance.
- Experience designing and building Multitenant SaaS architectures and relational data models for enterprise applications
- Defined cloud network architecture using Azure virtual networks, VPN and express route to establish connectivity between on premise and cloud
- Developed Patient matching algorithms to identify members from different source systems and tie them together as one person using Azure Databricks and Delta Lake.
- Hands on experience with Azure Databricks, Azure Data Factory, Azure SQL and PySpark
- Worked on Data Factory editor, to create linked services, tables, data sets, and pipelines
- Implemented Copy activity, custom Azure Data Factory pipeline activities
- Designed and configured Azure Virtual Networks (VNets), subnets, Azure network settings, security policies and routing.
- Utilized Terraform to deliver IaC (Infrastructure as Code) and also to create Azure VMs
- Used GIT as version control system.
- Hands on with Datalake storage, Log Analytics management, User and role assignment
- Implemented Azure VMs and container management solutions and automation
- Experience in python programming using pandas and NumPy
- Developed python script for data cleansing and trigger ETL processes based on file availability.
- Designed reusable Informatica components to process different types of eligibility files
- Developed and automated five star Supplemental files for different health care payers
- Developed Payer gap process to help payers identify the gaps in the care
- Provided data for Power BI reports and dashboards
- Implemented Disaster Recovery, backup migration and Azure deployments
Data Architect / Azure Administrator
Confidential
Responsibilities:
- Designed on-premise ETL processes for enterprise data warehouse, data marts and enterprise applications using Informatica and SQL
- Migrated SQL data warehouse to Azure Data Lake.
- Experience in processing data in Azure Databricks
- Experience in setting up authentication and authorization in Azure
- Hands on experience with Azure Databricks, Azure Data Factory, Azure SQL and Pyspark
- Designed and configured Azure Network Security Groups (NSGs), Virtual Networks (VNets) and subnets
- Strong experience and knowledge in modular design and error handling
- Handled changes affecting entire infrastructure with minimal business impact
- Used powershell for scripting tasks including list of files to be processed and for loop through mechanism in Informatica
- Developed audit functionality for tracking different load processes.
- Monitored infrastructure and resolved any potential issues.
Sr. ETL Administrator / Architect
Confidential
Responsibilities:
- Installed and configured PowerCenter 9.6.1
- Applied hot fixes on Informatica PowerCenter.
- Deployed Informatica objects from Dev to QA and QA to Prod.
- Experience with native & LDAP authentication.
- Optimized Informatica code to improve performance for highly complex business logic
- Experience in XML, JSON, WSDL and XQuery coding for data transformation and loading.
- Created database objects for reporting and transactional processing.
- Performed data analysis, data profiling and data cleansing using Informatica and SQL.
- Worked extensively on performance tuning of Informatica workflows and SQL scripts.
- Hands on expertise with Informatica Data Services (IDS)
- Experience in Disaster Recovery planning, execution and testing.
- Supported SAP Business Objects reports and Tableau with Informatica PowerCenter and IDS.
- Leadership experience in coordinating with multiple technical teams, stakeholders and senior management.
Technical Lead
Confidential
Responsibilities:
- Responsible for end to end implementation and operations of Enterprise Data Warehouse using Informatica, Oracle 11g, Microsoft SQL Server 2012 and DB2.
- Represented Informatica and SQL server on Enterprise architecture board
- Installed and configured Informatica PowerCenter 8.6 and 9.1
- Deployed code to Informatica QA and production environments.
- Managed security and permissions of Informatica users.
- Performed business transformations and loaded data into EDW from SQL server, Oracle, flat files, XMLs, Teradata, Mainframes, APIs and Epic clarity system.
- Worked extensively with Informatica PowerCenter and PowerExchange.
- Designed data warehouse and data marts using Relational and Dimensional modeling.
- Developed SQL scripts to create database objects.
- Fine-tuned Informatica transformations and workflows for better performance.
- Developed reconciliation process to validate data is loaded correctly.
- Experience with Pushdown optimization (PDO)
- Extensive knowledge of HealthCare, Medicaid, Medicare, clinical data including claims data, procedure codes, diagnosis codes, provider and member data
- Developed solutions for multiple complex Care Management requirements and guided team to develop code with higher coding standards and performance.
- Designed de-identification process to mask PII data.
- Created Informatica mappings using Source qualifier, Expression, lookup, Aggregator, Router, Filter, Update Strategy, Joiner, XML generator and XML parser transformations.
- Designed, created and modified SQL database objects.
- Hands on experience on OLTP and OLAP applications.
- Provided 24x7 on-call support.
- Worked on Informatica PowerExchange for Change Data Capture (CDC).
- Collaborated with remote offshore team, created requirement documents, verified coding standards and conducted code reviews.
- Developed Informatica data services (IDS) in Developer tool for SOA applications.
- Developed PowerShell scripts and batch scripts for file processing and cleansing.
- Experience in integrating vendor systems with in-house applications while maintaining system integrity and compliance with HL7 and HIPAA regulations.