Senior Lead Data Integration Engineer/senior Technology Architect Resume
2.00/5 (Submit Your Rating)
CA
PROFESSIONAL SUMMARY:
- Seventeen(17) years of experience working as a Lead Architect/Developer Engineer, AWS, Azure, GCP Cloud Services, Web App, Web Logic, Software Development Life Cycle ( SDLC ), Blob Storage, Snowflake DW, Hadoop, Hive, Azure, Databricks, DevOps, Terraform, MS Dynamics CRM, Kinesis, Jenkins, Docker, Chef, Ansible, Redshift, DataBricks, GCP, Big Table, HBase, EMR, S3, EC2, Lambda, Kubernetes, Data Integration, Sagemaker, Zeppelin, Migration, IaaS, Paas, Saas, Big Data, Cloud Architect, Ansible, Palo Alto Network Security, Docker, DevOps, Automation, CI/CD, Big Data Hadoop/Spark, HIVE, Java, Kinesis, Azure Cosmos DB, Chef, Event Hubs, MS SQL Server, Dashboard, AWS, S3, Cloudwatch, Elasticsearch Service, VPC, IOT, Machine Learning, Tableau, Looker, TALEND, SQL,SSIS, SSAS, SSRS, UNIX/Linux, RESTful API Web Services, Terraform, Azure DevOps, Security.
- Healthcare, HIPAA transactions 837,835, 834, 270/271, 276/277, HL7, FHIR, HIPAA, and HEDIS.
- AWS Analytics services, Azure, Machine Learning, Internet of Things, IOT, DevOps wif Terraform services, GCP, Big Table, Directory, Scripting, Automation, Data Lake, Blob Storage, Azure Data Factory
PROFESSIONAL EXPERIENCE
Confidential, CA
Senior Lead Data Integration Engineer/Senior Technology Architect
Responsibilities:
- Experienced wif Big Data Cloud platforms Google Cloud (GCP), Big Query, Big Table, HBase, Azure, Databricks, HDInsight (Hortonworks), Kubernetes,
- Google cloud platform(GCP) framework BigQuery, DataFlow, DataLab, Pub/Sub, ML Engine, BigTable, Workflow
- Worked wif HL7 functions to facilitate interoperability between various health information systems.
- Version 2.x Messaging Standard AWS services - EKS, ECS, EC2, RDS, Kinesis, S3, Cloudwatch, Cloudformation, Secrets Manager, Elasticsearch Service, Jupyter, Sagemaker, Kinesis, VPC, Route 53, Direct Connect, etc.
- Worked as a Databricks Engineer wif focus on Data Warehousing ETL development in Azure Databricks, MS Azure Databricks wif Database and ADLS (Python,
- Strong in MS Azure Data Factory wif Databricks MS Azure Databricks wif Database and ADLS (Python, Spark SQL Java, Javascript, using Parquet files)
- Working knowledge of Azure DevOps and its interaction wif Databricks and Data Factory, Databricks Delta Lake
- Worked as a Looker Developer, Azure AWS Databricks, IoT, pyspark, worked wif Dynamics CRM Solutions, Data Integration, ETL, Service Oriented Architecture (SOA), Enterprise Service Bus (ESB), and Enterprise
- Worked wif Automation Tools Chef and Puppet, Azure Services Compute, ECS, EC2, ECR, Lambda, VPC, S3, and IoT.
- AWS services - EKS, ECS, EC2, RDS, S3, Cloudwatch, Secrets Manager, Elasticsearch Service, Kinesis, VPC, Route 53, Direct Connect
- Worked on data migration from S3 to HIVE, Impala, IOT, Machine Learning, Kinesis, Application Integration (EAI), SSIS, and SSRS
- Worked wif Visual Studio, Tableau, Looker, .NET, Integration Services (SSIS); Tableau, Looker, SSRS, Databricks
- Worked on Enterprise Data Warehouse to design for a single source repository for all Health Care data dashboards/Ad-HOC Reports
- Developed and designed Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) databases
- Worked in developing and designing Microsoft Azure, Databricks, SQL Data Warehouses,SQL Server (SSAS)/SSIS/SSRS.
- Built data workflows wif ETL, SSIS, BI, DW, AWS EMR, Data Lake, Redshift, Hadoop, Spark, Spark SQL, Scala, and Python
- Experienced working in building and delivering Cloud Services and Cloud Automation solutions, utilized Terraform services, data migration S3
- Worked on Azure Cosmos DB, Data Bricks, Event Hubs API, Azure Data Factory pipelines, Azure Data Platform to orchestrate management tasks Worked wif Google Cloud (GCP), Big Query, Big Table, HBase, Azure, Databricks, Data Factory (ADF) to compose and orchestrate Azure data services, Data Cloud Architect, Azure, Ansible, Jenkins, Docker, Kubernetes,Utilized Azure Data Factory to create, schedule and manage data pipelines
- Worked wif Looker, Machine Learning, IOT, Azure Data Factory pipelines and other Azure Data Platform to orchestrate management tasks
- Used Pentaho/PDI, Kettle, Electronic Data Interchange (EDI), SSIS, Batch Data Analysis using Hive, SQL, SSIS, SSRS, SSAS
- Worked wif Facets, Healthcare Data/Claims 835 and 837 formats for analytical purposes, X12 Enterprise Data Integration(EDI), PHI
- Azure Cosmos DB, Data Bricks, Terraform services for DevOps wif Kubernetes,Terraform tools and services, Event Hubs
- Worked wif Facets, EPIC, Automation Tools Chef and Puppet, Azure Services Compute, Networking, Storage, and Public Cloud platform.
Confidential
Senior Lead Azure Architect/Data Integration ETL
Responsibilities:
- Worked as Azure architecture, design and implementation on-prem and hybrid cloud solutions utilizing Azure and AWS
- Experienced wif DevOps, CI/CD pipeline tools Jenkins, Azure Cosmos DB, Data Bricks, Event Hubs, Bitbucket.
- Utilized Azure services and automation tools including Azure Resource Manager, Puppet, Chef, Ansible to implement cloud operating model to
- Strong in MS Azure Data Factory wif Databricks MS Azure Databricks wif Database and ADLS (Python, Spark SQL, using Parquet files)
- Worked in developing Microsoft Azure SQLWarehouses; and, developing and maintaining SQL Server Analysis Services (SSAS)/SSIS/SSRS.
- Built data workflows by using GCP, HBase, Big, Table, Big Query, AWS EMR, Spark, Spark SQL, Scala, and Python
- Worked wif Azure Databricks, Pyspark, Java, Javascript, Scala, Azure Data Factory (ADF), Databricks Delta Lake
- Worked wif Automation Tools Chef and Puppet, Data Cloud Architect, Azure, Ansible, Jenkins, Docker, Kubernetes, DevOps, Automation, CI/CD, Azure Services Compute, Networking, Storage, and Public Cloud platform.
- Worked wif Visual Studio 2015, Team Foundation Server, .NET 4.5, SSRS, SQL Server Reporting Services, Web Security.
Confidential, San Francisco, CA, Dallas, TX, Seattle, WA
Senior Lead Architect/ Developer Cloud AWS/Azure/ GCP
Responsibilities:
- Worked in ESB(Enterprise Service Bus), Logic Apps, API,reviewing and documenting existing SQL database design and proposing and implementing an architecture to migrate existing data from data repository to an enterprise data warehouse.
- Worked in developing and designing GCP, HBase, Big, Table, Big Query, Microsoft Azure SQL Data Warehouses; and, developing and maintaining SQL Server 2016/SQL Server Analysis Services (SSAS)/SSIS/SSRS.
- Worked wif Pentaho, ETL, SSIS, Talend Open Studio &Talend Enterprise platform for data management
- Worked wif Azure Data Factory pipelines Azure Data Platform to orchestrate management tasks using Azure Automation,
- Configured and installed tools wif a highly available architecture, created ETL SSIS packages, VB, C#
- Used Tableau, Looker, Pentaho, Hadoop, Spark-Streaming APIs to optimize existing algorithms in Hadoop, Spark Context, Spark-SQL
Confidential, Los Angeles, CA
Lead ETL Pentaho Talend Developer
Responsibilities:
- Worked wif Looker, ESB(Enterprise Service Bus), Logic Apps, API,AWS EMR, Kinesis, and Hadoop technologies
- Worked wif HL7 and MIRTH to send and receive HL7 Messages between Mirth Connect and teh HL7 Soup message editor.
- Worked wif Azure IaaS, SaaS, PaaS Electronic Data Interchange (EDI) to deliver accessible, trusted, and secure data
- Performed Electronic Data Interchange (EDI), Databricks, Azure, Analytics, Scala, Migration projects to migrate data from data warehouses
- Worked on Tableau Dashboards, Talend, SSRS, ETL tool, develop jobs and scheduled jobs in Talend integration suite.
- Monitored and supported teh Talend jobs scheduled through Talend Admin Center (TAC).
- Tuning of SQL for optimal performance, Dimensional Modeling, OLAP, OLTP, Star, Snowflake Schema
Confidential, San Francisco, CA
Lead Talend Developer/ETL BI DW Dashboard Developer
Responsibilities:
- Experienced working to build and deliver Cloud Services and Cloud Automation solutions, Big Data, Hadoop,Pentaho,Talend
- Worked wif FHIR, HL7 V2, HL7 V3, and CDA, RESTful Webservices, Healthcare Patient Data, HIPAA compliance audits
- Established, maintained, and enforced ETL architecture design principles, techniques, standards, and best practices
- Used Informatica Cloud Data Integration Electronic Data Interchange (EDI) for global, distributed data warehouse and analytics projects.
- Worked wif data migration S3 to Hive, cloud data warehouses, Tableau, AWS Redshift, Azure SQL Data Warehouse, SQL, Java, Javascript and Snowflake and Informatica Cloud Data Integration solutions to augment teh performance, productivity, and extensive connectivity to cloud and on-premises sources.
- Experienced in data migration S3 to Hive, Azure architecture, design and implementation on-prem and hybrid cloud solutions utilizing Azure and AWS
- Experienced working in building and delivering Cloud Services and Cloud Automation solutions.
- Data warehouse using wizards, preconfigured templates, and out-of-teh-box mappings
- Worked wif Big Data, Electronic Data Interchange (EDI), Hadoop, Hive, Pig, Sqoop, Pentaho, Informatica
Confidential, Los Angeles, CA
ETL Data Integration/BI Architect Developer
Responsibilities:
- Managed Error Handling, Performance Tuning, Error Logging, clustering and High Availability in Talend
- Implement and utilize teh function of HL7 standards to facilitate interoperability between our health information systems.
- Worked wif HL7 on all documentation and data in order to remain consistent across all teh systems organizations.
- Worked wif Healthcare and Patient Data for analytics and Business Analysts to correlate business requirements to domain entities and data elements
- Analyzed and performed data integrations, Pentaho SSIS, SSAS, SSRS, created dashboards, data visualizations, etc.