We provide IT Staff Augmentation Services!

Senior Lead Data Integration Engineer Resume

SUMMARY

  • Experienced on Healthcare Systems EPIC, Cerner, SOX,HIPAA, QNXT 837,835, 834, 270/271, 276/277, and HEDIS.
  • PIM Repository changes, include Adding New fields, Enumerations, Import/Export field parameters, Categories, Subentities etc.
  • Structure groups and features. Article attributes. Data Merge from Supplier catalogs to Master catalog. Creation of DQ rules. Advanced SQL queries for interfaces or reporting purposes.
  • Experienced on PIM, IDQ two major components IDQ Analyst IDQ Developer and a series of IDQ Options. Hands on experience on PIM, IDQ workbench for developing global, reusable Data Quality rules and strong experience on Data Profiling using IDQ Analyst.Good understanding on Supplier portal, Dashboards in PIM, Media Manager and plugin customizations using SDK, REST services etc.
  • Supporting upgrades to the tools and technology
  • Developed in Pentaho, Informatica, Talend for Big Data projects, Dashboard Designs, and Data Visualizations. Utilized with Pentaho(PDI) Kettle, Informatica IDQ, PowerCenter, MDM, data management, Electronic Data Interchange (EDI) data integration/quality, and data governance
  • AWS Analytics services, Azure, Machine Learning, Internet of Things, IOT, DevOps with Terraform services, GCP, Big Table, Directory, Scripting, Automation, Data Lake, S3, Blob Storage, Salesforce, Azure Data Factory, ESB(Enterprise Service Bus), Logic Apps, API, EMR, Hive, SQOOP
  • Managed Error Handling, Performance Tuning, Error Logging, clustering and High Availability

PROFESSIONAL EXPERIENCE

Confidential

Senior Lead Data Integration Engineer

Responsibilities:

  • Experienced with Big Data Cloud platforms Google Cloud (GCP), Big Query, Big Table, HBase, Azure, Databricks, HDInsight (Hortonworks), Kubernetes Hadoop applications in administration,, configuration management, monitoring, debugging, and performance tuning
  • Experienced working on Informatica, GUI, Workflows, Models, Hub, Data Quality, Modelling, Architecting complex multi - domain scalable MDM implementations using Informatica MDM 9.x, 10.x Stack, EBX5 and Stibo.
  • AWS services - EKS, ECS, EC2, RDS, Kinesis, S3, Cloudwatch, Secrets Manager, Elasticsearch Service, Kinesis, VPC, Route 53, Direct Connect, etc.
  • Experience working with Informatica MDM 10.2+, Informatica 360, Real-time Bstch Data processing,Product 360
  • Extensive experience designing scalable solutions for real-time integration with a downstream application involving large volumes of data for MDM projects.
  • Worked with MS Dynamics with hands on CRM development with Azure integration with app logic development experience; API development for cloud to cloud app interfaces utilizing C# and Java script
  • Worked with healthcare, claims payment / processing systems and EDI transactions
  • Worked with claims payment systems EPIC, Facets, QNXT, and SOA architecture (WCF/WPF)
  • Healthcare Systems EPIC, Cerner, SOX,HIPAA, QNXT 837,835, 834, 270/271, 276/277, and HEDIS
  • Worked with HIPAA testing tools Community Manager, Ramp Manager
  • Utilized DaaS to collect, correlate, enriche, and manage business core data to produce single view of truth(BVT) for analytical and reporting purposes.
  • Worked as a Looker Developer, worked with Dynamics CRM Solutions, Data Integration, ETL, Service Oriented Architecture (SOA), Enterprise Service Bus (ESB), and Enterprise Application Integration (EAI), SSIS, and SSRS
  • Configuration, troubleshooting, and supporting the hosted PIM instance, Creating a PIM instance and setting up Informatica PIM instance with user access
  • Utilized Azure services, Databricks and automation tools including Azure Resource Manager, Puppet, Chef, Ansible to implement cloud operating model to enable Environment-as-a-Service and DevOps
  • Worked with Automation Tools Chef and Puppet, Azure Services Compute, ECS, EC2, ECR, Lambda, VPC, S3, and IoT.
  • • Experience with HTTP, REST, JSON and IP technologies Networking, Storage, and Public Cloud platform.
  • Hands-on experience supporting, automating, and optimizing mission critical deployments in AWS, leveraging configuration management, CI/CD, and DevOps
  • DevOps experience in large Healthcare/Retail environment containing PHI/PII data sets
  • Architecting and designing DevOps processes for large organization
  • Experienced with Looker, installation and configuration of Kubernetes, clustering and managed local deployments in Kubernetes .
  • Provided consistent environment using Kubernetes for deployment scaling and load balancing to the application from development through production, easing the code development and deployment pipeline by implementing Docker containerization.
  • Experience working with Informatica MDM 10.2+, Informatica, PowerCenter, BDQ, IDQ, BDM,PIM, MDM, Informatica 360, Product 360
  • Extensive experience designing scalable solutions for real-time integration with a downstream application involving large volumes of data for MDM projects.
  • Worked with Azure Kubernetes Service (AKS) to simplify the deployment and operations of Kubernetes, Data Cloud Architect, Azure, Ansible, Jenkins, Docker, Kubernetes, DevOps, Automation, CI/CD, Kinesis, scale application infrastructures
  • Data loading using OOTB capability into PIM for the required capabilities, Hands on OOTB APIs from Informatica PIM, Data syndication capability, UI capabilities and fetching product data
  • PIM functionality as a data store for multiple product data types. Data loading using OOTB capability into PIM for the required capabilities, Hands on OOTB APIs from Informatica PIM, Data syndication capability, UI capabilities and fetching product data
  • PIM Repository changes, include Adding New fields, Enumerations, Import/Export field parameters, Categories, Subentities etc.Work on Items, Variants, Products, Structures and relationships.
  • AWS services - EKS, ECS, EC2, RDS, S3, Cloudwatch, Secrets Manager, Elasticsearch Service, Kinesis, VPC, Route 53, Direct Connect, etc.
  • Structure groups and features. Article attributes. Data Merge from Supplier catalogs to Master catalog. Creation of DQ rules. Advanced SQL queries for interfaces or reporting purposes.
  • Experienced as a Looker Developer, work on PIM, IDQ two major components IDQ Analyst IDQ Developer and a series of IDQ Options. Hands on experience on PIM, IDQ workbench for developing global, reusable Data Quality rules and strong experience on Data Profiling using IDQ Analyst.Good understanding
  • Worked on data migration from S3 to HIVE, Impala, IOT, Machine Learning, Kinesis, Octave, Web Services, Web Logic, Matlab, SAS,Supplier portal, Dashboards in PIM, Media Manager and plugin customizations using SDK, REST services etc. Supporting upgrades to the tools and technology being used in the
  • Worked with Visual Studio, Tableau, Looker, .NET, SalesForce Development, leveraging third party Cloud Service Providers (IaaS, PaaS, SaaS).
  • Worked as an Enterprise Data Warehouse Architect to design a Enterprise Data Warehouse for a single source repository for all the Health Care data with including dashboards/Ad-HOC Reports
  • Provided reports requiring information for improved Health Care reporting
  • Worked in Looke, Mode Admin,ESB(Enterprise Service Bus), Logic Apps, API, reviewing and documenting existing SQL database design, proposing, implementing architecture to migrate existing data from data repository to an enterprise data warehouse.
  • Suggested design and action plans to review the existing platform and document the baseline of existing databases;
  • Designed an enterprise data warehouse that is scalable, accommodates needs of the business.
  • Document proposed data warehouse for the proposed data warehouse and database architecture document, workflows, ERD,
  • Data Dictionary (DD) and related artifacts
  • Documented stored procedures in the existing data marts;
  • Performed Extract/Transform/Load (ETL) AWS Glue business data into data cubes and data warehouse;
  • Designed a data warehouse to ingest data from different data sources and scalable for future requirements.
  • Configured standard end user business Intelligence applications in developing reports and extracts, ad hoc queries and dashboards
  • Worked in ESB(Enterprise Service Bus), Machine Learning, Internet of Things, IOT, Octave, Web Services, Web Logic,Matlab, SAS, AWS, S3, Glue, Lambda, Logic Apps, API, Designing Enterprise Data
  • Worked with Looker, Mode admin, SQL Server Business Intelligence Architecture (BIA) top-down and bottom-up approach
  • Developed and designed Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) databases
  • Worked in Designing and implementing data extract using Replication, Stored Procedures and SQL Server
  • Integration Services (SSIS); Tableau, Looker, SSRS, Databricks,
  • Created procedures and policies for data warehouses; Installing, maintaining, administering Microsoft SQL Server;
  • Worked in Creating requirements, documents, systems architecture and interfaces, data models, configuration management documents and procedural manuals.
  • Worked in Maintaining database structures to support the Software Development Life Cycle (SDLC) phases
  • Worked in SalesForce developing and designing Microsoft Azure, Databricks, SQL Data Warehouses; and, developing and maintaining SQL Server 2016/SQL Server Analysis Services (SSAS)/SSIS/SSRS.
  • Designed and developed data ingestion pipeline, performed data migration and conversion activities.
  • Performed Data Curation, Data Blending and Data harmonization.
  • Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
  • Applied experience to develop, test, and document enhancements or extension to the Azure Data lake
  • Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (libraries, modules, execution environmen
  • Performed end to end automation of ETL / Data Ingestion process, Kubernetes SalesForce Developer for various datasets that are being ingested into the Azure
  • Worked with Informatica, PowerCenter, BDQ, IDQ, BDM,PIM, MDM, PIM, Looker, Mode Admin, Big Data, IBM DataStage ETL, ESB(Enterprise Service Bus), Logic Apps, API,BI, Tableau, DW to help form a cloud based engineering team to deliver platform automation and security.
  • Built data workflows with ETL, SSIS, BI, DW, Salesforce, AWS EMR, Data Lake, Redshift, Hadoop, Spark, Spark SQL, Scala, and Python
  • Experience working with Informatica MDM 10.2+, Informatica 360, Product 360
  • Extensive experience as a Looker Developer designing scalable solutions for real-time integration with a downstream application involving large volumes of data

Confidential

Senior Lead Azure Achitect

Responsibilities:

  • Configuration, troubleshooting, and supporting the hosted PIM instance, Creating a PIM instance and setting up Informatica PIM instance with user access, Octave, Web Services, Web Logic, Matlab, SAS
  • Data loading using OOTB capability into PIM for the required capabilities, Hands on OOTB APIs from Informatica PIM, Data syndication capability, UI capabilities and fetching product data
  • PIM functionality as a data store for multiple product data types. Data loading using OOTB capability into PIM for the required capabilities, Hands on OOTB APIs from Informatica PIM, Data syndication capability, UI capabilities and fetching product data
  • PIM Repository changes, include Adding New fields, Enumerations, Import/Export field parameters, Categories, Subentities etc.Work on Items, Variants, Products, Structures and relationships. as a Looker Developer MS Dynamics Developer with with hands on CRM development with Azure integration with app logic development experience; API development for cloud to cloud app interfaces utilizing C# and Java script
  • Experienced on PIM, IDQ two major components IDQ Analyst IDQ Developer and a series of IDQ Options. Hands on experience on PIM, SalesForce Developer, MS CRM, IDQ workbench for developing global, reusable Data Quality rules and strong experience on Data Profiling using IDQ Analyst.
  • Worked on Supplier portal, Dashboards in PIM, Media Manager and plugin customizations using SDK, REST services etc. Supporting upgrades to the tools and technology being used in the Project.
  • Experienced as an established hands-on technical leader of Cloud Software Engineering teams.
  • Worked with ESB(Enterprise Service Bus), Logic Apps, API,IBM DataStage, as an Enterprise Data Warehouse Architect to design a Enterprise Data Warehouse for a single source repository for all the sales and marketing data with including dashboards/Ad-HOC Reports for client/stakeholders.
  • Provided reports requiring information for improved Health Care reporting
  • Worked in ESB(Enterprise Service Bus), Logic Apps, API, Azure Cosmos DB, Data Bricks, Event Hubs reviewing and documenting existing SQL database design and proposing and implementing an architecture to migrate existing data from data repository to an enterprise data warehouse.
  • Suggested design and action plans to review the existing platform and document the baseline of existing databases;
  • Designed an enterprise data warehouse that is scalable, accommodates needs of the business.
  • Document proposed data warehouse for the proposed data warehouse and database architecture document, workflows, ERD,
  • Data Dictionary (DD) and related artifacts
  • Documented stored procedures in the existing data marts;
  • Performed Extract/Transform/Load (ETL) the business data into data cubes and data warehouse;
  • Designed a data warehouse to ingest data from different data sources and scalable for future requirements.
  • Recommended and configured standard end user business Intelligence applications in developing reports and extracts, ad hoc queries and dashboards
  • Worked in IBM DataStage,Designing Enterprise Data Warehouses;
  • Worked with SQL Server Business Intelligence Architecture (BIA) top-down and bottom-up approach
  • Developed and designed Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) databases
  • Worked in Designing and implementing data extract using Replication, Stored Procedures and SQL Server
  • Worked with Service Oriented Architecture (SOA), Enterprise Service Bus (ESB), and Enterprise Application Integration (EAI)
  • Worked with Visual Studio, .NET, MS SQL Server, Oracle, Java, and Python
  • Created procedures and policies for data warehouses; Installing, maintaining, administering Microsoft SQL Server;
  • Worked in Creating requirements, documents, systems architecture and interfaces, data models, configuration management documents and procedural manuals
  • Worked on IBM DataStage, Maintaining database structures to support the Software Development Life Cycle (SDLC) phases
  • Worked in developing and designing Microsoft Azure SQL Data Warehouses; and, developing and maintaining SQL Server 2014/2016/SQL Server Analysis Services (SSAS)/SSIS/SSRS.
  • Performed data manipulations using various Pentaho/ Talend components like tMap, tJavarow, tjava
  • Worked to form a cloud based big data engineering team to deliver platform automation and security.
  • Built data workflows by using GCP, HBase, Big, Table, Big Query, AWS EMR, Spark, Spark SQL, Scala, and Python
  • Worked with tableau, Salesforce, IBM DataStage, Azure Data Factory pipelines and other Azure Data Platform to orchestrate management tasks using Azure
  • Worked with Azure Data Factory (ADF) to compose and orchestrate Azure data services.
  • Utilized Azure Data Factory to create, schedule and manage data pipelines, Data Cloud Architect, Azure, Ansible, Jenkins, Docker, Kubernetes, DevOps
  • Worked with EMR provisioning, updating through service catalog using Cloud formation.
  • Worked with Hadoop cluster set up, Electronic Data Interchange (EDI) performance fine-tuning, monitoring and administration.
  • Worked with Hadoop Eco System, Tableau, Hive, MapReduce YARN, Tez, Presto, Beeline, Pig, Spark, Scala
  • Worked with AWS Cloud formation to create service catalog to launch EMR clusters with desired setup.
  • Worked with data migration S3 to Hive, AWS EMR, Ranger, and Hadoop technologies
  • Experienced with IBM DataStage, PWX Informatica CDC 10.X project implementations, Informatica BDM, PWX, B2B DT, DX
  • Worked with Oracle PL/SQL, Big Data, enterprise projects implementations
  • Worked with PWX Informatica CDC to stream data in real time, PWXCCL remote logger to apache Kafka distributed platform
  • Created ETL packages, Tableau, Looker, reports, configured and installed tools with a highly available architecture
  • Designed, reviewed and fixed security vulnerabilities at network/subnet/security groups level
  • Created security standardized templates including password management strategy and implementation
  • Installed custom software and automated installation process
  • Developed required modifications of business logic in Data Mart and transition to Data Lake
  • Created Thematic Heat maps using MapInfo in Tableau, Looker, Lookup, SSRS
  • Worked with data migration S3 to Hive, Hadoop to S3, Azure Data Factory pipelines and other Azure Data Platform to orchestrate management tasks using Azure

Confidential

Senior Lead Big Data/Hadoop

Responsibilities:

  • Data loading using OOTB capability into PIM for the required capabilities, Hands on OOTB APIs from Informatica PIM, Data syndication capability, UI capabilities and fetching product data
  • Experienced working on Informatica MDM, UI, Workflows, Models, Hub, Data Quality, Modelling, Architecting complex multi-domain scalable MDM implementations using Informatica MDM 9.x, 10.x Stack, EBX5 and Stibo.
  • Experience working with Informatica MDM 10.2+, Informatica 360, Product 360
  • Extensive experience designing scalable solutions for real-time integration with a downstream application involving large volumes of data for MDM
  • PIM functionality as a data store for multiple product data types. Data loading using OOTB capability into PIM for the required capabilities, Hands on OOTB APIs from Informatica PIM, Data syndication capability, UI capabilities and fetching product data
  • PIM Repository changes, include Adding New fields, Enumerations, Import/Export field parameters, Categories, Subentities etc.Work on Items, Variants, Products, Structures and relationships.
  • Structure groups and features. Article attributes. Data Merge from Supplier catalogs to Master catalog. Creation of DQ rules. Advanced SQL queries for interfaces or reporting purposes.
  • Experienced on PIM, IDQ two major components IDQ Analyst IDQ Developer and a series of IDQ Options. Hands on experience on PIM, IDQ workbench for developing global, reusable Data Quality rules and strong experience on Data Profiling using IDQ Analyst.Good understanding on Supplier portal, Dashboards in PIM, Media Manager and plugin customizations using SDK, REST services etc. Supporting upgrades to the tools and technology being used in the Project.
  • Worked in ESB(Enterprise Service Bus), Logic Apps, API,reviewing and documenting existing SQL database design and proposing and implementing an architecture to migrate existing data from data repository to an enterprise data warehouse.
  • Suggested design and action plans to review the existing platform and document the baseline of existing databases;
  • Designed an enterprise data warehouse that is scalable, accommodates needs of the business.
  • Document proposed data warehouse for the proposed data warehouse and database architecture document, workflows, ERD,
  • Data Dictionary (DD), data migration S3 to Hive and Hadoop to S3 migrations, and related artifacts
  • Documented stored procedures in the existing data marts;
  • Performed Extract/Transform/Load (ETL) the business data into data cubes and data warehouse;
  • Designed a data warehouse to ingest data from different data sources and scalable for future requirements.
  • Recommended and configured standard end user business Intelligence applications in developing reports and extracts, ad hoc queries and dashboards, Looker, Tableau, LookUp, Pentaho, Talend.
  • Worked in Designing Enterprise Data Warehouses;
  • Worked with SalesForce Development, SQL Server Business Intelligence Architecture (BIA) top-down and bottom-up approach
  • Developed and designed Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) databases
  • Worked in ESB(Enterprise Service Bus), Logic Apps, API, Designing and implementing data extract using Replication, Stored Procedures and SQL
  • Created procedures and policies for data warehouses; Installing, maintaining, administering Microsoft SQL Server;
  • Worked in Creating requirements, documents, systems architecture and interfaces, data models, configuration management documents and procedural manuals; and,
  • Worked in Maintaining database structures to support the Software Development Life Cycle (SDLC) phases
  • Worked in developing and designing GCP, HBase, Big, Table, Big Query,Microsoft Azure SQL Data Warehouses; and, developing and maintaining SQL Server 2016/SQL Server Analysis Services (SSAS)/SSIS/SSRS.
  • Worked with Pentaho, ETL, SSIS, Talend Open Studio &Talend Enterprise platform for data management
  • Managed Error Handling, Performance Tuning, Error Logging, clustering and High Availability in Talend
  • Worked with SalesForce, Informatica Cloud Data Integration to deliver accessible, trusted, and secured data to facilitate more valuable business decisions to identify competitive advantages to better service customers, and build an empowered workforce.
  • Worked with IBM DataStage, Azure Data Factory pipelines and other Azure Data Platform to orchestrate management tasks using Azure Automation,
  • Worked with Azure Data Factory (ADF) since its a great SaaS solution to compose and orchestrate Azure data services. Data Cloud Architect, Azure, Ansible, Jenkins, Docker, Kubernetes, DevOps, Automation, CI/CD, Utilized Azure Data Factory to create, schedule and manage data pipelines
  • Worked as a Data Engineer to help form a cloud based big data engineering team to deliver platform automation and security.
  • Built data workflows by using AWS EMR, Spark, Spark SQL, Scala, and Python
  • Configured and installed tools with a highly available architecture, created ETL SSIS packages, VB, C#
  • Designed, reviewed and fixed security vulnerabilities at network/subnet/security groups level
  • Worked with Informatica Cloud since it’s flexible and scalable transformations and advanced capabilities, to seamlessly integrate growing data volumes across disparate sources in the DWH using wizards, developed dashboard reports using Tableau, preconfigured templates, Electronic Data Interchange (EDI), and out-of-
  • Experienced with tableau, PWX Informatica CDC 10.X project implementations, Informatica BDM, PWX, B2B DT, DX
  • Worked with Oracle PL/SQL, Big Data, enterprise projects implementations
  • Worked with PWX Informatica CDC to stream data in real time, PWXCCL remote logger to apache Kafka distributed platform
  • Used Tableau, Looker, Pentaho, Hadoop, Spark-Streaming APIs to perform necessary transformations and actions on the fly for building the common learner data model which gets the data from Kafka in near real time and Persists into Cassandra.
  • Worked with SalesForce Developer Tools, Azure Data Factory pipelines and other Azure Data Platform to orchestrate management tasks using Azure Automation, data migration S3 to Hive
  • Used Pentaho, Hadoop, Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
  • Developed Scala scripts, UDFFs using Data frames/SQL and RDD/MapReduce in Spark 1.6 for Data Aggregation, queries and writing data back into OLTP system Sqoop. Created and managed Worked with EMR provisioning, updating through service catalog using Cloud formation.
  • Worked with SalesForce Developer Tools, Hadoop cluster set up, performance fine-tuning, monitoring and administration.
  • Configured deployed and maintained multi-node Dev and Test Kafka Clusters.
  • Developed Spark scripts by using Scala shell commands as per the requirement.
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
  • Loaded the data into Spark RDD and do in memory data Computation to generate the Output response.
  • Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.
  • Worked on migrating Map Reduce programs into Spark transformations using Spark and Scala.
  • Worked with Hadoop/Hive/Big data to architect, design and build solutions to create dashboards/Data Visualizations
  • Utilized Hadoop HiveQL(HQL) development and performance tuning on full lifecycle implementations.
  • Designed and developed POCs in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle.
  • Worked with data migration S3 to Hive, ETL Electronic Data Interchange (EDI) interfacing components of solution design and configuration activity in

Hire Now