We provide IT Staff Augmentation Services!

Sr. Data Integration Engineer Resume

3.00/5 (Submit Your Rating)

AZ

SUMMARY:

  • 8+ years of comprehensive and in - depth experience in Information Technology with a strong background in Database development and Data warehousing. I have expertise in ETL process using Informatica Power Center 10.2.0/9.6x/9.5x/8.x, Power Exchange using Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, and Workflow Manager & Workflow Monitor.
  • Hands on experience working with ICS, IICS and ICRT building webservices and business process to invoke SOAP/REST calls.
  • Used Address doctor webservice which call a WSConsumer connection and performs address validation and split.
  • Used Postman and SOAP UI for testing API.
  • Experience with Data integration using Informatica cloud for Oracle OCRM data and moving the data into sql server.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Extensive Experience in developing and performance tuning of Informatica mappings.
  • Hands on Using Pushdown Optimization and partitioning feature of Informatica.
  • Worked on Agile creating, managed and worked on stories.
  • Worked on different RDBMS management in Sql server, oracle, Teradata, Greenplum and db2.
  • Experience in writing complex queries and stored procedures.
  • Designed and supervised overall development of Data Mart and Oracle-hosted dimensional models.
  • Worked with integration with marketing cloud through web service consumer with the retrieve, insert, update and delete methods.
  • Technical expertise in Relational Databases like Teradata, DB2, SQL, PL/SQL, Stored Procedures, Functions, Packages, Indexes, Sub Queries, and Performance Tuning. Implemented Data Cleansing and necessary Test plans to ensure the successful execution of the data loading processes.
  • Has an intensive experience working on Address doctor, Matching, De-duping and Standardizing.
  • Data cleansing, standardizing and matching using IDQ and integrating with Informatica.
  • Identified and eliminated duplicates in datasets thorough IDQ 9.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, it enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Performed Match and merge configurations and used Merge Manager to assign the match rules and create the data ready for the merge process.
  • Used scheduling tools CA Autosys, Goanywhere and Tidal.
  • Extensive written Unix scripts to automate job schedule, file watch and sending mail notifications.
  • Good understanding of Salesforce using Developer console, workbench for data manipulations.

TECHNICAL SKILLS:

Data warehousing: Informatica Power Center 10.2.0, 9.6/9.5/9.1.0/8.6 , Informatica Power Exchange, ICS, IICS, ICRT, Informatica Analyst,SSIS.

Data Modeling: Ralph Kimball Methodology, Bill Inmon Methodology, Snowflake, Star Schema, FACT Tables, Dimension Tables.

Databases: Greenplum, Microsoft SQL Server, Teradata, DB2, SQL, Oracle, PL/SQL.

Environment: Windows 2007/2003, Unix

Job Scheduling: Informatica Scheduler, Control M, Tidal, Goanywhere.

Others: Microsoft Word, Microsoft Excel, Outlook, XML, and HTML, Informatica Data Analyst, Informatica Data validation option (DVO), HP ALM and HP Quality Center.

Languages: SQL,PLSQL,Unix,Apex

PROFESSIONAL EXPERIENCE:

Confidential, AZ

Sr. Data Integration Engineer

  • Worked with IICS and ICRT for integration of SFDC with Datawarehouse.
  • Used address doctor process for address validation and address split.
  • Created mapping task and Data synchronization/replication task to sync up data from SFDC to DW.
  • Performed reverse engineer analysis to understand the IICS code and understand production issues.
  • Worked on JIRA production defects.
  • Designed and developed linear task flows to combine various integration tasks and run them in a specified order.
  • Used saved query and Mapplet-PC import to upload a powercenter mapplet to use its transformation logic in a synchronization task or a mapping.
  • Designed and developed views in SQL Server and performed performance tuning.
  • Create indexes, clustered and non-clustered in sql server to improve the performance of views.
  • Performed update/delete/insert using Workbench and Data loader.
  • Monitor bulk jobs in both Informatica cloud and SFDC.

Environment: IICS, ICRT, Informatica Powercenter 9.6, Informatica Cloud, Microsoft SQL Server Management Studio 11.0.2, SQL Server 12.0, Toad, Oracle SQL Developer, Unix.

Confidential, FL

Sr. Informatica Developer

  • Worked with Data migration from OCRM to SFDC using Informatica Cloud.
  • Created Data replication and synchronization tasks fro data migration from SQL server to SFDC.
  • Designed and developed linear taskflows to combine various integration tasks and run them in a specified order.
  • Used saved query and Mapplet-PC import to upload a powercenter mapplet to use its transformation logic in a synchronization task or a mapping.
  • Designed and developed views in SQL Server and performed performance tuning.
  • Create indexes, clustered and non-clustered in sql server to improve the performance of views.
  • Created views used for production fixes.
  • Used Informatica scheduler to automate powercenter jobs.
  • Develop and execute unit test cases.
  • Provide quality documentation and status updates.
  • Worked extensively on production defect and support.
  • Performed update/delete/insert using Workbench and Data loader.
  • Monitor bulk jobs in both Informatica cloud and SFDC.

Environment: Informatica Powercenter 9.6, Informatica Cloud(ICS), Microsoft SQL Server Management Studio 11.0.2, SQL Server 12.0, Oracle SQL Developer, Unix.

Confidential, Charlotte, NC

Sr. Informatica Developer

  • SFDC integration and data migration through Informatica.
  • Integration with Microsoft Dynamics.
  • Worked with integration with marketing cloud for user access management.
  • Extensively involved in testing the system from beginning to end to ensure the quality if the adjustments made to oblige the source system up-gradation.
  • Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
  • Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.
  • Performed Match and merge configurations and used Merge Manager to assign the match rules and create the data ready for the merge process.
  • Created data backup jobs using replication task to backup entire source data.
  • Used Informatica Developer for Data standardization, data quality and data scrubbing.
  • Worked with Address validator for both Basic and advanced model for address validations.
  • Exported reusable IDQ mappings as mapplets to powercenter and deployed IDQ applications on servers.
  • Created reusable Match mapplets that can be used for multiple lob’s.
  • Used IDQ for the data load from landing to stage tables for the MDM process.
  • SFDC integration for Data migration and integrations.
  • LDAP integration sync-up with SFDC user, profiles and roles.
  • Used Java transformation for creating xml from expression transformation.
  • Used Unix shell scripts to automate the workflows using pmcmd/pmrep commands.

Environment: Informatica Powercenter 9.6, Informatica developer 10.2.0/9.6.1 , Microsoft SQL Server Management Studio 11.0.2, SQL Server 12.0, Oracle SQL Developer, Unix, CA Autosys.

Confidential, Denver, Colorado

Sr. Informatica Developer

Responsibilities:

  • Analyzed, designed and worked closely with Business Analyst for requirement gathering.
  • Created stored procedures for one-time updates.
  • Worked with Policy and Quote tables for developing SAS Mart, SAS Individual and SAS Policy.
  • Created deployment groups in Tidal and created dependencies for data load.
  • Created clustered indexes and used temporary tables to improve performance.
  • Worked with Business Analyst to identify issues and design solution.
  • Migrated historical data, which is one time and built re-platform process for ongoing data loads.
  • Applying dedupe logic for mappings based on the requirements.
  • Developed metadata mappings and also tuned for better performance.
  • Hands on experience working with SQL server database.
  • Extensively involved in testing the system from beginning to end to ensure the quality if the adjustments made to oblige the source system up-gradation.
  • Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
  • Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.
  • Detail system defects are created to inform the project team about the status throughout the process.

Environment: Informatica 9.6, Microsoft SQL Server Management Studio 11.0.2, SQL Server 12.0, Tidal.

Confidential, Jessup, PA

ETL Informatica Developer

Responsibilities:

  • Analyzed and created technical specs documents for all the mappings.
  • Migrated historical data, which is one time and built migration process for ongoing data loads using Metadata process.
  • Worked on FACETS Data tables and created audit reports using queries. Manually loaded data in FACETS and have good knowledge on FACETS business rules.
  • Developed metadata mappings and also tuned for better performance.
  • Worked with SSIS tool for developing packages. Created metadata framework for data ingestion.
  • Hands on experience working with Greenplum database.
  • Involved in loading the data into Greenplum from SQL server database and Facets.
  • Hands on experience with external tables created using gpfdist.
  • Created generic function to create external and stage tables.
  • Worked on functions to load data into tables and create process log tables for delta.
  • Export and import data from Microsoft SQL server into Greenplum database.
  • Extensively used Tortoise SVN to check-in all the documents.
  • Extensively worked on Deployment and release procedures in SIT, UAT and Prod environment.
  • Analyzed existing system and developed business documentation on changes required.
  • Used PowerShell scripts for archiving and rename files.

Environment: Facets, GemfireXD, SQuirrel SQL Client 3.6, Microsoft SQL Server Management Studio 11.0.2, pgAdmin 1.16 Greenplum, HP ALM 12.00, Active Batch.

Confidential, South Portland, ME

ETL Informatica Developer

Responsibilities:

  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Developer.
  • Involved in requirement analysis, ETL design and development for extracting data from the source systems like Oracle, flat files, XML files and loading into data mart.
  • Converted functional specifications into technical specifications (design of mapping documents).
  • Developed complex mappings to load data from multiple source systems like Teradata, DB2, flat files and XML files to Data Mart in DB2 database.
  • Used Control-M to schedule jobs.
  • Extensively used various types of transformations such as Expression, Joiner, Update strategy, Aggregator, Filter, and Lookup, Sorter, Rank and Router.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Worked on complex mapping related to Slowly Changing Dimensions.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for bulk loading.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
  • Used HP Quality center for defects.

Environment: Informatica Powercenter 9.5/9.1 (Powercenter Designer, workflow manager, workflow monitor), Informatica Data Quality 9.1, TOAD 10, 9.5 Remedy, Teradata, DB2, UNIX, Oracle.

We'd love your feedback!