Sr. Etl Developer, Salesforce.com/cloud Data Migration Specialist Resume
Minneapolis, MN
EXPERIECENE SUMMARY:
- Over 7+ years of IT experience in analysis, design, development, implementation and troubleshooting of Data Mart / Data Warehouse applications using ETL tools; Primarily Informatica Power Center, Informatica Cloud/MDM/IDQ/Power Exchange, SSIS, Talend and other cloud tools such as data loader, Pervasive and Relational Junction.
- Worked as a Sr. Team Member on multiple engagements for Data Warehousing and Cloud data Migration/Integration projects as an ETL Developer using different type of ETL Tools.
- Extensive experience with Banking, Healthcare, Energy and Media domains.
- Extensive experience in Salesforce.com Data Migration using Informatica with SFDC Connector, Informatica Power Exchange, IDQ, Informatica Cloud and Apex Data Loader.
- Extensive hands on experience loading large volume of data into Data Warehouse from several legacy systems by applying profiling, Master Data Management (MDM) and cleansing techniques.
- Experience in using several Open source and cloud ETL tools such as Talend, Pervasive, Relational Junction and Data loader to support data migrations into multiple CRM applications such as Salesforce.com, Siebel and Microsoft Dynamics CRM.
- Develop scalable, reusable ETL jobs using the selected ETL tool for an engagement.
- Utilize Integration Services to construct enterprise level ETL procedures, ensuring data integrity and proper data transformations to load data into the target data structure
- Identify data quality issues and support the data governance initiative by participating in necessary activities and develop data profiling solutions using Informatica IDQ/MDM.
- Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2, Oracle, Flat files, XML files, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets
- Excellent knowledge of Informatica Administration; Involved in grid management, creation and up gradation of repository contents, creation of folders and users and their permissions.
- Design, develop and support application solutions with a focus on Teradata Architecture and its utilities such as Fast Load, Multi Load, Fast export, TPump and BTEQ.
- Expertise in implementing complex Business rules by creating robust Mappings, Mapplets, Shortcuts, reusable transformations using transformations like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
- Good understanding of relational database management systems like Oracle and SQL Server and extensively worked on Data Integration usingInformaticafor the Extraction, transformation and loading of data from various database source systems
- Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. Expertise in Session partitioning and tuning session and lookup/aggregator/joiner caches for performance enhancements.
- Hands on experience in performing and supporting Unit testing, System Integration testing, UAT and production support for issues raised by application users.
- Strong in UNIX Shell Scripting. Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like Autosys.
- Excellent technical and professional client interaction skills. Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
- Enthusiastic and adaptive of new technologies with Strong technical, Analytical, problem solving and Presentation skills.
TECHNICAL SUMMARY:
ETL Tools: Informatica PowerCenter 8.x/9.x, Informatica Cloud/IDQ/MDM, SSIS, Informatica Power Exchange, Talend, Pervasive, Salesforce Data Loader, Relational Junction, OBIEE
Reporting Tools: Cognos, Business Objects, SSRS
Programming Skills: Shell Scripting, SQL, PL/SQL
Databases: Oracle 10g/11g, SQL Server 2012/2008, Teradata V2R4 V2R5, DB2, DB2 Blu
Methodologies: Data Modeling Logical, Physical Dimensional Modeling Star/ Snowflake
Operating Systems: UNIX (Sun - Solaris, HP/UX, IBM AIX), LINUX, Windows NT/XP
PROFESSIONAL EXPERIENCE:
Confidential, MINNEAPOLIS, MN
Sr. ETL Developer, Salesforce.com/Cloud Data Migration Specialist
Environment: Informatica PowerCenter 9.6.1, Informatica MDM/IDQ/Cloud, SSIS, SUSE Linux, Power Exchange for Salesforce.com with Bulk API, DB2, Oracle 11g, MS SQL Server 2012, Salesforce Data Loader, Autosys.
Responsibilities:
- Working with business to get the requirements and prepare requirements documents as per Bank standards.
- Created mappings specification and unit test cases as per the requirements.
- Worked with delimited and fixed with flat files and load data into Salesforce.com using direct and indirect methods.
- Developed IDQ data profiling and cleansing mappings in Informatica Developer.
- Used Informatica MDM technique to implement Customer 360 view across companies all CRM platforms such as Salesforce.com, Siebel and Microsoft Dynamics CRM.
- Extensively used Salesforce Bulk Api to load 27 millions of customer in less than 6 hours.
- Used Salesforce Pipeline lookup transformation to directly lookup on Salesforce objects.
- Used pass through partitioning to improve the job runtime during data normalization.
- Developing UNIX scripts for splitting/sorting files, masking data and running workflows from command line.
- Used various cloud data migration tools for quick ad-hoc loads into Salesforce.com.
- Extensively worked with different transformations such as Aggregator, Expression, Router, Filter, Lookup, and Sorter.
- Created reusable mapplets and transformation.
- Crated user defined function and called them in multiple mappings.
- Used mapping variables to remove the duplicates based on certain fields.
- Created command tasks to invoke UNIX scripts from Informatica.
- Worked with Workflow variables to pass the values to the command task rather passing hard coded value.
- Used upsert operation in Informatica to load data into salesforce.com using unique external id.
- Created UNIX scripts to invoke the jobs using pmcmd and automated them with scheduling tool AUTOSYS.
- Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for clean-up and update purposes
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
Confidential, DALLAS,TX
Sr. ETL Developer
Environment: Informatica PowerCenter 8.6.1, Informatica Data analyzer/IDQ/MDM, Talend, Oracle Database 11g, MS SQL Server 2008 R2, MYSQL, LINUX 2.6.18.
Responsibilities:
- Prepared Mapping specifications and wrote Unit Test cases for end-to-end data validation.
- Designed and developed ETL mappings using transformation logic for extracting the data from SQL Server and Oracle.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator, SQL transformations.
- Developed ETL solutions using open source tool Talend before we migrate into using Informatica.
- Used TYPE1 and TYPE2 Slowly changing methodologies to capture historical data according to business rules.
- Created user defined functions and used in multiple expressions in the mappings.
- Created worklets, workflows and reusable and non reusable tasks such as Session, Command, Email, and Assignment.
- Automated the load process using UNIX shell scripts, dynamically assigned the partition ranges to the parameter file.
- Identified bottlenecks in Target, Source, Mapping and Session using performance techniques.
- Involved in writing PL/SQL code in Oracle stored procedures, functions and packages to support applications.
- Developed UNIX scripts for pre data manipulation.
Confidential, DALLAS, TX
Sr. ETL Developer
Environment: Informatica PowerCenter 8.6.1 with SFDC Connector, Apex Explorer 8.0, Apex Data Loader 17.0, Relational Junction, Pervasive, Teradata, Oracle Database 10g, MS SQL Server, Mainframe, IBM AIX 5.2, 5.3, AUTOSYS (Batch Scheduler)
Responsibilities:
- Member of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to migrate data from the source to the target DW hosted on Teradata.
- Understanding existing Salesforce and ER model of Salesforce’s Schema.
- Analyzed the business requirement document and created functional requirement document mapping all the business requirements.
- Designed and developed ETL mappings using transformation logic for extracting the data from various sources systems.
- Used various cloud data migration tools for quick ad-hoc loads into Salesforce.com.
- Created Teradata External Loader connections such as MLoad, Upsert, MLoad Update, FastLoad and Tpump in the Informatica Workflow Manager while loading data into the target tables in Teradata Database.
- Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange formainframesources.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator, SQL transformations.
- Created Mapplets, Reusable transformations
- Used mapping parameters and variables.
- Automated the load process using UNIX shell scripts, dynamically assigned the partition ranges to the parameter file.
- Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
- Developed process for error handling and auto reloading.
- Created reusable objects in Informatica for easy maintainability and reusability.
- Extensively used debugger to trace errors in the mapping.
- Involved in developing test plans and test scripts to test the data based on the business requirements.
- Developing UNIX scripts for pre data manipulation.
- Supported migration of ETL code from development to QA and then from QA to production.
- 24/7 On - call Support for daily batch jobs.
Confidential
Asst. System Engineer/ETL Developer
Environment: Informatica PowerCenter 8.1.1, Cognos Report Net 1.1, MS SQL Server 2005, Oracle 9i, Teradata, IBM AIX 4.3.3/5/1, Windows XP, UNIX.
Responsibilities:
- Understanding existing business model and customer requirements.
- Studying the business logic for different custom objects in Salesforce environment.
- Understanding the Data architecture of legacy system and integrating/ automating the data load from the legacy system using ETL tools.
- Design, develop and support application solutions with a focus on Teradata architecture and its utilities such as Fast Load, Multi Load, Fast export, TPump and BTEQ.
- Designed and developed mappings and Mapplets using Informatica Source analyzer, Warehouse designer, Transformation designer and Mapplet designer.
- Created and developed numerous mappings using Expression, Router, Joiner, Lookup, Update Strategy, Stored Procedure and other transformations
- Extensively involved in working on different active and passive transformations and involved in generating, modifying and processing the data
- Extraction, transformation and loading of data were carried out from different sources like MS-Access, Flat files, MSSQL Server and ORACLE.
- Created stored procedures, triggers and functions to support the data load to Data Warehouse.
- Involved in designing and development of multi dimensional star schema.
- Involved in creations of various snapshots like transaction level snapshots, periodic snapshots and Accumulated Snapshots.
- Designed and developed pre and post session routines and batch execution routines
- Used session partitions, dynamic cache memory, and index cache to improve the performance
- Used debugger and breakpoints to view transformations output and debug mappings
- Implemented different tasks in workflows which included sessions, command task, decision task, timer, assignment, event-wait, event-raise, e-mail etc.
- Worked on writing UNIX shell scripts to invoke the sessions using PMCMD Command.
- Created the reports using report net functionalities like cross-tab, master detail.
- Customizing reports using breaks, filters and sorts.
- Involved in creation of Unit and System Test Cases.