We provide IT Staff Augmentation Services!

Informatica Developer Resume

2.00/5 (Submit Your Rating)

Reston, VA

PROFILE SUMMARY:

  • Around 9 + years of experience in development of Data Warehousing solution using (ETL Tool) Informatica Power Center 7.1/8.1/8.6/9.1/9.6 , Informatica Power Exchange, Informatica Data Quality and databases are Oracle, SQL Server, Teradata and IBM - Db2.
  • Worked on all phases of data warehouse development lifecycle, from Analysis, Effort Estimation, ETL Design, Coding, Testing and Implementation, Support of new/existing applications and version up- gradation projects.
  • Worked extensively on Data Modeling OLAP (Star schema/Snowflake Schema/Data Vault) using tools like Erwin and ER data modeling for OLTP database systems.
  • Extensive experience in developing Extracting, Transforming and Loading (ETL) process to load the data from different heterogeneous sources systems like Flat Files (Fixed Width and Delimited), XML Files, Excel, Oracle, Mainframe, IBM Db2, Teradata and MS SQL Server to Warehouse systems.
  • Extensively used Transformation such as Source Qualifier, Expression, Filter, Router, Aggregator, Rank, Joiner, Lookup, Normalizer, XML Parser/Generator, Update strategy, JAVA, Transaction Control and Stored Procedure, etc., to implement complex business Logic.
  • Worked extensively with slowly changing dimensions.
  • Experience in using Informatica command Line Utility like PMCMD to execute workflows using script.
  • Extensively involved in Fine-tuning the Informatica Code (mapping and sessions), Stored Procedures, SQL to obtain optimal performance and throughput.
  • Experience in design and implementation of Informatica (IDQ v9.1):- Data Profiling, matching/removing duplicate data, quality rule, implementation patterns with cleanse, parse, standardization, validation, scorecard.
  • Experience with Teradata utilities Fast Load, Multi Load, BTEQ scripting, fast Export, TPT, SQL Assistant.
  • Experience in developing PL/SQL procedure and function and performed performance tuning the queries.
  • Developed and executed UNIX shell scripts for file transfer (FTP/SFTP), emailing, automation, creating updating parameter file, archiving the source files, purging the archive files and creating an indicator files.
  • Very good Knowledge and experience on B2B integration, Informatica Data Integration Hub (DIH).
  • Leadership Qualities, ability to handle projects independently, Detail Oriented and a perfect Team member as well.
  • Extensive domain knowledge in Financial / banking and insurance domain.

TECHNICAL SKILLS:

Operating System: Windows XP, 2000, 2003, Linux and Sun Solaris

ETL Tools and Scripting Languages: Informatica Power Center 7.x, 8.1,8.6, 9.1,9.6 and 10.1, Informatica Power Exchange, Informatica Data Quality, SQL, UNIX-Shell/Windows Batch and PL-SQL

Databases: Oracle, SQL Server, Teradata and IBM: DB2

Other Languages: Core Java, C/C++ and XML

Version Control: CVS, SVN, Informatica version control.

Scheduling Tool: Informatica Scheduler, CRON, Auto sys, Maestro and Control-M

Reporting Tools: Micro strategy and Tableau

Tools: Informatica, Teradata SQL Assistant, Toad and SQL Developer, WIN SCP, SQL Lite, Putty and ALM

PROFESSIONAL EXPERIENCE:

Confidential, Reston, VA

Informatica Developer

Responsibilities:

  • Analyzing the User Stories based on the requirement by using Agile Methodology.
  • Active Participation in Daily Scrum Status Meetings.
  • Work with advance requirement analyst to interpret technical requirement for design and build using an Agile development principles and practices.
  • Preparing solution design document based on new proposed system
  • Participate in requirements gathering for data transformation between source and target
  • Design, develop, test, debug, implement and maintain ETL workflows, mappings, scripts, and stored procedures to support the data warehouse.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created Complex mappings using Unconnected, Lookup, and Aggregate and Router transformations for populating target table in efficient manner.
  • Created Autosys JIL files to execute ETL (Extraction Transformation Load) jobs at the specific timings.
  • Provide technical guidance to team members in various ETL (Extraction Transformation Load) issues
  • Perform in-depth data quality analysis from various sources to target mappings
  • Defects were tracked, reviewed and analyzed
  • Created events and tasks in the work flows using workflow manager
  • Developed Informatica mappings and also tuned them for better performance
  • Worked on JMS, JNDI and SOAP to vend (tibco) data in the form of XML to downstream applications.
  • Create and keep up to date the documentation of the QA processes using Cucumber
  • Prepared release documents which contains all production deployment instructions.
  • Involved in Unit and System testing of ETL Code Mappings and workflows.

Environment: Informatica Power Center, Informatica MDM, Autosys, Oracle, Netezza, Putty and JIRA.

Confidential, West bend, WI

Informatica Developer

Responsibilities:

  • Analyzing the requirement and preparing high level design of the new proposed system based and documenting with functional information and process flows.
  • Validates data integration by developing and executing test plans and scenarios including data design, tool design, data extract/transform,
  • Leading the offshore team in coding, testing phase for development activities.
  • Provide technical guidance to team members in various ETL issues and also provide ETL monitoring support from onshore.
  • Developed various Mappings to load the data from various sources by using different Transformations like Router, Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Expression and Sequence Generator to generate a data feed for HSB.
  • Worked on MFT B2B file transfer process.
  • Set up new mail box with PGP encryption using advance option in MFT
  • Test the connectivity of file transfer via FTP through MFT with downstream vendor (HSB)
  • Involved in Unit and System Testing of ETL Code Mappings and Workflows.
  • Defects were tracked reviewed and analyzed
  • Defining the pre/post migration and rollback strategies
  • Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.

Environment: Informatica, SQL Server, Windows, Control-M, SQL Assistant and MFT

Confidential

Sr. Informatica Developer

Responsibilities:

  • Understand existing environment architecture of 17 applications and doing impact analysis.
  • Prepared High level estimation based on impact analysis.
  • Worked on strategies for Upgrade from Informatica 8.1 to 9.6 version.
  • Migration of configurations and data on new environments (i.e. Informatica 9.6)
  • Prepared each application up gradation task list as Product backlog and delegating work to the team members.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, Sybase, MS Access, and Flat Files
  • Developed various Mappings to load the data from various sources by using different Transformations like Router, Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Expression and Sequence Generator to generate a feed for downstream application’s.
  • Migration of FTP process to SFTP in each application’s shell scripts using public/private key password less authentication mechanism.
  • Extracted Mainframe files using Informatica Power Exchange (Data Maps) and transformed them to be loaded in DW tables.
  • Extensively worked on tuning Informatica mappings to increase performance
  • Analyzing and fix migration and data issue
  • Involved in Unit and System Testing of ETL Code Mappings and Workflows.
  • Defects were tracked reviewed and analyzed
  • Defining the pre/post migration and rollback strategies
  • Migrated the Informatica Power Center mappings and Code/Folder migration from one environment to another as part of release management.
  • Worked closely with Production Control team to schedule shell scripts Informatica workflows and PL/SQL code in Control-M Scheduler.
  • Active participation in Daily Scrum Status Meetings.
  • Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.

Environment: Informatica, Informatica Power Exchange, Oracle, UNIX, Windows, Control-M, SQL Assistant, WinSCP and Putty.

Confidential, Phoenix, AZ

Sr. ETL Developer

Responsibilities:

  • Analyzing the requirement and preparing requirement document as per Confidential standard.
  • Preparing high level design of the new proposed system based and documenting with functional information and process flows.
  • Preparing high level designing of the new proposed system based on ETL model and it’s a documented with functional information and process flows.
  • Involved in the design, development and implementation of the Data Warehousing (EDW) process.
  • Created the Visio diagrams of ETL process to include in the technical transformation documents
  • Developed various Mappings to load the data from various sources by using different Transformations like Router, Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Expression and Sequence Generator to create output feed for downstream application’s.
  • Developed Informatica Type-1 and Type-2 Mappings based on the requirements.
  • Involved in Analyzing the requirements & Involved in designing and development of Complex Solutions.
  • Extracted data from Mainframe Systems and loading them into Teradata.
  • Preparing HLD and LLD as per Business requirement.
  • Extensively worked on sources as Flat files, csv, spreadsheet and Db2 database.
  • Prepared Sybase/Teradata stored procedure to execute DDOEE email system based on condition.
  • Worked on MLoad, BTeq, Fast Export and Fast Load to load the feed data in data warehouse.
  • Analyzing the Primary, Secondary, PPI and Join Index taking into consideration of both planned access of data and even distribution of data across all the available AMPs.
  • Worked with SET and MULTISET tables for performance evaluation of the scripts.
  • Used BTEQ and TD-SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Wrote shell scripts (UNIX) to run the stored procedure snippet in Sybase to make process more flexible.
  • Performed data quality analysis using Informatica Analyst tool (IDQ) on source data to understand and assess completeness, conformity, and consistency, creating profiles, rules to identify and prioritize data quality issues.
  • Analyzed and consolidated business requirements across a multitude of data sourcing projects and used Informatica Data Quality (IDQ) tool for profiling, analyzing and creating scorecard of data.
  • Participated in weekly meetings to discuss the status, issues and defects detected during the different stages of testing of the application.
  • Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.
  • Preparing Rollout Plan to knowledge transfer the Production team and support the application under warranty period.

Environment: Informatica Power Center, Power Exchange, Informatica Data Quality, Teradata, Sybase, UNIX, Windows, Mainframes, CVS, Teradata, SQL-Assistant and Event Engine.

Confidential

Sr. ETL Developer

Responsibilities:

  • Analyzing the requirement and preparing K245 requirement document as per Confidential standards.
  • Preparing high level designing of the new proposed system based on ETL model and it’s a documented with functional information and process flows.
  • Preparing HLD and LLD as per Business requirement.
  • Extensively worked on sources as Flat files, csv, spreadsheet and Db2 database.
  • Handling Thai, Japanese character using Informatica encoding standards.
  • Developed various Mappings to load the data from various sources by using different Transformations like Router, Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Expression and Sequence Generator to create output feed for downstream application’s.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Used Debugger to validate the mappings and gain troubleshooting information about data and error condition
  • Developed various Mapplet, Transformations and was responsible for Validating and Fine-Tuning the ETL logic coded into mappings to improve Data Quality.
  • Involved in unit testing of the mapping and SQL code.
  • Worked in Informatica B2B Data Transformation.
  • Implementation experience with Informatica Data Transformation and Data Exchange in handling XML, HTML, PDF and flat file data.
  • Writing a shell scripts (UNIX) to run the workflow, back up the incoming files, deleting old back up files after 180 days and uploading the files through SFT by calling Java code.
  • Handling PGP encryption standards by using Java code in shell scripts doing encryption/decryption of incoming/outgoing files to maintain secure process as per Confidential standards.
  • Worked with production support systems that required immediate support and frequent communication with the business teams.
  • Involved in Functional Unit Testing and Integration Testing.
  • Analyzed and consolidated business requirements across a multitude of data sourcing projects and used Informatica Data Quality (IDQ) tool for profiling of data.
  • Extensive use of IDQ to improve quality and accuracy of important fields. Writing and executing IDQ Plans for Standardization
  • Involved in all aspects of software development life cycle.
  • Actively update all the Project related documents in Share Point.
  • Involved in Support, Maintenance, Enhancements and Development
  • Monitoring the jobs on daily basis and rerunning the failure jobs.
  • Preparing shell script for scheduling jobs through control-M scheduler.
  • Deployment of code from development to QA and QA to Production environment using deployment group.
  • Understanding the client requirement and preparing new mapping using various transformation like router, normalize, update etc.
  • Development of the Interface which includes developing shell scripts to automate ETL process, Developing Mappings, Making Sessions and Workflows.
  • Maintaining load statistics of all the data loads running in production.
  • Review Informatica scripts before moving them into production environment

Environment: Informatica 7.1 and 8.6, Informatica Power exchange, Informatica Data Quality, Db2, MS-SQL Server, Shell Scripting, Control-M Scheduler, Problem Management tool.

Confidential

Software Engineer

Responsibilities:

  • Design, Development and Unit Testing of the applications
  • As a team member involved in collecting all the requirements for building the database.
  • Involved in Creation and designed required tables, indexes and constraints for all production activities.
  • Played a key role in designing the application and migrate the existing data from relational sources to corporate warehouse effectively by using Informatica Power center.
  • Involved in unit testing of the mapping and SQL code.
  • Developed mappings to load data in slowly changing dimensions.
  • Develop, execute and maintain appropriate ETL development best practices and procedures.
  • Assisted in the development of test plans for assigned projects.
  • Worked on Informatica Power Center 7.1 tool to design the source definition, target definition and transformation to build mappings.
  • Worked on different client tools Source analyzer, Warehouse designer, Mapping and Mapplet designer and Transformation Developer.
  • Development and customization of Mappings using transformations like filters, flat file as lookups, Expressions, Router and update strategy transformations that are used in mappings.
  • Create parameter file and executed the sessions using Workflow Manager.
  • Does unit testing, Integration testing, System Testing

Environment: Informatica, Oracle, UNIX, Windows, Toad

We'd love your feedback!