We provide IT Staff Augmentation Services!

Sr.etl Informatica Developer Resume

Sfo, CA

PROFESSIONAL SUMMARY:

  • 7 Plus years of professional experience in Informatica Power Center in all phases of analysis, design, development, implementation and support of Data warehousing applications using InformaticaPowerCenter 10.x/9.x/8.x
  • Experience in Data warehouse development, working with Data migration, Data conversion and Extraction/Transformation/Loading using Informatica Power Center and Informatica cloud platform.
  • Participated and understood business requirements, discussed with Business Analyst to gather and analyze the requirements and to prepare business rules.
  • Solid and extensive knowledge in understanding Software Development Life Cycle (SDLC), requirement, analysis, design and implementation, review specification, testing and maintenance of applications in Agile Methodology and Agile Scrum meeting for creative work.
  • Excellent understanding of ETL design process, development and maintenance using Mapping Designer, Workflow Manager, Monitor and Repository Manager.
  • Worked with different Databases& Files - Oracle, DB2, Sybase, My Sql, Sql Server, Teradata, Flat Files & XML.
  • Involved working on the transformations like Source Qualifier, Filter, Router, Joiner, java, Aggregator, Expression and Lookup.
  • Expert in Data Modeling using Star Schema/Snowflake Schema, Fact and Dimensions tables and Slowly changing Dimensions (SCD1, SCD2)
  • Extensively worked in creating UNIX shell scripts and automation of ETL project using UNIX shell Scripting.
  • Strong experience with Integrating Informatica with Teradata.
  • Strong experience in using Teradata Utilities - BTEQ, M-Load, F-Load, F-Export, T-Pump & TPT
  • Strong experience in writing Teradata Loader Scripts
  • Experience in Performance Tuning of Teradata BTEQs as part of production support along with Informatica Tuning.
  • Designed and developed many SSIS packages for the management/processing of daily data load files utilizing Control Flow, Data Flow and Script task to cleanse, transform and load millions of rows of data from various sources
  • Solid understanding of OLAP (Online Analytical Processing) concepts and challenges, with large data sets.
  • Experience with Informatica Power Exchange change data capture (CDC), also the traditional CDC method.
  • Experience in documenting High Level Design, Low level Design, STM, Unit test plan, Unit test cases and Deployment documents.
  • Experienced to work with Development team, Production support team and Business Analysts in handling critical situations to meet the deadlines for successful completion of the tasks/projects.
  • Experienced in writing and debugging to find out the problems in mappings, troubleshooting and rectifying ETL bugs.
  • Worked with multiple LOB's to set up a test data management framework for requesting, reserving and provisioning of test data.
  • Experienced in working for the post development cycle and applications in Production Support
  • Expert in Agile & Waterfall methodology also expert in using Agile tools Rally & Jira.

TECHNICAL SKILLS:

ETL Tools: Informatica PowerCenter 10.1/9.6.1/9.5.1/ 8.5.1/8.6.1 /8.1.1

Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, E-R Modeling.

RDBMS: Oracle 11g/10g/9i, Teradata 14/13/12, DB2, SQL Server 2000/2005/2008 , MySQL, Sybase

Reporting Tools: Cognos, Business Objects

Scheduling Tools: Control M, Autosys, Maestro, Cron Job

Languages: XML, UNIX Shell Scripting, SQL, PL/SQL

Operating Systems: Windows, Unix, Linux

Others: Putty, Toad, SQL Assistant

EXPERIENCE SUMMARY:

Confidential, SFO, CA

Sr.ETL Informatica Developer

Responsibilities:

  • Extensively worked with Business Analyst & Users in understanding the Source to target Mapping documentation.
  • Used Informatica Power Center to migrate the data from different source systems to ECDW.
  • Created complex mappings using Source Qualifier, Expression, Joiner, Connected/Unconnected Lookup, Sorter, Aggregator, Filter, Java, Update Strategy and Router transformations for populating target tables in efficient manner.
  • Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using Mapping Parameter files.
  • Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements, extensively used Teradata Utilities like M - load, F- Load, TPT, BTEQ and Fast Export.
  • Designed SQL Server Integration Services (SSIS) packages to extract data from JDE to SQL Server Data Warehouse following the JDE specs and SQL Server standards.
  • All SSIS standards are followed to maintain reliability and scalability in the extraction.
  • Extensively involved in both ETL Informatica & Database performance tuning.
  • Analyzed session log files in session failures to resolve errors in mapping or session configuration.
  • Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
  • Created mapplet and used them in different mappings.
  • Extracted data from different heterogeneous sources Flat Files, XML, My Sql and Oracle.
  • Written PL/SQL Procedures and functions.
  • Involved in change data capture (CDC) ETL process.
  • Implemented Slowly Changing Dimension Type I &Type II for different Dimensions.
  • Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.
  • Worked with Informatica version Control excessively.
  • Written Unit test scripts to test the developed interfaces.
  • Managed enhancements and coordinated with every release with in Informatica objects.
  • Provided support for the production department in handling the data warehouse.
  • Worked under Agile methodology and used Rally tool one to track the tasks.
  • Written thorough design docs, unit test documentation, Deployment & Runbook documents.

Environment: Informatica Power Center 9.6.1/9.5.1/9.1.1 , Teradata 14/12, Oracle 10g, My Sql, XML, Flat Files, Sql Assistant, Toad, PL/SQL, Unix shell scripting, Cognos, Maestro, SVN, Unix, Windows.

Confidential, Charlotte, NC

Sr.ETL Informatica Developer

Responsibilities:

  • Worked with Business Data Analysts (BDA) to understand the requirements for Data Mart development.
  • Utilized all the features of Source Qualifier transformation such as filter, joiner, sorter and SQL override to the extend level at the source level.
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Lookup, Filter and Union in developing the mappings to migrate the data from source to target.
  • Extensively used Lookup transformation and Lookup Caches in looking the data from relational and Flat Files.
  • Extracted data from various heterogeneous sources like Sybase, Flat Files and COBOL (VSAM) using Informatica Power center and loaded data in target database DB2.
  • Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.
  • Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements.
  • Involved in doing Unit Testing, Integration Testing and Data Validation.
  • Worked with the Control M scheduling team in scheduling Informatica Jobs as per the requirement and frequency.
  • Implemented various Performance Tuning techniques by finding the bottle necks at source, target, mapping and session and optimizing them.
  • Involved in DWH up gradation for source system changes.
  • Created Mapping parameters and Variables and written parameter files.
  • Created UNIX shell scripts for various needs.
  • Worked with the Debugger Wizard in debugging the mappings.
  • Used Normalize Transformation for Cobol (VSAM) sources.
  • Worked with External stored procedures for data cleansing purpose.
  • Worked with the Cognos team in generating various reports.
  • Involved in preparing the Migration Documents.
  • Implemented Informatica Procedures and Standards while developing and testing the Informatica objects.
  • Successful in Providing 24x7 On-call Support for Production databases.

Environment: Informatica Power Center 9.5.1,Sybase, DB2, Flat Files, Cobol (VSAM),Win SQL, Oracle 10g, Toad, Ultra Edit - 32, SQL Advantage, Control Center, Power Designer SQL Modeler, CDMA, MS-Visio, Cognos, UNIX, Windows.

Confidential, Dayton, OH

Sr.ETL Informatica Developer

Responsibilities:

  • Involved in Requirement analysis in support of Data Warehousing efforts
  • Maintain Data Flow Diagrams (DFD’s) and ETL Technical Specs or lower level design documents for all the source applications
  • Worked with source databases like Oracle, SQL Server and Flat Files
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations
  • Created complex mappings using Unconnected and Connected lookup Transformations
  • Implemented Slowly changing dimension Type 1 and Type 2 for change data capture
  • Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache
  • Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, transform, cleanse data and load it into data marts
  • Worked with various Informatica PowerCenter objects like Mappings, transformations, Mapplet, Workflows and Session Tasks
  • Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level
  • Worked extensively with update strategy transformation for implementing inserts and updates
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre session commands
  • Extensively used debugger to test the logic implemented in the mappings
  • Performed error handing using session logs
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval
  • As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables
  • Auditing is captured in the audit table and EOD snapshot of daily entry is sent to the distributed list to analyze if anything is abnormal
  • Monitored workflows and session using PowerCenter workflows monitor
  • Used Informatica Scheduler for scheduling the workflows

Environment: Informatica Power Center 8.5 / 8.6.1/9.1 , Oracle 9i/10g, SQL Server, Sybase, DB2, SQL, PL/SQL, Cognos, Cron Job, Unix windows.

Confidential

ETL Informatica Developer

Responsibilities:

  • Involved in Requirement analysis in support of Data Warehousing efforts
  • Reviewing the data model and ETL flow with Modelers & Architects.
  • Providing the time estimation of various tasks involved in development phase.
  • Understanding of client requirements during team discussions.
  • Development of Informatica Mappings as per the requirement in STM to load data into the Data Warehouse.
  • Writing SQL Scripts based on the requirement.
  • Scheduling of Informatica sessions for testing automation of loads.
  • Performance tuning of long running jobs in Informatica.
  • Updating of ETL Specification, Test Case and Reporting Metadata Documents after testing in various environments (Development, QA and Production)
  • Creating the functions and few rules for having a good data quality.
  • Extracted the data from the Flat Files, Oracle and My Sql into staging area and populated in to data warehouse which is DB2.
  • Developed and documented data Mappings/Transformations and Informatica sessions.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, look-up, Update Strategy, Rank, Joiner, and Stored procedure transformations.
  • Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Involved in Migration & Testing of mappings, sessions, worklets and workflows from Informatica 7.1 to Informatica 8.6.1 and make sure all the environment is upgraded properly.
  • Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level.
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre session commands.
  • Extensively used debugger to test the logic implemented in the mappings.
  • Performed error handing using session logs.

Environment: Informatica Power Center 8.5.1/8.6.1 /9.1 , Oracle 9i/10g, My Sql, Flat File,DB2, SQL, PL/SQL, Cognos, Unix and Windows.

Confidential

Jr.Informatica ETL Developer

Responsibilities:

  • Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 8.6.1
  • Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and Flat Files (Fixed & delimited) into Staging Area
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.
  • Implemented complex business rules in Informatica Power Center by creating re-usable transformations, and robust Mapplets.
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
  • Improved session Performance by enabling property incremental aggregation to load incremental data into target table.
  • Worked with Functional team to make sure required data has been extracted and loaded and performed the Unit Testing and fixed the errors to meet the requirements.
  • Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.

Environment: Informatica 8.5.1/8.6.1 , UNIX, Oracle, DB2, Flat File, Toad, SQL Developer

Hire Now