Informatica/etl Developer Resume Profile
Charlotte, NC
Summary
- 7 years of total IT experience in the Planning, Analysis, Design, Implementation and Maintenance in different domains of Financial, Manufacturing, Pharmaceutical, Health and Retail Oriented companies
- Vast experience in designing and developing complex mappings for strong Data Warehouse using Informatica PowerCenter 9.1.0./8.6.1/8.5./8.1.1/8.0/7.X/6.X/5.X/4.X Source Analyzer, Data Warehousing Designer, Mapping Designer, Power plugs, Mapplet, Transformations , PowerMart 6.1/5.1.1/5.x/4.7, Informatica Orchestration, Informatica B2B, Warehouse Designer, Power Exchange, Power Analyzer, Power Connect, Power Plug, ETL, DB2, Oracle 10g/9i/8i, MS SQL Server 2005/2000, Business Objects SAP XI R3/XI R2/6.x/5.x.
- Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets, SQL scripts and triggers and PL/SQL stored procedures.
- Assisted in Data Modeling and Dimensional Data Modeling.
- Involved in strong Business Analysis experience on Data Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
- Expertise in the QA process of ETL mappings like Unit, Functional and Integration testing.
- Experience in debugging and Performance tuning of targets, sources, mappings, sessions and system
- Involved in Highly Proficient in Data Modeling, Strong in Data Warehousing concepts, Dimensional, Star Schema and Snowflake Schema methodologies. Complete understanding of Ralph-Kimball and Inmon approaches to Data Warehousing.
- Strong analytical and conceptual skills in database design and development.
- Experience in Business Objects XI R3/XI R2/6.5/6.0/5.1/4.1 Web-Intelligence 2.5, Designer 5.0, and Developer Suite Set Analyzer 2.0 , Cognos Series 7.0/6.0/5.x, Cognos Impromptu, Cognos IWR Impromptu Web Reports and Cognos PowerPlay Transformer.
- Experience with reporting tool like Business Objects, BRIO. Understanding of complex report generation processes in BI tools.
- Experience in Scheduling a Report in Business objects Infoview and giving access to User in Business objects Central Management Console.
- Experience in using scheduling tools Cybermation, TWS Tivoli Job Scheduler , AutoSys to automate running of Informatica Workflows on a daily Batch Jobs.
- Migrate the code from Development Environment to QA Environment using Deployment Groups.
- Worked with Business when user has any issue with data in the Report.
- Worked on On-Call support and Production support.
- Experience in Power Exchange Change Data Capture 9.1.0/8.6.1 using Oracle Logminer to read change Data from oracle Redo Logs.
- Experience in Power Exchange Listener starter and Logger shutdown Process in Cold and Warm Start mode.
- Involved in Writing UNIX Shell scripts to automate various Processes.
- Experience in using Oracle 10G/9i/8i/8.x/7.x, DB2 8.0/7.0, MS SQL Server 2008/2005/2000, IBM Informix 7.2, MS Access 7.0/2000, XML, PL/SQL, SQL Plus, SQL Loader and Developer 2000, Win 3.x/95/98/2000, Win NT 4.0 and Sun Solaris 2.x. Experienced in PL/SQL scripts and shell scripting. Strong Experience to databases SQL, PL/SQL, SQL PLUS, SQL Loader, DBMS Packages, TOAD ODBC interfaces to Oracle.
Technical skills:
- ETL Tools: Informatica Power Center, Power Exchange, IDS, IDQ
- Data Modeling: ERwin, Star and Snowflake Schema Modeling, Dimensional Data modeling, Fact and Dimensions tables
- Databases: Oracle 10g, 9i/8i, MS-SQL Server 7.0, MS Access, Teradata, UDB DB2
- Languages: SQL, PL/SQL, UNIX
- Tools: Business Objects MS-Excel, SQL Loader, SQL PLUS, TOAD
- Operating System: Windows95/98/NT/2000/XP, UNIX, IBM Mainframe, SunOS
Professional Experience:
Confidential
Role: Informatica/ETL Developer
Confidential , Duke energy is the largest electric power holding company in the united states. supplying and delivering energy to Carolinas and some parts of Confidential. The Project was aimed to consolidate work management systems of Carolinas and Florida into a new application. The Purpose is to maintain a data warehouse that would enable the work management systems to fetch the data required for the application. Distributed data is coming from heterogeneous sources like Flat files, GIS applications, SQL server, Oracle.
Responsibilities:
- Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
- Prepared high-level design document for extracting data from complex relational database tables, data conversions, transformation and loading into specific formats.
- Performed source system analysis and designed change data capture CDC / incremental loading technical specifications.
- Extracted data from Flat files, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Data Warehouse.
- Extensively worked on Informatica developer client 9.1.0 - Source Qualifier, Joiner, Lookup, router, Filter, Sorter, Expression and Update Strategy.
- Worked on Informatica data quality IDQ to perform analysis, data profiling, data quality check setting up business rules on columns selected for IDQ.
- Worked on profiling tables on columns that are selected for cleansing, analyzing patterns, frequency and percentage of occurrences using Informatica data quality IDQ .
- Extensively used Slowly Changing Dimension, SCD1 to handle the Incremental Loading.
- Used debugger extensively to identify the bottlenecks in the mappings.
- Worked on developing workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.
- Created and maintained documents including data quality plan design, mapping inventory, mapping specifications, change request form, unit test plan, test case list, target-source matrix
- Used Cybermation for scheduling the workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
- Created Test cases for Integration testing and UAT. Reviewed UAT with business users
Environment: Informatica Power center Designer 9.1.0, IDQ 9.1.0 Workflow Manager, workflow monitor, Repository manager, Oracle 10g/9i,Sql Server2000,ERWIN 3.5, Shell script, , SQL, Cybermation PL/SQL, Visio,
Confidential
Sr. Informatica Developer
Confidential and health services is among the nation's largest publicly traded managed health care companies. Health Net's mission is to help people be healthy, secure and comfortable. The company's POS, HMO, insured PPO, behavioral health and government contracts subsidiaries provide health benefits to more than 7 million individuals. The project is aimed to integrate data from source systems for providers health care in to Data warehouse for Business Reporting Purpose.
Responsibilities:
- Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow.
- Developed ETL mappings, transformations using Informatica Power Center 8.6
- Extracted data from flat files provided by disparate ERP systems and loaded the data into Oracle staging using Informatica Power Center.
- Analyzed Source Data to resolve post-production issues. Used MS Access to analyze source data from flat files.
- Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
- Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions.
- Used Mapping Parameters and Mapping Variables based on business rules provided.
- Scheduled workflow daily basis for incremental data loading
- Wrote PL/SQL Procedures for data extractions, transformation and loading.
- Assisted in Data Modeling and Dimensional Data Modeling.
- Involved in Performance Tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
- Scheduling jobs using TWS to automate the Informatica Sessions.
- Used TOAD to FTP file moving processes to and from source systems.
- Performed Unit testing for all the interfaces.
- Used Test Director to log and keep a track of the defects.
- Provided Production Support at the end of every release.
Environment: Informatica Power center Designer 8.6.1, Workflow Manager, workflow monitor, Repository manager, Oracle 10g/9i,Sql Server2000,BO XI R2, UNIX, COBOL,ERWIN 3.5, Shell script, Rapid-SQL, Toad, SQL Loader, SQL, PL/SQL, Visio, TWS.
Confidential
Role: Sr. Informatica/ETL Developer
Confidential , will be helping the vehicle remarketing services reaching out their profitability with varied forms of measures like calculating distance logic to reach the owners, new prospects and so on. The project was to implement an Enterprise data warehouse EDW by means of integrating data from different feeder system situated across various locations into a central repository of business information. The EDW provides quick access to data to enable a more informative decision making process.
Responsibilities:
- Involved in the Informatica server installation and set up the environment.
- Worked on developing Unix scripts for data cleansing and data archiving
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
- Implemented slowly changing dimension Type 1 and Type 2 for Change data capture using Version control.
- Involved in Creation of SQL, Packages, Functions, Procedures, Views, and Database Triggers.
- Developed database monitoring and data validation reports in SQL Server Reporting Service SSRS .
- Created DTS packages for data migration from Excel Sheets and Flat files into the SQL Server Database using SSIS.
- Migrated DTS objects to SQL Server Integrated Services SSIS .
- Expertise in configuration, performance tuning, installation of Informatica, in integration of various data sources like Oracle, MS SQL Server, XML, Flat files into the staging area and Design ETL processes that span multiple projects
- Involved in writing the SQL procedures, used SQL Server DTS to improve the warehouse loading.
- Designed and Developed ODS to Data Mart Mappings/Sessions/Workflows.
- Created various Oracle database objects like Indexes, stored procedures, Materialized views, synonyms and functions for Data Import/Export.
- Created reusable worklets and workflows.
- Used Transformation Language functions in the mappings to produce the desired results.
- Used TOAD to run SQL queries and validate the data in warehouse and mart.
- Involved in Debugging and Troubleshooting Informatica mappings.
- Populated error tables as part of the ETL process to capture the records that failed the migration.
- Involved with Scheduling team in creating and scheduling jobs in Tivoli Workload Scheduler.
- Used CDC for moving data from Source to Target.
- Re-writing PL/SQL routines using Netezza nzsql and nzload utilities
- Designed the ETL processes using Informatica tool to load data from Oracle, flat files into Netezza Staging, Warehouse .
- Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations
- Writing stored procedures and DTS packages for maintenance tasks in Production environment.
- Implemented various Data Transformations using Slowly Changing Dimensions
- Developed test cases for Unit, Integration and system testing
- Used Custom logger can run a Serializer that reports detailed event information
- Involved in Maintaining the Repository Manager for creating Repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environments.
- Partitioned the Sessions for better performance.
- Designed of ETL mappings for the CDC change data capture
- Trained end users in using full client BO for analysis and reporting.
- Wrote SQL Scripts and PL/SQL Scripts to extract data from Databases
- Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Environment: Informatica Power Center 8.6.0/8.1.1, DB2, Erwin 4.0, UNIX Shell Scripting, Oracle 9i/10g/11g, PL/SQL, Business Objects XI R2,SQL Server 2005/2008, Teradata SQL,Korn Shell Scripting, Teradata Utilities BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD , Netezza Database , Tivoli Workload Scheduler 8.4 , TOAD 9.7.2, Tibco.
Confidential
Informatica Developer
The deposit account Confidential program is one of the many benefits of Paychex Resource Management Account RMA . Any un-invested cash that enters the account, whether from traditional depositors or as proceeds from an investment, is automatically swept on daily basis to FDIC-insured deposit account at Paychex, USA or to another sweep option.
Responsibilities:
- Architecture the dataflow in the datamart, extensively customized Bank's traditional methodology, designed data flow diagrams, designed the best solution for data flow.
- Involved in discussions with business analysts for requirement gathering, understanding the requirements and explanation of technical probabilities and possibilities with business users.
- Created Time Estimate Proposal document with estimation of hours required for completion of each ETL task.
- Conversion of business requirements into technical documents Business Requirement Document, explained business requirements in terms of technology to the developers.
- Worked with data modeler and business users for designing of tables.
- Analyzed the source data with business users, developed critical mappings using Informatica PowerCenter to load the data from DB2 to Oracle.
- Extensively used SCD's Slowly Changing Dimension to handle the Incremental Loading for Dimension tables, Fact tables.
- Extensively Used Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator.
- Developed mappings to handle exceptions and discarded data.
- Used External Loader SQL Load for bulk loading of Oracle tables.
- Created UNIX scripts for using SQL Loader and defined positions for each field in Flat files.
- Involved in Data modeling and design of data warehouse in star schema methodology with confirmed and granular dimensions and FACT tables.
- The table or column structures were changed without affecting the Conceptual model using Entity relationship.
- Worked extensively on Erwin tool on Logical and physical level Design.
- In systems analysis logical data models are created as part of the development of new databases.
- Assisted data modelers in database design and successfully brought database to 2nd Normal form.
- Worked with DBA's and boiled down where to Partition the Tables, adding Indexes to the columns.
- Unit testing the data and report generation for review of business users for special accounts.
- Developed UNIX shell scripts and used PMCMD to execute the workflows.
- Exported the workflows from Repository Manager, checked the workflows into svn Version Control Management tool .
- Prepared documents for QA and PRODUCTION migration.
- Worked with business users and QA team during testing phases.
- Created Test cases for Integration testing and UAT. Reviewed UAT with business users.
Environment: Informatica Power center Designer 8.6.1, Workflow Manager, workflow monitor, Repository manager, Oracle 10g/9i,Sql Server2000,BO XI R2, UNIX, COBOL,ERWIN 3.5, Shell script, Rapid-SQL, Toad, SQL Loader, SQL, PL/SQL, PVCS, Visio, AutoSys.
Confidential
Informatica Developer
Confidential is considered one of the country's most highly regarded regional banks. founded more than 145 years ago in Western Confidential, It's parent company, Confidential Corporation, had over 55billion in assets and is one of the 20 largest commercial bank holding companies in the Confidential al operates at over 650 branches throughout Confidential Responsible for the development of Data Warehouse and supported Data mart for reporting services in large financial organizations credit card division supported high availability ad-hoc reporting.
Responsibilities:
- Analyzed business process and gathered core business requirements. Interacted with business analysts and end users.
- Prepared a handbook of standards for Informatica code development.
- Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all application and systems.
- Developed Custom metadata repository.
- Used Informatica designer for designing mappings and mapplets to extract data from SQL Server, Sybase and Oracle sources.
- Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
- Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level. Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.
- Used Debugger to test the data flow and fix the mappings.
- Created and monitored workflows and task using Informatica PowerCenter Workflow Manager.
- Partitioned Sessions for concurrent loading of data into the target tables.
- Tuned the workflows and mappings.
- Involved in identifying the bottle necks and tuning to improve the Performance.
- Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
- Executed Workflows and Sessions using Workflow Monitor.
- Dealt with data issues in the staging flat files and after it was cleaned up it is sent to the targets.
- Actively coordinated with testing team in the testing phase and helped the team to understand the dependency chain of the whole project.
- Executed the workflow using pmcmd command in UNIX.
Environment: Informatica PowerCenter 8.0/7.1.3, Oracle 10g/9i/8i, SQL Server 2000, PL/SQL, Erwin, TOAD, SQL Plus, SQL Loader, UDB, Sybase, XML, Windows, Linux and HP-UNIX.
Confidential
Role: Database programmer /ETL Developer
Confidential is one of the leading banking sectors in Confidential. Bank has a need for an analytical Datawarehouse to improve the decision-making system. They mainly wanted to get information about season wise financial analysis, target custom groups, loss making schemes, profitable schemes, Branch wise analysis etc. Worked as Database programmer in client / server environment, design and modifications of various applications using Oracle and Developer 2000. Developed various forms and reports for departments such as: quality control, warehouse and finance.
Responsibilities:
- Worked as a technical support for a team that involved technical quality reviews of program codes and PL/SQL blocks for optimization and maintaining standards guidelines
- Developed Database Triggers in order to enforce complicated business logic and integrity constraints, and to enhance data security at database level
- Involved in Business analysis and technical design sessions with business and technical staff to develop requirements document, and ETL specifications.
- Designed and developed the ETL Mappings for the source systems data extractions, data transformations, data staging, movement and aggregation.
- Worked on Informatica Power Center 6.1. Used Source Analyzer and Warehouse designer to import the source and target database schemas and the Mapping Designer to map the sources to the target, Mapplets and Transformation Developer.
- Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter.
- Promoted the new code to production through unit and system testing
Environment: Informatica Power Center 6.1, SQLServer, Oracle 8i, DB2, SQL/PLUS, ERWIN, Windows XP, UNIX.
Confidential
Role: PL/SQL Developer
The main Achievement of this Project is reduce the Manufacturing cost and improved the sales Forecasting by identifying key factors like the total sales revenue in 1 Week, 1 Month, and Quarter against to Previous Week, Month, and Quarter Based on the Region, Sales Channel, and The Sale amount. Analyzed how many Customer satisfied calls, dissatisfied calls when compared to the earlier business weeks.
Responsibilities:
- Gathered requirements from business and created functional specifications
- Analysis of data models based on business requirements.
- Studied current state and designed future state in PL/SQL and derived extract and load strategy
- Involved in Unit Testing, Integration Testing, and System Testing.
- Written stored procedures for ledger processing and bill processing modules.