Sr. Etl/informatica Developer Resume
Houston, TX
SUMMARY
- Having Seven (7) years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Data Quality (IDQ) developer Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.
- Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
- Extensive testing ETL experience using Informatica 9.1/8.6.1/8.58.1/7.1/6.2/5.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.
- Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata.
- SSISPackage Configuration,SSRS, SSASand Data-warehousing.
- Experience in using Business Intelligence tools(SSIS,SSAS,SSRS) in MS SQL Server 2008.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server highly preferred.
- Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
- Extensive in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
- Created SQL Jobs to scheduleSSIS Package.
- Extensively worked on performance tuning of Informatica and IDQ mappings.
- Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Complete understanding of regular matching, fuzzy logic and dedupe limitations on IDQ suite.
- Azure and SQL Azure and in Azure web and database deployments.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Experience in using Automation Scheduling tools like Autosys and Control-M.
- Worked extensively with slowly changing dimensions.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
TECHNICAL SKILLS
Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS
ETL Tools: Informatica Power Center 9.1/8.6/8.5/8.1/7.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), Informatica Data Quality (IDQ) 9.6.
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, SQL Azure, DB2 v8.1, Teradata.
Data Modeling tools: Erwin, MS Visio
OLAP Tools: Cognos 8.0/8.1/8.2/8.4/7.0/, Business Objects XI r2/6.x/5.x.
Reporting Tools: Tableau, SSIS, SSAS, SSRS.
Languages: SQL, PL/SQL, UNIX, Shell scripts, C++
Scheduling Tools: Autosys, Control-M
Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, Test Director
PROFESSIONAL EXPERIENCE
Confidential, Houston TX
Sr. ETL/Informatica Developer
Responsibilities:
- Responsible for Business Analysis and Requirements Collection.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Comprehensive Cloud Integration and Data Management Solution for Microsoft Azure
- Created Informatica workflows and IDQ mappings for - Batch and Real Time.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Created mapplets to use them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used existing ETL standards to develop these mappings.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
Environment: Informatica Power Center 9/8.6.1, Informatica Data Quality (IDQ) 9.6 Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0, Cognos 8.
Confidential, Tampa FL
ETL Consultant
Responsibilities:
- Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA
- Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
- Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
- Used various transformations like Source Qualifier, Joiner, Lookup, sql, and router, Filter, Expression and Update Strategy.
- Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
- Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
- Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools
- Documented Informatica mappings in Excel spread sheet.
- Tuned the Informatica mappings for optimal load performance.
- Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
- Extract Transform Load (ETL) development using SQL Server 2005, SQL 2008 Integration Services(SSIS)
- Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
- Performed Documentation for all kinds of reports and DTS and SSIS packages
- Have generated reports using OBIEE 10.1.3 for the future business utilities.
- This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
- Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
- Constantly interacted with business users to discuss requirements.
Environment: Informatica PowerCenter Designer 8.6.1, Informatica Data Quality (IDQ) 9.6 Informatica Repository Manager, Oracle10g/9i, and DB2 6.1, Erwin, TOAD, and SAP Version: 3.1.H, Unix- SunOS, PL/SQL, SSIS,SSAS,SSRS.
Confidential, Phoenix, AZ
Sr. ETL/Informatica Developer
Responsibilities:
- Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
- Designed ETL specification documents for all the projects.
- Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
- Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalize, Sorter and Union to develop robust mappings in the Informatica Designer.
- Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
- Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
- Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
- Extensively used workflow variables, mapping parameters and mapping variables.
- Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
- Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Implemented Informatica recommendations, methodologies and best practices.
- Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
- Involved in Unit, Integration, System, and Performance testing levels.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Created detailed Unit Test Document with all possible Test cases/Scripts.
- Conducted code reviews developed by my team mates before moving the code into QA.
- Provided support to develop the entire warehouse architecture and plan the ETL process.
- Modified existing mappings for enhancements of new business requirements.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- Involved in production support.
- Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.
Environment: Informatica PowerCenter 8.6/8.1, Oracle 11g, SQLServer2008, IBM I Series (DB2), MS Access, Windows XP, Toad, Tidal, Cognos 8.4.1., SQL developer.
Confidential, Roseville CA
ETL Developer
Responsibilities:
- Involved in the analysis of the user requirements and identifying the sources.
- Created technical specification documents based on the requirements by using S2T Documents.
- Involved in the preparation of High level design documents and Low level design documents.
- Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
- Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
- Followed Ralph Kimball approach (Bottom Up Data warehouse Methodology in which individual data marts like Shipment Data Mart, Job order Cost Mart, Net Contribution Mart, Detention & Demurrage Mart are provides the views into organizational data and later combined into Management Information System (MIS)).
- Prepared Level 2 Update plan to assign work to team members. This plan is very helpful to know the status of each task.
- Administered the repository by creating folders and logins for the group members and assigning necessary privileges.
- Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.
- Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
- Developed reusable Mapplets and Transformations.
- Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
- Involved in monitoring the workflows and in optimizing the load times.
- Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
- Involved in writing procedures, functions in PL/SQL.
- Developed mappings in Informatica using BAPI and ABAP function calls in SAP.
- Used the Remote functional call RFC as the SAP interface for communication between systems
- Implemented RFCs for the caller and the called functions module for running in the same sytem.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Worked with SQL*Loader tool to load the bulk data into Database.
- Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.
- Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in)
- Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
- Defect Tracking and reports are done by Rational Clear Quest.
Environment: Informatica Power Center 8.6/8.1, SQL*Loader, IDOC, RFC, HP Quality Center, Oracle9i/10g, AUTOSYS, Rational Clear case, Rational Clear Quest, Windows XP, TOAD, UNIX.
Confidential
ETL Developer
Responsibilities:
- Analyzed the requirements and framed the business logic for the ETL process.
- Extracted data from Oracle as one of the source databases.
- Involved in JAD sessions for the requirements gathering and understanding.
- Involved in the ETL design and its documentation.
- Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.
- Followed Star Schema to design dimension and fact tables.
- Experienced in handling slowly changing dimensions.
- Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
- Responsible for the development, implementation and support of the databases.
- Extensive experience with PL/SQL in designing, developing functions, procedures, triggers and packages.
- Developed mappings in Informatica to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
- Developed reusable Mapplets and Transformations.
- Used data integrator tool to support batch and for real time integration and worked on staging and integration layer.
- Optimized the performance of the mappings by various tests on sources, targets and transformations
- Design, develop and Informaica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows
- Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
- Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
- Scheduled the tasks using Autosys.
- Loaded the flat files data using Informatica to the staging area.
- Created SHELL SCRIPTS for generic use.
- Created high level design documents, technical specifications, coding, unit testing and resolved the defects using Quality Center 10.
- Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.
Environment: Windows XP/NT, Informatica Power center 8.6, UNIX, Teradata V-14, Oracle 11g, Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer, MS VISIO, Autosys, Korn Shell, Quality Center 10.