Etl/informatica Developer Resume
Columbus, OH
SUMMARY
- 7+ Years of IT experience in designing, implementation, development, testing and maintaining the ETL components in building Data Warehouse & Data marts across Health Care, Retail, Finance, Insurance, and Banking.
- Experienced in all stages of the software lifecycle Architecture (Waterfall model, Agile Model) for building a Data warehouse.
- Well acquired on Informatica PowerCenter 9.x/8.x/7.x Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer) and Repository Manager & Admin console.
- Involved in Troubleshooting data warehouse bottlenecks, Performance tuning - session and mapping tuning, session partitioning & implementing Pushdown optimization.
- Extensively worked on various transformations like Lookup, Joiner, Router, Rank, Sorter, Aggregator, Expression, etc.
- Exposure to Data Warehousing, Data Architecture, Data Modeling, Data Analysis, SDLC Methods and GUI applications.
- Good Understanding with experience in Ralph Kimball Methodology, Bill Inmon Methodology, creating entity-relational and dimensional-relational table modeling using Data Modeling (Dimensional & Relational) concepts like Star Schema Modeling and Snow-flake schema modeling.
- Understanding & Working knowledge of Informatica CDC (Change Data Capture).
- Extensive experience with Data Extraction, Transformation, and Loading(ETL) from heterogeneous Data sources of Multiple Relational Databases like Oracle, Netezza, Teradata, DB2, SQL Server, MS Access and Worked on integrating data from flat files like fixed width and delimited, CSV, XML into a common reporting and analytical Data Model using Informatica.
- Experience with Teradata utilities like Fast load, Fast Export, Multi Load, TPUMP & TPT. Have experience in creating BTEQ scripts.
- Experience on Oracle utilities like SQL Loader, TOAD.
- Extensively used SQL and PL/SQL for development of Procedures, Functions, Packages and Triggers.
- Experience in using the Informatica command line utilities like PMCMD to execute workflows in Unix environment.
- Good programming skills in SQL, PLSQL, TSQL
- Used CA7 and Clearcase for automation of workflows and Sub Version for versioning.
- Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment, Normalization & De-normalization, Cleansing, Performance Optimization along with the CDC (Change Data Capture).
- Good Experience in UNIX Shell scripting & Windows Batch Scripting for Parsing Files & automation of batch ETL jobs.
- Strong knowledge in RDBMS concepts and extensive experience in creation and maintenance of Tables, Views, Materialized Views, Stored Procedures, Packages, Synonyms, Indexes, Triggers, Bulk Load and PL/SQL programming.
- Versatile team player with excellent analytical, communication and presentation skills.
- Excellent interpersonal, communication and presentation skills with the ability to successfully interact with team members and business customers.
TECHNICAL SKILLS
ETL TOOL: Informatica PowerCenter 9.5.1/9.1.1/8. x/7.x
DATABASES: Oracle 11g/10g, Teradata, Netezza, MS SQL Server 2005/2008, DB2 UDB, MS Access
TOOLS: /UTILITIES: Teradata SQL Assistant, TOAD, Putty, SQL Loader, Visio, SQL Plus, Query Analyzer, Fast Load, Fast Export, Mload, BTEQ, TPT, TPUMP, AQT, PMCMD
OPERATING SYSTEMS: Windows XP, Windows 2003, 2005, 2008, UNIX 5.05, Linux
PROGRAMMING: C, C++, Visual Basic, Unix Shell Scripting, Windows Batch Scripting, SQL, PL/SQL, TSQL, JAVA, Perl Scripting
SCHEDULER: WLM, Informatica Scheduler, ControlM
WEB TECHNOLOGIES: HTML, XML, DHTML
PROFESSIONAL EXPERIENCE
Confidential, Columbus, OH
ETL/Informatica Developer
Responsibilities:
- Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data marts.
- Preparation of technical specification document for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards
- Estimates and planning of development work using Agile Software Development.
- Good experience on Agile Methodology and the scrum process.
- Ensured performance metrics are met and tracked
- Worked on Informatica Power Center Designer - Source Analyzer, Warehouse designer, Mapping Designer & Mapplet Designer and Transformation Developer.
- Processes to generate Daily, Weekly and Monthly data extracts were developed and the data files were sent across to the downstream applications.
- Update the status and plan the releases through the scrum meetings.
- Proficient in ETL (Extract - Transform - Load) Informatica PowerCenter tool.
- Designed Conceptual, Logical and Physical Model based on normalization standards (Second Normal Form-2NF/Third Normal Form-3NF).
- Involved in Data modeling, E/R diagrams, normalization and de-normalization as per business requirements.
- Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management and Data transportation.
- Experience in Performance tuning of SQL queries and mainframe applications.
- Designed and Developed ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems such as Oracle, Flat files, XML, SQL Server.
- Used most of the transformations such as the Connected & Unconnected lookups, Filters, Routers, Joiners, Stored Procedure transformations & Sequence Generators.
- Worked on making session runs more flexible through the use of mapping parameters and variables and used parameter files and variable functions to manipulate them
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business logic and captured the deleted records from the source systems.
- Used the update strategy to effectively load data from source to target.
- Worked on slowly changing dimension Type1 and Type2.
- Used the Dynamic look up to implement the CDC.
- Experience working with Rational ClearCase for Version controlling and ClearQuest for defect tracking tool.
- Have worked with the JMS integration with Informatica
- Worked on WebService Call through Informatica by generating and testing the incoming XML using external Freeware called SOAP UI
- Configured tasks and workflows using workflow manager.
- Involved in the creation of Oracle Tables, Views, Materialized views and PL/SQL stored procedures and functions.
- Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data.
- Tuned the Sessions for better performance by eliminating various performance bottlenecks.
- Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.
- Writing Perl script to load data from sources to staging tables, to create indirect file list, generate parm files for respective paths.
Environment: Informatica PowerCenter 9.5.1, Oracle 11g, Autosys, SQL Server, UNIX, Ruby, Perl, MS Visio.
Confidential, Denver, CO
ETL/Informatica Developer
Responsibilities:
- Worked and coordinated with Data Architects, Business Analysts & users to understand business and functional needs and implement the same into an ETL design document.
- Created Technical Specification Documents to outline the implementation plans for the requirements and cleansing the data coming from raw data
- Worked with Informatica power center 9.5.1 to load in the Oracle 11g warehouse reading SQL Server data and COBOL files in a UNIX AIX 5.3.8.0 platform.
- Worked on Source Analyzer, Mapping & Mapplet Designer and Transformations, Informatica Repository Manager, Workflow Manager and Workflow Monitor.
- Develop mappings transformations like Filter, Joiner, Sequence Generator and Aggregator and perform query overrides in Lookup transformation as and when required to improve the performance of the mappings.
- Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
- Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Involved in the creation of oracle Tables, Table Partitions, Materialized views and Indexes and PL/SQL stored procedures, functions, triggers and packages.
- Created Workflows with worklets, event wait, decision box, email and command tasks using Workflow Manager and monitored them in Workflow Monitor.
- Used the Teradata external loading utilities like MultiLoad, TPUMP, FastLoad and FastExport to extract from and load effectively into Teradata database
- Developed complex T-SQL queries and designed SSIS packages to load the data into warehouse.
- Designing SSIS Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
- Updated numerous Bteq/Sql scripts, making appropriate DDL changes and completed unit and system test.
- Worked for preparing the BTEQ scripts to load data from Preserve area to Staging area.
- Used UNIX Shell Scripting to invoke Sessions in the workflow manager.
- Run the install scripts through UNIX shell scripts to run the workflows, used Autosys for scheduling the scripts.
- Experience with FTR DPI for XML files.
- Worked on WebService Call through Informatica by generating and testing the incoming XML using external Freeware called SOAP
- Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing Change Data Capture (CDC).
- Designed and developed ETL architecture for Real Property database in 3NF
- Designed SSIS Packages to transfer data between servers, load data into database; Scheduled the jobs to do these tasks periodically.
- Created SSIS packages for application that would transfer data among servers and perform other data transformations.
- Performed pre-session and post-session scripts in Informatica mappings
- Implemented Slowly Changing dimensions to update the dimensional schema.
- Designed ETL process using Informatica Tool to load from Sources to Targets and for data Transformations TOAD is used for Oracle.
- Worked on the Waterfall Methodology.
- Implemented performance tuning techniques by identifying bottlenecks in source, target, transformations, mappings and sessions to improve performance and resolving various components like Parameter files, Variables and Dynamic Cache, various SQL queries to improve the performance.
- Performance tuning using session and workflow logs for debugging the session.
Environment: Informatica PowerCenter 9.5.1, Teradata TD13.10, Teradata Tools and Utilities, Oracle 11g, Oracle Utilities, SQL Server, Perl Scripting, Unix, Tableau, COBOL, IBM Main frame, DB2, UNIX AIX 5.3.8.0
Confidential, New York City, New York
Informatica Developer
Responsibilities:
- Gathered the system requirements and created mapping document which gives detail information about source to target mapping and business rules implementation.
- Designed, developed and debugged ETL Mappings using Informatica designer tool.
- Created complex mappings using Aggregator, Expression, Joiner, Filter, Stored Procedure, Connected & Unconnected Lookup, JAVA and Update Strategy transformations using Informatica PowerCenter Designer.
- Involved in creating CA7 to automate the jobs to run on monthly basis.
- Implemented Slowly Changing Dimension for accessing the full history of accounts and transaction information.
- Created and configured workflows, worklets, and sessions using Informatica Workflow Manager.
- Worked as a SQL server 2000 writing TSQL code.
- Used Mapping Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
- Used Debugger to test the mappings and fixed the bugs.
- Worked with Shortcuts across Shared and Non Shared Folders
- Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
- Involved in migrating the mappings and workflows from Development to Testing and then to Production environments.
- Worked with SAP and Oracle sources to process the data
- Was involved in migrating Informatica objects from Development and Test to PROD using deployment groups
- Improved ETL Performance by using the Informatica Pipeline Partitioning.
- Created Partitions through the session wizard in the Workflow Manager to increase the performance.
Environment: Informatica PowerCenter 9.1.1, TSQL, Metadata Manager, Tableau, Linux, Oracle 11g, SAP, XML Files, UNIX.
Confidential, Golden Valley, MN
ETL Developer
Responsibilities:
- Involved in design and development of ETL process full data warehouse life cycle (SDLC)
- Coding and testing of various database objects such as views, functions and stored procedures using SQL and PL/SQL
- Involved in designing the procedures for getting the data from all the source systems to DataWarehousing system.
- Loaded all membership/Claims data as per the Business requirements by profiling the data, cleansing the raw source data, applying business transformations, massage the data, SQL Optimization and Performance tuning.
- Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.
- Performance tuning on sources, targets, mappings, sessions and SQL queries in the transformations to improve the daily load performance.
- Designed and developed complex Aggregate, Join, Router, Look up and Update strategy transformation based on the business rules.
- Created Dynamic lookup transformation to increase the session performance.
- Experience in integration of various data sources like Oracle, DB2 UDB, Teradata, SQL server and MS access into staging area.
- Used Scripting and Scheduled PMCMD to interact with Informatica Server from command mode and involved in jobs scheduling, monitoring and production support in a 24/7 environment.
- Migration of the sessions, Workflows, Mappings and other objects from Development to Production by Exporting/Importing as XMLfiles from Development to QA and Production Deployment
- Improved the data quality by understanding the data and performing Data profiling.
- Implemented procedures/functions in PL/SQL for stored Procedure Transformations.
- Used various Oracle Index techniques to improve the query performance.
- Gained working knowledge of ControlM Scheduling Tool for loading/force starting jobs, changing job status and monitoring job progress.
- Prepared BTEQ scripts to load data from Preserve area to Staging area.
- Experience working with Rational ClearCase for Version controlling and ClearQuest for defect tracking tool.
- Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts.
- Have experience in developing mappings according to business rule, migrating to QA AND production, naming conventions, mapping design standards and good knowledge in Datawarehouse, PL/SQL concepts, ODBC connections etc.
- Tracked the defects using Clear Quest tool and generated defect summary reports.
- Wrote UNIX shell scripts and used them as Pre/Post Session commands in the Sessions.
- Worked on different utilities of Teradata like BTEQ, fast load, M-load, T-pump and Fast export.
- Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
- Creating Test cases for Unit Test, System Integration Test and UAT to check the data.
Environment: Informatica PowerCenter 8.6 (Designer, Server Manager, Repository Manager), ControlM Scheduler, Autosys, SQL Server 2008, DB2 UDB, Erwin, SQL, PL/SQL, Perl Scripting, UNIX, Clear Case, Unix Platform, Perl Scripting, Putty, Windows 2007, Teradata V2R6.
Confidential, Scranton, PA
ETL Developer
Responsibilities:
- Worked with Data Architects, Business Analysts and Independent Testing Team.
- Responsible for analyzing, programming and implementing modifications to existing systems required by changes in the business environment
- Provided technical documentation of Source and Target mappings
- Extensively worked in creating Mappings using Informatica Designer and processing tasks using Workflow Manager to configure data flows from multiple sources (flat files, HTML files, DB2, Oracle) to targets persistent data stores
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Source Qualifier, Expression, Stored Procedure and Update transformations
- Developed Shell scripts for initial conversion of dimensions, for validation of source files and data loading procedures, written multiple scripts for pre-processing the files.
- Designed and coded maps, which extracted data from existing, source systems into the data warehouse
- Designed and developed UNIX scripts and used AUTOSYS, ControlM for job scheduling.
- Performed other performance tuning techniques like reducing the number of transformations used in the mapping, using unconnected lookup when multiple values has to be returned from the same table, performing joins at the source level (homogenous sources) instead of using joiner transformation etc
- Used the Teradata external loading utilities like MultiLoad, TPUMP, FastLoad and FastExport to extract from and load effectively into Teradata database
- Worked on Data Partitioning that optimizes Parallel Processing with Multi Thread Processing
- Involved in Unit testing Interacted with QA team for system/integration testing
- Scheduled the Informatica Sessions and Batches based on Event based scheduling
- Used Debugger in troubleshooting the existing mappings
- Prepared BTEQ scripts to load data from Preserve area to Staging area.
- Involved in writing scripts for loading data to target data warehouse for BTEQ, FastLoad and MultiLoad.
- Involved in Fine tuning SQL overrides and Look-up SQL overrides for performance Enhancements
- Worked on Pushdown Optimization and Partitioning at Session Level
- Extensively involved in parameterization of the workflow objects
- Executed sessions, sequential and concurrent batches for proper execution of mappings
Environment: Informatica 8.6, Teradata V2R6, Teradata Tools and Utilities, Netezza Bulk Reader/Writer, Data Quality, Windows Server 2008, DB2, AQT, Oracle 10g, Flat Files, Unix shell scripting