We provide IT Staff Augmentation Services!

Sr. Informatica / Etl Developer Resume

Washington, DC

SUMMARY:

  • Over 9+ Years of experience in Information Technology and Data warehousing, ETL, applications using Informatica (Power center/Power Mart )
  • Implemented data warehousing methodologies for Extraction, Transformation and Loading using Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console.
  • Extensively worked on ETL Informatica Transformations effectively including - Source Qualifier, Connected - Unconnected Lookup, Filter, Expression, Router, Normalizer, Joiner, Update etc. and created complex mappings .
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehousing/Decision Support Systems using Informatica Power Center 9.6/9.1/8.6/8.1/7.1
  • Good Experience in developing Unix Shell Scripts for automation of ETL process.
  • Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, Data Validation
  • Expertise in creating databases, users, tables, triggers, views, stored procedures, functions, Packages and indexes
  • Implemented Slowly Changing dimensions Type 1 and Type 2 , methodology for accessing the full history of accounts and transaction information .
  • Involved in understanding requirements and in modeling activities of the attributes identified from different source systems like Oracle, Teradata, SQL and Flat files.
  • Expertise in Data modeling techniques like Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Ability to Read XML and created XML files through XML Generator Transformation.
  • Good understanding of relational database management systems like Oracle, TeraData, DB2, and SQL Server and extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems and mainframe Cobol and VSAM files.
  • Experienced in Performance of Mapping Optimizations in Informatica .
  • Experienced in using the Oracle SQL* Loader feature for loading the data from Flat Files to Data Base tables for Bulk Loading.
  • Involved in Relational database design and development of data warehouse data feeds.
  • Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation
  • Very strong in Relational Databases (RDBMS), Data Modeling and Design and build the Data Warehouse, Data marts using Star Schema and Snow Flake Schema for faster and effective Querying.
  • Excellent Team Player self-motivated with good communications skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.1.0/ 9.6/9.1/8.6/8.1/7.1 (Power Center Repository, Informatica Designer, Workflow manager, Workflow monitor), Informatica Developer (IDQ) and Analyst (IDE).

Databases: MS SQL Server 2005/2008, Oracle 8i/9i /10g/11g/12c, Teradata, IBM DB2, SybaseData Modeling Tools: ERWIN 4.0/3.4, Star Schema Modeling, Snow Flake Modeling, Fact and Dimensions tables, Entities, AttributesDatabase Skills: Cursors, Stored Procedures, Functions, Views, Triggers and Packages

Client Side Skills: SQL, T-SQL, PL/SQL, UNIX Shell Scripting, XML. Putty

Web Servers: IIS v5.0/6.0, Apache Tomcat

Operating System: Windows 2000/2003/XP/Vista/Windows 7, UNIX Linux.

Methodologies: Master Data Modeling Logical/ Physical, Star/ Snowflake Schema, FACT& Dimension Tables, ETL, OLAP.

PROFESSIONAL EXPERIENCE:

Confidential, Washington, DC

Sr. Informatica / ETL Developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Analyzed the source system for integrity issues to the data resided in various source systems and defined the same in Systems and Trust.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Created a mapping in MDM Hub to load data into MDM Landing, Stage and BO .
  • Worked with Business to create a composite key into the MDM Landing table.
  • Executed Stage load jobs in MDM Hub.
  • Involved in creating, monitoring, modifying, & communicating the project plan with other team members.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 9.6, 10.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005,, UNIX AIX

Confidential, Silver Spring, MD

Sr. Informatica / ETL Developer

Responsibilities:

  • Worked with Data Warehouse analysts for requirement gathering, business analysis, and translated the business requirements into Data Requirements specifications to build the Enterprise Data Warehouse and Data Model.
  • Analyzed the key functionalities of the institute and performed Data Analysis and abstracted the transactional nature of data.
  • Profiled the data using Informatica Analyst tool to analyze source data (Departments, party and address) coming from Legacy systems and performed Data Quality Audit.
  • Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.
  • Primary activities include data analysis identifying and implementing data quality rules in IDQ and finally linking rules including Address Doctor to Power Center ETL processes and delivery to MDM Data Hub and other data consumers.
  • Extensively worked on Power Center 9.6 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Maplets Designer and Transformation Developer.
  • Worked on Source to Target Mapping Document to make sure that All Transformation are captured correctly and documented.
  • Worked on Informatica Data Quality aka Informatica Developer(IDQ) for Data Profiling, Data Cleansing, Data Validation
  • Extensively worked on Transformations like Unconnected Lookup, Router, Joiner, Expression, T-SQL and Source Qualifier Transformations in the Informatica Designer.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Worked on Parameters, Worklets and Maplets.
  • Worked Extensively on Data Profiling and Data Validation to check Data Quality before providing First Cut of Data to MDM team.
  • Created Stored Procedure, Package and functions
  • Used Informatica debugger to test the data flow and fix the mappings.
  • Involved in Data Modeling, System Design Documents and STTM
  • Performed Weekly Data load as per Sudden Data Model Changes in Agile Methodology as and when Bugs Identified by Testing Team.
  • Modified existing and developed new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.
  • Moved the mappings, sessions, workflows, Maplets from one environment to other.

Environment: Informatica Power Center 9.1,9.6, Oracle 11g,12c, (Exadata),, Win7, SQL * Plus, Toad, AS 400, UNIX, WinScp, Putty, ERWIN

Confidential, Chicago, IL

Informatica/ETL Developer

Responsibilities:

  • Experience in Performance tuning and Optimization of Cache with Informatica and OBIEE.
  • Extensively used Informatica PowerCenter 8.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, XML, Stored procedure transformations in the mapping
  • Collaborated with developers, stakeholders and subject-matter experts to establish technical vision and analyze trade-offs between usability and performance needs
  • Played a key role in the setting of standards for both the physical and logical database schema and table design.
  • Modifying the UNIX scripts as a part of Upgrade and making sure they point to the correct directories.
  • Involved with data modeling using ER and dimensional modeling techniques
  • Wrote UNIX Shell Scripting for Informatica Pre-Session, Post-Session Scripts and also to run the workflows.
  • Involved in developing OBIEE Repository at three layers (Physical, Business model and Presentation Layers), Interactive Dashboards and drill down capabilities using global and Filters and Security Setups
  • Involved in Using OBIEE for Creating RPD, S and creating Reports and Dashboards.
  • Implementing various workflows using transformations such as SQL Transformation, XML transformation, Normalizer, look up, aggregator, stored procedure and scheduled jobs for different sessions.
  • Performance Tuning of Sessions and Mappings.
  • Tune performance of Informatica sessions for large data files by implementing Pipeline partitioning and increasing block size, data cache size, sequence buffer length, target based commit interval.

Environment: Informatica Power Center 9.1,9.6, Oracle 11g,12c, (Exadata), Win7, SQL * Plus, Toad, AS 400, OBIEE, ERWIN.

Confidential, Miami, FL

Sr. Informatica/ETL Developer

Responsibilities:

  • Involved in complete SDLC phase in collecting and documenting requirements. Wrote TSD documents for developers easily to understand and code.
  • As an Onsite / Offshore model, helped team to understand scope of the project and also assisted and developed Informatica code.
  • Developed Informatica mappings to consolidate data from 10 source systems into the new data warehouse that we built. All the 10 data sources resided on various platforms. These 10 sources include fixed width and delimited flat files, excels, Oracle and SQL server tables
  • Creating mappings to load data using different transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, Normalizer, Filter and Router transformations, Union transformations, etc.
  • Prepared Key metrics and dimensions document. In this process, analyzed the System Requirement document and identified areas which were missed by Business and IT team.
  • Generated sample data based on the data model design and loaded the data into the database using UNIX shell scripts, which was used both for data analysis and also unit testing purposes.
  • Used Teradata utilities fastload, multiload, tpump to load data
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ
  • Formulated and implemented Historical load strategy from multiple data sources into the data warehouse. Data concurrency, Prioritization, comprehensiveness, completeness and minimal impact to existing users are taken as key attributes for Historical data load strategy.
  • All the information regarding dimension tables which included Dimension category, data frequency (Monthly, weekly daily etc.), Data periodicity (Incremental Vs. Historical), Identification of CRC (or MD5) columns is led by me.
  • Participated in Logical data model review meetings and identified missing attributes, facts and key measures. Updated the Logical data model using Erwin.
  • Working on Billion record tables ( performance tuning for historical loads )

Environment: Informatica PowerCenter 9.1/8.6, Oracle 10g/9i ,TeraData, Flat Files, vertica, Business Objects XI, Toad, PG Admin III, WIN SCP, Unix/AIX, Windows 7, Erwin, Control-M.

Confidential, Owings Mill, MD

Informatica/ETL Developer

Responsibilities:

  • Worked as a Informatica Developer and participated in all the project phases starting from Requirements gathering through Deployment of this Program (SDLC).
  • Used Full PDO (Push down optimization) in infromatica for loading data.
  • Creating mappings to load data using different transformations
  • Troubleshooted the ETL process developed for Conversions and implemented various techniques for enhancing the performance.
  • Extensively involved in creating design documents for loading data into Data warehouse and worked with the data modeler to change/update the Data warehouse model when needed.
  • Used Teradata utilities fastload, multiload, tpump to load data
  • Wrote Teradata Macros and used various Teradata analytic functions
  • Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
  • Developed ETL processes to load data into dimensional model from various sources using Informatica Power Center
  • Developed Mapplets and reusable transformations that were used in different mappings and across different folders
  • Designed and developed the error handling mechanism to be used for all the Informatica jobs, which load data into the data warehouse.
  • Extensively used warehouse designer of Informatica Power center Created different target definitions.
  • Created robust and complex workflows and worklets using Informatica workflow manager and Troubleshooted data load problems

Environment: Informatica Power Center 7.1 Hot fix2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Oracle 11g, SeaQuest, HPDM, SQL Server, TeraData,UNIX, Toad, Control-M.

Hire Now