We provide IT Staff Augmentation Services!

Senior Etl Informatica Consultant Washington Dc

4.00/5 (Submit Your Rating)

Skills

Please use this table to list the skills noted in the Required/Desired section of the requirement. In addition, please respond with the years of experience for each skill and the last time each skill was used. Add or delete rows as necessary.


Skill

Years Used

Last Used

Experience with Extract, Transform, Load (ETL) systems using oracle SQL Loader and PL/SQL

8 Years

2012

Strong development skills in at least one report deployment solution (i.e., Cognos, LogiXML, Crystal Reports or Reporting Services

4 Years

2012

Strong SQL Experience

10 Years

2012

Oracle 9-11g experience

6 Years

2012

Data Warehouse experience

4 Years

2012

Data Analysis

7 Years

2012

Promotion Change Control Methodology

3 Years

2012

Good Oral and written communication

12+ Years

2012

Employment History

Confidential,
DC Government, Washington, DC March 2006 – Till Date Senior ETL Informatica consultant

I was involved in the design and the development of the W-2 matching project. This data warehouse was developed to provide reports for the compliance and audit department to identify the anomalies in the tax withheld amounts reported on the individual tax returns and the amounts reported on W-2 by employers. These reports would help auditors identify the fraud and take the action.

Responsibilities:

  • Involved in the design and development of Data Warehousing project for the improvement of Account Management System.
  • Worked on Enterprise Resource Management to develop and maintain Account Management System.
  • Coordinating with source system owners and day-to-day ETL progress monitoring.
  • Involved in Analysis, Requirements Gathering, SRS (Software Requirement Specifications) & HLDD (High Level Design Document) & LLDD (Low Level Design Document).
  • Involved in analyzing existing logical and physical Data modeling using Erwin.
  • Responsible for creating & running SQL scripts for DDL, DML operations on Oracle DB.
  • Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Involved in data migration to import legacy data from one system to the other.
  • Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems.
  • Involved in Informatica Power Exchange which scales effectively to support high volume batches to low latency complex data.
  • Very strong indata analysisand ETL solution design and development
  • Widely used Informatica client tools – Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer and Informatica Work Flow Manager.
  • Used Transformations like look up, Router, Filter, Joiner, Stored procedure, Source Qualifier, Aggregator and Update strategy extensively.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Performed Pipeline partitioning to optimize the performance of mappings.
  • Created Mapplet and used them in different Mappings.
  • Partitioned Sessions for concurrent loading of data in to the target tables.
  • Performed incremental aggregation to load incremental data into Aggregate tables.
  • Created Schema objects like Indexes, Views and Sequences.
  • Developed stored procedure to check source data with Warehouse data and if not present, write the records to spool table and used spool table as Lookup in Transformation.
  • Done extensive bulk loading into the target using Oracle SQL Loader.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to automate sessions and cleansing source data.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Extensively used SQL*Loader to load Data from flat files to Database tables in Oracle.
  • Created Sessions and batches to move data at specific intervals & on demand using Workflow Manager.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
  • Created User Interface reports for validating the Data throughReporting Services.
  • Used PMCMD to run workflows and crontab to automate their schedules.
  • Was responsible for migration/conversion of Informatica PowerCenter from Informatica 7.1.3 to Informatica 8.5.
  • Involved in upgrading and configuring Informatica Power Exchange from 8.5 to 8.6.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality.
  • Involved in Promotion Change Control Methodology.
  • Developed views necessary for structured and ad-hoc reporting.
  • Involved in Version control of code from development to Test and Production environments using Change man.
  • Used Rational Rose to model the process using UML to create behavioral and structural diagrams.
  • Generated reports using Business Objects Report Designer.

Environment: Informatica Power Center 8.5/8.1.1/7.1.3, Informatica Power Exchange 8.5, Informatica Power Mart 7.0, Business Objects, Oracle SQL Loader, SQL*Plus, JDBC, PL/SQL, Reporting Services, Oracle 9i/10g, SQL* Loader, Toad, Win XP Pro, Autosys, Erwin 4.0, Flat files (delimited, fixed width, XML), UNIX Shell scripting

Confidential,
NY April 2005 – February 2006
DW/ETL Analyst

MetLife, Inc., through its subsidiaries and affiliates, is a leading provider of insurance and other financial services to individual and group customers. Claims Management System provides the technology that assists claim professionals in administering claim practices in a timely and effective manner. This application involved designing and development of a Data Warehouse. Various Business Rules and Business Processes were applied to Extract, Transform and Load the data into the Data Warehouse.

Responsibilities:

  • Prepared the Detail Design Document from the requirements specification document.
  • Analyze business requirements and segregated them into Use Cases and activity diagrams using Microsoft Visio and Rational Rose according to UML methodology thus defining the Data Process Models and Business Workflow.
  • Developed various Mapplet, Transformations and was responsible for validating and fine-tuning the ETL logic coded into mappings.
  • Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems.
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Source, Target, and Mapplet and Transformation objects.
  • Created (or edited existing) Informatica mappings and sessions to accomplish data migrations. Sources were Oracle, DB2 and flat files and load into target.
  • Created Complex mappings using transformation like Connected/Unconnected Lookup, Filter, Expression, Joiner, Aggregator, Router and Stored Procedure transformations for populating target table in efficient manner.
  • Coded the scripts for loading the data into Oracle Database from flat files using SQL Loader.
  • Re-designed ETL mappings to improve data quality.
  • Created and managed daily, weekly and monthly data operations, workflows and scheduling processes.
  • Responsible for validating the Informatica mappings against the pre-defined ETL design standards.
  • Performdata analysis, testing, debugging, audits, disaster recovery, and problem resolution.
  • Provided production support by monitoring the processes running daily.
  • Involved in the debugging of the mappings by creating breakpoints to gain troubleshooting information about data and error conditions.
  • Performance tuned the workflows by identifying the bottlenecks in targets, sources, mappings, sessions and workflows and eliminated them.
  • Created partitions through the session wizard in the workflow manager to increase the performance.
  • Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Created reports/Dash boards from Star schema usingreporting services
  • Developed UNIX Shell scripts for data extraction, running the pre/post processes and PL/SQL procedures for performing different database tasks. Used pmcmd command of Informatica in UNIX scripts.
  • Created complex joins, transformations of all types to pass data through ETL maps.
  • Created workflows with sessions, worklets, event waits assignments, conditional flows, email and command tasks in Workflow Manager, maintained versioning in Visual Source Safe (VSS) and created .Jill files for scheduling the workflows, sessions using Autosys jobs.
  • Recommended tuning options to source/target database DBA team to gain optimum performance.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data.

Environment: Informatica PowerCenter 7.1, Informatica Power Mart 6.2, Erwin 4.0, RUP, UML, Rational Rose, MS-Visio, Teradata, Oracle 9i/8i, Reporting Services, UNIX Shell Scripting, SQL, PL/SQL, SQL * Loader, TOAD, Sybase, Erwin 3.5.5, Visual Source Safe, Sun Solaris 2.6 UNIX, Windows 2000

Confidential,

Milford, CT April 2004 – February 2005

Informatica Developer

The project was aimed at building an Investment DW, which provides Financial Statement, Capital Gain Analysis, Performance Measurement Analysis, and Portfolio Management to Business Managers. Informatica PowerCenter 6.2 was used as ETL tool to extract data from source systems and to load data in to target systems.

Responsibilities:

  • Involved in Systems Study and Analysis and understand the business needs and implement the same into a functional database design.
  • Defined the ETL strategy for Data Warehouse population.
  • Involved in Data Quality Analysis to determine cleansing requirements.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Performed extensive analysis of metadata to test the integrity, consistency and appropriateness of the data to be brought into the centralized City Data from various sources.
  • Installed, Maintained and Documented the Informatica Power Center setup on multiple environments.
  • Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Worked on Informatica Power Center 6.2 and created Informatica mappings with PL/SQL procedures/functions to build business rules to load data. Most of the transformations used were like the Source qualifier, Aggregators, lookups, Filters & Sequence generators.
  • Technical mentored other consultants indata analysisand development.
  • Created sessions and batches to move data at specific intervals & on demand using Workflow Manager.
  • Extensively worked on the Database Triggers, Stored Procedures, Functions and Database Constraints. Written complex stored procedures and triggers and optimized for maximum performance.
  • Analysis and Design of Migration Procedures for the DB Migration between Relational and Extended ERP Data Sources.
  • Performed Database Migration from sources to the Data Warehousing database. Worked closely with Oracle DBA for migration support.
  • Created UNIX shell scripts (Scheduler utilities) for automating the backup of Database/ Transaction log.

Environment: Informatica Power Center 7.1/6.2, PowerConnect for Mainframes, Oracle 9i, DB2, Oracle SQL Loader, TOAD, SQL Navigator, SQL Server 2000, PL/SQL, XML, Windows NT 4.0.

Confidential,
MA June 2001 – March 2004
Informatica Developer

Invensys Production Management Division provides world class Invensys information on technology, automation, and process solutions to a wide range of manufacturing applications for the cement, chemical, metals & mining, oil & gas, pulp & paper, power, pharmaceutical and specialty chemicals. This Data mart and Data Warehouse is constructed to stage Inventory related data for the Invensys Company. Informatica was used to extract data from source system to target systems. There are numerous sources of data coming into the warehouse, including Oracle, DB2, SQL Server, flat files and Excel spreadsheets.

Responsibilities:

  • Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle as per business requirements using Erwin 4.0.
  • Wrote stored procedures in PL/SQL and UNIX Shell Scripts for automated execution of jobs.
  • Done extensive bulk loading into the target using Oracle SQL Loader.
  • Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mapping and complex transformations (Aggregator, Joiner, Lookup, Normalizer, Filter).
  • Wrote UNIX Shell Scripting for Informatica Pre-Session, Post-Session Scripts.
  • Analyzed and created Facts and Dimension tables.
  • Involved in Performance Tuning of the mappings.
  • Experienced in database design,data analysis, development,SQLperformance tuning, data warehousing ETL process and data conversions.
  • Implemented Slowly Changing Dimension (SCD- Type II) dimension mappings.
  • Wrote validations using PL/SQL Stored Procedures before transferring data from temporary table to Interface tables.
  • Used Repository Manager to create Repository, User groups, Users and managed users by setting up their privileges and profile.
  • Installed, upgraded and configured Power Center 6.2 on multiple environments.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions and scheduling them to run at specified time with required frequency.
  • Created Autosys Jobs for Scheduling.

Environment: Informatica PowerCenter 6.2, Informatica Power Mart 5.1, Autosys, COBOL, Oracle8i, DB2, SQL Server 7.0, Teradata, Oracle SQL Loader, Toad, Erwin4.0, HP-UX, OS/360, Windows NT, PL/SQL, Unix Shell Scripting.

Confidential,
India June 1998 – May 2001
Oracle Developer

The objective of MAST (Management of Skills and Training) is to manage the skills and training procedures under Quality Management System (QMS) at CMC Center. Maintaining QMS involves recording details of personnel data of the employees, training avenues, courses organized by the training avenues, training needs identified for various employees and the accounts part of the training.

Responsibilities

  • Worked with DBA in installing Oracle 8i on Solaris UNIX Box and Windows.
  • Reviewed application requirements and recommended Database configuration.
  • Designed various data entry forms, Query screen and Output Reports
  • Created primary Database storage structures (TableSpaces) and objects (Tables, Views, and Indexes).
  • Managed the TableSpaces, rollback segments and indexes for applications.
  • Run the Database in Cost Based mode for better performance.
  • Coded the Procedures and Triggers.
  • For connectivity configured the Sqlnet.ora, tnsnames.ora and listener.ora at server side.
  • Coded the scripts for loading the data into Oracle Database from flat files using SQL Loader.

Environment: SQL, PL/SQL, Oracle SQL Loader, Shell scripting, Oracle 8i, Sun Solaris UNIX, Win NT.

We'd love your feedback!