Etl Informatica Developer Resume
Tysons Corner, VA
SUMMARY:
- 8 years of strong experience in designing and implementing Data Warehouse applications using ETL tools Informatica Developer Client (10.1,10.2), INFORMATICA Power Center 9.6.1, 9.5.1, 8.6, (Designer / Workflow Mgr.) and Power center to transfer the data from legacy systems (Mainframes) to common/ different targets and Oracle PL/SQL.
- Experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, also created Mappings and Workflow’s for Confidential .
- Experience in Debugging and Performance Tuning of targets, sources, mappings and sessions.
- Proficient in using Informatica Workflow manager, Workflow monitor, pmcmd (Informatica command line utility) to create schedule and control Workflows, tasks and sessions.
- Extensively worked on database applications using DB2, Oracle, SQL Server, T - SQL, PL/SQL, SQL*Loader.
- Basic Knowledge of Advanced Programming for data transformation (JAVA, C++, C)
- Experience in UNIX shell scripting.
- Worked on Informatica Cloud.
- Applied the rules and profiled the source and target table's data using IDQ.
- Installed ODI. Set up the ODI connection with Oracle, MS SQL Server and flat files.
- Experience with point to point integration to integrate clients front end applications.
- Experience with Postgres Sql.
- Experience in incident management, problem management and change management
- Experience in developing logical and physical models and implementing them in Oracle.
- Experience in integrating data from various sources Oracle, DB2 UDB, Sybase, and SQL Server.
- Experience in creating entity relational & dimensional relational data models with Kimball Methodology (Star schema and Snow flake schema architectures, Fact/dimension tables)
- Documented design procedures, operating instructions test procedures and troubleshooting procedures.
- Experience with Informatica Admin console.
- Used partitioning in Informatica to improve session performance for very large data loads.
- Knowledge in design and development of Business Intelligence reports using BI tools Business Objects, Cognos and knowledge in micro strategy.
- Worked as On-Site coordinator for an off-shore development team in India.
- Versatile team player with excellent analytical, presentation and interpersonal skills with an aptitude to learn new technologies.
- Solid time management and multitasking skills, which help in conducting project meetings, reviews, walkthroughs, and customer interviews according to the varied needs of the people involved
- Excellent written and oral communication skills and results-oriented attitude
TECHNICAL SKILLS:
ETL Tools: Informatica Developer Client(10.1.1, 10.1.2), Informatica (Power Center 9.0.1,8.6, 8.5, 8.1.2), IDQ 9.1/9.5.1
Data Base: Oracle 11g,9i/8i/8.0/7.0,MS SQL Server 2000, 2008, DB2, Postgres SQL, MS Access 97/2000, Sybase, Teradata, Netezza.
Data Modeling: Erwin 3.5/4.0
Operating System: Windows 98/NT/2000/XP, UNIX (Linux, HP-Unix, Solaris)
Other Software: Aginity, WinSQL, TOAD7.3, MSOffice, MS Visio, Autosys.
Scripting Languages: Korn Shell Scripting (Kshell), Perl, Python, Unix
PROFFESIONAL EXPERIENCE:
Confidential, Tysons Corner, VA
ETL Informatica Developer
Responsibilities:
- Involved in Informatica upgrade project from 9.6 to 10.2.
- Migrated informatica objects from old server to the new server, as a part of upgrade.
- Create, Test, Monitor DRT jobs on Informatica Cloud.
- Used REST API to access metadata information in Informatica Cloud.
- Created DSS tasks in Informatica Cloud.
- Integrated data from SalesForce into our DWH.
Environment: Informatica 10.2, Informatica 9.6, SQL Server Management Studio 2008 and 2012, Windows Server .
Confidential, Owings Mills, MD
ETL Informatica Developer
Responsibilities:
- Involved in Informatica upgrade project from 9.1 to 9.6.
- Migrated informatica objects from old server to the new server, as a part of upgrade.
- Updated jil scripts to update autosys jobs with the new server information.
- Created Informatica Mappings to acquire assets and flows information on competitor products, both US and International.
- Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
- Involved in migration of the mapps from IDQ to power center
- Applied the rules and profiled the source and target table's data using IDQ.
- Loaded monthly file-based Strategic Insight tables into CIP Exploration Oracle database.
- Provided limited change data capture to overwrite recently modified records.
- Provided basic exception handling for potential data issues.
- Provided new ‘Calendar’ dimension for ease of rolling period reporting.
- Involved in Integrating various applications
- Helped creating Interfaces between various front end applications .
- Implemented point to point integration .
Environment: Informatica 9.6.1, SQL Server Management Studio 2008 and 2012, Oracle 11g, Unix.
Confidential, Balacynwyd, PA
ETL Informatica Developer
Responsibilities:
- Worked with Integration Team to build intermediate DB.
- Created mappings, workflows to load data into the new tables in intermediate DB Tables.
- Followed Agile methodology, interacted with Integration Team on daily basis.
- Worked with Spire Integration Team to Help load data into intermediate DB for easy access of data through various different systems.
- Created Mapping and Workflows to load appropriate data for billing Center and APPS applications.
- Optimized existing code with performance issues.
- Extensively involved in data validation to ensure that the highest levels of data quality and data integrity are maintained.
- Created Staging Tables on the intermediate DB and loaded these tables with appropriate data for different systems.
- Participated in status review team meetings on daily basis.
- Created different Transformations for loading the data into target like Source Qualifier, Expression, Aggregator,
- Update Strategy, Lookup, Router, Filter, and Sequence Generator transformations.
- Performed unit testing and Involved in tuning the Session and Workflows for better performance.
Environment: Informatica 9.6.1, SQL Server Management Studio 2008 and 2012, PL/SQL, Windows Server.
Confidential, Pittsburgh, PA
ETL Informatica Developer
Responsibilities:
- Involved in a Project called POLYPATHS.
- Involved in optimization of code and tuning queries for performance.
- Involved in testing and data validation of the reports.
- Created mappings, workflows to load data from FHLB-PGH’s existing DWH into their new POLYPATHS system.
- Worked with FHLB-PGH’s Trade data to analyze bond - pre trade data.
- Helped load Market values correctly into the new POLY system for trade analysis.
- Worked with Flat files like (OF-Generic Curve, Bond-Pre Trade) loaded these flat files into POLY system.
- Worked on both Outbound and Inbound interfaces (POLYPATHS).
- Redesigned and implemented FHLB-PGH’s FAS-133 system, to load INCEPTION and ONGOING outbound Flat files.
- Created deployment groups to migrate Informatica code from Dev to UAT.
- Followed Agile methodology, interacted with business users on a daily basis.
Environment: Informatica 9.5.1, SQL Server Management Studio 2008 and 2012, Autosys, PL/SQL, Windows XP, UNIX, Secure FX.
Confidential, Philadelphia, PA
ETL Informatica Developer
Responsibilities:
- Creating Informatica Mappings and workflows to migrate data from one environment to another and data enhancements.
- Involved in optimization of code and tuning queries for performance.
- Involved in testing and data validation of the reports.
- Provided post production support to end-users.
- Worked on ESR Support Project and EM support which are two process in Confidential
- Created Autosys jobs and also involved in monitoring Autosys jobs on daily basis.
- Support end User on a daily basis, with reports and any issues concerning the balance sheet.
Environment: Informatica 9.5.1, SQL Server Management Studio 2008 and 2012, Autosys, PL/SQL, Windows XP.
Confidential, Jessup, PA
ETL Informatica Developer
Responsibilities:
- Preparation of design document, development for loading data into the data warehouse.
- Designed, Developed and Deployed informatica mappings, workflows from Dev to testing and Production environments.
- Created New Staging tables in Staging DB to store data from client files (CMS, NY
- Medicaid Files etc.)
- Wrote Postgres SQL Queries to work with the data, being loaded from external client files into the Data Warehouse.
- Worked with Shell Scripts (UNIX).
- Worked with existing Python Scripts, and also made additions to the Python script to load data from CMS files to Staging Database and to ODS.
- Worked extensively on SQL Server 2008 and Postgres SQL.
- Worked extensively on Medicaid and Medicare and CMS files, and their corresponding file layouts.
- Prepared design, technical and functional documents from the business requirements gathered.
- Created Mappings to move data from Oracle, SQL Server to new Data Warehouse in
- Green Plum.
- Created different Transformations for loading the data into target like Source Qualifier,
- Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression,
- Router, Filter, Aggregator and Sequence Generator transformations.
- Created and Configured Workflows, Worklets and Sessions to transport the data to target tables using Informatica Workflow Manager
- Parsing high-level design specs to simple ETL coding and mapping standards.
- Involved in data validation to ensure that the highest levels of data quality and data integrity are maintained.
- Used Informatica Partitioning to improve session performance.
- Performed unit testing and Involved in tuning the Session and Workflows for better performance.
- Participated in status review team meetings.
Environment: Informatica 9.1, SQL Server Management Studio 2012, Idle Python GUI 2.5.4, PgAdmin3 (Postgres), PL/SQL, Windows XP, UNIX.
Confidential, King of Prussia, PA
ETL Developer
Responsibilities:
- Involved in requirement gathering with business team for data feed and report development for the stores Ralph Lauren and Club Monaco.
- Preparation of design document, development of data feed interfaces and reports using Sagent Business Intelligent Tool.
- Design, Development and Deployment of Order, Buyer and Customer Profile data feeds.
- Identifying and resolving performance issues on long running reports and data feeds.
- Coordinating with DBAs to find optimal solutions on complex backend operations.
- Scheduling and automating interface feeds and reports using Sagent Automation Tool.
- Prepare test documents and perform testing to certify for deployment.
- Involved in User Testing, fixing issues and production deployment.
- Resolve high priority production support tickets involving data issues, historical loads and real time data traffic.
Environment: Sagent, Sagent Automation, Oracle 11g/9i, DB2, PL/SQL, Windows XP, SQL Server.
Confidential, Harrisburg, PA
ETL Informatica Developer
Responsibilities:
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.0.1.
- Interfaced with various members of the technical and business team to translate the business reporting and data maintenance requirements into functional ETL code.
- Involved in migrating of data from DB2 to Netezza, using Informatica 9.0.1
- Created Mappings to move ETL processes from various systems like DB2, Oracle, SQL Server to CBC’s new Data Warehouse in Netezza.
- Created and Configured Workflows, Worklets and Sessions to transport the data to target tables using Informatica Workflow Manager
- Worked with the DBA to modify SQL from DB2 to Netezza version for already existing code in DB2.
- Parsing high-level design specs to simple ETL coding and mapping standards.
- Extensively involved in data validation to ensure that the highest levels of data quality and data integrity are maintained.
- Experience with Informatica Admin Console.
- Created deployment groups to migrate code for one environment to another.
- Complete understanding of Pushdown Optimization Utility in Informatica.
- Used Informatica Partitioning to improve session performance.
- Used Aginity and WinSQL to run Queries, for testing and validation of data.
- Created mappings to write Infusion data, CPT HCPC codes etc to flat files for Care Centrix for reporting purposes.
- Used transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, and lookup, Rank, Joiner, Expression, Stored Procedure and Update Strategy to meet business logic in the mappings.
- Created reusable transformations and Mapplets to use in multiple mappings.
- Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.
- Used mapping parameters and variables.
- Created Workflow, Worklet, Assignment, Decision, Event Wait and Raise and Email Task, scheduled Task and Workflow based on Client requirement.
- Migrated ETL code from Dev to QA and then from QA to Prod for new release cycles.
- Developed testing strategies and performed in-depth testing to ensure data quality.
Environment: Informatica Power Center 9.0.1, Oracle 11g/9i, DB2, PL/SQL, UNIX, Windows XP, Win SQL, Tidal.
Confidential, Boston, MA
ETL Informatica Developer
Responsibilities:
- Involved in data analysis and development.
- Prepared mapping specifications documents, unit testing documents for developed Informatica mappings.
- Prepared mapping documents for loading the data from Oracle database, to staging area.
- Loaded data from staging (ODS) Schema to EXTRACT tables for reporting purpose.
- Defined Target Load Order Plan for loading data into different Target Tables.
- Worked on Informatica Power center 9.0.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
- Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
- Worked with Utilities such as BTEQ, Fast Load, Multi Load, Xml import, Fast Export. Exposure to TPump on UNIX/Windows/Mainframe environments and running batch process for Teradata CRM.
- Used Bteq to load data from the flat files delivered on FTP server by Summit Energy Group into target tables.
- Developed Bteq scripts to generate reports for campaigning purposes for the marketing.
- Extensively used Teradata analytical functions such as RANK(), SUM /OVER,
- AVG/OVER, etc. for developing code as per the requirement.
- Performed tuning and optimization of complex SQL queries using Teradata Explain. Created several custom tables, views and macros to support reporting and analytic requirements
- Involved in the design and development of mappings from Oracle database to target database.
- Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
- Used SQL tools like TOAD to run SQL queries and validate the data.
- Defined test cases and prepared test plan for testing ODS jobs.
- Written Scripts using PERL.
- Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
Environment: Informatica 9.0.1, Workflow Manager, Workflow Monitor, IDQ Erwin, Oracle 10g, Teradata SFDC, TOAD Version 9.0, Tidal.
Confidential, Mason, Ohio
ETL Developer
Responsibilities:
- Reading the data from the AS400 systems then written ETL Functional Design documents and ETL technical design documents for program development, logic, coding.
- Reviewed BPD’s (business process documents) and designed functional and tech. design documents.
- Used EDI to exchange process able data to Core FACET tables.
- Interacted with Business gathered Requirements and designed the Business process flow documents.
- Installed Pervasive data Integrator for the ETL processing.
- Worked partially on Oracle eBusiness Suite.
- Mapped broker data, client data of Eye Med from SFDC to FACETS.
- Mapped data from various systems to FACETS core tables.
- Helped in developing Confidential Vision claim’s module.
- Mapped data from core FACETS to Hyper data warehouse.
Environment: Pervasive 5.0 data integrator, Workflow Manager, Workflow Monitor, IDQ Erwin, Oracle 10g, Facets.