Informatica Powercenter & Data Quality Architect\lead Resume
- Over Fourteen years of experience as an Informatica Power center/Data Quality/Data warehouse developer in developing large - scale Data warehouse and Client/Server Applications, including: Data Extraction/Conversion, Database Administration, Client/Server Applications, User Interface, Data Modeling, Database Programming, Software Development Life Cycle.
- Experienced in designing and implementing data warehouse and data mart applications.
- Experienced in designing and developing stored procedures with Oracle (PL/SQL) and SQL Server (T-SQL).
- Experienced in writing UNIX shell and MS DOS Batch scripts.
- Good functional knowledge of Finance, Insurance, Media, Utilities and Health Care sectors.
- Interacted with Business Users to analyze the business process and made necessary changes to schema objects to cater their reporting needs.
- Assisting the Data Architect in the design and review of data models.
- Excellent problem solving, communication, analytical, interpersonal skills and ability to perform independently or as part of a team.
- Working Knowledge of Cobol data Structures, Mainframe and CICS Screens.
Databases: Oracle, Teradata, SQL Server and UDB/DB2, Sybase.
Operating Systems: MS-DOS, Windows 2000/XP and Sun Solaris UNIX
Languages: C, C++, PL/SQL, T-SQL, Visual Basic, Java, HTML, XML, Shell Scripting.
Tools: Informatica PowerCenter v10.x/v9.x/v8.x, Informatica Power Exchange 9.x (STRIVA), Informatica Data Quality 9.x, Business Objects XI R2, Informatica Power Analyzer, Informatica Metadata Reporter, Micro Strategy, SQL loader, Erwin.
OLAP: COGNOS (Impromptu Administrator 5.0, Power Play Transformer 6.5, Power Play Reports 6.5, Scheduler 5.0), Business Objects 4.x.
GUI: VisualBasic5.0/6.0, (ActiveX, COM/DCOM, COM+).
Methodologies: Star Schema, Snow Flake Schema, Dimensional modeling.
Confidential, Weehawken, NJ
Informatica Powercenter & Data Quality Architect\Lead
- Provide estimations for ETL deliverables and oversee the progress for quality ETL Deliverables.
- Designed the data flow and architecture of CCRIP database.
- Created multiple mapplets in Informatica for shorthand implementation.
- Used Informatica parameter file functionality to parameterize the table names, period date and connections.
- Designed and implemented complex SQL queries and PLSQL procedures to process the data per MDRM from AXIOM backend tables.
- Coded 3000 + rules in Informatica PowerCenter and Developer tool to check the Data Quality of the reports - FR Y9C, FR Y14Q, FR Y14A & FR Y14M .
- Heavily used Normalizer and java transformation in Informatica mappings.
- Heavily used Parser and Standardizer transformation in IDQ.
- Used reference tables in Informatica Data Quality for shorthand definition expansion.
- Created profiles in Informatica Data Quality for identifying unique values for tables in AXIOM (reporting database).
- Performed Data Analysis and Validations on business rules provided by Data governance team.
- Designed and implemented stored procedures in oracle for executing gather stats to improve the load performance.
- Created Tables, Sequences, Stored procedures and Triggers in CCRIP database.
- Designed and implemented mappings in developer tool and imported in powercenter as mapplets.
- Created Profiles in IDQ and Analyst tool on AXIOM tables.
- Designed and implemented partition on Oracle tables.
- Created MS Access database and forms for Business Analysis and maintenance of RLI inventory.
Environment: Informatica PowerCenter v10.1.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality 10.1.1 (Developer tool), informatica metadata manager, Oracle 11g, UNIX, Microstrategy 10.3, Oracle 11g, UNIX, Functional knowledge of AXIOM front end.
Confidential, Jersey city, NJ
Project Manager\ Senior Developer
- Wrote Functional Specifications for Monitoring and reporting of Intraday Liquidity.
- Built SQL queries to retrieve payment data made to BNPs clients via FED (Federal Reserve).
- Performed analysis to retrieve different collateral information that is pledged at the bank.
- Wrote UNIX script for file’s date\record count validation and file statistics report.
- Created Informatica mappings for GCARs file processing.
- Built SQLs to retrieve BNPs collateral pledged at Ancillary system which is ‘CHIPS’.
- Created reference tables in Informatica Data Quality tool to remove repeated and common client names.
Environment: Informatica PowerCenter v9.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality v9.6.x, Oracle 11g, Business Objects, SQL Developer, UNIX, Tableau.Business Bridge.
Confidential, Jersey city, NJ
Senior Informatica Data Quality \ Power Center Lead
- Created Rules and Profiles in Informatica Data Quality tool.
- Worked with various Data stewards to gather requirements to build their Monitoring rules and profiles.
- Worked with Informatica IDQ transformations Standardizer, Parser and Global address validation transformations for data cleansing, data matching, data conversion & exception handling.
- Worked extensively in Informatica Power Center for creating mappings and mapplets to merge data from Oracle Profile Warehouse and Ab Initio MDR and load data into COGNOS data mart.
- Assigning work and running daily scrum meetings with offshore team for status on rules deliverables.
- Made updates in Share Point with Work Intake status and set up alerts.
- Worked with Business users (Data Stewards) to gather requirements for COGNOS reporting needs and worked with COGNOS developers to leverage the requirements.
- Worked with B2B Data Transformation tool to Parse MISMO XML file.
- Profiled large MISMO Property Appraisal XML files which include 400+ hierarchies and 3000 elements.
- Staged and Profiled EBCDIC format VSAM files using Informatica Power Exchange and Power center.
- Worked with COGNOS developers to create Materialize views for faster performance of COGNOS reports.
- Created LDOs( Logical Data Objects) and CDOs (Custom Data Objects) in Informatica Data Quality tool.
- Implemented Slowly Changing Dimensions Type 1 and Type 2.
- Used Power Center debugger to test the mappings.
- Uploaded \ migrated rule’s Metadata into AbInitio Meta Data Repository.
- Uploaded tables\views Metadata into AbInitio MDR metadata portal.
- Created folders in IDQ for new projects and provided privilages to developers.
- Migrated IDQ code to UAT and PROD.
- Executed various Ad hoc Queries in Oracle and Teradata for Operational Reports to present them to leader ship team. Interacted with Customers for requirement gathering, effective analysis and also to provide end - user training.
- Created monthly metrics for Rules and Profiles by running ETL jobs to present it to higher management.
- Created Score cards on profiles for monitoring Data Quality.
- Worked with CONTROL-M for scheduling and executing Ad hoc ETL jobs.
Environment: Informatica PowerCenter v9.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality v9.6.x, AbInitio MDR Tool V mh-18.104.22.168, Teradata, Oracle 11g, SQL Server 2005,COGNOS Reporting tool, SQL Developer, UNIX. Qlikview, CONTROL-M BMC. Informatica Analyst, Informatica MDM, Informatica Power Exchange.
Confidential, New York, NY
Data warehouse \ ETL Developer
- Worked with Business users to gather the requirements.
- Created LLD/HLD documents and mapping documents.
- Involved in designing the project data flow.
- Developed complex SQL queries and Views using Oracles SYS CONNECT BY PATH to build subscriptions path.
- Worked with Architect to design the ETL architecture for flexible business analysis.
- Created mappings/sessions/workflows to move data from various source systems to reporting data warehouse.
- Designed and developed the exception process using Oracle stored procedures for Unbundling crosswords and ABC reporting projects.
- Involved in the Unit testing and System testing.
- Developed triggers for logging deletions in production database.Tuned the Existing Mappings\Sessions for better performance.
- Executed complex queries in Oracle database for real time testing.
- Developed SCD type-I, Type-II and Type III mappings in Informatica to load the data .
- Created VISIO diagrams to depict Informatica Workflows.
- Exposure to software development methodologies such as Scrum and Agile Methodology.
Environment: Informatica PowerCenter v8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Oracle 11g, SQL Server 2005,Bussiness Objects XI R2, Erwin 3.x/4.x, Toad for Oracle 10.6.13, UNIX, SQL Developer (3.1.07).
Confidential, Piscataway, NJ
- Determine the data quality standards for the organization and ensure adherence of the processes data to these standards.
- Make certain that the data complies with the user needs and expectations referring to quality and authenticity.
- Formulate policies and procedures necessary for data management, processing and quality assessment functions.
- Utilized Informatica toolset (Informatica Data Explorer, and Informatica Data Quality) to analyze data for data profiling.
- Documented Cleansing Rules discovered from data cleansing and profiling.
Environment: Informatica PowerCenter v9.0.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality (IDQ) v9.0, Microsoft SQL Server 2005, Informatica Metadata Manager, IBM DB/2 UDB, Informatica Data Explorer.
Confidential, West Trenton, NJ
Senior Informatica Developer\Architect
- Worked with Informatica Data quality tool for Address standardizing, Name parsing and matching Parties.
- Created Data maps in Informatica Power Exchange (Striva) to pull data from mainframe sources.
- Worked with informatica technical support to resolve upgrade issues related to Informatica.
- Created Mappings with Informatica First Logic IQ Link Match Consolidate, Data right IQ and ACE job files.
- Re-architectured and Developed the existing application with the Business Objects XiR2 (Data quality 11.7) replacing First Logic.
- Created Complex matching rules in Informatica IDQ to consolidate customer data using customer’s SSN, Name, address and DOB information.
- Created various mappings to capture Exception data to generate exception reports for users.
- Extensively used Lookup, Aggregator, Normalizer, Expression and Filter transformations in
- Various mappings.
- Tuned mappings/sessions for better performance.
- Managed Developer privileges for Informatica folders.
- Used deployment groups to migrate the code.
- Worked on Informatica Association matching to implement rule of Transitivity.
- Interacted with Business users and Analysts for new requirements.
- Involved in writing complex queries to support adhoc user requests for data.
- Developed SQL Stored Procedures to replace Informatica Sequence Generators.
- Created XML targets using the Filename column.
- Extensively used Informatica debugger to analyze the issues in the application.
- Worked with XML sources to load data into SQL Server tables.
- Involved in upgrading the application from Informatica version 6.2.2 to version 8.1.1.
- Verify load status by accessing Repository Tables and communicate any issues to users.
- Worked closely with the data architect to design the application flow.
- Developed Unix shell scripts for workflow execution
- Trained and provided guidance to junior developers.
- Created reports in Informatica Power Analyzer.
Environment: Informatica PowerCenter v9.0.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica PowerCenter v8.6.1,Informatica Power Exchange 9.0 (Striva), Business Objects XIR2 (Data Quality tool 11.7), Informatica First Logic IQ Link (ACE Views 7.20c, DataRight IQ Views 7.10c, Match/Consolidate Views 7.30c), Informatica Data Quality (IDQ) v9.0, Informatica PDM, IBM QualityStage, Oracle 11g, Informatica Metadata Reporter, Power Analyzer, UDB Workbench, Oracle 10g, Microsoft SQL Server 2005,SQL Server Reporting Services, Red Gate SQL Compare, Quest Central for db2 V5.2,UNIX, Tivoli, Erwin.
Confidential, Stamford, CT
- Understood the business process and analyzed the source data coming from various source systems.
- Designed the ETL process to load the Star Schema
- Ported the existing logic and business rules in the Visual Basic application into Informatica Mappings.
- Followed the Type 2 dimension model to maintain the history of the updates to the dimensions.
- Developed mappings to load data from MS SQL server, MS Access, Excel spread sheets, CSV files into staging area and then to Oracle targets.
- Developed mappings and mapplets to load complex Hierarchical dimensions and implemented complex business rules related to customer credit ratings
- Migrated Mappings and scripts from Informatica V5.1 to V7.1.
- Developed and implemented Error Handling Strategies.
- Extensively used Lookup, Expression, Filter, Router, Normalizer, Stored procedure, Update Strategy and Sorter transformations.
- Developed materialized views to get filtered data and use it as the source.
- Developed UNIX shell scripts to cleanse the source data.
- Developed stored procedures and used them in Stored Procedure transformation.
- Followed the Ralph-Kimball methodology to load Helper table.
- Created Data Flow Diagrams in Microsoft Visio.
- Used SQL Loader, Export/Import utilities to load data from flat files and remote databases into local database.
- Developed UNIX shell scripts to start sessions and run batch jobs.
- Documented Informatica mappings in Excel spread sheet.
- Constantly interacted with business users to discuss requirements.
- Optimized/tuned mappings for better performance and efficiency.
Environment: Informatica PowerCenter v7.1/5.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Oracle 9i/8i, PL/SQL, Erwin 3.5.2,MS Access, MS Excel, MS Visual Basic.Net, MS SQL Server, SQL*Loader, Autosys 4.5, People soft, Hyperion Essbase V7.0, Windows NT, UNIX Sun Solaris.
Confidential, Dallas, TX
Data Warehouse Developer
- Developed strategies for integrating data from various source systems and for initial and daily loads.
- Identified and analyzed source data from various source systems, both internal (DB2, SIEBEL) and external (Flat files).
- Using Informatica PowerCenter created maps and mapplets to transform the data according to the business rules.
- Created UNIX shell scripts to preprocess the flat files before loading and to start sessions and batches.
- Wrote several ETL functions, procedures in PL/SQL for aggregation and summation purpose.
- Tuned the Informatica mappings for optimal load performance.
- Created sessions and batches using Informatica Server Manager and monitored and scheduled the sessions and batch jobs.
- Created UNIX Shell scripts for Change data Capture (CDC) process.
- Assisted the DBA’s in database tuning and optimizing load process.
- Designed and developed ETL processes to extract data from the source systems and load into the Data warehouse/Data mart in development environments.
- Assisted the data modelers in creating the data models based on the source systems.
- Created test plans for quality assurance and documented the test plans and load strategies.
- Created summary reports about Management review details and sales reports using Impromptu.
- Created Adhoc slice and dice and drilldown reports using cognos Impromptu.
- Mentored other team members new to Informatica.
- Constantly interacted with the business users, Data Architects and the DBA’s.
Environment: Informatica v6.1/5.1, PowerCenter v6.1/5.1 (Repository Manager, Designer, Server Manager), Oracle 9i/8i/8, PL/SQL, Cognos Impromptu v5.0, MicroStrategy, DB2, Siebel, Erwin, SQL*Loader, Maestro, Trillium, Windows NT, UNIX Sun Solaris.
Confidential, Piscataway, NJ
- Involved in designing the data model using ERWin and Generate the DDL for DBA to create the tables in the development and production environment.
- Extensively used ETL to load data (metadata) from MS SQL Server 7.0, Sybase, Oracle 8i, flat files and COBOL files into Staging area and then to DB2 UDB.
- Documented the existing Mappings.
- Cleanse the source data, Extract and Transform data with business rules, and built re-usable mappings, using Informatica PowerMart Designer.
- Scheduled and Ran Extraction and Load process and monitor sessions by using Informatica Server Manager.
- Developed many procedures, functions and database triggers using PL/SQL for auditing and enforcing business rules
- Maintained and enhanced the application as per the new requirements/enhancements.
Environment: Informatica v5.1/4.7.2, PowerMart/PowerCenter v5.1/4.7.2 (Repository Manager,
- .NET Developers/Architects Resumes
- Java Developers/Architects Resumes
- Informatica Developers/Architects Resumes
- Business Analyst (BA) Resumes
- Quality Assurance (QA) Resumes
- Network and Systems Administrators Resumes
- Help Desk and Support specialists Resumes
- Oracle Developers Resumes
- SAP Resumes
- Web Developer Resumes
- Datawarehousing, ETL, Informatica Resumes
- Business Intelligence, Business Object Resumes
- MainFrame Resumes
- Network Admin Resumes
- Oracle Resumes
- ORACLE DBA Resumes
- Other Resumes
- Peoplesoft Resumes
- Project Manager Resumes
- Quality Assurance Resumes
- Recruiter Resumes
- SAS Resumes
- Sharepoint Resumes
- SQL Developers Resumes
- Technical Writers Resumes
- WebSphere Resumes
- Hot Resumes