- Over 7+ years of IT experience in Design, Development, Implementation and Maintenance of Data warehouse applications as Developer and Administration Activities.
- Experience in administration/management of multimode vertica database clusters and Experience with Attensity Analyze Module.
- Experience in Oracle SQL and PL/SQL including all database objects: Stored procedures, Stored functions, Packages, TYPE Objects, Triggers, cursors, REF cursors, Parameterized cursors, Views, Materialized Views,PL/SQL collections
- Experience with Trillium Discovery and Data quality modules.
- Experience with scheduling Autosys jobs and writing JIL scripts.
- Scheduled CRON jobs on UNIX.
- Experience in designing, development and support of data warehouse applications using Informatica Power Center 9.x/8.x/ 7.x/6.x / 5.x.
- Experience in development, implementation, administration and support of ETL processes for large scale data warehouses using Informatica Power Center.
- Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
- Experience in Extraction, Transformation and Loading of data from different heterogeneous source systems like Flat files (Fixed width & Delimited), XML Files, COBOL files, VSAM, IBM DB2 UDB, Excel, Oracle, Sybase, MS SQL Server and Teradata.
- Good understanding of Star and Snowflake Schema, Dimensional Modeling, Relational Data Modeling, Slowly Changing Dimensions and data ware housing concepts.
- Experience in using Teradata load utilities (FASTLOAD, MULTILOAD and TPUMP) to load huge volumes of data to Teradata RDBMS.
- Experience in working with UNIX Shell scripts and Batch scripts and necessary Test Plans to ensure the successful execution of the data loading process.
- Expertise in OLTP/OLAP System Study, Analysis and E - R modeling, developing Database Schemas like Star schema and Snowflake schemas.
- Experience in Data Modeling and Project Management.
- Hands on experience in all aspects of Software Development Life Cycle (SDLC).
- Exceptional problem solving and sound decision making capabilities, recognized by associates for quality of data, alternative solutions, and confident, accurate, decision making.
- Extensive Data Warehouse experience using Informatica Power Center / Power Mart (Source Analyzer, Repository Manager, Server Manager, Mapping Designer, Mapplet Designer, Transformation Designer) as ETL tool on Oracle Database.
- Strong Experience on Workflow Manager Tools - Task Developer, Worklet & Workflow Designer.
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like targets, sources, mappings and sessions.
- Extensive domain knowledge in Banking, Insurance, Health care, Payroll and Financial.
- Involved in all aspects of ETL- requirement gathering, coming up with standard interfaces to be used by operational sources, data cleaning, coming up with data load strategies, designing various mappings, developing mappings, unit testing, integration testing, regression testing and UAT in development.
- Good knowledge in interacting with Informatica Data Explorer (IDE), and Informatica Data Quality (IDQ).
- Strong Experience on Informatica MDM HUB development support and detailed technical knowledge of Informatica MDM products
- Strong experience in coding using SQL, PL/SQL Procedures/Functions, Triggers and Packages.
- Well versed in developing the complex queries, unions, multiple table’s joins, views and sub-queries.
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- As a team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation.
ETL Tools: Informatica PowerCenter 9.x/8.x/7.x/6.x/5.x, SSIS, Informatica MDM 9.x
Data Modeling: Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, Fact, Dimensions), Erwin Data Modeler 7.2/4.0, Normalization, De-normalization and MS Visio
Databases: Oracle 11g/10g/9i, Vertica 6.0, MS SQL Server 2008/2005, MS Access.
Programming: SQL, PL/SQL, C, C++, SQL*Loader, Unix, HTML, Core Java, Java Script, VB Script
Other Tools: AttensityAnalyze6.2, Trillium Discovery 14.0,JBOSS,WebSphere, Web Logic,SQL Developer,OBIEE 11g,SQL*Plus, TOAD, Squirrel 3.5, Excel, Word, Remedy,SVN Tortoise, Outlook, Siebel Anaalytics7.x,Sugar CRM, Sales Force, Zen-Cart and ACT-DB, ODBC
Operating Systems: UNIX, Windows (7, XP, NT/ 95/98/2000/2003 ).
Reporting Tools: MS SQL Server Reporting services 2005/2008, Business Objects XI,SAS
Confidential, Hartford CT
Informatica Admin & Application Analyst
- Working on Attensity Analyze Module.
- Setting up vertica database on a 3-node cluster environment.
- Adding new nodes to the cluster for data management, k-safety and performance.
- Setting up and configuring vertica management console for database monitoring.
- Tuning the vertica database and collecting the diagnostic logs and analyzing them for issues.
- User and Security management for the vertica database.
- Configured Squirrel to browse Vertica database and wrote PL/SQL bulk collection and DML (insert/update/delete) to validate data.
- Responsible for installing, configuring, and maintaining applications, including web servers, in support of business requirements.
- Performed software installations and upgrades to application software packages.
- Scheduling installations and upgrades and maintain them in accordance with established IT policies and procedures.
- Ensures data/media recoverability by implementing a schedule of application backups and database archive operations.
- Administrating Application servers like Apache/Tomcat (or equivalent) and JVM’s, JBoss, WebLogic, and WebSphere.
- Experience implementing the back end of enterprise web applications.
- Working with Job scheduling tools like Autosys.
- Ensure that all support requests are properly approved, documented, and communicated using the Remedy tool.
- Worked on Trillium Discovery and Data quality modules. Responsible in Repository creation. Creation of OS profiles.
- Responsible to analyze the rules/logic inTrilliumand to replicate them into Informatica Data Quality.
- Responsible for Trillium Discovery Linux POC.
- As an Informatica Admin responsible for managing Development, QA and Production environments. Making sure that code is sync in all the three environments.
- Worked with Siebel EIM tables and tweaked the batch file (.ifb) to process more no. of rows at a time.
- Creation and maintenance of Informatica users and privileges.
- Responsible for outages handling: Password Changes, Patch installation (Hot Fixes/Softwares), Bouncing
- Activities or Restarting the Services, Network Changes
- Extensively worked on UNIX shell scripts on automation (Auto restart of servers, Disk space utilization, Informatica Server monitoring and UNIX file system maintenance and cleanup, and infacmd and pmrep operations).
- Expert in Accessibility of web based information and Applications.
- Worked on Informatica Mapping Architect for Visio to automate Mappings, Sessions and Workflow generation.
- Migration/Testing/Deployment of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
- Experience in integrating external business application with Informatica MDM hub using Batch process, SIF and message queues with a significant advantage.
- Expertise with Siebel Base tables and EIM workflows
- Worked on Object Queries, resolving conflicts delete and purge objects.
- Administrating distributed systems processing large amounts of streaming data.
- Responsible for SOA, Data integration, Data Quality, data architecture and Master Data Management, project life cycle Phases.
- Responsible for program design, coding, testing, debugging, and documentation.
- Customized Siebel Repository defined Physical Data Layer and developed Business Model and Presentation Layer using Siebel Analytics Administration Tool and using OBIEE.
- Experience with Informatica MDM Hub configurations - Data modeling & Data Mappings (Landing, staging and Base Objects), Data validation, Match and Merge rules, writing and customizing user exits, customizing/configuring Business Data director (BDD)/Informatica data director (IDD)applications
- Responsibility for the reliability and scalability of the 24x&7 platforms.
Environment: Vertica 6.0, Attensity Analyze 6.2, Trillium Discovery 14.0, JBOSS, WebSphere & Web Logic Application Servers, OBIEE 11g, Informatica Power Center 9.0.1/8.6.1, Informatica MDM 9.x, IDQ 9.0.1/8.6.1, Power Mart, IDE, Power center (Repository Manger, Designer, Service Manager, Workflow Monitor, Workflow Manager), Oracle 11gR2/10g, SQL, PL/SQL Developer, Python,Teradata V2R6, Siebel, JAVA,UNIX Shell Scripting, Perl, Autosys, Windows XP, RedHat Linux.
Confidential, Boston, MA
- Translated the T-SQL procedure, function and views into Oracle’s PL/SQL Packages, Procedure, Function and Views.
- Developed performance utilization charts, optimized and tuned SQL and designed Physical databases. Assisted developers with Teradata load utilities and SQL..
- Imported Source/Target tables from the respective SAP R3 and BW systems and created reusable transformations (Joiner, Routers, Lookups, Rank, Filter, Expression and Aggregator) inside a mapplet and created new mappings using Designer module of Informatica Power Center to implement the business logic and to load the customer healthcare data incrementally and full.
- Created Complex mappings using Unconnected Lookup, and Aggregate and Router transformations for populating target table in efficient manner.
- Developed custom reports/Ad-hoc queries using Answers and assigned them to application specific dashboards.
- Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Worked along with SAP Functional team on SAP Solution manager and service manager tool for transferring or Transporting programs from one level to another level in SAP(Landscape like Development, UAT, Regression testing and Production).
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Worked in Web Center Portal position, assisting in creating the connections between Portal and SOA
- Extensively work on the performance tuning of Informatica PowerCenter Mappings as well as the tuning of the sessions by creating multiple sessions on primary key values by using partition query.
- Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs
- Used PL/SQL and SQL*Loader to create ETL packages for flat file loading and error capturing into log tables.
- Designed and developed a series of complex Business Intelligence solutions using Pentaho Report Designer
- Created various tasks like Session, Command, Timer and Event wait.
- Migrated mappings, sessions, and workflows from Development to Testing and then to Production environments.
- Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into teradata.
- Involved in working with IMPLICIT CURSORS, EXPLICIT CURSORS, REF CURSORS
- Prepared the completedata mappingfor all the migrated jobs using SSIS.
- Provided production support by monitoring the processes running daily.
- Scheduled the Jobs by using Informatica scheduler.
- Worked on creation and configuration of Siebel dashboards, customized reports and charts (Sales & Marketing)
Environment: Informatica PowerCenter 9.0.1/8.6.1 /7.1.4, IDQ 9.0.1/8.6.1, Power Mart, IDE, Power center (Repository Manger, Designer, OLAP, Service Manager, Workflow Monitor, Workflow Manager),Business Objects XI/6.0,Cognos 7, Erwin 4.5,Oracle 11gR2/10g, SQL, PL/SQL Developer, Teradata V2R6,OBIEE 11g,WINSQL, MS SQL Server 2005/2000,SSIS,Database Systems, Flat Files, Siebel, SAP ABAP, Windows XP, JAVA,UNIX Shell Scripting, Perl, Autosys,Python
Confidential, Cambridge, MA
Data Warehouse Developer
- Responsible for activities related to the development, support, implementation of ETL processes for large scale data warehouse using Informatica Power Center.
- Involved in on-site Production Support and change management team and was responsible for smooth onshore-offshore co-ordination, change/release management, user interaction and production support.
- Worked on Informatica DVO (Data Validation Option) tool is used to compare the data before and after the Informatica upgrade.
- Created ETL mappings using Informatica Power center to extract the data from multiple sources like Flat files, Oracle, XML, CSV, Delimited files transformed based on business requirements and loaded to Data Warehouse
- Experience in using various oracle pl/sql collections VARRAYS, NESTED TABLE, ASSOCIATIVE ARRAYS with INDEX BY VARCHAR2
- Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Stored Procedures and Normalizer transformations.
- Experience in integrating external business application with Informatica MDM hub using Batch process, SIF and message queues a significant advantage.
- Prepare implementation plans when moving a code to production. Perform system testing and monitoring.
- Developed extensions and customizations to the services (SIF) tier of Informatica MDM Hub
- Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
- Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
- Used Bulk Collect INTO/Select INTO/ statements to retrieve multiple rows within a single fetch improving the speed of data retrieval
- Conducts MDM unit tests and code reviews and Participated in MDM design reviews, including database code design reviews
- Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
- Develop strategy for implementing data profiling, data quality, data cleansing and ETL metadata.
- Created translation and cleansing scripts using PLSQL Packages, Procedure, Functions and SQL. Utilized Oracle performance tools to accurately optimized and tuned SQL scripts reducing extraction process by 70%.
- Developed PLSQL/SQL code using information from Metadata. Wrote PL/SQL, shell, SQL, and Informatica scripts as needed.
- Created batch scripts and updated the scripts as per the requirement, Developed them for inbound and outbound of the data on servers.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Worked on time to time enhancements, bug fixes, performance tuning of the existing mappings.
- Implemented 2 end-to-end Siperian MDM/Informatica MDM engagements including Business Data director(BDD)/Informatica Data Director (IDD)
- Maintained large data sets, combined data from various sources in varying formats to create SAS data sets and/or ASCII files.
- Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.
- Configured Informatica MDM on Oracle / JBoss
- Provided 24/7 production support.
- Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
- Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
- Worked on Teradata Query man to validate the data in warehouse for sanity check.
- Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extracted data from Oracle database transformed and loaded into Teradata database according to the specifications.
Environment: Informatica Power Center 8.6.1/8.6.0, Informatica MDM 9.x, Oracle 11g/10g, TOAD, JBOSS,WebSphere & Web Logic Application Servers Window2008, OBIEE 10.1, DVO Client, Informatica Power Connect, (Repository Manger, Designer, Server Manager, Workflow Monitor, Workflow Manager), Power Exchange, ETL, Flat files, MS SQL Server 2008, PL/SQL, SQL * Loader, Toad, Excel.
Confidential, Parsippany, NJ
- Experienced in Parsing high-level design specs to simple ETL coding and mapping standards. Designed Mapping Document, Detail Design Documents, Install and Release documents which is a guideline to ETL Coding.
- Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Used transformations like Connected and Unconnected Lookups, Aggregator, Expression, Joiner, Update, Router and Sequence generator.
- Worked with various Informatica Power Center tools --Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, Transformations.
- Created reusable transformations and Mapplets and used them in complex mappings.
- Involved in performance and tuning the ETL processes
- Developed and scheduled Workflows using task developer, worklet designer and workflow designer in Workflow manager and monitored the results in Workflow monitor.
- Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.
- Automated Quality Check tasks by creating PL/SQL procedures, cursors, functions and dynamic SQL. Enhanced it with creating exceptions for error handling. Created triggers for auditing purpose and created group, tabular and form reports.
- Created Shortcuts, reusable transformations and Mapplets to use in multiple mappings.
- Migrated mappings from Development to Testing and to Production.
- Performed unit testing and Involved in tuning the Session and Workflows for better performance.
- Used Debugger to test the mapping and fixed the bugs.
- Worked with UNIX shell scripting.
- Developed technical infrastructure designs, data mappings, flows and report dissemination mechanisms by architecting Data Warehouses and Marts.
- Developed Mapplets using corresponding Source, Targets and Transformations.
- Tuned Informatica Mappings and Sessions for optimum performance.
- Used power exchange CDC which captures changes in the sources and loads those changes into the customer dimension.
- Implemented various automated UNIX shell scripts to invoke pl/sql anonymous blocks, Stored PROEDURES/FUNCTIONS/PACKAGES using SQL PLUS session in silent mode.
- Implemented various customized Oracle reports using different technique in Oracle sql/pl-sql.
- Used Autosys for scheduling the jobs.
Environment: Informatica Power Center 8.6, Informatica Power Connect, Power Exchange, ETL, Flat files, Oracle 11g/10g, MS SQL Server 2008, PL/SQL, Autosys, Shell Programming, SQL * Loader, Toad, JAVA, Excel and Unix scripting, Windows
Confidential, White Plains, NY
Data Warehouse Informatica Developer
- Update maps, sessions and workflows as a part of ETL change.
- Develop and documented Data Mappings/Transformations, and Informatica sessions as per the business requirement.
- Wrote UNIX shell scripts for various purposes.
- Schedule the workflows in the workflow manager.
- Call stored procedures in oracle as pre sql session.
- Provided technical team leadership support for the full systems development life-cycle (SDLC) Data Warehouse.
- Create reusable failure and success email tasks.
- Use trillium software to remove duplicates from the data.
- Working with slowly changing dimensions type 1, 2, and 3, Fact Tables, Star Schema design, Operational Data Store (ODS), leveling and other Data Warehouse concepts.
- Work on SQL tools like TOAD to run SQL Queries to validate the data.
- Design and develop Oracle PL/SQL Procedures and wrote SQL, PL/SQL scripts code for extracting data to system.
- Modifications to existing ETL Code and document the changes.
- Upgrade from Informatica power center 8.1 to Informatica power center 8.6.
- Create repositories in development, QA and production environment.
Environment: Informatica Power Center 8.6/8.1, Informatica Power Connect, (Repository Manager, Designer, Server Manager, Workflow Monitor, Workflow Manager), Power Exchange, Power Analyzer, ETL, Oracle 11g/10g, Teradata V2R5PL/SQL, ODI, UNIX commands, Trillium 11, Excel and shell scripting, MS Windows XP professional 2002, UNIX.
- Implemented Star Schema and Snow Flake schema for the data warehouse.
- Designing and development of data warehouse environment.
- Translated Business processes into Informatica Mappings for building Data marts.
- Importing various Sources, Targets, and Transformations using Informatica Power Center
- Responsible for Logical & Physical modeling as per business requirements.
- Developed various transformations like Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Stored procedure Transformations, Expressions and Sequence Generator for loading the data into target Data Mart.
- Modified the existing mappings and created the new ones based on specification documents.
- Usage of Aggregator, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensive.
- Developed user defined functions based on requirements. Developed back end interfaces using PL/SQL Stored Packages, Procedures, Functions, Collections and Triggers.
- Created PL/SQL procedures and packages.
- Developed various reusable transformations using the Transformation Developer in the Informatica Power Center Designer.
- Implemented the Slowly Changing Dimensions as per the requirement.
- Debugging & Performance Tuning of the Informatica mappings and sessions.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
Environment: Informatica Power Center 7.1, SQL Server 7.0, Oracle 10g, Windows 2003, SQL, PLSQL.
- Resolving technical issues related to ETL through Informatica to maintain data integrity
- Extensively worked on writing complex SQL queries (cursors, ref cursors, sub queries, correlated sub queries).
- Provided support for multiple business groups and managed SDLC for multiple projects to satisfy business needs.
- Developed Database Triggers for audit and validation purpose. Used pipelined functions for faster data accessibility.
- Writing validation packages using PL/SQL package.
- Involved in analysis and development of the Data Warehouse
- Translation of Business Processes into data mappings for building the Data Warehouse.
- Prepared the detailed Low level design document for the bug fixing.
- Closely work with teams to discuss/determine scope of project phases, and documented the mappings.
- Used Informatica Power center tools to extract data from multiple sources, which include relational sources, flat files and XML files.
- Implemented performance tuning techniques at system, session, mapping, transformation, source and target levels
- Performing Independent Unit and Integrated Testing.
Environment: Informatica 7.1, Oracle 10g, SQL*Plus, Windows 2003\
ETL Informatica Developer
- Involved in Business Analysis and requirement gathering.
- Designed the mappings between sources to operational staging targets.
- Used Informatica Power Center as an ETL tool for building the data warehouse.
- Employed the Aggregator, Sequencer and Joiner transformations in the population of the data.
- Data transformed from SQL Server databases and loaded into an Oracle database.
- Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Environment: Informatica Power Center 6.2/5.1, Oracle 9i, SQL, PL/SQL