- Over 8+ years of professional IT experience in Analysis, Design, Development, Testing and Implementation of various Database and Data Warehouse Software Applications, Involved in designing, development and production support of Data Warehouse.
- Extensive experience in Informatica Power Center 9.x/8.x/7.x/6.x/5.x to carry out Extraction, Transformation and Loading process.
- Thorough understanding in Data Warehouse and Business Intelligence concepts with emphasis on Repository Manager, Master Data Management (MDM), Designer, Workflow Manager and Workflow Monitor.
- Expert skills in OLAP, developing database Schemas like Star Schema, Snowflake Schema, Conforming Dimensions and Slowly Changing Dimensions used in relational, dimensional and multi - dimensional modeling.
- Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, MS Access, DB2 and flat files using Informatica.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Aggregators, and Re-usable transformations
- Experience in using the Informatica command line utilities like paced to execute Workflows in non-windows environment
- Experience in debugging mappings; Identified bugs in existing mappings by analyzing the data flow and evaluating transformations
- Extensively worked with Stored Procedures, Triggers, Cursors, Indexes, Functions and Packages.
- Expertise in PL/SQL development and SQL Query tuning for better performance.
- Extensive experience with Data Extraction, Transformation, and Loading (ETL) from disparate Data sources like Multiple Relational Databases. Worked on integrating data from flat files, CSV files, and XML files into a common reporting and analytical Data Model.
- Implemented complex business rules in Informatica by creating re-usable transformations,robust mappings/mapplets.
- Hands on experience on Various No SQL databases such as H base and Mongo DB.
- Worked on UNIX-shell Scripting for Pre-Session, Post-Sessions tasks and automated the scripts using a scheduling tool.
- In-depth understanding of Star Schema, Snow Flake Schema, Normalization, 1st NF, 2nd NF, 3rd NF, Fact tables, Dimension tables.
- Conducted numerous business analysis techniques like GAP Analysis, SWOT, Impact Analysis, Root-cause identification, Cost-Benefit analysis, KPI, Workflow analysis, Feasibility study, and Business Risk Analysis to understand the business pain points.
- Expertize in defining and documenting the business process flows (UML) like Use case diagrams, Activity diagrams, Sequence diagrams and Data Flow Diagrams.
- Well versed in handling organizational change management and conducted assessments in the event of change request; effectively performing both business and technical impact analysis on the triple constraints of the project, baselining documents endorsed by Change Control Board.
- Extensive Agile-Scrum experience, utilizing JIRA, Rally, Confluence, TFS (Team Foundation Server), Tableau for managing sprints, collecting requirements, user story tracking, bug tracking, team management tool as well as team reporting tools.
ETL/Data Modeling Tools: Informatica Power Center 9.5/9.1/8.6/8.1.1/7. x/6.x(Designer, Work flow Manager &Work Flow Monitor) Power Mart 6.2/5.1/4.7 Power Connect and Power Exchange, Ascential Data Stage, ER win r8.x/r7.x/4.x
Dimensional tools: Data Modeling(Star Schema Modeling, Snowflake Schema Modeling, FACT and Dimensions Tables)
BI/OLAP Tools: Business Objects XI/6.5,Cognos8 BI, ReportNet1.1/1.0 (Report studio, Query studio, Analysis studio, Framework Manager), Cognos EPSeries5.2/6.0/7.x(Impromptu,Powerplay,PowerPlay Transformer), Pentaho
Databases: Oracle 11i/10g/9i/8i,MS SQL Server 2008/2005/2000 , Mongo DB, Teradata, DB2, and MS Access 2003, ANSI SQL,Transact SQL
Languages: C, C++,SQL,PL/SQL,T-SQL, UNIX, Shell Programming, XML
OS: Windows 2010/2007/2003/2 K/XP/NT//98, IBM AIX 5.1, HP Unix
Packages/ Other: Toad 11.6/12.0/8.6.1/8.5.3 SQL * Loader, Teradata SQL Assistant, SQL Query Analyzer, Enterprise Manager, MS Office, lotus 123,EXCEL,TFS, SQL developer
Confidential, Orville, Ohio
Sr. ETL/Informatica Developer
- Designed developed, implemented ETL process to extract, transform, and load (ETL) data from inbound flat files and various source systems and loaded into Data Mart using the Informatica PowerCenter.
- Developed mappings to extract data from SQL Server, Oracle, Teradata, SFDC, SIEBEL, Flat files to load into Teradata using the PowerCenter.
- Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, Connected and Un-connected lookup, sorter, Normalizer, SQL transformation and sequence generator.
- Profile source data using IDQ tool, understand source system data representation, formats & data gaps.
- Worked in an Agile Software Development life cycle to successfully accommodate change in requirements.
- Created Exception handling process and worked on the best practices and standards for exception handling routines.
- Developing the Mappings to move the data loaded from Landing to Stage by using various cleanse functions.
- Created Web services in IDQ developer tool and generated WSDL file for real time services.
- Worked on Infrastructure setup for building new applications.
- Used metadata driven ETL framework to generate XML’s and for batch execution control.
- Designed and developed process to handle very high volumes of data using Teradata Parallel Transporter (TPT) load protocols like LOAD and UPDATE.
- Expertise in writing BTEQ scripts on Linux. Used BTEQ scripts in pre/post load processes for inserting/updating the process activity tables. Also used Teradata Macros and Query banding.
- Used Informatica Push Down Optimization (PDO) to push the transformation processing from the PowerCenter engine into the relational database to improve performance.
- Written shell and perl scripts to execute the workflows, cleanup the files, execute BTEQ Scripts and for other batch processings.
- Used Autosys Scheduler to Create, Schedule and control the batch jobs.
- Wrote services to store and retrieve user data from the MongoDB for the application on devices.
- Used Network Data Mover to receive/transfer files across multiple source systems.
- Assisted and worked on performance testing, data quality assessment & production deployments.
- Involved in jobs scheduling, monitoring and production support in a 24/7 environment.
Environment: Informatica Power Center 9.6.1, Teradata 15.10/14.10 , Oracle 11g, Teradata SQL Assistant, Autosys, NDM, DTS, SQL, Perl, UNIX Shell Scripts, Mongo DB, Linux, Tortoise SVN
Confidential, Pittsburg, PA
Sr. Informatica Developer
- Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
- Responsible for Data Warehouse Architecture, ETL and coding standards.
- Filtered XML claims files by using filter conditions on D9 segment and converted back the filtered claims xml files to EDI format using serializer in B2B data transformation.
- Led B2B structured and unstructured transformations that included: resolving end user problems, on B2B transformations and resolving system failures.
- Responsible for Administration of Informatica PowerCenter 9.5
- Extensively used Transformations like Source Qualifier, Expression, Filter, Aggregator, joiner, lookup, Sequence Generator, Router, Sorter and Stored Procedures, Java Transformations. And used debugger to test the mappings and fixed the bugs.
- Installed and configured PowerCenter 9.5 on UNIX platform.
- Installed and configured Metadata Manager Service, Data Analyzer and Web Service Hub.
- Involved in meetings with key stakeholders of the business and discussed on strategies and roadmaps for Informatica Power Center Grid implementation.
- Created and maintained Users, Groups, Roles and Privileges.
- Provide Power Center performance tuning and production support.
- Repository Service Backup, Recovery and Migration between Dev, UAT and Prod environments.
- Creating and maintaining Grids and Nodes.
- Followed complete SDLC process and worked under Agile methodology.
- Created documents and manuals for Informatica Installation, Configuration, Migration procedures ETL Architecture and Naming Standards.
- Worked with existing Python Scripts, and also made additions to the Python script to load data from CMS files to Staging Database and to ODS.
- Involved in documenting the Deployment plan and created Test Scripts for the Unit test plan.
- Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
- Expertise in designing Universes for Performance Management and Reporting needs.
- Experienced in developing Web Intelligence, Desktop Intelligence and Crystal Reports using different Data Sources.
- Designed web intelligence reports for dimensional hierarchical data
- Created web reports through report template in Web Application Designer.
- Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
- Extensively used IDQ which helped in debugging and also reducing development time.
- Used Match and Consolidator transformation in IDQ which helped in reducing the duplicates.
- Created several session and workflows in IDQ which we deployed in Power Center.
- Extracted data from SAP R/3 and loaded into Oracle Data Warehouse.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows.
- Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
- Developed stored procedure to check source data with warehouse data and if not present, write the records to spool table and used spool table as lookup in transformation.
- Done extensive bulk loading into the target using Oracle SQL Loader.
- Developed ETL process using Pentaho PDI to extract the data from legacy system.
- Did Application tuning and Disk I/O tuning to enhance the performance of the system.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
Environment: Informatica Power Center, IDQ, MDM, Oracle, Business objects, Power Exchange, Flat files, Pentaho PDI 4.2.1, Agile, Pentaho Report Designer, Pentaho schema workbench Windows, MS SQL server 2008, DB2 8.0, Erwin, SSIS, Winscp, Control-M, MS. Visio, UI, Shell Script, UNIX.
Confidential, San Ramon. CA
ETL/ Informatica Developer
- Responsible for Requirement Gathering Analysis and End User Meetings
- Responsible for Business Requirement Documents BRD's and converting Functional Requirements into Technical Specifications
- Responsible for mentoring Developers and Code Review of Mappings developed by other developers
- Extracted data from various heterogeneous sources like Oracle, Sybase, SQL Server, and Flat Files.
- Responsible for Production Support and Issue Resolutions using Session Logs, and Workflow Logs
- Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
- Responsible for best practices like naming conventions, Performance tuning, and Error Handling
- Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level
- Worked with team of developers on Python applications for RISK management.
- Solid Expertise in using both Connected and Un connected Lookup transformations
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache
- Organized and facilitated Agile and Scrum meetings, which included Sprint Planning, Daily Scrums or Standups, Sprint Check-In, Sprint Review & Retrospective.
- Developed Re-Usable Transformations, and Re-Usable Mapplets
- Worked with Shortcuts across Shared and Non-Shared Folders
- Worked with Session Logs, and Workflow Logs for Error handling and Troubleshooting in all environment
- Performed data manipulations using various Informatica transformations like Joiner, Rank, Router, Expression, Lookup, Aggregator, Filter, Update strategy and Sequence generator etc.
- Extensively used Dynamic Lookup transformation and Update Strategy transformation to update slowly changing dimensions.
- Extensively utilized the Debugger utility to test the mappings.
- Extensively worked on Performance Bottlenecks to improve the performance of the sessions and workflows.
- Used workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.
- Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.
- Used Unix Scripting and Scheduled PMCMD to interact with Informatica Server from command mode.
- Wrote and executed various MYSQL database queries from Python using Python-MySQL connector and MySQLdb package.
- Created database Triggers, Stored Procedures, Exceptions and used Cursors to perform calculations when retrieving data from the database.
- Defined Mapping parameters and variables and Session parameters per requirements and performance related issues.
Environment: Informatica Power center (v9.5), TDM, Data Quality, Power Exchange, Sybase, SQL Server, Oracle 10g/11gSQL, UNIX, AUTOSYS, Agile Windows 7.
Confidential, Milwaukee, WI
Sr. ETL/Informatica Developer
- Created detailed Technical specifications for the ETL processes.
- Performed ILIT (Irrevocable Life Insurance Trust) implementation and replacement activities.
- Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
- Used Informatica as ETL tool, and storedprocedures to pull data from sourcesystems/ files, cleanse, transform and load data into databases.
- Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
- Developed the Informaticamappings using various transformations, Sessions and Workflows. SQL Server was the target database, Source database is a combination of Flat files, Oracle tables, People Soft, Excel files, CSV files etc.
- Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Joincache while developing the Mappings.
- Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
- Involved with the DBA in performancetuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
- Optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
- Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance
- Involved in writing UNIX shell scripts for Informatics ETL tool to run the Sessions.
- Fixing and tracking mapping defects and implementing with enhancements.
- Managing post production issues and delivering task/projects within specific timeline.
- Involved in the mirroring of the staging environment to production.
- Worked on Modification of Actuate report to upload and run reports on servers.
- Worked on Autosys to schedule jobs, define dependencies and etc.
- Collaborated with teams for migration and Production Deployment activities.
- Scheduling Informatica jobs and implementing dependencies if necessary using Tidal Scheduler.
- Responsible for performing SQL query optimization using Hints, Indexes and Explain plan.
- Played a Vital role in requirement gathering, preparation of engineering gathering requirement specification.
- Using Perforce as a versioning tool for maintaining revision history for code.
- Managed production issues and delivered all assignments/projects within specified time lines.
- Worked on all phases of multiple projects from initial concept through research and development, implementation, QA, to live production, by strict adherence to project timelines.
Environment: Informatica Power Center 8.1 IBM UDB DB2, SQL server 2008, SSIS, doc Loader, Embarcadero Rapid SQL 7.7.1, Autosys, Text Exceed 14, UNIX.
Informatica Developer/ Tester
- Designed ETL mapping document to map source data elements to the target based in Star-Schema dimensional model
- Performed Client Regression testing with help of Test Cases, Requirement documents and Wireframes.
- Involved in designing of star schema based data model with dimensions and facts
- Tested all the ETL processes developed for fetching data from OLTP systems to the target Market Data warehouse using complex SQL queries.
- Designed and developed complex mappings using various transformations in Designer to extract the data from relational sources like Oracle, SQL Server, and non-relational source like flat files to perform mappings based on company requirements and load into Oracle tables
- Worked extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement
- Performed end to end system Integration testing.
- Preparation of Test checklist for deployment testing.
- Worked on Informatica repository manager for creating users, groups, assigning Read, write, execute privileges for Users by assigning users to groups, creating folders
- Worked on Power Exchange CDC option to distill updates from unchanged background data and to deliver them to multiple targets without intermediate queues or staging tables
- Used different tasks (Session, Command, Decision, Timer, Email, Event-Raise, Event-Wait, Control) in the workflow
- Performance Tuned Informatica Targets, Sources, mappings and sessions for large data files by increasing data cache size, sequence buffer length and target based commit interval
- Created pre-session and post-session shell scripts and mail-notifications
- Worked with tools like TOAD to write queries and generate the result
Environment: Informatica Power Center 8.1.1 (Designer, Workflow Manager, and Workflow Monitor), Solaris, Oracle 9i, InformaticaPower Exchange, SQL Server 2005, PL/SQL, Flat files, XML Files, ERwin, TOAD
- Imported various Sources, Targets, and Transformations using Informatica Power Center Server Manager, Repository Manager and Designer.
- Created and managed the global and local repositories and permissions using Repository Manager in Oracle Database.
- Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
- Used heterogeneous files from Oracle, Flat files and SQL server as source and imported stored procedures from oracle for transformations.
- Designed and coded maps, which extracted data from existing, source systems into the data warehouse.
- Used Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.
- Written PL/SQL procedures for processing business logic in the database. Tuned SQL queries for better performance.
- Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
- Generated completion messages and status reports using Informatica Server manager.
- Tuned ETL procedures and STAR schemas to optimize load and query Performance.
- Starting Sessions and Batches and make Event based scheduling
- Managed migration in a multi-vendor supported Server and Database environments.
Environment: Informatica Power Center 7.1.2, DB2 v8.0, SQL, Windows 2000, UNIX, SQL Server 2000, Oracle 8i, Flat files, SQL *Plus, Business Objects 5.1.6