Data Warehousing Resume
Summary
- Data Warehousing: Have 8 years of solid experience in end-to-end implementation of Data warehousing projects, which include Business Requirements gathering, Analysis, System study, Prepare Functional & Technical specifications, Design (Logical and Physical model), Coding, Testing, Code migration, Implementation, System maintenance, Support, and Documentation.
- ETL: 7 years of experience in Data Warehousing, ETL using INFORMATICA Power Center 8.6.1 /7.x/ 6.2/5.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager), Power connect, Power Analyzer, ETL, Data mart, OLAP, OLTP, SQL, T-SQL, PL/SQL, Stored Procedures, Database Triggers, Data Profiling, Materialized Views, Complex Analytical functions, Performance Tuning, Unix shell scripting.
- Data Modeling: Expertise in Develop and build Enterprise level Data warehouse and subject oriented Data marts using Ralph Kimball and Bill Inman methodologies, Multidimensional data modeling using Star and Snowflake schema design, Surrogate keys, Facts & Dimensions, Erwin 4.5/4.0
- Data Cleansing: 4 years of Data cleansing experience-using Trillium 7.6/6.5 (Converter, Parser, Converter & Geocoder), First logic 4.2/3.6.
- Business Intelligence: 5+ years of Business Intelligence experience using Cognos 8.0/7.0/6.0, Micro strategy 8.0, iDashBoard, Seagate Crystal Reports 6.x/5.x.
- Databases: Expertise in using Oracle 10g/9i/8i/8.0/7.0, DB2 8.0/7.0/6.0, DB2/400, MS SQL Server 2005/2000/7.0/6.0, Sybase 12.x/11.x, Informix, MS Access 7.0/2000, SQL, XML, Netezza, Teradata.
Technical Skills
Data Warehousing:Informatica Power Center 8.6.1/7.X/6.2/5.1, Data mart, OLAP, OLTP, SQL*Loader, Oracle Warehouse Builder, SQL Loader
Dimensional Data Modeling:
Dimensional Data Modeling, Star & Snowflake design, FACT and Dimension Tables, Logical & Physical Design, 3NF, ERWIN 4.5/4.0/3.5/3.2, Oracle Designer, Visio
Data Cleansing :
Trillium 7.6/6.0 (Converter, Parser, Geocoder, Matcher), First Logic
- BI Tools:
Cognos Report Net 1.01/1.1(Report Studio & Query Studio), Cubes, Cognos Frame work manager, Micro Strategy, Business Objects XI/6.5/6.0/5.1, Tableu, iDashboard
Databases:
Oracle 10g/9i/8i/8.0/7.x, IBM DB2 UDB 8.0/7.0, DB2/400, MS SQL Server 2000/7.0/6.0, Sybase 12.x/11.x, Informix, MS Access, Netezza, Teradata.
Programming:
C, C++, PL/SQL, HTML, XML, Unix Shell Scripting, PERL Programming
Other Tools:
Wincvs (version control), Tidal Scheduler, Quality center 9.0/Test Director, Remedy, Win SQL, TOAD, Putty, Win SCP, Microsoft Project, Visio
Environment:
Windows NT, Sun Solaris 5.5/5.4, IBM AIX 5.3/4.3, Red Hat Linux, Win XP, MS-DOS
PROFESSIONAL EXPERIENCE
Confidential,PA Feb 2009 – Mar 2011
Confidential,is nation\'s leading providers of entertainment, information and communication products and services. Comcast connects more than 24 million people nationwide to cable, digital telephone service, the Internet and some of television’s most popular programming on channels.
Responsibilities:
- Involved in end-to-end implementation of Data warehousing projects, which include Business Requirements gathering, Analysis, System study, Prepare Functional & Technical specifications, Design (Logical and Physical model), Coding, Testing, Code migration, Implementation, System maintenance, Support, and Documentation.
- Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems from DB2, DB2/400, Oracle, SQL Server, Flat files to load into Staging, ODS, Enterprise Data warehouse, and subject oriented Data marts for Sales, Marketing and Finance departments.
- Extensively used various types of Informatica transformations such as Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Joiner, Filter, Router, Update strategy, and Stored Procedure Rank, Normalizer, Sequence Generator, Sorter, Reusable transformations, Mapplets, Lookup Caches (Static, Dynamic).
- Developed various Informatica mappings, sessions, and workflows Automated jobs using Unix scripts and Tidal job scheduling tool.
- Developed PL/SQL Stored Programs (Procedures & Functions) to do data transformations and integrated them with Informatica programs.
- Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Created UNIX shell scripts for Automation and to invoke the stored procedures for the control process and parameter file generation.
- Extensively worked on source/target flat files and used FTP, SFTP, and SSH in Unix scripts.
- Administered the repository by creating folders and logins for the group members and assigning necessary privileges. Used partitioning and sub-partitioning methods in Oracle to improve the efficiency of loading and reporting the data. Used local partitioned indexes to enable parallel load of data into various partitions/sub-partitions.
- Worked on bug fixes on existingInformaticaMappings to produce correct output.
- Design ETL procedures to efficiently process large volumes of data.
- Implemented complex business rules using various complex transformations, Oracle packages, stored procedures, function, triggers, analytical functions
- Fine-Tune the ETL processes Informatica mappings, sessions,workflows, Oracle SQL queries, PL/SQL procedures to improve the efficiency of the load process.
- Identify and analyze data discrepancies and data quality issues and work to ensure data consistency and integrity. Performed audit on ETLs and validated source data Vs target table loads.
Environment: INFORMATICA Power Center 8.6.1, Oracle 10g/9i, DB2/400, SQL Server 2005, SQL*Plus, PL/SQL, TOAD, AS/400, Erwin 4.0, Windows NT, Putty, IBM AIX, Win SCP, Netezza, TIDAL Scheduler (job scheduling), Quality Center 9.0, Win CVS (version control, REMEDY, Cognos Cubes and Report Net (Report Studio and Query Studio).
Confidential,NJ Jun 2007 - Jan 2009 Confidential, offers a full line of property, casualty and life insurance products to protect businesses, cars, homes, lives and retirement incomes. AIG’s individual life insurance, group insurance coverage’s, annuities and pensions.Responsibilities:
Involved in analysis, design, development and maintenance, and supported Data warehouse.- Interacting with Data /Business Analysts and prepared Functional and Technical specifications.
- Created Unit Test Plan Scenarios.
- Analyzed Informatica mappings, SQL queries and Procedures and optimized them.
- Troubleshoot and resolved issues within SLA in case for load failures or issues.
- Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems like Flat files, SQL Server, Oracle to Oracle target database and flat files.
- Extensively used various types of transformations (Source Qualifier, Expression, Sorter, Aggregator, Joiner, Lookup (Connected and Unconnected), Filter, Router, Update strategy, and Stored Procedure (connected and unconnected), Rank, Normalizer, Sequence Generator, Mapplets, Lookup Caches (Static and Dynamic).
- Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Created reusable Transformations and Mapplets to use in multiple Mappings where and when required. Created Joiner Transformation to join heterogeneous sources using various kinds of joins like Normal Join, Master Outer Join, Detail Outer Join, and Full Outer Join.
- Review and Analyze functional requirements, mapping documents, assist in problem solving and troubleshooting.Managing the transformation, delivery, and exchange of data between both internal and external business systems across all business units - Upstream, Downstream
- Develop test cases and plans to complete the unit testing. Support System testing. Performed audit on ETLs and validated source data Vs target table loads. Provide on call support.
- Work with different application groups and architecture, design and deliver effective solutions.
- Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.
Environment: Informatica Power Center 8.6.1/7.1(Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager, Workflow Monitor, Worklets), Oracle 10g/9i, ANSI SQL, SQL Server, PL/SQL, Flat Files, Windows NT, IBM AIX, Control-M, WinCVS, Teradata (BTEQ, Fast Load, Multi load, TPump, Fast Export, Parallel Transporter). Test Director 8.0 /Quality Center 9.0.
Confidential,MA Feb 2007 – Jun 2007
Responsibilities:
- Involved in Dimensional Data Modeling and populating the business rules using mapping into the repository for Metadata Management
- Designed and developed Informatica Mappings for data load and data cleansing
- Extracted, transformed data from various sources such as Flat files, MS SQL Server and transferred data to the target data warehouse Oracle9i database and Flat files
- Created Stored Procedures, Functions and Triggers as per the business requirements
- Used update strategy and Lookup transformation to insert, delete, update or reject the records based on business requirements. Designed and developed complex aggregate, joiner, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL tool
- Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Weekly, Monthly and Yearly Loading of Data
- Utilized the Aggregate, Joiner, Router, Look up and Update transformations to model various standardized business processes. Used SQL Loader for bulk loading. Developed the Informatica scripts in order to populate the Global repository from the local repository
- Worked with Scheduler to run the Informatica session on daily basis and to send an email after the completion of loading. Conducted Unit testing and Integration testing
Environment: Informatica Power Center 7.1, SQL Loader, Erwin 3.5, Oracle 9i, Flat file, MS SQL Server, Windows 2000, Autosys, Test Director, Clear case, Sun Solaris, TOAD, Putty.
Confidential,PA Aug 2006 - Feb 2007
The objective is to achieve single point of reference to get the customer data from the various databases. Distributed data residing in heterogeneous data sources is consolidated onto target enterprise data warehouse. CHS EDW has been developed for Managing and processing Claims, Claims Management and Reconciliation (CMR). Patient’s information, history about disease and medication is collected and stored in the CMR system
Responsibilities
- Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
- Extensively Worked on Star Schema, Snowflake Schema, Data Modeling, Logical and Physical Model, Data Elements, Issue/Question Resolution Logs, and Source to Target Mappings, Interface Matrix and Design elements.
- Created Informatica mappings with T-SQL procedures to build business rules to load data.
- Wrote stored procedures and triggers in PL/SQL. Extensively worked on Performance Tuning.
- Cleanse the source data, Standardize the Customer address, Extract and Transform data with business rules, and built re-usable ‘Maplets’ using Informatica Designer.
- Created reusable Transformations and Mapplets to use in multiple Mappings wherever required.
- Extensively used ETL to load data from flat files (excel/access), Sybase, XML to Oracle database.
- Extensively used various types of transformations (Expression, Aggregator, Joiner, Filter, Update strategy, Lookup (Connected and Unconnected) and Stored Procedure (connected and unconnected), Rank, Normalizer, Router, Sequence Generator, Sorter, Source Qualifier, Mapplets, LookUp Caches (Static and Dynamic) to load the data.
- Created Informatica mappings with T-SQL procedures to build business rules to load data.
- Extensively worked electronic health record (EHR) systems, and HL7 messaging formats.
- Wrote Unix shell scripts for regular maintenance and automation.
Environment: Informatica Power Center 7.1/6, Oracle 8, PL/SQL, Business Objects 4.0, Crystal Reports 7.0, Sybase 11.5/11.9.2, DB2, JCL, ERWIN 3.5.2, Autosys, XML, Sun Solaris 2.x, WinNT 4.0
Confidential,NJ Feb 2006 – Jul 2006
Responsibilities:
- Extensively involved in the Analysis, design and Modeling. Worked on Snowflake Schema, Data Modeling, Data Elements, Issue/Question Resolution Logs, and Source to Target Mappings, Interface Matrix and Design elements.
- Design and develop logical and physical data models that utilize concepts such as Star Schema, Snowflake Schema and Slowly Changing Dimensions
- Involved in the development of Informatica mappings and also tuned for better performance.
- Involved in designing the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units
- Wrote stored procedures and triggers in PL/SQL. Involved in Consolidation, ETL, and customization of data. Optimized Query Performance, Session Performance and Reliability.
- Designed Databases schemas using Erwin Design Tool and also Generated Scripts through it.
- Extensively used Transformations for heterogeneous data joins, complex aggregations and external procedure calls.
- Provide guidance and work with the data architect in reviewing and validating the data warehouse data model. Developed Reusable Mapplets and Transformations for calculated business units.
- Involved in writing SQL Stored procedures to access data from Oracle 8.1.7, MS SQL Server 2000. Supported applications (24x7).
Environment: INFORMATICA Power Center 5.1/6.1 (Source Analyzer, Data warehousing designer, Mapping Designer, Mapplet, Transformations), Oracle 8.1.7, Business Objects 5.0, Web-Intelligence 2.5, Business Objects Designer 5.0, BO Developer Suite PL/SQL, SQL*Plus, SQL*Loader, PL/SQL, SQL Navigator, MS SQLServer2000, Erwin3.5.2, WindowsNT4.0
Confidential,IL Jan 2005 – Feb 2006Business loan Information system involved in loan amounts for various purposes likes personal loans, vehicle loan, housing loan, and customer durable loans. The company requires different levels of analysis regarding loan amount type customer and type of payment schedules, interest calculation.
Responsibilities:
- Involved in Extraction, transformation and loading of the data from different Data Sources like Flat files, Oracle to the Data Warehouse.
- Developed mappings using Informatica Power Center Designer to load data from various source systems to target databases. Responsible to design, develop and test the mappings.
- Created sessions and batches to run the mappings and set the session parameters to improve the load performance. Developed Mapplet, reusable transformations.
- Responsible for monitoring the all sessions that are running, scheduled, completed and failed. Debugged the mapping of the failure session.
- Involved in identifying Business requirements and deliver the solutions.
- Used various transformations like Source Qualifier, Aggregators, Connected and Unconnected Lookups, Filters, Routers, Sequence generator, Update Strategy, Expression.
Environment: Informatica Power Center 6.1, Business Objects, Oracle 8g, Windows 2000.
Confidential,India Mar 2002 – Oct 2004Responsibilities
- Created database tables, indexes in ORACLE RDBMS implementing various business rules using appropriate constraints and database triggers. Reverse engineered and reconciled the database to implement the changes in database using Designer 2000.
- Involved in development of Front end Screens for entering information of the problems using Forms 4.5. Designed various Reports using Reports 2.5.
- Developed Packages, database Triggers, Procedures. and functions to monitor the business functionality
Environment: Oracle 7.X, PL/SQL, Developer 2000 (Forms 4.5, Reports 2.5)