Senior Datawarehouse Resume
Charlotte, NC
SUMMARY
- 7 plus years of experience in Information Technology, with over 6 plus years of experience with huge Data Warehouses.
- Strong ETL experience using Informatica PowerCenter 8.x/7.x/6.x/5.x (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Mappings, Mapplets, Transformations in Oracle and Teradata environments.
- Expertise in Data Warehouse applications, directly responsible for the Extraction, Transformation and Loading of data from multiple sources into Data Warehouse.
- Experience in implementation of various dimensional models for data extraction and manipulation.
- Expertise in tuning existing mappings, identifying and resolving performance bottlenecks at various stages.
- Developed Cubes and Reports using Cognos PowerPlay.
- Developed reports using Microstrategy.
- Contributed towards the Impact analysis of design changes on Fact and Dimensions
- Thoroughly proficient in SQL and PL/SQL development in Oracle 10g/9i/8i.
- Experienced in designing and writing on-demand UNIX shell scripts for process automation and job scheduling.
- Experience with Cognos Reportnet.
- Excellent team player with solid communication and interpersonal skills.
- Highly motivated, industrious and self-disciplined team player.
ETL Tools: Informatica PowerCenter 8.x/7.x/6.x/5.x,
RDBMS: Oracle 10g/9i/8i, SQL Server 2000/2005/2008, Teradata,
MS Excess, TOAD
Methodologies: E-R Modeling, Star Schema and Snow Flaking.
Scripting languages: UNIX Shell Scripting (Korn shell)
Languages: C, C++, SQL, PL/SQL, JAVA.
Internet Technologies: HTML and XML
Operating Systems: MS-DOS, Windows 9x/NT/2K, UNIX (SOLARIS)
Reporting tools: Microstrategy, Cognos Power Play, MSRS
Scheduling Tools: Maestro, Cybermation ESP
CERTIFICATION
- Brainbench Certified Informatica Powermart/PowerCenter 6.2.1
ACADAMIC QUALIFICATIONS
- M.S in Computer Science (USA)
PROFESSIONAL EXPERIENCE
Confidential, Charlotte NC Nov '10-Present
Role: Senior Datawarehouse Developer
Environment: Informatica Power Center 8.6,Shell Scripts, UNIX, Oracle 10g/9i, DB2, Rapid SQL, Visio 2003.
Description:
Facilitated the Integration of Wealth Management Data Mart for Confidential, and Wachovia Bank systems. The Project involved integrating the Mortgage, Consumer and Commercial loans from both Wachovia and Confidential, systems to the newly built Private Banking (PB) Data Mart. The goal is to build a centralized data mart to meet the reporting needs of Credit and Counter Party Risk team, Financial, Accounting, Compliance and Compensation teams. Source feeds includes Teradata, Oracle, VSAM/Cobol Files, Flat Files and target tables resided in DB2.
Responsibilities:
- Involved in the JAD Sessions with the business partners to scope and size the development effort of the project.
- Worked closely with Business Analysts and our business partners to gather the requirements.
- Analyzed the requirements provided by the BA and designed the tables.
- Created Logical and Physical Data Models for the Mortgage data by implementing the snow flake-schema dimensional modeling
- Acted as Informatica Subject Matter Expert (SME) to the development team and the BA.
- Involved in the development and delivery of more than 150 mappings for the complete Data Mart
- Worked with departmental database architect and application support teams to ensure implementation follows internal design standards.
- Identified and appraised management of development issues, Design workarounds/implement design changes to address
- Performed Unit Testing. Analyzed and responded to QA and UAT issue reports. Implemented fixes.
- Provided Production support by running scheduled jobs, diagnosing session problems, fixing failed mappings etc.
- Prepare Change Request Documentation and Implementation plans.
- Simultaneously work on multiple projects and able to prioritize work based on business requirement deadlines
- Working in a team where the team members are at different geographic locations and time zones.
- Participated in daily scrums to provide the team members with status of ongoing work reporting progress and issues in timely manner.
Confidential, Charlotte, NC Dec-'07-Dec-'09
Role: Tech Lead
Environment: Informatica Power Center 8.1, Shell Scripts, UNIX, Oracle 10g, TOAD, Cybermation, Visio 2003, Changeman, Lotus Notes, Mercury Quality Center.
Description:
Complete Midwest Financial Migration (CMFM) Scalable Reporting 2008 is focused on ability to integrate Duke's and Cinergy's financial systems. The capabilities of the FIHUB are expanded by increasing the scope and size of the FIHUB - Core Data Model and replacing Financial Interfacing Architecture (FIA) with Financial Interfacing System (FIHUB-FIS).
Financial Information Hub (FIHUB) is the primary system of reference for the majority of users and systems requiring Duke Energy finance information. Financial data is imported from external systems and is controlled by the data management environment contained within, thus assuring data integrity. Core Data is a "full breadth" resource with a business-oriented data organization used specifically for reporting and down stream application feeds. Involved in the development of the following interfaces-EAM (Enterprise Asset Management), FIS (Feeder Interface Services) and Taxstream.
Responsibilities:
- Interacted with Business Users to gather the Business Requirements.
- Designed, developed and deployed new Interfaces along with modifying existing Interfaces to support additional business requirements.
- Analyzing the source data coming from different sources and working with business users and developers to develop the Model.
- Create DDL for the tables, views and other database structures
- Developed ETL design Flow charts using Microsoft Visio.
- Written High Level Design Document and Low Level Design Documents.
- Write Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.
- Mentor 6-8 Developers in the SDLC from Design, Build, and Unit Testing till the Integration Process.
- Interact with the offshore team in development of ETL mappings, code and unit test plan reviews on a regular basis.
- Followed rigorous SDLC.
- Prepared SDLC documentation for any Production moves or fixes and adhered to Change Control and management procedures.
- Migrated Mappings, Workflows from Development to Integration, Integration to System Test.
- Working in all environments of SDLC as Development, Testing, and Production.
- Designed mappings to pull Daily/Weekly/Monthly/Yearly data and write it to Data warehouse.
- Developed many complex SQL queries to improve the performance of the process.
- Used Informatica Designer for developing mappings, using transformations, which includes aggregate, Update, lookup, Expression, Filter, Sequence Generator, Router, and Joiner.
- Created reusable transformations and mapplets and used them in mappings to reduce redundancy in coding.
- Jobs were run on demand basis and some were scheduled to load the data into tables on daily basis using the ESP Scheduler.
- Performed Data cleansing prior to Transformation and Loading for heterogeneous sources.
- Followed complete cycle of SDLC, coding, testing including test case scenarios, tested the data in QA and fixed any errors.
- Implemented different Performance tuning methods such as filtering the data right at the Source Qualifier, implemented Partitioning.
- Verified the ETL mappings and performed QA checks to make sure the coding standards are followed and labeling the code for migration and took part in the ETL migrations from one repository to other.
- Participate in MOCK and Production Code deployments.
- Performed Batch Schedule testing to verify the code satisfies all test scenarios and update the Quality Center with the findings.
- Creation of Deployment Plans and Service Implementation requests (SIR) for the various projects.
- Logged defects in Quality Center throughout the project to keep tracks of all the issues.
- Provide Knowledge Transfer sessions to the Production Support team from documentation available to understanding all the prospective of the ETL's developed in order to be able to support it in the event of issues in production.
- Hands on experience with Microsoft Reporting Services.
Confidential, Charlotte, NC Jun '06 - Sept'07
Role: Senior Technical developer
Environment: Informatica Power Center 8.1/7.1 , Shell Scripts, UNIX, Oracle 8i/ 9i, TOAD, Maestro, Visio 2003,Microstrategy.
Description:
The Accumulator is a centralized point of data collection and distribution. It is intended to collect, cleanse, and integrate operational data from various source systems in preparation to distribute that data to domain-specific data marts. The objective of this project is to make the Accumulator the central place for products and client information and the one and only source that feeds this information to various systems through an interface.
Responsibilities:
- Designed, developed and deployed new data marts along with modifying existing marts to support additional business requirements.
- Involved in meetings to gather information and requirements for the clients.
- Created following standards documents for the standards and conventions used to be used by Informatica Developer.
- Developed product design plans and applications that meet business requirements.
- Attended JAD (Joint Application Design) sessions to meet with the business clients directly, understand their requirements and analyze what they data they expect and when they expect it.
- Developed Database designs and implementations along with Database architecture group
- High Level design (HLD) and Low Level design (LLD) discussions with end user clients.
- Worked closely with executive sponsors and user decision makers to develop the transformation logic to be used in Informatica.
- Identified and tracked Change Data Capture in the dimensions.
- Created views on the tables so that no one has direct access to tables for security.
- Worked on Dimensional modeling to design and develop SNOWFLAKE schemas using ER-win 4.0, Identifying Fact and Dimension Tables.
- Created Mapping Documents field-by-field from Source to Target for all the tables involved.
- Extensively used Transformations like Router, Aggregator, Joiner, Expression, Lookup, Update Strategy and Source Qualifier.
- Wrote UNIX shell scripts for Informatica ETL tool to run the Sessions.
- Creatednamingandsystemstandards for lookup, transformation and target tables.
- Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
- Involved in code review and performance tuning of shell scripts and Informatica Mappings designed by various developers in the team - on-shore and off-shore.
- Responsible for troubleshooting, identifying and resolving data problems, Worked with analysts to determine data requirements and identify data sources, provide estimates for task duration.
- Designed the diagrams for scheduling processes-Nightly, Near Real time and Weekly using
- Visio 2003
- Create job definitions and job streams in Maestro. Scheduled the jobs to run and monitors the job for performance and any related issues.
- Involved in unit testing, systems testing, integrated testing and user acceptance testing.
- Maintain synchronization between the progress in development and the documentation of the project till the end.
- Worked with the *business intelligence* tool - Microstrategy to develop appropriate applications that query against the data warehouse.
- Migrating all the projects from Informatica 7.1.3 to Informatica 8.1 (Mappings, Sessions,
- Workflows, Scripts and Maestro jobs)
Confidential, Des Moines, IA May '05- May'06
Role: Programmer/Analyst
Environment: Informatica Power Center 7.1/6.2, Shell Scripts, UNIX, Oracle 8i/ 9i, Windows 2000, Cognos.
Description:
Insurance division data warehouse for collecting client, providers and third party data for Customer Campaign Management. Various data marts for Sales and Marketing teams are derived off of this insurance data warehouse based on the region and divisions.
Responsibilities:
- Translated business-reporting requirements into data warehouse architectural design.
- Designed, developed and deployed new data marts along with modifying existing marts to support additional business requirements.
- Worked closely with executive sponsors and user decision makers to develop the transformation logic to be used in Informatica.
- Identified and tracked the slowly changing dimensions, heterogeneous sources and determined the hierarchies in dimensions.
- Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update Strategy, Aggregator and Sequence generator.
- Wrote UNIX shell scripts for Informatica ETL tool to run the Sessions.
- Worked on Dimensional modeling to design and develop STAR schemas using ER-win 4.0, Identifying Fact and Dimension Tables.
- Creatednamingandsystemstandards for lookup, transformation and target tables.
- Analyzed data relationships graphically and changed displays using Cognos PowerPlay functionality by means of drill down, slice and dice, rank, sort, forecast, and nest information to gain greater insight into trends, causes, and effects.
- Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
- Created versions of customer related data for different divisions and departments using Cognos Reportnet.
- Involved in creation of Informatica users and repository backup using Server Manager.
- Finished the tasks within the allocated time for every release, always we are on time and on Target.
- Developed schedules to automate the update processes and Informatica Sessions/Batches.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
- Generated reports for end clientusing various Querytools.
Confidential, Richmond VA March'04- April '05
Role: Data Warehouse Developer
Environment: Informatica Power Center 7.1/6.2/ 5.1, Shell Scripts, UNIX (IBM-AIX, Solaris), SQL Server 2000, Windows 2K/XP,Cognos.
Description:
Campaign management and customer loyalty points programs data warehouse project.
Responsibilities:
- Translated business-reporting requirements into data warehouse architectural design.
- Designed and maintained logical and physical enterprise data warehouse schemas
- Designed, developed and deployed new data marts along with modifying existing marts to support additional business requirements.
- Worked closely with executive sponsors and user decision makers to develop the transformation logic to be used in Informatica.
- Identified and tracked the slowly changing dimensions, heterogeneous sources and determined the hierarchies in dimensions.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update Strategy, Aggregator and Sequence generator.
- Involved in writing UNIX shell scripts and Perl Scripts for Informatica ETL tool to run the Sessions.
- Provided technical and analytical expertise in responding to complex, specialized report requirements involving higher-level data analyses using Cognos PowerPlay.
- Creatednamingandsystemstandards for lookup, transformation and target tables.
- Installed Informatica software on the client systems and configured Informatica server.
- Involved in creation of Informatica users and repository backup using Server Manager.
- Finished the tasks within the allocated time for every release.
- Developed schedules to automate the update processes and Informatica Sessions/Batches.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
- Generated reports for end clientusing Querytools.
Confidential, Hyderabad, India May '01 - July '02
Role: Programmer
Environment: Oracle 7, Oracle PL/SQL, Oracle Forms & Reports 2.5, Windows 2000.
Description :
The system maintains order processing and dispatching of goods. It consists of order processing and invoicing, document generation, stock allocation, excise accounting and sales analysis. It also maintains order status and shipping information. The work involved development of PL/SQLprocedures, functions, development of forms as per specifications, development of SQL reports, testing the forms and reports.
Responsibilities:
- Development of stored procedures and functions as per technical specifications.
- Development and modification of front end forms.
- Development and modification of reports.
- Unit Testing and System Testing.
- Loading of flat files into oracle database tables using SQL*LOADER and analyzing the output.
Confidential, Hyderabad, India March '00- April '01
Role: Programmer Analyst
Environment: - Oracle 7.3, Windows NT, PL/SQL, Developer 2000(Forms 4.5 and Reports 2.5).
Description:
This project involved development of a Human Resource Management System. This system is designed and developed to handle payroll, time tracking etc.
Responsibilities
- As a Programmer Analyst in a team of 4, was involved in design and development of this application.
- Interacted with functional users for developing application specifications.
- As a team member in a team of 3 members, was involved in programming in Developer 2000 for developing the front end of the application.
- Was also responsible for creating Reports using Oracle Report Writer 3.0.
- Performed all software activities including design, development, testing and bug fixing.
- Study and analysis of the System.
- High Level design (HLD) with discussing end user clients.
- Generation of ERD.
- Data extraction.
- Created and maintained summary tables for faster performance.
- Created various snapshots and database Links.