We provide IT Staff Augmentation Services!

Bi Resume

Seattle, WA

PROFESSIONAL SUMMARY:

  • 6+ years of experience in architect, analysis, design, development, testing, integration solutions, migration, reporting, implementation and administration of data warehouse. Experience in various databases, reporting and UNIX System Software Applications. Established a consistent record of providing successful BI solutions and deliverable benefits through the innovative use of technology.
  • Business Intelligence - Well versed with DW concepts, Dimensional modeling, Star, Snowflake Schema for designing Data Warehouse/Data Mart.
  • ETL Development – Expertise lies in architect, design, development, testing, implementation and maintenance of data warehouse solutions using ETL tool Informatica 6.1/7.1/8.6.1/9.0.1. Expert in developing complex mappings and debugging mappings/sessions to test and fix errors. Implemented best practices include processing several files, duplicate management, usage of parameter files, partitions, usage of re-usable components and QA process etc. Expertise in performance tuning of Informatica. Exposure in administering Informatica 7.1/8.1 and maintained users/folders. Installed and configured 7.1 and 8.1 in Windows environment.
  • Database Development – Adept at developing and supporting analysis solutions, data transformations, and reports. Strong relational database background in Oracle 9i/10g, PL/SQL, Netezza 6.0. Written complex SQL queries and worked on performance tuning and hints to speed up the SQL queries.
  • UNIX Scripting – Expertise in writing complex UNIX scripting including restart ability, calling SQL scripts/stored procedures, Informatica WF’s (PMCMD), control files, data quality checks and implemented best practices with the combination of Informatica. Implemented SFTP between the hosts (UNIX AIX, HP, Mainframe, Linux).
  • Extensively Implemented Error Handling Concepts
  • Reporting – Proficiency in using creating Cognos Reports using Report Studio, Query Studio and Analysis Studio.
  • QA – Implemented QA process and data quality checks in Informatica and Unix Shell Scripts to make sure the logic is adhered to the requirements. Prepared Unit Test Plans, Functional Test Plans and performed unit testing, functional testing, performance testing and system testing at various projects. Helped users in UAT.
  • Systems Integration and maintenance – Managed enterprise applications and various teams to deliver scalable, solutions across platforms. Successfully integrated Windows, UNIX and Mainframe systems. Provided 24x7 to key production environments.
  • Troubleshooting Skills – Skilled in troubleshooting, debugging using Informatica Debugger, diagnosing application, and operating system issues.

Education:

  • Master of Computer Applications
  • Bachelor of Science in Computers

Skills/Tools:

Data Warehouse Technologies:

Informatica (PowerCenter 9.0.1/8.1.1/7.1/6.1, PowerMart 5.1.2)

Databases:

Oracle 10g/9i/8i, Netezza 6.0/4.6.5, SQL Server 2000,MS Access 7.0/’97/2000,

Other Softwares:

Appworx 5.0, iQMS 2.0, IPMS 2.0, ERWIN, Data Clean, VSS6.0

Programming Languages:

PL/SQL, Java, C, C++, UNIX Shell Scripting (Ksh/Csh) and VB

Web Technologies:

HTML, DHTML, JavaScript, VB Script, JSP, ASP

Reporting tools:

Cognos 8

Operating Systems:

Sun Solaris 8/7/2.7/2.6, HP-UX, IBM AIX 4.2/4.3, UNIX, Win NT/2000

PROFESSIONAL EXPERIENCE:

Confidential,Seattle, WA
September 2008 - Till Date
Project : CM Data warehouse
Position : BI Developer
Confidential,based in Seattle, WA, is a leader in online social networking. The company operates Classmates.com (www.classmates.com), connecting millions of members throughout the U.S. and Canada with friends and acquaintances from school, work and the military. Its Classmates International subsidiary also operates leading community-based networking sites in Sweden, through Klassträffen Sweden AB (www.stayfriends.se), and in Germany, through StayFriends GmbH (www.stayfriends.de). Classmates Online is a wholly owned subsidiary of United Online, Inc.

Development/Maintenance Responsibilities:

  • Designed ETL Informatica jobs to run daily, monthly, Quarterly and Yearly.
  • Performed Impact Analysis for Enhancement Requests and estimate efforts (LOE) from the ETL perspective
  • Provided 3,000 hours of ETL load time reduction annually by performance tuning of various legacy ETL processes in Oracle Database, SQL query tuning, Informatica ETL processes tuning, Netezza ETL processes tuning thereby helped operations to meet the daily SLA well ahead of schedule.
  • Delivered analytical capability to business with respect to 10 different types of user generated content generated on Social networking site by developing and integrating ETL processes using Informatica mappings and Netezza performance server
  • Extracted the data from Oracle, Netezza, Flat Files into Oracle Data warehouse
  • Suggested and implemented best practices of eliminating duplicates, incremental loading of Log Files data loaded into CM data warehouse
  • Developed various mappings, mapplets, Transformations and responsible for validating and fine-tuning the ETL logic coded into mappings.
  • Simplified development and maintenance of ETL by creating re-usable Mapplets and Transformation objects.
  • Extensively involved in performance tuning by determining various bottlenecks at various points like targets, sources, mappings and sessions
  • Designed data feeds to load Oracle tables from Oracle tables, Netezzaa or from flat files
  • Prepared Unit Test Plans, Functional Test Plans and performed unit testing, functional testing and regression testing at various projects.
  • Code walk-through of programs for defects, semantics and standards checking
  • Expertise in UNIX shell programming and written complex Korn shell scripts (KSH) including restart ability, calling SQL scripts, Informatica WF’s/Sessions, control files and data quality checks
  • Responsible for validating the Informatica mappings against the pre-defined ETL design standards.
  • Responsible for the Data Cleansing of Source Data Using Various functions in Informatica.
  • Extensively Worked With Both Connected and Un-Connected Lookups.
  • 24/7 on-call support for Informatica, Unix Shell Scripts and Oracle PL/SQL programs
  • Performed RCA (Root Cause Analysis) when any defects occurred.
  • Worked on Push Down Optimization Mappings.
  • Worked with Sales Force Objects.
  • Worked on Sales Force Apex Data Loader to Pull the Data From the Sales Force
  • Coordinating With Offshore Team.

Environment: Informatica Power Center 9.0.1, Oracle 11g, Microstrtegy, Netezza 6.0, UNIX, Informatica DQ 9.0.1,Informatica DX

Confidential,Seattle, WA June 2008 to September 2008
Project : Safeco-Insurance Data warehouse
Position : ETL Developer/ Administrator

Development/Maintenance Responsibilities:

  • Prepared the Detail Design Document from the requirements specification document.
  • Defined responsibilities, assigning the work from functional & technical perspectives and reviewing deliverables.
  • Responsible for writing the test cases for Informatica mappings.
  • Responsible for setting up the test data to validate different scenarios of testing.
  • Created ETL Mappings for existing Sources/Feeds to Data warehouse fields.
  • Extensively used most of the Transformations such as Source Qualifier, Aggregator, Lookup, Filter, Normalizer, Update strategy etc.,
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size
  • Worked on Teradata Fast Load, Multi Load Scripts.
  • Understanding the Transformations carried out in the sample feeds and ascertaining the correct outcome or possible causes of error.
  • Extracted data stored in different databases such as Oracle 9i, SQL server, Flat files and to load data into staging tables first and then into Teradata.
  • Documentation of the New Process
  • Actively participated in the client meetings and suggested solutions for critical tasks.
  • Involved in enhancement, maintenance, and documentation support for ETL program using Informatica.

Administrator Responsibilities:

  • Administer the Repository by creating Folders and logins for Group Members.
  • Extensively involved in performance tuning by identifying various bottlenecks at various points.
  • Involved in Upgrading the Repository to 8.1
  • Created Relational Connections.


Confidential,Stamford, CT June 2006 to December 2007
Project : GRM-RemMaintenance Data warehouse
Position : ETL Developer/ Administrator

Project Description:
The Risk Exposure Monitoring development effort within Global Risk Management is designed to provide GRM employees at all levels with easy access to the exposure data that they need. Data will be collected from each GE Capital business globally. The Risk Exposure Monitoring project includes the creation of an Enterprise Data Warehouse for GRM, to meet specific business needs and feed a number of Data Marts. The data files that populate the Data Warehouse will be supplied by the various GEC businesses. The Informatica maps are used for the implementation of the business rules and for the creation of data used in reports

Development/Maintenance Responsibilities:

  • Captured Business requirements and analyzing the current GRM business processes and system.
  • Prepared the Detail Design Document from the requirements specification document.
  • Defined responsibilities, assigning the work from functional & technical perspectives and reviewing deliverables.
  • Requirements gathering and Impact Analysis for Change Requests or Enhancements and testing.
  • Responsible for writing the test cases for Informatica mappings.
  • Responsible for setting up the test data to validate different scenarios of testing.
  • Performed database testing using SQL to ensure the correctness of data loading.
  • Identified the Test requirements based on application business requirements and blueprints.
  • Created ETL Mappings for existing Sources/Feeds to Data warehouse fields.
  • Extensively used most of the Transformations such as Source Qualifier, Aggregator, Lookup, Filter, Normalizer, Update strategy etc.,
  • Worked with Mapping parameters and variables.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size
  • Created and managed daily, weekly and monthly data operations, workflows and scheduling processes.
  • Automated and Schedule Jobs Using Informatica Scheduler.
  • Worked on Teradata Fast Load, Multi Load Scripts.
  • Understanding the Transformations carried out in the sample feeds and ascertaining the correct outcome or possible causes of error.
  • Extracted data stored in different databases such as Oracle 9i, SQL server, Flat files and to load data into staging tables first and then into Teradata.
  • Monitoring the Jobs and sever space availability.
  • Worked with Scheduler to run the Informatica session on daily basis and to send an email after the completion of loading.
  • Documentation of the New Process and knowledge Transition to the Support Groups
  • I have worked on different OS like Sun Solaris 5.8 UNIX, and Windows NT 4.0
  • Actively participated in the client meetings and suggested solutions for critical tasks.
  • Involved in enhancement, maintenance, and documentation support for ETL program using Informatica.

Administrator Responsibilities:

  • Administer the Repository by creating Folders and logins for Group Members.
  • Extensively involved in performance tuning by identifying various bottlenecks at various points.
  • Created Relational connections.

Environment: Informatica Power Center 8.1/7.1/6.1,Informatica Power Connect / Power Exchange, Teradata,Oracle10g,MS SQL Server,T-SQL,UNIX,Teradata Fastload, Multi load Scripts,Cognos,Business Objects Sun Solaris 5.8 UNIX, and Windows NT 4.0.

Confidential,Stamford, CT June 2005 to May 2006
Project : GEAM Insurance Data Warehouse
Position : ETL Developer

Project Description:
Confidential,is a wholly owned subsidiary of GE (General Electric) that is in to Funds Management and Portfolio Management business. The main objective of GEAM Data warehouse is to store all the holding, Security and transaction information from all the businesses under GE Insurance. GEAM provides an active bottom-up approach to manage a full range of U.S Equity, International Equity, Fixed Income, Asset Allocation and Money market strategies. It also offers a family of Mutual Funds designed to meet a wide variety of Investment needs.

Developer Responsibilities:

  • Created ETL Mappings for existing Sources/Feeds to Data warehouse fields.
  • Extensively used most of the Transformations such as Source Qualifier, Aggregator, Lookup, Filter, Sequence generator, Update strategy etc.,
  • Designed and executed Test cases and obtained the desired results.
  • Developed Informatica Mappings for Type2 Slowly Changing Dimensions.
  • Worked with Mapping parameters and variables
  • Prepared Detail Technical Design Document from the requirement specification document.
  • Actively participated in the client meetings and suggested solutions for critical tasks.
  • Re-designed ETL mappings to improve data quality.
  • Most of the transformations were used like the Source qualifier, Aggregators, Connected & unconnected lookups, Update Strategy, Filters & Sequence.
  • Extracted data from SAP R/3 tables.
  • Created Mapplets and Reusable Transformations Using Informatica
  • Establishing & monitoring the processes for Issue Management, Change Management and Quality Management.
  • Extensively Worked on CR’s(Change Requests)
  • Created tasks, worklets, workflows and Schedule, sessions by using workflow manager
  • Understanding the Transformations carried out in the sample feeds and ascertaining the correct outcome or possible causes of error.
  • Worked out the impact analysis of the existing source systems against the users new requirements.

Environment: Informatica PowerCenter 7.1/6.1, Power connect, Oracle 9i, TOAD, UNIX, SAP, ALE IDOC, SQL, PL/SQL, Sun Solaris 5.8 UNIX, Windows NT 4.0.

Confidential, April 2004 to June 2005
Project : Home Depot Datawarehouse
Position : ETL Developer

Project Description:
The Home Depot is the world\'s largest home improvement retailer and second largest retailer in the United States. We help our customers build their dreams by being more than a store.
The Home Depot currently has over 1,800 warehouse-style home improvement stores in the United States, Canada, and Mexico. Home depot has committed to offering the ultimate home improvement shopping experience. With up to 40,000 different products, trademark customer service and guaranteed low prices, Home depot stores cater to every type of home improvement customer, from novice to expert.
it is having over 1,800 Home Depot stores, but it is also boast over 50 EXPO Design Centers, Home Depot Landscape Supply Centers, APEX Supply, and many other Home Depot-owned companies

Responsibilities:

  • Analyzed Source System data.
  • Design and development of ETL mappings using Informatica
  • Involved in importing Source/Target Tables from the respective databases.
  • Extensively used ETL to load data from Oracle and Flat files to Data Warehouse.
  • Involved in importing source/target tables from the respective databases.
  • Involved in Mapping of source with Data Marts for ETL program using Informatica.
  • Used Informatica Workflow Manager to create sessions, workflows to run with the logic embedded in the mappings.
  • Documentation of existing mappings as per standards.

Environment: Informatica PowerCenter 6.1, Oracle 8i, UNIX, SQL, PL/SQL, UNIX, Windows NT.

Hire Now