We provide IT Staff Augmentation Services!

Etl Senior Technical Developer Resume

CharlottE

SUMMARY:

  • Around 8.5 years of combined Industry Experience as a Data Warehousing Expert and ETL Developer in designing, developing and implementing ETL processes for business applications.
  • Extensive experience in analysis, design, development & migration of data warehouse applications.
  • Worked on Development, Maintenance, enhancement and support related projects.
  • Diversified domain experience in banking domain.
  • Working in banking domain for over 5 yrs.
  • Demonstrated understanding of Dimensional modeling (Star & Snow flake schema) and other data warehousing concepts.
  • Worked on Informatica versions of INFORMATICA 6.1.3, INFORMATICA 7.3.1, INFORMATICA 8.1.1 & INFORMATICA 9.0.1
  • Extensive experience in implementing the ETL processes which include developing the data model, designing the change capture and disaster recovery strategies.
  • Have clear understanding of Data Warehousing concepts with emphasis on ETL and Life Cycle Development including requirement analysis, design, development, testing and implementation.
  • Experience in Data Warehouse development working with Extraction/Transformation/Loading using Informatica PowerMart/PowerCenter with Teradata & Oracle.
  • Experience in automating of the ETL processes for migration and patching Informatica objects.
  • Worked with various Power Center/Power Mart components like Informatica Server and Client tools e.g.) Designer, Workflow Manager, and Repository Manager & Workflow Monitor.
  • Strong experience with star and snowflake schema, dimensional data modeling, Fact and dimensional tables, slowly changing dimensions.
  • Developed shell scripts for invoking Informatica workflows.
  • Involved in Informatica Performance tuning of the ETL process and SQL tuning.
  • Extensive experience in designing the architecture for Extract, Transform, Load environment and development of mappings and process using Informatica Power Center.
  • Extensive experience in formulating error handling mechanism.
  • Used Command Line Utility like PMREP, PMCMD.
  • Experience in integration of various data sources like Teradata and Oracle.
  • Worked extensively on Teradata Utilities like MLOAD, TPUMP, FASTLOAD & FASTEXPORT and also Oracle Stored Procedures, Table Partitions SQL queries and loading data into Data Warehouse/Data Marts using Informatica.
  • Involved in upgrading the product from PC 7.1.4 to PC 8.1.1, and also applied PC 8.1.1 SP1, SP2, SP3 and SP4on top of PC 8.1.1.
  • Extensively used third party schedulers which include TOAD, AutoSys, Tivoli, Maximo, CA7 & Quality center etc.
  • Extensively used various third party version control tools like Source safe & Sub Version tools.
  • Excellent communication skills, Good organizational skills, outgoing personality, Self-motivated, hardworking, ability to work independently or cooperatively in a team, eager to learn, ability to grasp quickly.

TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 6.x/7.x/8.x/9.x
Databases: Teradata V2R5 & V2R6, Oracle 8.i/9.i/10g.
Data Modeling Tools: Erwin 4.1.
Version Control Tools: Source safe & Sub Version
Languages: SQL, PL/SQL, C, C++, Java
Scripting Languages: Korn Shell, Bash shell scripting
Business Intelligence Tools: Had training on Cognos 7.x, Informatica Power Exchange
Scheduling tools: Autosys and Tivoli
Special Tools: TOAD, Autosys, Tivoli, Maximo, CA7, HP Quality center, Changeman, NDM & File-Aid
Operating Systems: Windows, Sun Solaris

Education: Bachelor of Technology in Electronics & Communications engineering.

CERTIFICATIONS:

  • Certified in Informatica 8 Mapping Design
  • Certified in Informatica 8 Advanced Mapping Design
  • Teradata Certified Professional V2R5
  • Teradata SQL Specialist V2R5
  • Certified in Banking Competencies
  • Certified in Designing and Implementing with Microsoft SQL Server 2000 EE

PROFESSIONAL EXPERIENCE

Confidential,Charlotte Apr ‘07 - Present
ETL Senior Technical developer/Tech Lead

financial services company, the largest bank holding company in the United States, by assets, and the second largest bank by market capitalization. Bank of America serves clients in more than 150 countries and has a relationship with 99 percent of the U.S. Fortune 500 companies and 83 percent of the Fortune Global 500. The company is a component of the Dow Jones Industrial Average (DJIA) and a member of the Federal Deposit Insurance Corporation (FDIC).
Some of the Key aspects of BOA are -

  • The bank\'s 2008 acquisition of Merrill Lynch made Bank of America the world\'s largest wealth manager and a major player in the investment banking industry.
  • The company holds 12.2% of all U.S. deposits, as of August 2009, and is one of the Big Four Banks of the United States
  • It has clients in over 48 states and 38 countries around the world.
  • It has $ 2.25 trillion dollars in total assets by the end of 2009 and 33 million retail customers.
  • It is the No. 1 debit card issuer in the United States, with nearly 16 million cards.
  • It has the largest proprietary ATM network in USA.

In total worked on 4 Projects in BOA over a period of 5.5 yrs.

Project Details below in chronological order.

Project 1: Supply chain Management Business Intelligence. Confidential, Duration: Nov’11 – Present

The SCMBI (Supply chain Management Business Intelligence) group of Bank Of America has been formed to develop a datawarehouse platform to store data of all suppliers of the Bank. This system basically comprises of Informatica as ETL tool, Oracle as the database and Microstrategy as the reporting tool. THE SCMBI has the data regarding all the supplier info and also the spend amount associated with each products that the supplier is proving for the bank. The data stored in SCMBI will be reported by business partners who will be making decisions on the reported data

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop data models, requirements document, and ETL specifications.
  • Involved in designing the system and designing the logical database design using ER Studio.
  • Involved in designing the physical database system.
  • Involved in data quality and cleansing of data source.
  • Involved in creation of database schema and capacity planning of schema for the warehouse.
  • Coding as per the mapping/D953 documents.
  • Unit testing of the mappings.
  • Responsible for migration of applications from dev environment to Test and finally to Production.
  • Developed UNIX scripting for data feed files for making ready to processing as per System.
  • Scheduled jobs in Autosys scheduler.
  • Developing the mapping documents including System of Record field names, Mapping Rules and BI Target field names.
  • Discussion of the mapping document with Architects and mapping team to appropriately map Source elements to SCMBI target model and create a D953 design document.
  • Continuous support through conference calls with mapping team and development team to provide accurate data in new environment.
  • Transmission of the System of Record files from UNIX to SCMBI environment.
  • Preparation of test cases to validate the output data generated using the mapping documents.
  • Validation of the test cases using Quality Center tool of Bank of America.
  • Defect tracking through Quality Center and SharePoint portal.

Environment: Informatica 9.0.1, Oracle, UNIX, Autosys Scheduler, Sub Version Control Tool.

Project 2: Confidential, (Information and Analytics Foundation). Duration: Jan’11 – Nov’11

Information and Analytics Foundation project is to achieve an industry best in class enterprise information & analytics environment. This program consolidate two of the largest data warehouses in Bank of America, “The W” and Bacardi, onto a new Teradata hardware platform in the bank’s data centers with less complex and fully integrated data reference model , called Banking Warehousing model, to enable cross-customer and product-domain analytics.
Mortgage, Customer, Deposit, Channels and Ecommerce are various lines of business in IAF project. We analyze the current W data and code to understand the existing mapping rules. These mapping rules will be further applied on system of Records data in new environment to load into the integrated Banking Data Warehousing (BDW) Model by reducing the complexity and operation costs.

Responsibilities:

  • Perform analysis of existing code in mainframes or in ETL tools like Informatica to understand the mapping logics
  • Developing the mapping documents including System of Record field names, Mapping Rules and W Target field names.
  • Perform analysis of the existing W data to understand the mapping rules in the current environment
  • Discussion of the W mapping document with Architects and mapping team to appropriately map ‘The W’ elements to the Banking Data Warehousing (BDW) Model.
  • Continuous support through conference calls with mapping team and development team to provide accurate data in new environment.
  • Reviewing the BDW mapping document to produce the same data in new environment as in current W.
  • Transmission of the System of Record files from mainframes / UNIX to new environment.
  • Coding in new environment as per the mapping documents.
  • Preparation of test cases to validate the output data generated using the mapping documents.
  • Validation of the test cases using Quality Center tool of Bank of America.
  • Defect tracking through Quality Center and SharePoint portal.
  • Had training on Datastage tool.

Environment: Informatica 8.6, Teradata V13, Datastage and quality stage Designer, UNIX, Autosys Scheduler.

Confidential, Duration: Aug’09 – Jan’11

Data Warehouse Management System consists of the following
ECOMM
Channels
Deposits
House Holding
Loans
Bank of America Data warehouse is one of the largest in the world, loading approximately 300 GB of data per month. Informatica is one of the prime tools employed by BOA for performing ETL load with approximately 45 applications currently in production. Some of the functions handled by DW Management group include delivering the flat file(s) “AS IS” and/or loading the flat file(s) into The W according to Service Level Agreementsand providing ongoing support by resolving problems and answering questions related to the data to the clients of the W, documenting and distributing DWIS sub-routine (or program) names used during new development for Special Data Warehouse (DW) Column Derivations and Common Edits and Output sub-routines. The scope of project is to develop applications in ECOMM domain for Interact and ML transition portfolios.

Responsibilities:

  • Handling Project Meetings & Perform analysis of the existing W data to understand the mapping rules in the current environment.
  • Finalize design, Model and High Level Design and Low Level Design Documents.
  • Perform analysis of existing code in mainframes or in ETL tools like Informatica to understand the mapping logics
  • Coordinating with offshore for Coding Testing and Deployment in prod.
  • Developing the mapping documents including System of Record field names, Mapping Rules and W Target field names.
  • Transmission of the System of Record files from mainframes / UNIX to new environment.
  • Coding in new environment as per the mapping documents.
  • Preparation of test cases to validate the output data generated using the mapping documents.
  • Validation of the test cases and Defect tracking using Quality Center tool.

Environment: Informatica 8.6, Teradata V2 R5, UNIX, F-SECURE, Teradata SQL Assistant, Autosys Scheduler.

Hire Now