We provide IT Staff Augmentation Services!

Data Warehouse Analyst Resume

5.00/5 (Submit Your Rating)

Montvale, NJ

SUMMARY:

  • 8 years of IT experience in Software Analysis, Development, manual testing on web based and Implementation of business applications for Financial, Mortgage Lending, Insurance, Healthcare, Telecom and sales.
  • 6+ years of ETL and data integration experience in developing ETL mappings and scripts using Informatica Power Center 9.1/8.6.1/8.5/8.1/7.1/6.2, Power Mart 6.2/5.1, Informatica Data Quality (IDQ) and Informatica Data Analyst (IDA) tools using Designer (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Repository Manager, Workflow Manager & Workflow Monitor.
  • 5+ years experience using Oracle 11g/ 9i/8i/7.x, MS SQL Server 2005/2000, Teradata V2R5/V2R4, MS Access 7.0/2000, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x
  • Extensive experience in creation and maintenance of database objects like tables, views, materialized views, indexes, constraints, primary keys, sequence, synonyms and database Link
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Extensive experience in development and modification of PL/SQL Scripts, Anonymous Blocks, Triggers, Procedures, Functions, Packages and Triggers using PL/SQL.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica graphs .
  • Tested the ETL Informatica mappings and other ETL Processes
  • Good at Data Warehouse techniques - Dimensional data Modeling, Star Schema and Snowflake Schema
  • Experienced in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experienced in Performance tuning of targets, sources, mappings and sessions.
  • Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet Metadata and Mapping Designer.
  • Involved in the creation of new objects (Tables/Views, Triggers, Indexes, Keys) in Teradata and modified the existing 700+ ETL's to point to the appropriate environment.
  • Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
  • Experience in preparing Test Strategy, developing Test Plan, Detailed Test Cases, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables.
  • Created Test Cases and developed Tractability Matrix and Test Coverage reports .
  • Experienced in integration of various data sources like Sales force, Oracle, DB2, SQL server and MS access into staging area.
  • Extensively used the VI editor for modification of UNIX scripts.
  • Performed SDLC cycle of Data Integration, Unit testing, System Integration testing, Implementation, Maintenance and Performance tuning.
  • Testing of records with logical delete using flags
  • Managed and conducted System testing, Integration testing and Functional testing .
  • Extensively written test scripts for back-end validations
  • Good experience on scheduling jobs using UNIX AUTOSYS.
  • Expertise in SQL queries and Query Optimization, Report Testing techniques.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Knowledge on Informatica Data Explorer (IDE) and IDQ Workbench for Data Profiling.
  • Having exposure on Informatica Cloud Services.
  • Experience in Teradata and worked on Teradata utilities such as B.TEQ, MLOAD, FLOAD.
  • Built report and dash boards using Business Objects. Expertise on Business Objects Universal Designer adding objects and classes and building reports through web intelligence.
  • Involved in testing, debugging, bugs fixing and documentation of the system.
  • Participated in design and code reviews and verified compliance with the project’s plan.
  • Experienced in Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
  • Expert in resource and team management. Excellent written and verbal communication skills and have clear understanding of business procedures and ability to work as an individual and as a part of a team.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.1/9.0/8.6/8.5/7. x, Informatica Power Exchange, SAP BODS 4.x, SSIS,, Reporting Service, IDQ, IDE, Data quality

Databases: Oracle 11g/10g/9i, DB2 8.0/7.0, MS SQL Server2008/05, SYBASE, Teradata, SAP R/3, SALESFORCE MS Access 7.0/97/2000.

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling Fact and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.0/3.5.2/3. x.

GUI: TOAD, Visual Basic 5.0/6.0, FrontPage 97/98/2000.

Programming: Visual Basic 6.0/5.0, PowerBuilder 6.0, C, PL/SQL, JavaScript, PERL, VBScript, HTML, XML, UNIX shell scripting, DHTML.

Design Tools: Erwin 4.5/4.0, Oracle Designer 2000.

Environment: Windows95/98/2000/XP/VISTA, UNIX AIX 5.2/5.3, LINUX, WinNT 4.0, MSDOS.

PROFESSIONAL EXPERIENCE:

Confidential, Montvale, NJ

Data Warehouse Analyst

Responsibilities:

  • Involvement in High Level Design and Low Level Design document.
  • Worked on BW Objects in Vistex Price Book, Price Request, Agreement, IP module (Bill backs, Customer Rebates, Sales Incentives, Chargebacks and Purchase Rebates) both transactional and composite processing, Transaction Register, and Claims related data modeling and reporting.
  • Developed Informatica mappings for data extraction and loading worked with Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.
  • Changed the source tables coming from ECC to BW in the mappings as per the requirement and made the changes from the mapping specification document.
  • Involvement in Test Strategy and test case designing.
  • Working with end customer to finalize the requirement.
  • Migration of ETL Codes across environments (Development, Test, Production).
  • Development of UNIX scripts to run the Informatica workflow real time and responsible for fixing defects during development, SIT, UAT, post production.
  • Create and maintain a project plan and update it on a weekly review with the IM and onsite coordinator of the Project.
  • Worked on performance tuning.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Used SQL, PL/SQL, and TOAD to validate the Data going in to the Data Ware House.
  • Used Quality Center for creating and documenting Test Plans and Test Cases and register the expected results.
  • Knowledge sharing within the team whenever required.

Environment: Informatica Power Center 9.6.1/9.5, Power Mart, Oracle 11g, PL/SQL, BI 4.1, BODS 4.1, IDT, WEBI, Lumira, SAP BW 7.4 SP 11, SAP ECC 6.0, SLT, Linux, Oracle 10g, BPC 10.1, Erwin, Windows XP.

Confidential, St. Louis, MO

Informatica Developer

Responsibilities:

  • Extensively used Informatica power Center 9.6.1 for ETL (Extraction, Transformation and Loading), date from relational tables.
  • Used Protegrity to apply functions to reference to tokenize and de- tokenize the data.
  • Walked through the Informatica and Teradata code to identify protected information references of columns like SSN, Medicaid number, Last name and first name.
  • Worked on creating sessions in the workflows based on the requirement.
  • Worked with data architecture teams to augment and define new structures.
  • Extensively worked on making changes to the parameter files if needed. All the ETL code is in Linux scripts.
  • Extensively used Tidal for scheduling the jobs when needed.
  • Involved in upgrading Informatica Power Center 9.5.1 to 9.6.1.
  • Written SQL queries to check whether the data is tokenized or not.
  • Worked on Query banding to pull all the data required for tokenization from the repository.
  • Deployed the code or changes made from Development to Test.
  • Extensively coordinated with other departments of the company to make desired changes to the workflows.
  • Involved in testing all the sessions, workflows, to check if the desired changes were made.
  • Created export scripts using Teradata Fast export Utility.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 9.6.1/9.5, Power Mart, Teradata SQL assistant, PL/SQL, Tidal, Tableau, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, UNIX shell scripting, AGILE, Erwin, TOAD, Oracle, Autosys, Business objects, Windows XP.

Confidential, Denver, CO

Informatica Developer

Responsibilities:

  • Analyzed business documents and created system requirement specification.
  • Extensively used Informatica power Center 9.5 for ETL (Extraction, Transformation and Loading), data from relational tables and flat files.
  • Extensively worked on complex mappings, mapplets and workflow to meet the business needs and ensured they are reusable transformation to avoid duplications.
  • Designed and developed star schema, snowflake schema and created fact tables and dimension tables for the warehouse and data marts using Erwin.
  • Implemented Join, Expressions, Aggregator, Rank, Lookup, Update Strategy, Filter and Router transformations in mappings.
  • Utilized the Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
  • Responsible for creating complete test cases, test plans, test data, and reporting status ensuring accurate coverage of requirements and business processes
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mappings, build, unit testing, systems integration and user acceptance testing.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Design and schedule the Autosys process to execute daily, weekly and monthly jobs.
  • Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
  • Performed data validation testing writing SQL queries
  • Used Repository manager to create folders, which is used to organize and store all metadata in the repository.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5.1.
  • I analyzed, designed and implemented ODS, Data marts, Data warehouse and operational databases.
  • Migrating from Informatica 9.1.1 mappings into Informatica 9.5.1 which consists of grid technology.
  • Created Dimensional and Relational Physical & logical data modeling fact and dimensional tables using Erwin.
  • Understanding the domain and nodes as well as by using the Informatica integration service to run the workflows in 9.5.1.
  • Design and development of the Informatica workflows/sessions to extract, transform and load the data into Target. Created database triggers for Data Security.
  • Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets to data into data warehouse.
  • Developed shell scripts for Daily and weekly Loads and scheduled using Unix Maestro utility.
  • Created export scripts using Teradata Fast export Utility.
  • Involved in writing SQL scripts, stored procedures and functions and debugging.
  • Involving in Functional Testing & Regression Testing
  • Responsible for providing comments for user stories within an AGILE software development SCRUM environment.
  • Created sessions and batches and tuned performance of Informatica sessions for large data files by increasing the block size, data cache size and target based commit interval.
  • Prepared test data by modifying the sample data in the source systems, to cover all the requirements and scenarios.
  • Used debugger to test mapping at designer level.
  • Developed email routines to indicate failure or successful completion of workflows.
  • Wrote stored procedures to check source data with warehouse data if it is not present wrote those records to spool table.
  • Used Quality Center for creating and documenting Test Plans and Test Cases and register the expected results.
  • Experienced in retest the existing test cases with the different kind of source systems for different periods of data
  • Created configured and scheduled the sessions and Batches for different mappings using workflow manager and using UNIX scripts.
  • Extensively used PL/SQL programming in backend and front end functions, procedures to implement business rules.
  • Involved in upgrading from Informatica Power Center 8.6 to 9.5.1.

Environment: Informatica Power Center 9.5/8.6, Power Mart, Oracle Oracle11g/10g, PL/SQL, Cognos, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, UNIX shell scripting, AGILE, Erwin, TOAD, Teradata, Autosys, mercury Quality center, Business objects, Windows XP.

Confidential, NJ

ETL/ Informatica Developer

Responsibilities:

  • Designed/Enhanced the Entire ETL Architecture for Basel II.
  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Gate authoring and versioning of Business Event Group (BEG) documents - Definition, Scope, Description, Requirement tracing to Functional & Technical Specifications.
  • Responsible for On Shore and Off Shore resources and planning for Basel Projects
  • Analyzing/Setting the timelines for the ETL and DB work.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5.1.
  • Analysis of the End user requirements and involved in Modeling the ETL Schema.
  • Developing Automated Test Scripts to perform Functional Testing, Performance Testing, Integration Testing, Stress Testing, System Testing, User Acceptance Testing, Regression Testing and Volume testing of the application using LoadRunner
  • Detailed Analysis of the Data provided by the respective source systems and filling in the gaps in the mapping specs.
  • Data modeling and design of for data warehouse and data marts in star schema methodology with Dimensions and Fact tables.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • I worked in several data marts over one terabyte in size.
  • Involved in implementing data quality and data security framework
  • Used SQL, PL/SQL, and TOAD to validate the Data going in to the Data Ware House.
  • Implementing the standards of creating ETL Workflows/Mappings.
  • Creating ETL Workflows/Mappings for Basel Project
  • Implemented standards for naming Conventions, Mapping Documents, Technical Documents, Migration form
  • Involved in upgrading from Informatica Power Center 8.6 to 9.5.1.
  • Responsible for providing comments for user stories within an AGILE software development SCRUM environment.
  • Created Defect Management flows for Factory (Pre-production) and Production issues.

Environment: Oracle 9i/10g, Informatica 8.6.1, SDLC, SQL Server, Teradata, Sybase, Solaris, Windows XP, HP Quality Center, PL/SQL, UNIX shell scripting, IDQ, IDE, AGILE, Autosys 4.0

Confidential, NY

ETL/ Informatica Developer

Responsibilities:

  • Responsible for ETL setup for data transfer and data integration across different applications, RDBMS, flat files and mainframe systems.
  • Installed and configured the power center Server (7.1), repositories and its services.
  • Extracted data from flat files and loaded them into Oracle and Sybase tables.
  • Used SQL, PL/SQL, and TOAD to validate the Data going in to the Data Ware House.
  • Generated fixed width files which were named dynamically in the format specified by Fortent.
  • Developed an automated process to load watch list files sent by Compliance on an adhoc basis.
  • Utilized transformations, mapplets, and worklets to design and implement a superior ETL process.
  • Developed stored procedures that would populate certain tables in the Data Hub.
  • Utilized HINTS, explain plan etc while developing and modifying stored procedures, materialized views and other database objects.
  • Utilized the Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses
  • Created Data Marts for both kinds of applications, CRM and financial and used effective querying and formatting tools to present the data to the end users
  • Design and development of the Informatica workflows/sessions to extract, transform and load the data into Target. Created database triggers for Data Security.
  • Created Dimensional and Relational Physical & logical data modeling fact and dimensional tables using Erwin.
  • Developed an audit system and maintained hash reports for ETL processes.
  • Performed administrative tasks related to environment setup, migration, access Control, as well as system backup and maintenance.
  • Setting up Alert Notification using Informatica's email task and UNIX utilities.
  • Developed power center startup scripts as well as shell scripts for dynamic file name creation, audit checks, file system backup (TAR), file transmission etc.
  • Design and schedule the Autosys process to execute daily, weekly and monthly jobs.
  • Documenting the entire Fortent Application process as per the bank's Audit requirements.
  • Upgrading and migrating powercenter from version 7.1 to 8.5.1.
  • Analyzed the Service Oriented Architecture (SOA) and its capability in a HA (High Availability) environment.

Environment: SQL Server, Oracle 9i/10g, Teradata, IDQ, IDE, Informatica (Power Center 7.1), UNIX shell Scripting (Ksh), SDLC, PL/SQL, Autosys

Confidential

Developer Tester

Responsibilities:

  • Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.
  • Responsible for developing Source to Target Mappings.
  • Extensively used Informatica Client tools- Source Analyzer, Warehouse Designer, Mapping Designer.
  • Developed Informatica mappings for data extraction and loading worked with Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
  • Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
  • Developed mappings, sessions for relation and flat source and targets.
  • Developed Single and multiple dashboards and scorecards using business objects.
  • Imported data from various source (Oracle, Flat file, XML) and transformed and loaded into targets using Informatica.
  • Written SQL queries to access the data in the Mainframe DB2 database.
  • Monitored Workflows and Sessions
  • Developed Unit test cases for the jobs.
  • Identified the facts and dimensions and designed the relevant dimension and fact tables

Environment: Informatica 8.6, Oracle 9i, Erwin 4.0, IDQ, IDE, Teradata, PL/SQL, UNIX shell scripting, SQL Server 2005, Business Objects 6.0, DB2, Autosys , UNIX and Windows NT

We'd love your feedback!