We provide IT Staff Augmentation Services!

Business/ Data Warehouse Analyst Resume

4.00/5 (Submit Your Rating)

Cincinnati, OH

SUMMARY:

  • 7+ years of IT experience in Software Analysis, Development, manual testing on web based and Implementation of business applications for Financial, Mortgage Lending, Insurance, Healthcare, Telecom and sales.
  • 6+ years of ETL and data integration experience in developing ETL mappings and scripts using Informatica Power Center 9.1/8.6.1/8.5/8.1/7.1/6.2, Power Mart 6.2/5.1, Informatica Data Quality (IDQ) and Informatica Data Analyst (IDA) tools using Designer (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Repository Manager, Workflow Manager & Workflow Monitor.
  • 5+ years experience using Oracle 11g/ 9i/8i/7.x, MS SQL Server 2005/2000, Teradata V2R5/V2R4, MS Access 7.0/2000, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x
  • Hands on JIRA Scrum project management processes, such as business analysis, prototype, implementation, verification and validation, release and documentation work;
  • Proficient of JIRA/Confluence administration experience, including but not limited: issue types/workflows/ screens/custom fields/permission schemes/notification schemes;
  • Good practices and understanding for JQL scripts, version control tools and project reporting dashboard;
  • Extensive experience in Informatica cloud services and creation and maintenance of database objects like tables, views, materialized views, indexes, constraints, primary keys, sequence, synonyms and database Link
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Extensive experience in development and modification of PL/SQL Scripts, Anonymous Blocks, Triggers, Procedures, Functions, Packages and Triggers using PL/SQL.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica graphs.
  • Tested the ETL Informatica mappings and other ETL Processes.
  • Executed, scheduled workflows using Informatica Cloud tool to load data from Source to Target.
  • Good at Data Warehouse techniques - Dimensional data Modeling, Star Schema and Snowflake Schema
  • Experienced in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experienced in Performance tuning of targets, sources, mappings and sessions.
  • Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet Metadata and Mapping Designer.
  • Involved in the creation of new objects (Tables/Views, Triggers, Indexes, Keys) in Teradata and modified the existing 700+ ETL's to point to the appropriate environment.
  • Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
  • Experience in preparing Test Strategy, developing Test Plan, Detailed Test Cases, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables.
  • Created Test Cases and developed Tractability Matrix and Test Coverage reports.
  • Experienced in integration of various data sources like Sales force, Oracle, DB2, SQL server and MS access into staging area.
  • Extensively used the VI editor for modification of UNIX scripts.
  • Performed SDLC cycle of Data Integration, Unit testing, System Integration testing, Implementation, Maintenance and Performance tuning.
  • Experience in working with other ETL tools like Pentaho Kettle and DataStage
  • Testing of records with logical delete using flags
  • Managed and conducted System testing, Integration testing and Functional testing.
  • Extensively written test scripts for back-end validations
  • Good experience on scheduling jobs using UNIX AUTOSYS.
  • Expertise in SQL queries and Query Optimization, Report Testing techniques.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Proficient in project implementations using various Software Development Life Cycle (SDLC) methodologies like Waterfall, Agile (SCRUM) and RUP.
  • Having exposure on Informatica Cloud Services.
  • Experience in Teradata and worked on Teradata utilities such as B.TEQ, MLOAD, FLOAD.
  • Built report and dash boards using Business Objects. Expertise on Business Objects Universal Designer adding objects and classes and building reports through web intelligence.
  • Involved in testing, debugging, bugs fixing and documentation of the system.
  • Participated in design and code reviews and verified compliance with the project’s plan.
  • Experienced in Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
  • Expert in resource and team management. Excellent written and verbal communication skills and have clear understanding of business procedures and ability to work as an individual and as a part of a team.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.1/9.0/8.6/8.5/7. x, Informatica Multidomain MDM, IBM InfoSphere (DataStage, QualityStage), Informatica Power Exchange, SSIS,, Reporting Service, IDQ, IDE, Data quality

Databases: Oracle 12g/11g/10g/9i, DB2 8.0/7.0, MS SQL Server2008/05, SYBASE, Teradata, SAP R/3, SALESFORCE MS Access 7.0/97/2000. SAS

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling Fact and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.0/3.5.2/3. x.

GUI: TOAD, Visual Basic 5.0/6.0, FrontPage 97/98/2000.

Programming: Visual Basic 6.0/5.0, PowerBuilder 6.0, C, PL/SQL, JavaScript, PERL, VBScript, HTML, XML, UNIX shell scripting, DHTML.

Design Tools: Erwin 4.5/4.0, Oracle Designer 2000.

Environment: Windows95/98/2000/XP/VISTA, UNIX AIX 5.2/5.3, LINUX, WinNT 4.0, MSDOS.

PROFESSIONAL EXPERIENCE:

Confidential, Cincinnati, OH

Business/ Data Warehouse Analyst

Responsibilities:

  • Business Analyst/Data Analyst in the Enterprise Data Warehouse (EDW) for the Commercial side of the bank.
  • Liaison among stakeholders in order to define needs and recommend solutions that deliver value to stakeholders and enable the organization to achieve its goals.
  • Used the data integration tool Pentaho for designing ETL jobs in the process of building Data warehouses and Data Marts
  • Involved in various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, Design, Analysis and Code development.
  • Produced and managed documentation on release activities to support problem resolution and post-release analysis.
  • Well versed with Requirement gathering, Requirement Management, Use Case writing, Use case diagrams/modeling and other Business Analysis skills/methodologies in general.
  • Designed and developed Use Cases, Activity Diagrams, Sequence Diagrams, Object Oriented Analysis and Design (OOAD) using UML and Business Process Modeling.
  • Involved in Data Analysis & Mapping to tracks all data elements used in the application from the user interface through different interfaces to the target databases in which they are stored.
  • Used Rational Rose to designed and develop Use Cases Scenarios, Use Cases Models, Activity Diagrams, Sequence Diagrams, State chart diagrams, Object Oriented Analysis and Design (OOAD) using UML
  • Prepared test data from user stories, write and execute test cases in JIRA.
  • Wrote business requirements document (BRD) and business/system process flows to meet the needs of all stakeholders.
  • Used JIRA for agile project management, creating product backlog, sprint backlog and bug tracking.
  • Experience in Data Extraction, Transforming and Loading (ETL) Excel, SQL Server, Oracle, DB2 and Flat file. Involved in transformation of OLTP data to the Enterprise Data Warehouse.
  • Worked on JIRA for Issue tracking and project planning.
  • Perform task decomposition, delegated tasks and monitored project milestones using JIRA.
  • Extensively using JIRA to request management and prioritizing workload.
  • Proficient in Functionality Testing, System Integration Testing, and Regression Testing including Effective defect tracking and reporting using HP Quality Center, JIRA and Application Life Cycle Management(ALM)
  • Extensively used Informatica Client tools- Source Analyzer, Warehouse Designer, Mapping Designer.
  • Developed Informatica mappings for data extraction and loading worked with Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Received the Data Warehouse feed files daily, weekly and loaded into Oracle tables using SQL Loader tool.
  • Analyze existing source system with the help of Data Profiling and source system data models thus creating individual data models for various domains/subject areas for the proposed data warehouse solution.
  • Worked on the development of Data Warehouse, Business Intelligence architecture that involves data integration and the conversion of data from multiple sources and platforms.
  • Developed ETL mappings in Pentaho kettle for validation purposes
  • Used Pentaho as a tool for testing the data movement
  • Designed ETL Process using ETL tools (Informatica & Pentaho) and Implementation of Data Movement, Error Capturing & Reporting, Initial & Delta Load, Implemented Change Data Capture methodology.
  • Experience in Requirement Management and Test Management tools JIRA.
  • Functioning liaison between multiple Business Lines, IT and Operations to analyze business and user needs, document requirements and resolve complex system problems throughout project cycle.
  • Assist IT Infrastructure team in defining and implementing its business plan and goals to support its strategy. Centralizes the data collection and distribution of all Infrastructure and financial data.
  • Provide complex and detail-oriented analysis and evaluation of financial and Infrastructure data.
  • Partner with different business units including Finance, Commercial Loans, Mortgage, Annuities, Trust, Accounting; to gain a thorough knowledge base of the various business lines, business systems and industry requirements including the business plan, products, processes and revenue streams.
  • Review operational procedures and methods and recommends changes for improvement with an emphasis on automation and efficiency.
  • Work with multiple different business units and coordinating efforts across multiple projects in areas of research and analyzing business requirements.
  • Provide guidance and context in prioritizing and determining complexity of multiple problems and requests.
  • Manage monthly and ad-hoc business analysis and related activities while being responsible for creating management-level presentations.
  • Ensure data process enhancements follow the appropriate IT guidelines, meet or exceed user requirements, and are completed in a timely fashion
  • QA Responsibilities
  • Work with resources both in house and third party to Identify data process enhancements, documents business needs and ensure development work is completed to specification.
  • Creating Functional and Technical Specification documents for the requirements.
  • Worked with SQL queries to test and dig into data.
  • Consolidating CRM data and co-locating it to one area for consumption.

Environment: IBM Datastage, IBM DB2, PL/SQL, Tableau, Pentaho, SAP BO, SDLC, UNIX shell scripting, AGILE, Erwin, Autosys, Business objects, Windows XP.

Confidential, Montvale, NJ

Data Warehouse Analyst

Responsibilities:

  • Involved in Data Warehousing which was developed to integrate operational systems into one central database structure. Informational queries can be obtained from the Data Warehouse that identifies customer behavior patterns.
  • Involvement in High Level Design and Low Level Design document
  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Created reports from data warehouse using SSRS i.e. Drill Down, Drill Through, Sub Reports, Charts, Table.
  • Involved in documentation of Data Mapping and ETL specifications for Data warehouse and Interacted with QA team in their testing of Data warehouse.
  • Develop, test and implement complex ETL flows for the Data Warehouse using PL/SQL, Oracle warehouse builder and Oracle Data Integrator.
  • Involvement in Test Strategy and test case designing.
  • Designed and developed ICS tasks and custom integration between Transaction system and Salesforce.com CRM.
  • Requirement gathering, ETL Design and Development of jobs and transformations using Pentaho.
  • Good work experience in Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
  • Worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center
  • Built Mapplet / Template to be used within the ICS DSS Synchronization Tasks.
  • Extensively worked on complex mappings, mapplets and workflow to meet the business needs and ensured they are reusable transformation to avoid duplications.
  • Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Java, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Working with end customer to finalize the requirement.
  • Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
  • Migration of ETL Codes across environments (Development, Test, Production).
  • Development of UNIX scripts to run the atica workflow real time and responsible for fixing defects during development, SIT, UAT, post production.
  • Create and maintain a project plan and update it on a weekly review with the IM and onsite coordinator of the Project.
  • Worked on performance tuning.
  • Knowledge sharing within the team whenever required.

Environment: Informatica Power Center 9.6.1/9.5, Informatica Cloud ICS, Oracle 12g, PL/SQL, SAP BW, Tableau, Teradata, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, Pentaho, UNIX shell scripting, AGILE, Erwin, TOAD, Oracle, Autosys, Business objects, Windows XP.

Confidential, St. Louis, MO

ETL Data warehouse Analyst/ Developer

Responsibilities:

  • Extensively used Informatica power Center 9.6.1 for ETL (Extraction, Transformation and Loading), date from relational tables.
  • Developed a Financial Systems Datamart and a Customer Information System (CIS) Datamart as components of the data warehouse.
  • Evaluated the tools for data cleanup and data warehouse build-up (Informatica and Pentaho)
  • Experience in using SQL queries, performing ETL process in Data Migration from databases to Data Warehouse.
  • Used Protegrity to apply functions to reference to tokenize and de- tokenize the data.
  • Walked through the Informatica and Teradata code to identify protected information references of columns like SSN, Medicaid number, Last name and first name.
  • Worked on creating sessions in the workflows based on the requirement.
  • Worked with data architecture teams to augment and define new structures.
  • Extensively worked on making changes to the parameter files if needed. All the ETL code is in Linux scripts.
  • Extensively used Tidal for scheduling the jobs when needed.
  • Involved in upgrading Informatica Power Center 9.5.1 to 9.6.1.
  • Written SQL queries to check whether the data is tokenized or not.
  • Worked on Query banding to pull all the data required for tokenization from the repository.
  • Deployed the code or changes made from Development to Test.
  • Extensively coordinated with other departments of the company to make desired changes to the workflows.
  • Involved in testing all the sessions, workflows, to check if the desired changes were made.
  • Created export scripts using Teradata Fast export Utility.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 9.6.1/9.5, Informatica Cloud, Java, Teradata SQL assistant, PL/SQL, Tidal, Tableau, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, UNIX shell scripting, AGILE, Erwin, TOAD, Oracle, Autosys, Business objects, Windows XP.

Confidential, Denver, CO

Informatica Developer

Responsibilities:

  • Analyzed business documents and created system requirement specification.
  • Extensively used Informatica power Center 9.5 for ETL (Extraction, Transformation and Loading), data from relational tables and flat files.
  • Extensively worked on complex mappings, mapplets and workflow to meet the business needs and ensured they are reusable transformation to avoid duplications.
  • Designed and developed star schema, snowflake schema and created fact tables and dimension tables for the warehouse and data marts using Erwin.
  • Implemented Join, Expressions, Aggregator, Rank, Lookup, Update Strategy, Filter and Router transformations in mappings.
  • Utilized the Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
  • Responsible for creating complete test cases, test plans, test data, and reporting status ensuring accurate coverage of requirements and business processes
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mappings, build, unit testing, systems integration and user acceptance testing.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Design and schedule the Autosys process to execute daily, weekly and monthly jobs.
  • Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
  • Performed data validation testing writing SQL queries
  • Used Repository manager to create folders, which is used to organize and store all metadata in the repository.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5.1.
  • I analyzed, designed and implemented ODS, Data marts, Data warehouse and operational databases.
  • Migrating from Informatica 9.1.1 mappings into Informatica 9.5.1 which consists of grid technology.
  • Created Dimensional and Relational Physical & logical data modeling fact and dimensional tables using Erwin.
  • Understanding the domain and nodes as well as by using the Informatica integration service to run the workflows in 9.5.1.
  • Design and development of the Informatica workflows/sessions to extract, transform and load the data into Target. Created database triggers for Data Security.
  • Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets to data into data warehouse.
  • Developed shell scripts for Daily and weekly Loads and scheduled using Unix Maestro utility.
  • Created export scripts using Teradata Fast export Utility.
  • Involved in writing SQL scripts, stored procedures and functions and debugging.
  • Involving in Functional Testing & Regression Testing
  • Responsible for providing comments for user stories within an AGILE software development SCRUM environment.
  • Created sessions and batches and tuned performance of Informatica sessions for large data files by increasing the block size, data cache size and target based commit interval.
  • Prepared test data by modifying the sample data in the source systems, to cover all the requirements and scenarios.
  • Used debugger to test mapping at designer level.
  • Developed email routines to indicate failure or successful completion of workflows.
  • Wrote stored procedures to check source data with warehouse data if it is not present wrote those records to spool table.
  • Used Quality Center for creating and documenting Test Plans and Test Cases and register the expected results.
  • Experienced in retest the existing test cases with the different kind of source systems for different periods of data
  • Created configured and scheduled the sessions and Batches for different mappings using workflow manager and using UNIX scripts.
  • Extensively used PL/SQL programming in backend and front end functions, procedures to implement business rules.
  • Involved in upgrading from Informatica Power Center 8.6 to 9.5.1.

Environment: Informatica Power Center 9.5/8.6, IBM InfoSphere DataStage, Power Mart, Oracle Oracle11g/10g, PL/SQL, Cognos, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, UNIX shell scripting, AGILE, RUP, UML,Erwin, TOAD, Teradata, Autosys, mercury Quality center, Business objects, Windows XP.

Confidential, NJ

ETL/ Informatica Developer

Responsibilities:

  • Designed/Enhanced the Entire ETL Architecture for Basel II.
  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Gate authoring and versioning of Business Event Group (BEG) documents - Definition, Scope, Description, Requirement tracing to Functional & Technical Specifications.
  • Responsible for On Shore and Off Shore resources and planning for Basel Projects
  • Analyzing/Setting the timelines for the ETL and DB work.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5.1.
  • Analysis of the End user requirements and involved in Modeling the ETL Schema.
  • Developing Automated Test Scripts to perform Functional Testing, Performance Testing, Integration Testing, Stress Testing, System Testing, User Acceptance Testing, Regression Testing and Volume testing of the application using LoadRunner
  • Detailed Analysis of the Data provided by the respective source systems and filling in the gaps in the mapping specs.
  • Data modeling and design of for data warehouse and data marts in star schema methodology with Dimensions and Fact tables.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • I worked in several data marts over one terabyte in size.
  • Involved in implementing data quality and data security framework
  • Used SQL, PL/SQL, and TOAD to validate the Data going in to the Data WareHouse.
  • Implementing the standards of creating ETL Workflows/Mappings.
  • Creating ETL Workflows/Mappings for Basel Project
  • Implemented standards for naming Conventions, Mapping Documents, Technical Documents, Migration form
  • Involved in upgrading from Informatica Power Center 8.6 to 9.5.1.
  • Responsible for providing comments for user stories within an AGILE software development SCRUM environment.
  • Created Defect Management flows for Factory (Pre-production) and Production issues.

Environment: Oracle 9i/10g, Informatica 8.6.1, Informatica MDM, SDLC, SQL Server, Teradata, Sybase, Solaris, Windows XP, HP Quality Center, PL/SQL, UNIX shell scripting, IDQ, IDE, AGILE, Autosys 4.0

Confidential, NY

ETL/ Informatica Developer

Responsibilities:

  • Responsible for ETL setup for data transfer and data integration across different applications, RDBMS, flat files and mainframe systems.
  • Installed and configured the power center Server (7.1), repositories and its services.
  • Extracted data from flat files and loaded them into Oracle and Sybase tables.
  • Used SQL, PL/SQL, and TOAD to validate the Data going in to the Data WareHouse.
  • Generated fixed width files which were named dynamically in the format specified by Fortent.
  • Developed an automated process to load watch list files sent by Compliance on an adhoc basis.
  • Utilized transformations, mapplets, and worklets to design and implement a superior ETL process.
  • Developed stored procedures that would populate certain tables in the Data Hub.
  • Utilized HINTS, explain plan etc while developing and modifying stored procedures, materialized views and other database objects.
  • Utilized the Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses
  • Created Data Marts for both kinds of applications, CRM and financial and used effective querying and formatting tools to present the data to the end users
  • Design and development of the Informatica workflows/sessions to extract, transform and load the data into Target. Created database triggers for Data Security.
  • Created Dimensional and Relational Physical & logical data modeling fact and dimensional tables using Erwin.
  • Developed an audit system and maintained hash reports for ETL processes.
  • Performed administrative tasks related to environment setup, migration, access Control, as well as system backup and maintenance.
  • Setting up Alert Notification using Informatica's email task and UNIX utilities.
  • Developed power center startup scripts as well as shell scripts for dynamic file name creation, audit checks, file system backup (TAR), file transmission etc.
  • Design and schedule the Autosys process to execute daily, weekly and monthly jobs.
  • Documenting the entire Fortent Application process as per the bank's Audit requirements.
  • Upgrading and migrating powercenter from version 7.1 to 8.5.1.
  • Analyzed the Service Oriented Architecture (SOA) and its capability in a HA (High Availability) environment.

Environment: SQL Server, Oracle 9i/10g, Teradata, IDQ, IDE, Informatica (Power Center 7.1), UNIX shell Scripting (Ksh), SDLC, PL/SQL, Autosys

Confidential

Developer/ Tester

Responsibilities:

  • Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.
  • Responsible for developing Source to Target Mappings.
  • Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
  • Developed mappings, sessions for relation and flat source and targets.
  • Developed Single and multiple dashboards and scorecards using business objects.
  • Imported data from various source (Oracle, Flat file, XML) and transformed and loaded into targets using Informatica.
  • Written SQL queries to access the data in the Mainframe DB2 database.
  • Monitored Workflows and Sessions
  • Developed Unit test cases for the jobs.
  • Identified the facts and dimensions and designed the relevant dimension and fact tables

Environment: Informatica 8.6, Oracle 9i, Erwin 4.0, IDQ, IDE, Teradata, PL/SQL, UNIX shell scripting, SQL Server 2005, Business Objects 6.0, DB2, Autosys , UNIX and Windows NT

We'd love your feedback!