Business Data Analyst Resume
Minneapolis, MN
SUMMARY
- Confidential is experienced Data Analyst and Data Modeler hands on with Data mapping, validation, requirement gathering in data warehousing environment.
- He is proficient in analyzing source system - the content, structure and internal relationships of legacy data sources like flat files, Excel, Oracle, Sybase, SQL Server, XML and DB2 databases.
- He has worked in various verticals such as Banking, Health Care, Pharmaceutical, Financial, Retail, Beverages and Educational.
- Confidential has been involved in all phases of the software development life cycle (SDLC) - from requirement definition through implementation.
- He designed Star schema (identification of facts and dimensions) and Snowflake schema among others for modeling a Data Warehouse.
- Confidential is well conversant with Ralph Kimball and Bill Inmon’s approaches to Data Warehousing.
- He facilitated and participated in Joint Application Development (JAD) sessions, and white board sessions to keep executive staff and the team members apprised of goals, project status, and resolving issues
TECHNICAL SKILLS
Database systems: Oracle 10g/9i/8i/7.x, SQL Server (2005/2000/7.0 ), DB2, TERADATAv2r5, MS access, Sybase.
Data modeling tools: Erwin, Power designer, oracle Designer, ER / Studio, Rational Rose, MS Visio, Embarcadero.
BI/ Reporting tools: Micro Strategy 9, Crystal Reports, Cognos and SSRS.
ETL Tools: Informatica, 8.1/7.1.3/7.1.1/6. x/5.x and SSIS.
Programming Language: C, C++, Java, Unix shell, Visual Basic, COBOL.
Operating Systems: Windows (NT/XP/2003), Linux and Unix.
Other tools: MS Office (Word, Excel, Power Point), MS Project
PROFESSIONAL EXPERIENCE
Confidential, Minneapolis, MN
Business Data Analyst
Responsibilities:
- Successfully implementbusinessteam deliverables throughout the SDLC process for the TMS application. Activities span Project Definition,BusinessRequirements Definition, Functional Design Review, User Feedback Sessions and Iterations, Testing, Deployment and Support
- Documented wireframes and system logic for customer access and enrollment paths - Quick Pages.
- Gathered business requirements from business users, SME (Hardware Asset & Software asset) and stakeholders to understand their needs and requirements.
- Perform Source-to-Target Data Mapping; Data Lineage Analysis and documentation using the Enterprise Metadata Repository for SOR’s like REMEDY, P2K, and BDANT, Reverse logistics.
- Used SQL queries for organizing and abstracting data from MS access data bases, created reports, forms on MS Access.
- Developed use cases, process flow diagrams (As Is and To Be flows) and UI mockups/wireframes to represent business requirements. Was part of the Reporting team for Technology Asset TMV (Total Market Value) and Asset FA (Fund Accounting).
- Gathered, analyzed, and draftedandfinalizedthebusiness requirement documents(BRD). Provided key input in working with users in definingproject and system requirements.
- Writing the Data Requirements for each Client as per the Custodian levels. Multiple clients had multiple custodians. Worked on Phased approach.
- Performed database analysis, tested data integrity and data mapping during development.
- Regularly meet with the user community to capture, validate and refine user stories and requirements.
- Worked with the Semantics teams and the ETL teams to ensure a smooth transition from the design phase to the implementation phase.
- Involved in project cycle plan for the source XML data extraction process, data analysis, transformation and ETL loading strategy designing.
- Documenting requirements in Power Designer V16 /V15.
- Deliverables include BRD's FSD's & Schema Mapping Doc's.
- Validated the test data/Production Data on Teradata using SQL queries.
- Played a major role in identifying the source to target elements implementing SQL queries.
- Designed the dimensional Data Model of the data warehouse using Power Designer.
- Worked with business users to develop Subject Area, Metadata Reporting requirements such as schemas required hierarchies and data sources.
- Create Data Flow/ Data Lineage Diagrams to depict the source to Target Mapping and Data Lineage Analysis.
- Coordinated with QA teams to document Test Cases for Functional and UAT test phases, involved in testing the application functionality using manual test cases from HP Quality Center.
- Worked with QA and technical leads from systems development, participated in defect triage, provided business priority and input on functionality, delivered requirements for defect correction and/or re-design.
- Used Business Objects free hand SQL to generate Ad-hoc reports and tested in Teradata for QA.
- Assisted Design, Development and Test Team in resource planning and time management.
Confidential, Eden Prairie, MN
Data Analyst/ Business Analyst
Responsibilities:
- Documented, coached, and elicited business requirements from subject matter experts by writinguser storiesresulting in a clearer, more detailed and more complete understanding of project deliverables.
- Partnering with portal users/clients and development team to understand requirements, and gain a better understanding of the source data from a business and technical perspective.
- Conducted design meetings to determine the appropriate design for the migration process.
- The old portal source data is provide in XML format and with our analysis we decided to stage XML data onto intermediate SLQ server and then move the data to sales Force Application.
- Analysis on 30 + clients on data migration like Caterpillar, Stanley Morgan, Union Bank, Apple, Accenture and Mc Donald100 (MCD100). Clients are divided into Due Diligence and 3rd party.
- 3rd party clients - helped creating the questioners on new compliance portal, which are sent to vendors and migrating the old data onto the new portal with the users/clients input.
- Worked on migrating business data, statistic, questioner scores using mapping rules and logics.
- Working on building the logic for creating the questioners on the new portal using one single questioner sales force object/table for all clients.
- Creating DFD’s, Mapping and Testing Criteria for client specific Orders, Profile and User data migration from old system to new Sales Force portal.
- Converting the data types (XML) and data length to accommodate the data from the old portal and new portal.
- Involved in creating reports to the business users on the migration process/data.
- Created client data access/ portal permission mapping rules for users on the new portal.
- Worked in a team on creating logical model on linking Accounts, Orders, Profiles, Attachments, Users, and Users permissions.
- Review migrated data at different environment at old portal and New portal with sample data and coordinated with testing and Informatica development team to correct the issues or update the DFD’s/Mapping.
- Led JAD sessions on specific portal enhancements for few clients like Apple, Apollo, Nike, KWISDD, KrollDD
- Performing Data validation on target, working with ETL to resolve defects and making any mapping changes if required.
- Involved in project cycle plan for the source XML data extraction process, data analysis, transformation and ETL loading strategy designing.
- PerformingData profilingto analyze and to supportData Qualitycontrols (valid value ranges, relationships between data elements).
- Querying databases, writing SQL test validation scripts and performing System testing.
- Worked with developers to build SQL Triggers and Procedures to checkfor any new orders on the old portal after migration of a specific client before to Sales Force is live for a specific client.
- Developed Use Cases using MS Vision.
- Enforcing standards to ensure that the data elements and objects are properly named in the Sales Force and theSource to Target mapping. Creating fully-fledged Source to Target Mapping documents, documented business and transformation rules.
- Reviewing test results and coordinating with testing and development team to correct the issues.
Environment: Salesforce, SQL Server 2008, Team Foundation Server (TFS), Power Designer, Microsoft Test Manager (MTM)
Confidential, Minneapolis, MN
Data Analyst/ Data Modeler
Responsibilities:
- Interacted with Business Users, Business Analysts and technical teams to understand the requirements, and gain a better understanding of the source data from a business and technical perspective
- Perform preliminary analysis on multiple origination and servicing systems such as Lending Information System (LIS), LPS, and AS400 and assisted the sourcing team with the creation of Data Requirement Documents.
- Working on multiple subject areas. Was involved in detailed identification and analysis of the source systems that needed to be moved over to the MIDE staging and integrated layer.
- Create full-fledged Source to Target Mapping documents, documented business and transformation rules and participated in working sessions and ensured full business participation throughout the process
- Supported and adhered to WFHM Integrated Methodology, operational procedures, architectural standards and review processes by following the required processes and standards.
- Attended JAD sessions with Data Architects to build up subject area specific models.
- Worked to create Physical Data Designs/ First Cut Data Models for various projects/contracts.
- Optimized Indexes for better reporting analysis.
- Worked with the Semantics teams and the ETL teams to ensure a smooth transition from the design phase to the implementation phase.
- Documenting requirements in Power Designer V16 /V15, ERwin and created Requirements Traceability Matrices (RTMs).
- Generate DDLs and provided them to ETL for FSD support and also to the DBA for database implementation.
- Working on different data formats such as Flat files, SQL files, Databases, XML schema, CSV files.
- Provided Semantics requirements to the developers/semantics models.
- Created Tables, Views and Indexes using Power Designer V16 /V15
- Use Teradata SQL assistant tool extensively to profile data and check mapping accuracy.
- Created DFD (Data Functional Design) artifacts that incorporated the process flow Visio, S-T Mapping document and all the specifications for proper ETL implementation.
- Conduct internal and final DFD reviews with the ETL team (EIS), Data Quality and Metadata Management team, data architects, and business and application data stewards.
- Involve in project cycle plan for the data warehouse, source data analysis, data extraction process, transformation and ETL loading strategy designing.
- Used Teradata SQL assistant tool to profile the data and to check mapping accuracy.
- Involved in quality assurance to ensure quality, validity and accuracy of data across the servers
- Enforced standards to ensure that the data elements and attributes are properly named in the data model and the Source to Target mapping.
- Worked within the constructs of the agile development concepts such as Unit Testing Continuous Build and Integration, and Highly Collaborative Development approaches
Environment: Power designer,Ab Initio, Informatica, Teradata, Informatica Data Quality, Teradata, Oracle 9i, SQL Server 2003,SQL, PL/SQL,MS Office,Windows 2003.
Confidential, Minneapolis, MN
Business Data Analyst
Responsibilities:
- Create Data Flow/ Data Lineage Diagrams (DFD/DLD) to depict the source-to-Target Mapping and Data Lineage Analysis.
- Proactively identify non-standard offerings and inform project teams of cost/effort implications early in the process
- Follow and meet Key Performance Expectations (KPEs)
- Working in a team environment to create DFD’s to sending out postcards and checks to 4.2 million customers across 13 banks/servicers for Independent Foreclosure Review
- Facilitated and led JAD sessions aimed at functional requirement for Non Responders Checks and request for Split Checks
- Identify QICO Records (‘IN CARE OF’ AND ‘C/O’) before sending out initial checks. Identified QICO are sent to banks/Servicers for updated Names & Address.
- Worked in building logic to load the Corrected QICO data from Servicer to SQL Server Database for around 400K Records.
- Review test results and coordinated with testing and development team to correct the issues.
- Identify solutions that reduce IT expenditures using a consultative approach between IT and project teams.
- Worked with developers to build SSIS packages to send records to Damasco team for TIN Matching.
- Create Weekly/ Bi weekly reports and extracts for Regulators at Office of the Comptroller of the currency (OCC)
- Involved in Risk Assessment for few Sub projects (QICO & Splits)
- Worked in a team to send out COA (Change of Address) and COP (Change of Payee) forms and create a SSIS package to load to incoming COA and COP file.
- Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirements
Environment: SQL Server 2008, Micro Strategy 9, SQL Developer, Windows XP, MS Visio, MS Access and MS Excel.
Confidential, Minneapolis, MN
Data Analyst
Responsibilities:
- Worked with business community and Source of Record IT analysts to gather information needed to perform the data modeling and to understand the data flow.
- Conducted Design discussions and meetings to come out with the appropriate design for the data warehouse
- Working with the business community and Source of Record IT analysts to gather information needed to perform the data modeling using Power designer to understand the data flow from source to target.
- Created databases for OLAP Metadata tables and worked with various data bases and schemas.
- Design discussions and meetings with the data architects to come out with the appropriate design for the data warehouse (Source Layer, Integration & Semantics).
- Create logical dimensional data models and Physical data models using Sybase Power designer 15 and Erwin for data warehouse.
- Created fully fledged Source to Target Mapping documents, documented business and transformation rules
- Enforced standards to ensure that the data elements and attributes are properly named in the data model and the Source to Target mapping.
- Performed Performance tunings on Teradata using the best Index combination.
- Involved in project cycle plan for the data warehouse, source data analysis, data extraction process, transformation and ETL loading strategy designing.
- Collected data from Various Heterogeneous database/files like MS SQL, ORACLE, Flat File, MS Access, EXCEL sheets and used complex data conversion to target.
- Perform data profiling to analyze and to support data quality controls (valid value ranges, relationships between data elements, business rules & reasonability checks).
- Responsible to Create Source Detail Design, Source to Teradata Mapping documents, Data Flow Design documents.
- Conducted internal and final DFD reviews with the ETL team (EIS), Data Quality, Metadata Management team, data architects, and business and application data stewards.
- Converting all data types to Teradata standards.
- Participated in ETL Functional Specifications Document (FSD) reviews and responded to questions related to the mappings, modeling and processing logic flows. Supported ETL Ab Initio/ Informatica with the development process.
- Involved in quality assurance to ensure quality, validity and accuracy of data across the servers.
- Performed data analysison source and target data to understand problem/PAC ticket/issues in the data warehouse (MIDE)
- Performed Data validation on target, working with ETL to resolve defects and making any mapping changes if required.
Environment: Power designer,Ab Initio, SQL Server 2003,SQL, PL/SQL,MS Office,Win2003.
Confidential, NY
Data Analyst/ Data Modeler
Responsibilities:
- Involved in the projects from requirement analysis to better understand the requirements and support the development team with the better understanding of the data.
- Used Rational Requisite Pro to document technical requirements and business user requirements.
- Designed logical and physical data models for multiple OLTP applications.
- Redesigned some of the subject areas and introduced some new entities and attributes as per the requirements.
- Developed Use Cases using MS Visio, and a detailed project plan with emphasis on deliverables.
- Used Rational Data Architect (RDA) to transform data requirements into data models.
- Performed Reverse Engineering of the current application using Rational Data Architect, and developed Logical and Physical data models for Central Model consolidation.
- To Design the logical Model and ensure that if follows the normalization rules and have the best possible traceability pattern
- Supported the DBA in the physical implementation of the tables in both Oracle and DB2 databases.
- Documented the design conventions used for performing database modeling.
- Ensured that the Rational toolset information accurately reflects the physical FPA structure, and the data dictionary, DDL, models, and any support documents produced from the toolset would correctly represent the physical structure and ensure a smooth transition if needed.
- Responsible for ensuring and providing consistent metadata across all the models in FPA.
Environment: Oracle 9i, SQL, PL/SQL, MS Office, Rational Data Architect, Windows Server 2003, UNIX-Solaris, Confidential Rational Requisite Pro, Rational Clear Quest, SQL Developer, TOAD
Confidential, Boston MA
Data Analyst
Responsibilities:
- Gathered business requirements through interviews, surveys, prototyping and observing from account managers and UI (User Interface) of the existing Broker Portal system.
- Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.
- Played an active role in analyzing the business requirements, by participating in sessions with business teams and other stake holders.
- Involved in designing the data warehouse, based on the requirements analyzed
- Worked extensively with SQL scripts, Stored Procedures and T-SQL programs.
- Used Sybase Power Designer tool for dimensional data warehouse designs.
- Created database objects, installed and deployed database, and integrated high-level business rules with code.
- Wrote several SQL scripts for massive data extracts and manage the data conversion and extraction process.
- Involved in performance tuning of the database, which includes indexing and optimizing SQL statements.
- Developed a data model (star schema) for the sales data mart using ERwin tool.
- Involved in installation and configuration of SQL Server 2000.
- Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
- Used Aggregator stages to sum the key performance indicators used in decision support system
- Developed data mapping documents between Legacy, Production, and User Interface Systems.
- Involved in creating job schedules to automate the ETL process.
- Conducted User Acceptance Testing on the application - resolved issues from the participants, prepared and submitted Test Analysis Reports and participated in Report Review
Environment: Windows NT 4.0, UNIX, Oracle 8i, SQL Server 2003, Erwin, power designer, Sequential Files
Confidential
Business Data Analyst
Responsibilities:
- Developed Views for UI and Downstream to send the data (Both normalized and deformalized views)
- Used Teradata SQL Assistant tool extensively to profile data and check mapping accuracy.
- Worked on to decrease the Skew percentage on the production environment.
- Created Source Detail Design, Source to Target Mapping documents and Data Flow Design documents (DFD).
- Created Metadata and also maintained it for all the data models.
- Worked closely with various business teams and IT analyst in gathering the requirements; Transformation of requirements into Data structures using modeling tools.