Sr. Data Analyst Resume
Indianapolis, IN
SUMMARY:
- Solid experience in System Analysis, Design, Development, and implementation of Client Server applications using different database technologies.
- Involved in various phases of Software Development Life Cycle i.e. Requirement Analysis, Design, Implementationand Testing.
- Familiar with Rational Unified Process, Agile and Iterative methodologies.
- 8 Plus years of experience in creating Stored Procedures, Packages, Functions, Triggers, Materialized Views, Cursorsand other database objects using SQL, PL/SQL.
- Experience in creating DDL, DML and Transaction queries in SQL.
- Experience in SQL Tuning/Performance tuning of SQL queries/stored procedures using partitioning, Explain Plan, SQL Trace. Using Tableau tool to present BOB data in user friendly dashboard, drill - down format to get transaction details driven by user ad-hoc request.
- Using Tableau tool to present BOB data in user friendly dashboard, drill-down format to get transaction details driven by user ad-hoc request.
- Strong working knowledge of Oracle 11g/10g/9i/8i, SQL, PL/SQL (procedures, packages, functions, database triggers), Teradata
- Experienced with XML related technologies such as XML, XSL and XSLT..
- Expertise in developing SQL & PL/SQL codes through various procedures, functions, packages, triggers, views and indexes to implement the business logics of database.
- Experience in working with Oracle Forms and Oracle Reports.
- Experience in loading data, troubleshooting, Debugging mappings, performance tuning of procedures and SQL queries.
- Expertise in operation of various database objects, such as Tables, Views, Indexes, Constraints, Materialized Views etc.
- Experienced in management across multiple industries with focus on Business Intelligence, Business Analysis and Leadership
- Involved in Data modeling and design of data warehouse in star schema methodology with conformed and granular dimensions and FACT tables.
- Extensive experience in data integration using web services (WSDL, XML).
- Experience in design of Data Warehouse/Data Mart, Star Schema, Snowflake Schema, ODS, ETL process, Data Requirement Analysis, Data modeling &ER-Diagrams, Development, Testing, Documentation and Implementation of Business Applications.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Involved in the Basel II Advanced Measurement Approach ( AMA ) approach which allowed Freddie Mac to take account of their operational risks in assessing capital adequacy as part of the Basel II compliant project.
- Good understanding of Oracle.
- Gathered and documented Master Data Management (MDM) application, conversion and integration requirements.
- Familiar with Data warehousing tools. Used ETL tools like Informatica 8.x/7.x to extract and load data
- Strong experience in design and development of ETL process from Staging to Data warehouse and to Data mart.
- Using Tableau tool to present BOB data in user friendly dashboard, drill-down format to get transaction details driven by user ad-hoc request
- Created Metadata related tables on the database.
- Experience in Data modeling using Relational (Normalized), Dimensional Data Modeling (De-Normalized), Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.x.
- Experience in reporting using Crystal Reports 8.0/9.0/10.
- Experience in Unit Testing, System Testing and User Acceptance Testing, creating test cases using Mercury Testing Tools
- Experienced in working with OLTP database, OLAP database and data warehouses.
- Excellent communication skills, team player, quick learner and self-motivated.
TECHNICAL SKILLS:
Databases: Oracle 11g/10g, SQL Server 2000/2005, MS Access
Database Tools: SQL*PLUS, SQL*Loader, Metadata, SQL Developer, Teradata 5.0, TOAD 9.5
Data Modeling Tools: ERWIN, ER Studio, MS Visio 2003
Reporting Tools: OBIEE, Crystal Reports 8.0/9.0/10
Standard and Codes: HIPAA 4010A1/5010, ICD-10, ICD-9, ANSI X12, HL7, CPT and CMS form.
ETL Tools: Informatica Power Center 8.x, 7.1.x
Languages: C, C++, Pro*C, JAVA, COBOL, SQL, PL/SQL, T-SQL, UNIX Shell Scripting. XSLT
Testing Tools: WinRunner, LoadRunner, Test Director, QTP, Rational Clear Quest
Version Control: Sub Version, PVCS, CVS, Clear Case
PROFESSIONAL EXPERIENCE:
Confidential, Indianapolis, IN
Sr. Data Analyst
Responsibilities:
- Gathering Requirements and design of technical specification documents. Document the business requirements forthe Cognos BI reports. Derive Cognos report requirements from already existing report templates.
- Responsible for importing, cleaning, transforming, validating or modeling data with the purpose of understanding or making conclusions from the data for decision making purposes.
- Worked on maintenance of Facets 4.71 by testing and implementing Facets patches as per the COSMOS tickets that are loaded in TFS.
- Installed, configured, maintained Tableau Environments; DEV, QA, and Production.
- Extensive knowledge of Microsoft technologies including and Web Services; proficient in best software design practices including Service Oriented Architecture (SOA), Design Patterns and UM
- Extensively worked on creating presentations, graphic organizers and detailed reports on OLTP data.
- Create Mapping documents, ETL technical specifications and various documents related to data migration
- Writing complex SQL Queries to test the Data Stage ETL process.
- Resolved issues rose by ETL and testing teams while development and testing phase of the project using QC (Quality center).
- Experience in testing integrated software Web Services through SOAP UI and XML
- Extensively used PL/SQL in writing database packages, stored procedures, functions and triggers in Oracle 10g.
- Created mock up diagrams using MS VISIO and provided the screenshots replicating the changes to the User Interface (UI) as per the requirements.
- Defined functional requirements for Confidential BRE/CRM call center implementation utilized the Rational Unified Process (RUP) & use case modeling (UML) BMI methodology; Mapped Data Legacy to Siebel & Oracle DBs
- Defined requirements for PCI compliant systems, credit card authorizations, POS, supply chain, and/or other retail systems.
- Integrated RequisitePro with ClearQuest and Rose to provide all teams visibility and maintain tractability among requirements, use cases and change requests
- Involved in Data Mining, Profiling and Data Modelling.
- Performed Data Migration from legacy SQL and Netezza tables to Teradata, and Data Mapping for easy and faster access to information.
- Worked on business and technical metadata and maintained naming standards
- Created custom SQL connections and calculated fields within Tableau to facilitate automated production of data visualizations.
- Worked extensively to socialize and execute the re-engineering of E-commerce wide requirements management structure using documentation tools such as Quality Center and Rational Requisite Pro.
- Designed Data Flow Diagrams (DFD’s), Entity Relationship Diagrams (ERD’s), and web page mock ups using modeling tools.
- Managed product Master Data Management (MDM) changes to support the standardization of downstream processes and the migration of multiple enterprise resource planning (ERP) instances to a single global instance.
- Followed RUP and AGILE development methodology and adhered to strict quality standards in requirement gathering.
- Created the conceptual model for the data warehouse using Erwin data modeling tool.
- Participate in daily SCRUM meeting and engage with development team(s) to communicate requirements, coordinate plans and progress, and assist in developing prototypes to validate user requirements, using SCRUM and Agile methodology, to define product backlog and sprint backlog.
- Supported the current implementation for Manhattan WMS-including the design, development, testing, and go live support.
- Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions.
- Wrote SQL for data profiling and developed data quality reports on the commerce source systems.
- Developed business test cases to test the final extract out puts.
- Responsible for meetings with users and stakeholders to identify problems, resolve issues and improve the process to ensure a stable and accurate solution.
- Investigated and resolved issues within Supply Chain module as well as Manhattan WMS (Warehouse Management System) software.
- Created High level requirements doc to capture all the requirements and define the in scope & out of Scope for the project in one central document which is signed off by the stake holders.
- Developed Project Technology Summary - PTS to Establish a roadmap for development to support Basel II compliance as it relates to the Basel II Data Mart Integration project for Commerce source systems
- Was responsible for conducting Gap Analysis for the fields which are required for BASEL and the fields available in the historical source system from Commerce and then getting the Gap analysis doc signed off from business (Retail Risk) to avoid any disturbances in the scope of the project
- Developed the key artifacts Source to Target mapping Documents and perform a source-to-target mapping analysis for each source system based on the Basel II Data Requirements. The output of the source-to-target mapping documents the following.
- Developed packages for processing data in the staging tables according to the Clients requirements.
- Generated SQL and PL/SQL scripts to create and drop database objects including: Tables, Views, and Primary keys, Indexes, Constraints, Packages, Sequences and Synonyms.
- Implemented Dimensional modeling in the database and created appropriate views along with DBA to acquire the Metadata for the given requirement.
- Managed supply chain and procurement of all domestic and national supplies.
- Worked on XML based web services to identify and resolve integration issues.
- Created Test environment using SQL & PL/SQL scripts and optimized critical queries to eliminate Full Table scans &reduce Disk I/O.
- Developed PL/SQL programs, stored procedures for data loading and data validations.
- Extensively worked on dimensional, relational and object oriented modeling techniques.
- Developed strategies with Q.C (Quality Center)for stress testing and UAT (User Acceptance Testing)
- Facilitated JAD sessions with management, development team, users and other stakeholders to refine functional requirements.
- Involved in Data modeling and design of data warehouse in star schema methodology.
- Converted various PL/SQL statements into stored procedures thereby reducing the Number of database accesses.
- Created surrogate key tables to map the required dimensions.
- Data transfers from text files and Access databases using BCP and DTS; Data mapping and transformations while loading data using VBScript
- Created hash tables used for referential integrity and/or otherwise while transforming the data representing valid information.
- Created and managed project templates, use case project templates, requirement types and traceability relationships
Environment: Oracle 11g, PL/SQL, SQL Loader, SQL Reports, Teradata, DW, ODS, DB2, Metadata XML,XSLT Cognos 8 BI, MS Office, MS Visio, MS PowerPoint, ERWIN, XML, WSDL, Altova XML spy.
Confidential, Cincinnati, OH
Senior Data Analyst
Responsibilities:
- Lead a team of Project on documenting the process and preparing Data Mapping Document and analyze incoming feeds, conduct data profiling and map the relevant fields to the target repository incorporating business validation rules to integrate the data.
- Designed and published visually rich and intuitively interactive Tableau workbooks and dashboards for executive decision making.
- Responsible for importing, cleaning, transforming, validating or modeling data with the purpose of understanding or making conclusions from the data for decision making purposes.
- Extensively worked on creating presentations, graphic organizers and detailed reports on OLTP data.
- Create Mapping documents, ETL technical specifications and various documents related to data migration
- Writing complex SQL Queries to test the Data Stage ETL process.
- Resolved issues rose by ETL and testing teams while development and testing phase of the project using QC (Quality center).
- Integrated RequisitePro with ClearQuest and Rose to provide all teams visibility and maintain tractability among requirements, use cases and change requests
- Involved in data visualizations using Tableau with high-level objectives provided by management, exercising judgment in presentation and in user experience.
- Involved in Data Mining, Profiling and Data Modelling.
- Wrote SQL for data profiling and developed data quality reports on the commerce source systems.
- Developed business test cases to test the final extract out puts.
- Responsible for meetings with users and stakeholders to identify problems, resolve issues and improve the process to ensure a stable and accurate solution.
- Coordinated with the Data warehouse team (Government Med D IT) as part of the CMS reporting
- Code Review of mappings from Source to Staging and Staging to Warehouse. Modified the existing code of SQL queries for data warehousing tests, data extracts and executive information requests as per the new requirements.
- Designed end-to-end ETL process to load the data from client extracts into data warehouse.
- Provided technical support to multiple data warehouse development teams.
- Analyze the integrated metadata to get information on data usage, end-to-end change impact analysis, and report-to-source data lineage.
- Created the conceptual model for the data warehouse using Erwin data modeling tool.
- Performed CDC (Change Data Capture) along with data replication and MDM (Master Data Management) capabilities.
- Responsible for identifying data quality issues and downstream impact to the data warehouse
- Developed the key artifacts Source to Target mapping Documents and perform a source-to-target mapping analysis for each source system based on the Basel II Data Requirements. The output of the source-to-target mapping document the following:
- Suggested measures and recommendations to improve the current application performance with the aid of SCR’s (Small Change Requests)
- Installed, configured, maintained Tableau Environments; DEV, QA, and Production.
- Worked on XML based web services to identify and resolve integration issues.
- Created Test environment using SQL & PL/SQL scripts and optimized critical queries to eliminate Full Table scans &reduce Disk I/O.
- Created, documented and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprise data stores within a metadata repository.
- Developed PL/SQL programs, stored procedures for data loading and data validations.
- Extensively worked on dimensional, relational and object oriented modeling techniques.
- Developed strategies with Q.C (Quality Center)for stress testing and UAT (User Acceptance Testing)
- Facilitated JAD sessions with management, development team, users and other stakeholders to refine functional requirements.
- Involved in Data modeling and design of data warehouse in star schema methodology.
- Converted various PL/SQL statements into stored procedures thereby reducing the Number of database accesses.
- Created surrogate key tables to map the required dimensions.
- Data transfers from text files and Access databases using BCP and DTS; Data mapping and transformations while loading data using VBScript
- Created hash tables used for referential integrity and/or otherwise while transforming the data representing valid information.
- Created and managed project templates, use case project templates, requirement types and traceability relationships
Environment: Oracle 11g, PL/SQL, SQL Loader, SQL Reports, Teradata, DB2, Metadata, Cognos 8 BI, MS Office, MS Visio, ERWIN, XML, WSDL, Altova XML spy.
Confidential, Thousand Oaks, CA
Data Analyst
Responsibilities:
- Gathering Requirements and design of technical specification documents. Document the business requirements forthe Cognos BI reports. Derive Cognos report requirements from already existing report templates.
- Identified Use cases from the requirements and wrote Use Case Specifications
- Created business process workflow diagrams (Activity diagrams)
- Link business processes to organizational objectives, perform critical path analysis, and identify opportunities for business process improvement
- Create Mapping documents, ETL technical specifications and various documents related to data migration
- Writing complex SQL Queries to test the Informatica ETL process.
- Developed PL/SQL programs, stored procedures for data loading and data validations.
- Involved in Testing Argus safety system 5.0/5.1.
- Involved in testing Cognos Custom reports and Out of the box reports using Argus Insight 5.0/5.1.
- Developed complex SQL Queries to validate the data in the Cognos Custom reports against Safety Database.
- Coordinated with the offshore team in the testing process of the SDA (Signal Detection and Alerts) & CBS ( Corporate Business Support) Cognos custom reports.
- Performed end to end ETL testing of Custom Tables which were developed for Cognos Custom reports.
- Coordinated the testing process for Out of the Box reports (Argus Insight reports).
- Coordinated the testing process for various periodic reports like (CTPR, NDA, IND, and PSUR).
- Developed complex SQL’s to validate the data in periodic reports as a part of Aggregate Testing
- Involved in Regression testing of Argus Safety(Periodic) and Insight reports (Cognos Custom and Out of the box) to verify if the Hot fix defects have been fixed in the system (Hot fix1 and Hot fix2 Testing)
- Assist in the configuration of products, licenses and studies in Argus Safety.
- Test various Biztalk interfaces developed for the automation of product, license and study configurations.
- Familiar with EDC, eCRF, E2B and other data exchanges
- Validate the systems as per GxP and SOX compliance requirements. Provide support for post production issues, fix the issues, test the issues.
Environment: Oracle 11g, PL/SQL, SQL Loader, SQL Reports, Teradata, DB2, Cognos 8 BI, MS Office, MS Visio, ERWIN, EDM Teams, EDM Quality, Argus Safety, Argus Insight.
Confidential, Columbus, OH
Data Analyst
Responsibilities:
- Involved in interacting with the end-user (client) to gather business requirements.
- Participated in analyzing and modeling the requirements for the logical and physical design of the database using star schema and normalization techniques.
- Involved in Data Mining, Profiling and Data Modelling.
- Developed complex mappings to load data from Source System (Oracle) and flat files
- Built re-usable mapplets using InformaticaDesigner.
- Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
- Developed and modified mappings for Extraction, Staging, Slowly Changing Dimensions of Type1, Type2, Type3, Facts and Summary tables duly incorporating the changes to address the performance issues as stated in the Re-architecture specifications.
- Developed mappings for Slowly Changing Dimensions of Type1, Type2, Facts and Summary tables using all kinds of transformations.
- Worked on Informatica Power Center 7.1.1 - Source Analyzer, warehouse designer, MappingDesigner, WorkflowManager, Mapplets, and ReusableTransformations.
- Written PL/SQL Stored Procedures and Functions for Stored Procedure Transformation in Informatica.
- Define and represent Entities, Attributes and Joins between the entities.
- Implemented PL/SQL scripts in accordance with the necessary Business rules and procedures.
- Generated SQL and PL/SQL scripts to create and drop database objects including: Tables, Views, and Primary keys, Indexes, Constraints, Packages, Sequences and Synonyms.
- Extensively developed PL/SQL Procedures, Functions, Triggers and Packages.
- Written UNIX shell scripts to automate loading files into database using crontab.
- Developing batch files to automate or schedule tasks.
- Support for the development, pre-production and the production databases.
Environment: Agile, Oracle10g (SQL, PL/SQL), SQL*Loader, Forms 10g, TOAD 7.4, HTML/DHTML, PRO*C, JavaScript, Windows NT/2000, Crystal Report 10, Informatica Power Center 7.1.3, Erwin 4.0, UNIX.
Confidential, Memphis, TN
PL/SQL Developer
Responsibilities:
- Interacting with the end-user (client) to gather business requirements.
- Converted and loaded data from flat files to temporary tables in Oracle database using SQL*Loader.
- Extensively used PL/SQL in writing database packages, stored procedures, functions and triggers in Oracle 10g.
- Used Ref cursors and Collections with bulk bind and bulk collect for accessing complex data resulted from joining of large number of tables to extract data from data warehouse.
- Developed SQL scripts involving complex joins for reporting purposes.
- Fine Tuned (performance tuning) SQL queries and PL/SQL blocks for the maximum efficiency and fast response using Oracle Hints, Explain plans.
- Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions.
- Migration of MS Access to SQL SERVER 2005.
- Load data from MS Access database to SQL Server 2005 using SSIS (creating staging tables and then loading the data).
- Develop various SQL scripts and anonymous blocks to load data SQL Server 2005
- Create procedures, functions and views in SQL Server 2005.
- Develop ad hoc reports using Crystal reports XI for performance analysis by business users.
- Exported reports into various formats like XML, PDF, HTML, and EXCEL using Crystal Reports XI.
- Involved extensively in Unit testing, integration testing, system testing and UAT.
- Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements, etc.,
Environment: Agile, Oracle 10g, PL/SQL, SQL* Loader, TOAD 9.5, DB2, Crystal reports 11, Teradata, SQL Server 2005, SSIS, MSVisual Studio