We provide IT Staff Augmentation Services!

Data Analyst Resume

5.00/5 (Submit Your Rating)

Mount Laurel, NJ

SUMMARY

  • 8 years of Experience in Data Analysis, Data Validation, Data Modeling, Data Profiling, Data Verification, Data Mapping, Data Loading and Data Quality analysis
  • Good experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications
  • Experience with Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling
  • Created Data mapping, logical data modeling coordinating with data architects, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database
  • Experience in Data Warehouse applications testing using Informatica Power Center and Data stage on multiple platforms
  • Well versed with various types Software Development methodologies - Waterfall, Agile, RUP, Iterative and Extreme Programming.
  • Extensive experience in reviewing Business Requirement Documents, Software Requirement Documents and preparing Test Cases, Test scripts and Execution.
  • Experience with Agile Methodology.
  • Good Knowledge on Business intelligence, OLAP, Dimensional modeling, Star and Snowflake schema, extraction, transformation and loading (ETL) process
  • Ability to develop complicated SQL script for Data validation testing by running SQL script, procedures.
  • Experience in data retrieval methods using Universes, Personal Data files, Stored Procedures, and free hand SQL. Automated and scheduled the Informatica jobs using UNIX Shell Scripting.
  • Experience in testing Business Report developed in Cognos
  • Experience in Black box testing with a complete QA cycle - from testing, defect logging and verification of fixed bugs
  • Extensive experience in Functional testing, Integration/System testing, Regression testing and User Acceptance testing.
  • Well versed in GUI application testing, Database testing and Front-end testing.

TECHNICAL SKILLS

Data Warehousing Informatica: 9, Data Stage 8.x, SSIS

Reporting Tools: Business Objects XIR3, Cognos 10.2/9.0 Suite, SSAS, SSRS, MicroStrategy

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Testing Tools: ALM, HP Quality Center, Rational Tools, QTP

RDBMS: Oracle 11g/ 10g/9i, MS SQL Server 2012/2008, UDB DB2, Sybase, Teradata13, MS Access

Programming: UNIX Shell Scripting

Web Technologies: JavaScript, HTML, .NET, Java, J2EE, XML, XSD, XSLT

Environment: UNIX, MVS, HP-UX, IBM AIX 4.2/4.3, Hyperion, Novell NetWare, Win 3.x/95/98

PROFESSIONAL EXPERIENCE:

Confidential, Mount Laurel NJ

Data Analyst

Responsibilities:

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.2
  • Involved in data governance, defined data mapping from source to target (for all the three heritages)
  • Gathered metadata and created data analysis repository (spreadsheet), which includes following metadata information: data source, hierarchy, fact, dimension, data attribute, data integration, data model, data acquisition etc
  • Worked collaboratively with multiple business entities and successfully implemented new business processes and established data governance
  • Write complex SQL queries against the target Oracle database to generate reports and compare against legacy Access database reports
  • Extensively involved with dataquality anddatagovernancesolutions including platforms and supportingdataprocesses
  • Defined data requirements and elements used in XML transactions.
  • Performed Data Validation with Data profiling
  • Involved in the testing of Data Mart using Power Center.
  • Identified and Documented additional data cleansing needs and consistent error patterns that could diverted by modifying ETL code.
  • Extensively used Informatica Power Center for Extraction, Transformation and Loading process.
  • Extensively tested several ETL Mappings developed using Informatica.
  • Extensively used Teradata load utilities Fast load, Multiload and FastExport to extract, transform and load the Teradata data warehouse
  • Worked in an Agile technology with Scrum.
  • Responsible for different Data mapping activities from Source systems to Teradata.
  • Queried Teradata Database and validated the data using SQL Assistant.
  • Effectively distributed responsibilities, arranged meetings and communicated with team members in all phases of the project.
  • Used import and export facilities of the application to download/upload XMLs of failed test cases so as to re-verify.
  • Scheduled the jobs using Auto sys and automated the jobs to be run at specific time and automated the reports.
  • Writing UNIX scripts to perform certain tasks and assisting developers with problems and SQL optimization.
  • Configured Quick Test Pro with Quality Centre and Maintained the project information in Quality Centre.
  • Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly monthly basis with proper dependencies.
  • Wrote complex T- SQL, SQL queries using joins, sub queries and correlated sub queries
  • Performed Unit testing and System Integration testing by developing and documenting test cases in Quality Center.
  • Validated the report generated using SSRS using T-SQL queries.
  • Did Unit testing for all reports and packages.
  • Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS ACCESS, MS EXCEL)
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Tested complex objects to the universe to enhance the report functionality.
  • Tested Canned/Ad-hoc reports using SSRS Reporter functionalities like Cross Tab, Master Detail and Formulas, Slice and Dice, Drill Down, variables, filters, conditions, breaks, sorting, @Functions, Alerts, Cascading Prompts and User Defined Objects.
  • Responsible for migrating the code changes from development environment to SIT, UAT and Production environments.
  • Validated cube and query data from the reporting system back to the source system.
  • Tested analytical reports using Analysis Studio.

Environment:: Informatica 9, Flat files, MS SQL Server 2012, Oracle 11g, SQL, PL/SQL, IBM DB2, AGILE, Teradata 13, Teradata SQL Assistant, SSRS, SSIS, HP ALM, HP Quality Center 10, Autosys, XML, Toad, Unix Shell Scripting

Confidential, Salt Lake City, Utah

Data Analyst

Responsibilities:

  • Involved data governance, defined data modeling and model maintenance standard (enterprise-wide)Resolved text truncation error during metadata distribution
  • Involved in creating source target mappings based on the Technical Design requirements
  • Preparation of technical specifications and Source to Target mappings
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity
  • Analyzed the source data to quantify benefits and improve ongoing data quality
  • Created test data for testing specific ETL flow.
  • Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.
  • Understanding Source Systems used by State Farm Insurance to populate Data Mart.
  • Working on defining mapping requirements sources systems used by State Farm Insurance to populate data mart.
  • Tested and developed the mapping for extracting, cleansing, transforming, integrating, and loading data using Informatica.
  • Worked on informatica to resolve production issues perform testing and debugging of mappings and reports Extraction of test data from tables and loading of data into SQL tables.
  • Used Software for Querying ORACLE. And Used Teradata SQL Assistant for Querying Teradata
  • Strong in writing SQL queries and makes table queries to profile and analyze the data in MS Access.
  • Formulate methods to perform Positive and Negative testing against requirements.
  • Performed Manual Testing of the application Front-End and Functionality. Identified the critical test scripts to be automated.
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
  • Monitored the Informatica workflows using Power Center monitor. Checked session logs in case of aborted/failed sessions
  • Perform Functional, Data Validation, Integration, regression and User Acceptance testing.
  • Used TOAD, DB Artisan tools to connect to Oracle Database to validate data that was populated by ETL applications
  • Worked with business team to test the reports developed in Cognos.
  • Involved in testing Unix Korn Shell to run various ETL Scripts to load the data into Target Database (Oracle).
  • Worked on Quality Center to log defects and track resolution till the closing of defect after retesting.
  • Created test cases, executed test scripts and track and report system defects using Quality Center.

Environment: Informatica 9, Java, Cognos 10.2 Series, SQL, SQL Server 2000/2005, Teradata (BTEQ), Teradata SQL Assistant, UNIX, Shell Scripting, Rumba UNIX Display, HP Quality Center 10

Confidential, NC

Data Analyst

Responsibilities:

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Develop test plans based on test strategy. Created and executed test cases based on test strategy and test plans based on ETL Mapping document.
  • Wrote complex SQL queries for querying data against different data bases for data verification process.
  • Prepared the Test Plan and Testing Strategies for Data Warehousing Application.
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Helped the Testing team by writing SQL statements to validate the reports
  • Used multiple data providers, Master/Detail, cross tab, Charts.
  • Created pivot tables in Excel by getting data from Teradata and Oracle.
  • Created several Ad-hoc reports in Business Objects XI R2 to validate the data.
  • Tested different types of reports, like Master/Detail, Cross Tab and Charts (for trend analysis).
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Quality Center.
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Developed scripts, utilities, simulators, data sets and other programmatic test tools as required executing test plans.
  • Used import and export facilities of the application to download/upload XMLs of failed test cases so as to re-verify.
  • Scheduled the jobs using Auto sys and automated the jobs to be run at specific time and automated the reports.
  • Writing UNIX scripts to perform certain tasks and assisting developers with problems and SQL optimization.
  • Configured Quick Test Pro with Quality Centre and Maintained the project information in Quality Centre.
  • Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly monthly basis with proper dependencies.
  • Wrote complex T- SQL, SQL queries using joins, sub queries and correlated sub queries
  • Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.
  • Creating test cases for ETL mappings and design documents for production support.
  • Setting up, monitoring and using Job Control System in Development/QA/Prod.
  • Extensively worked with flat files and excel sheet data sources. Wrote scripts to convert excel to flat files.
  • Scheduling and automating jobs to be run in a batch process.
  • Extensively used Informatica power center for Extraction, Transformation and Loading process (ETL).
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Worked on issues with migration from development to testing.
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.

Environment: SAS/Macros, SAS/ETL, UNIX Shell Scripting, Informatica Power Center8.4 (Power Center Designer, workflow manager, workflow monitor), Business Objects XIR2, Quality Canter 9.2, QTP, SQL *Loader, Oracle8i, SQL Server, Teradata

Confidential GA

SQL / BI / ETL Test Analyst

Responsibilities:

  • Wrote SQL Statements to extract Data from Tables and to verify the output Data of the reports
  • Performed back-end testing on the Oracle database by writing SQL queries.
  • Collected, analyzed and reported testing metrics, executed SQL queries
  • Built and maintained effective QA test lab environments
  • Tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards.
  • Used Query Studio to test ad hoc reports
  • Developed SQL queries in Toad and DB Artisan to achieve the data transformations
  • Participated in various meetings and discussed Enhancement Request issues
  • Conducted Regression testing after the bugs have been fixed by the development team
  • Responsible for source control, versioning, and configuration management of test scripts, test results, defects
  • Analyzed the source data to quantify benefits and improve ongoing data quality
  • Created test data for testing specific ETL flow.
  • Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.
  • Understanding Source Systems used by State Farm Insurance to populate Data Mart.
  • Working on defining mapping requirements sources systems used by State Farm Insurance to populate data mart.
  • Tested and developed the mapping for extracting, cleansing, transforming, integrating, and loading data using Informatica.
  • Worked on informatica to resolve production issues perform testing and debugging of mappings and reports Extraction of test data from tables and loading of data into SQL tables.
  • Developed Test Strategy, Test plan, Test cases, and Test scripts
  • Created test plan and executed test cases using Rational Test Manger
  • Conducted Functionality, Security, and End to End testing
  • Reported bugs using Quality Center and generated the defect reports for review
  • Maintained the test traceability matrix
  • Performed Regression Testing on weekly builds.
  • Monitored the workflow transformations in Informatica work flow monitor

Environment: Oracle 9i, SQL*Plus, SQL, Quality Center, Rational Clear Quest, Clear Case

We'd love your feedback!