We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

Springfield, MA

SUMMARY

  • 7+ years of Industry experience as a Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
  • Excellent experience in analysis huge volumes of data in industries such as Finance, Healthcare & Retail.
  • Expert in writing SQL queries and optimizing the queries in Oracle 11g/10g and SQL Server 2008.
  • Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and SQL Server.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Excellent knowledge on Perl & UNIX.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio
  • Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
  • Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects
  • Excellent knowledge on creating DML statements for underlying statements.
  • Extensive ETL testing experience using Informatica 8.6.1/8.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
  • Have good exposure on working in offshore/onsite model with ability to understand and/or create functional requirements working with client and also have Good experience in requirement analysis and generating test artifacts from requirements docs.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
  • An excellent team player & technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

TECHNICAL SKILLS

Data Warehousing: Informatica 9.1/8.6/7.1.2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SSIS, Data Stage 8.x

Reporting Tools: Business Objects6.5, XIR3, Cognos 8 Suite

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest

RDBMS: Oracle 11g/10g/9i/8i/7.x, MS SQL Server 2008/2005/2000 , MS Access 7.0

Programming: T-SQL, PL/SQL, UNIX Shell Scripting

Environment: Windows (95, 98, 2000, NT, XP), UNIX

Other Tools: TOAD, MS-Office suite (Word, Excel, Project and Outlook)

PROFESSIONAL EXPERIENCE

Confidential, Springfield MA

Data Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Worked on claims data and extracted data from various sources such as flat files, Oracle and SQL Server 2008.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and SQL Server.
  • Written several shell scripts using UNIX Korn shell for file transfers, error logging, data archiving, checking the log files and cleanup process.
  • Metrics reporting, data mining and trends in helpdesk environment using MS Access
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Designed and developed database models for the operational data store, data warehouse, and federated databases to support client enterprise Information Management Strategy.
  • Flexible to work late hours to coordinate with offshore team.

Environment: MS SQL Server 2008 client & SERVER, Oracle 11g, Perl, Toad, SQL, MS office, Rational Clear Quest, Clear Case.

Confidential, Cincinnati OH

Data Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Performed data mining on Claims data using very complex SQL queries and discovered claims pattern.
  • Created DML code and statements for underlying & impacting databases.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and SQL Server.
  • Responsible for different Data mapping activities from Source systems to SQL Server.
  • Assisted in the oversight for compliance to the Enterprise Data Standards
  • Worked in importing and cleansing of data from various sources like Oracle, flat files, SQL Server 2008 with high volume data
  • Worked with Excel Pivot tables.
  • Create and Monitor workflows using workflow designer and workflow monitor.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Flexible to work late hours to coordinate with offshore team.

Environment: Quality Center 9.2, MS Excel 2007, PL/SQL, Business Objects XIR2, ETL Tools, Oracle 10G, SQL Server 2008.

Confidential, Wayne NJ

Data Analyst

Responsibilities:

  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Performed data mining on Claims data using very complex SQL queries and discovered Health care claims pattern.
  • Responsible for different Data mapping activities from Source systems to SQL Server.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Executed campaign based on customer requirements
  • Followed company code standardization rule
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Responsible for creating test cases to make sure the data originating from source is making into target properly in the right format.
  • Tested several stored procedures and wrote complex SQL syntax using case, having, connect by etc
  • Involved in SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Oracle 9i, SQL Server 2008/2005, Quality Center 8.2, SQL, TOAD, Erwin, PL/SQL, Flat Files

Confidential, Branchville NJ

Data Analyst

Responsibilities:

  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Responsible for different Data mapping activities from Source systems to SQL Server
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Executed campaign based on customer requirements
  • Followed company code standardization rule
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Executed the SAS jobs in batch mode through UNIX shell scripts
  • Created remote SAS sessions to run the jobs in parallel mode to cut off the extraction time as the datasets were generated simultaneously
  • Involved in code changes for SAS programs and UNIX shell scripts
  • Reviewed and modified SAS Programs, to create customized ad-hoc reports, processed data for publishing business reports.
  • Responsible for creating test cases to make sure the data originating from source is making into target properly in the right format.
  • Tested several stored procedures and wrote complex SQL syntax using case, having, connect by etc
  • Involved in SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL Flat Files.

Confidential, Suwanee GA

Data Analyst

Responsibilities:

  • Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Developed separate test cases for ETL process (Inbound & Outbound) and reporting
  • Involved with Design and Development team to implement the requirements.
  • Developed and Performed execution of Test Scripts manually to verify the expected results
  • Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation
  • Involved in Manual and Automated testing using QTP and Quality Center.
  • Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
  • Defects tracking, review, analyzes and compares results using Quality Center.
  • Participating in the MR/CR review meetings to resolve the issues.
  • Defined the Scope for System and Integration Testing
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in ETL job failures, abort or data issue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Created and executed test cases for ETL jobs to upload master data to repository.
  • Responsible to understand and train others on the enhancements or new features developed
  • Conduct load testing and provide input into capacity planning efforts.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
  • Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Windows XP, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, Erwin

We'd love your feedback!