Sr. Data Analyst Resume
NY
SUMMARY
- Experienced Data Analyst with 7+ years in Business Data analysis and Testing process which includes analyzing and modeling business processes, requirements gathering and expectation management
- Proficient at translating strategic IT user requirements into specific database capabilities
- Ability to write clear documentation of procedures, and ability to write specifications for data models to be developed
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch
- Strong working experience with Data warehouse ETL tools like (Informatica, Datastage, SSIS, Mainframe, AbInitio, etc.) and BI reporting tools like Business Objects, Cognos, SSRS, Micro strategy, etc.)
- Excellent SQL & PL/SQL skills with both DML,DDL commands and demonstrated ability to write complex queries for purpose of analyzing data and/or evaluating how information needs might be translated into back end database structures
- Experience working with various relational database management systems such as MS Access, MS SQL Server, Oracle and Teradata
- Have actively involved in design impact analysis, Data modeling and project design), development (analysis, programming, and testing), implementation, maintenance and administration
- Have performed several roles and have been part of full SDLC of several Development Projects
- Possesses excellent communication skills and an active team player
- Quickly learn and effectively utilize third party or proprietary tools to reduce the product delivery time
- Excellent domain understanding from Mortgage (Loan Origination system and Point of Sale) applications.
- Extensive experience in understanding the landscape of systems for claim payment, claim adjudication process and MMIS applications
- Experience in writing requirements for ETL Data warehousing, database profiling.
- Strong knowledge in Data Warehousing concepts, Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, SQL Assistant), Data Modeling and Oracle E - Business Suite 11.5(Oracle Applications)
- Efficient in VBA for MS Excel for planning/status reporting and creating test data.
- Highly dependable Data Analyst successful at profiling and interpreting data. Supportive and enthusiastic team player dedicated to streamlining processes and efficiently reconciling data issues.
- Solid understanding of master data management (MDM). Worked with data architects team to create the data model.
TECHNICAL SKILLS
Operating Systems: Microsoft Windows 2000/NT/7/XP, UNIX and LINUX
Databases: ORACLE, MS SQL Server, DB2, Teradata.
ETL & reporting Tools: ETL Informatica, AbInitio, Data Stage, SSIS, OBIEE, Crystal Reports, Microsoft SSRS Business Objects, Cognos, SAS.
Languages: SQL, PL/SQL, T-SQL, VBScript, MS SQL.
Tools: & technologies: MDM, XML/HTML/, QMF, MS Office (Word, Access, Excel, PowerPoint, Project Visio), Macros using VBA, Erwin, Visio.
Quality Management: Jira, HP QC/ALM
Process Tools: Rational Requisite Pro 2001, Rational Clear case Packages
PROFESSIONAL EXPERIENCE
Confidential, NY
Sr. Data Analyst
Responsibilities:
- Analyze raw data, draw conclusions & develop recommendations writing SQL scripts to manipulate data for data loads and extracts.
- Analyzed ETL data mappings using Informatica for extract, transform and loading data from OLAP system to OLTP system.
- Gathered Business requirements and performed technical analysis for Data transformations in the ETL process
- Participate in defect review meetings with the team members. Work closely with the project manager to record, track, prioritize and close bugs.
- Create all data issue design documents thoroughly and with sufficient logic update.
- Delivered Enterprise Data Governance, Data Analyst, Metadata, and ETL Informatica solution
- Analyzed ETL data mappings using Informatica for extract, transform and loading data from OLAP system to OLTP system.
- Involve in formulating strategies to manage the data part of the project. Beginning with data checking, data cleaning and working with developers and the business users efficiently to provide quality deliverables.
- Execute Data quality checks in the form of SQL queries to ensure data integrity and enforce business rules.
- Extensively involve with Data cleansing, formatting of the data to correct the mismatch in Staging area.
- Identify and address recurring/ root causeissues that can affect Data Integrity.
- Analyzing the huge data from legacy system (Mainframe) to new system (SQL) for migration and identified the gaps.
- Gathering the actual requirement from the business and coordinating with business stake holders regarding the table structure.
- Gathered business requirements pertaining to Loans, Members, memberships, salary and converted the same into functional requirements and business requirement document.
- Performed Tuning and optimization of ETL processes, by identifying bottlenecks and presenting solutions
- Designed and implemented complex SQL queries for QA testing and report/ data validation. Detected defects, reported issues and results to dev and PP team using Issue Tracking tool ITS.
- Involved in scrum process like Grooming, Meeting and reviews for sprint estimates.
- Coordinating with the BA and DEV teams to fix the gaps between businesses, development and implementation phases of the project.
- Created reports and graphs using Business objects to meet the business requirements of the project. Worked closely with BA to determine feasibility and determine LOEs
- Perform requirement analysis and document deliverables based on the version control within internal storage.
- Data Migration testing using SQL and checking source and target data integrity.
- Developed daily and weekly status reports.
- Performed Data Profiling and extensive data validation.
- Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
- Independently perform complex troubleshooting, root-cause analysis and solution development.
- Worked with Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.
Environment: ETL Informatica, BI Business Objects, Oracle, SQL server, SSIS, SSRS, X-Analysis for Eclipse, MS Excel, SQL, Erwin, Share point, Trillium, MS Access.
Confidential, Overland Park, KS
Sr. Data Analyst
Responsibilities:
- Was responsible for analyzing raw data, drawing conclusions & developing recommendations writing SQL scripts to manipulate data for data loads and extracts.
- Assist product team by automating data loads, creating diagnostic scripts, transformational scripts, creating scripts used by ETL teams, etc.
- Played an active and lead role in facilitating (JAD) Joint Application Development sessions to identify business rules and requirements and then document them in a format that can be reviewed and understood by both business and technical people.
- Worked on daily basis with lead Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.
- Documented the Requirement traceability matrix (RTM) and created Flow Diagrams using MS Visio.
- Participated in defect review meetings with the team members. Worked closely with the project manager to record, track, prioritize and close bugs.
- Gathered business requirements pertaining to trading, equities and fixed incomes like bonds, converted the same into functional requirements and business requirement document.
- Extensively involved in Data Extraction, Transformation and Loading (ETL process) from XML to the staging area and from Staging to ODS using Informatica Power Center.
- Analyzed user requirements, attended change request meetings to document changes and implemented procedures to test changes.
- Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
- Involved in developing the test strategy and assisted in developed Test scenarios, test conditions and test cases.
- Designed ad hoc queries in SQL Server management studio based on request. Examined reports and presented findings in Excel.
- Executed Data quality checks in the form of SQL queries to ensure data integrity and enforce business rules.
- Analyzed Business Requirements provided by the client and already existing mainframe application and is ensured that there are no missing links and no ambiguity in the requirements.
- Created Source to target Data mapping document of input /output attributes with the proper transformations which would help for the easy development of Informatica code.
- Worked with Informatica developers to debug the jobs and looked for the errors using the Informatica debugger.
- Analysis of the data stored in DB2 tables and the mainframe files required for the testing Team.
- Worked closely with the testing team in the testing of the transformed Informatica code and provides the reviewed results to the clients.
- I was responsible in coordinating the weekly in person meetings to make sure the complete ETL team is of all the details that are involved in issues that affect the applications and end result.
- Performed Data Profiling and extensive data validation to ensure report matches with existing mainframe Files. Extensively involved in Data Extraction, Transformation and Loading (ETL process) from XML to the staging area and from Staging to ODS using Informatica Power Center.
- Worked on daily basis with lead Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.
- Worked extensively with the ERwin Model Mart for version control
- Worked closely with QA team and developers to clarify/understand functionality, resolve issues and provided feedback to nail down the bugs.
- Assisted the database modelers in preparing the logical and physical data models and ascertained that requirements have been met and have worked on loading the tables in the Data Warehouse.
- Wrote SQLs using QMF to query the DB2 database on mainframe and Sql server using Sql server management studio to analyze the data.
Environment: SQL, ETL Tool Informatica, QMF, DB2, MS Excel, MS Visio, JIRA, Erwin, Informatica Power center, Windows 7, Microsoft Office
Confidential, West Chester, PA
Sr. Data Analyst
Responsibilities:
- Experience in all phases of the Data warehouse life cycle involvingAnalysis, Design, DevelopmentandTestingas a part of Data Quality Design framework. Development of Extraction and Loading using ETL Informatica.
- Prepared SQL & PL/SQL Queries to validate the data in both source and target databases.
- Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
- Design and develop ETL Informatica mappings using Informatica to support the reporting data mart.
- Developed process for capturing and maintaining metadata from all data repository components.
- Solely responsible for designing and migrating theData quality framework from composite to Informatica.
- Used various transformations like Source Qualifier, Expression, Normalizer, Aggregator, Filter for Designing and optimizing the Mapping.
- Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor. Created ETL Informatica execution scripts for automating jobs.
- Profiled various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.).
- Followed company code standardization rule Created and Designed logical database schema layout structure to prepare for ETL Informatica processes, data mining, extraction, Analysis and Reporting System
- Worked on SQL Developer to develop queries in Oracle. Interaction with the offshore team on a daily basis and Analyzed the business requirements and functional specifications.
- Developed complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards.
- Perform small enhancements (data cleansing/data quality).
- Extracted data from oracle database and Teradata, applied business logic to load them in the Teradata database. Parameterized the mappings and increased the re-usability.
- Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
- Followed Informatica recommendations, methodologies and best practices on the development activities.
- Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
- Developed process for capturing and maintaining metadata from all data repository components.
- Created Generated Complex reports in Cognos report studio including Drill Down reports from DMR modeled Frame work model.
- Experience in using Informatica command line utilities like PMCMD to control workflows in non-windows environments. Automated the Informatica jobs using UNIX shell scripting.
- Closely worked with the reporting team to ensure that correct data is presented in the attached reports, which was automatically being sent once DQ process is completed.
- Prepared Low level design document through interaction with team lead and manager
- Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
- Independently perform complex troubleshooting, root-cause analysis and solution development.
Environment: ETL Informatica, Cognos, SQL, PL/SQL, UNIX, Erwin, Excel, Access, Oracle, Teradata, Java, .Net, HP Quality Center and Microsoft Office.
Confidential, Bloomington, IL
Sr. Data Analyst
Responsibilities:
- Created customized and adhoc reports for various campaign execution based on certain criteria.
- Scrubbed data to accurately generate customer pull. Provide output files in various file format based on customer request.
- Reviewed specifications for feasibility of customer list pull criteria and commit date
- Design and develop ETL Informatica mappings using Informatica to support the reporting data mart.
- Wrote the SQL and PL/SQL queries on data staging tables and data warehouse tables to validate the data results.
- Involved in requirements analysis by interacting with analysts for the creation of Data marts using Informatica.
- Delivered Enterprise Data Governance, Data Quality, Metadata, and ETL Informatica solution
- Write SQL queries to validate source data and created mapping document for the target data.
- Writing complex SQL and PL/SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
- Performed ETL Informatica development task's like creating jobs using different stages, debugging etc.
- Creating and executing SQL queries to perform Data Integrity testing on an Teradata Database to validate and test data using TOAD.
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Created Test input requirements and prepared the test data for Data Driven testing.
- Created ETL Informatica execution scripts for automating jobs.
- Created and Designed logical database schema layout structure to prepare for ETL Informatica processes, data mining, extraction, Analysis and Reporting System
- Created, documented and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprise datastores within a metadata repository.
- Incumbent data warehouse was having trouble keeping pace, and its slow response time was preventing users from getting necessary information in time to be useful. After realizing significant performance, we decided to migrate to Teradata.
- Generate ad-hoc or management specific reports using Business Objects and Excel.
- Responsible for data mapping and data mediation between the source data table and WMO data tables using MS Access and MS Excel.
- Analyzed existing data and tagging patterns using HttpWatch, WebTrends and Business Objects.
- Created waterfall reports to show total lead pull, number of records dropped as result of each suppression or set of suppressions
- Responsible to design, develop mapping document for to maintain the data marts (Load data, OLAP tools).
- Supported ongoing MDM governance, stewardships and MDM visionary projects in various
- Provided output by segments for each Direct mail files
Environment: SQL, ETL Tools Informatica, SQL, PL/SQL, SAS, Business Intelligence Business Objects, Teradata, TOAD, Windows XP, MS Excel, Access, MS Visio, Oracle 11g, Teradata, UNIX, HP ALM.
Confidential
Data Analyst
Responsibilities:
- Worked with Requirements Analysts and Subject Matter Experts to identify, understand and document business needs for the data flow.
- Worked closely with various Financial, Mortgage and Credit Consumer Group business teams in gathering the business requirements.
- Worked with Central Distribution Hub (CDH) team in developing strategies to handle data from EO (Enterprise Originations) to CDH and then CDH to downstream systems.
- Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
- Performed day to day migrations of various ETL Informatica objects using export-import option.
- Worked with Chief Data Architects to slice the data requirements into work streams and various components.
- Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries and PL/SQL stored procedures to filter data within the Oracle database.
- Performed ETL Informatica development task’s like creating jobs using different stages, debugging etc.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Served as a resource for analytical services utilizing SQL Server, and TOAD/Oracle.
- Run the SQL and PL/SQL queries using TOAD and SQL Navigator.
- Identified/documented data sources and transformation rules required to populate and maintain data warehouse content.
- Delivered requirements for integrating transaction, previous day and current day information reporting, for ACH, Wires, Memos, Returns, Controlled Disbursements, Lockbox, and Data Exchange.
- Was responsible for indexing of the tables in that data warehouse.
- Reverse Engineered the existing ODS in to Erwin.
- Lead multiple project teams of technical professionals through all phases of the SDLC using technologies including Oracle, Erwin, Data Stage, Data Warehousing, Websphere and Cognos.
- Created job schedules to automate the ETL Datastage process.
- Worked on data modeling and produced data mapping and data definition documentation.
- Designed and implemented basic SQL queries for testing and report/data validation.
- Used Data warehousing for Data Profiling to examine the data available in an existing database.
- Highly contributed to the enterprise metadata efforts by completing the data mapping repository (DMR) information.
- Ensured the compliance of the extracts to the Data Quality Center initiatives.
- Gathered and documented the Audit trail and traceability of extract information for data quality.
- Worked with the Business and the ETL Informatica developers in the analysis and resolution of data related problem tickets and other defects.
Environment: SQL, ETL Datastage, Cognos, PL/SQL, MDM, BI, MS Excel, MS Access, SAS, MS Visio, Oracle 10g, UNIX, Windows XP, SQL,PL/SQL.
Confidential
Data Analyst
Responsibilities:
- Worked with SME’s from 8 different source systems to gather, analyze and document business requirements for the project
- Collaborated with SMEs to determine member billing calculations/definitions from 3 different source systems to produce a reconciled view for online members
- Involved in Data warehouse and Data mart design. Experience with various ETL, data warehousing tools and concepts.
- Actively participated in the Logical Data Model design to support future expansion and development of the project
- Extensively used Informatica client tools-Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager to solve day to day issues.
- Collaborated with QA team on test cases, test scripts, defect resolution and requirements clarification
- Wrote all Business and non/Functional requirements rules for the mapping document and delivered the Data blue print.
- Designed and implemented data integration modules for Extract/Transform/Load (ETL) functions.
- Created the rules engine for identifying the golden copy of the master data and manually profiled the data to make sure the rejects were handled correctly.
- Updated the RTM on HP QC
Environment: ETL Informatica,SQL, HP QC, Unix, Infomatica 8.6, Oracle 10g, SAS, MS Excel, MS Access, MS Visio
Confidential, New York
Data Quality Analyst
Responsibilities:
- Ran queries and created SQL Ad hoc reports and Pivot Tables According to the user specifications.
- Performed ad-hoc queries on the database as needed to generate reports.
- Worked in the core team of ETL developers to build applications with generic functionality; reduced data
- Performed export/import activities between MySql, MS Access, SQL Server 2005 and MS excel.
- Provided data analysis support to Management and all company employees.
- Designed the data conversion strategy, development of data mappings and the design of Extraction, Transformation and Load (ETL) routines for migrating data from non-relational or source relational to target relational.
- Prepared reports according to third party vendor specifications.
- Generated new reports as required by Management.
- Maintained the schedule of generation of standard reports on a weekly and monthly basis.
- Prepared the mapping document for the ETL Informatica development Team and testing team
- Created the traceability matrix.
- Collected the report specifications by having either client meetings or with the employees.
Environment: SQL, MySQL, MS Access, Informatica, Excel, SQL Server 2005 Windows Vista, SAS and windows XP.