We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Mapping Developer Resume


  • Experienced Data Analyst with 8+ years of experience in the analysis, design and development of Data warehouse and Business Intelligence applications, Testing process, requirements gathering and expectation management.
  • Proficient at translating strategic IT user requirements into specific database capabilities.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration Management.
  • Ability to document the procedures, and ability to write specifications for data models to be developed.
  • Highly analytical and process - oriented data analyst with in-depth knowledge of database types; research methodologies; manipulation and visualization of data.
  • Excellent SQL & PL/SQL skills with both DML, DDL commands and demonstrated ability to write complex queries for purpose of analyzing data and/or evaluating how information needs might be translated into back end database.
  • Experience in Creating Views, Stores Procedures, DDL/DML Triggers and User-Defined Functions to implement Business Logic and Data Protection.
  • Experience working with various relational database management systems such as MS SQL Server, Oracle, Teradata, DB2, MS Access.
  • Experience in creating KPI dashboards on Excel, tableau as well as Cognos reports.
  • Have performed several roles and have been part of full SDLC of several Development Projects.
  • Quickly learn and effectively utilize third party or proprietary tools to reduce the product delivery time.
  • Strong experience in interacting with stakeholders/customers, gathering requirements through interviews, existing system documentation or procedures, defining business processes using appropriate templates and analysis tools.
  • Expert in Data Integrity constraints, Performance Tuning, Query Optimization and Validation issues.
  • Good understanding of SQL Server and Oracle Architecture in terms of ETL and query processing, structure and performance and table partitions.
  • Involved in testing complex COGNOS, Micro strategy and Tableau reports, supporting the Testing Team, writing Test cases and creating Test Data.
  • Experience in writing requirements for ETL Data warehousing, database profiling.
  • Strong knowledge in Data Warehousing concepts, Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, SQL Assistant), Data Modeling and Oracle E-Business Suite 11.5(Oracle Applications).
  • Efficient in VBA for MS Excel for planning/status reporting and creating test data.
  • Solid understanding of master data management (MDM).


Operating Systems: Microsoft Windows 2000/NT/7/XP, UNIX and LINUX

Databases: ORACLE, MS SQL Server, DB2, Teradata.

ETL & reporting Tools: ETL Informatica, AbInitio, Data Stage, SSIS, OBIEE, Crystal Reports, Microsoft SSRS, Business Objects, Cognos, Tableau.

Languages: SQL, PL/SQL, T-SQL, VBScript, MS SQL.

Tools: & technologies: MDM, XML/HTML/, QMF, MS Office (Word, Access, Excel, PowerPoint, Project,Visio), Macros using VBA, Erwin, Visio, Share point, SQL Developer, OCTOPUS, GIT, Team City.

Quality Management: Jira, HP QC/ALM.



Sr. Data Analyst/Data Mapping developer


  • Develop artifacts that are consumed by the data engineering team such as source to target mappings, data quality rules, and data transformation rules, Joins etc. for the various EIM projects.
  • Involved in designing, developing and testing of various artifacts and support entire Migration process from different databases of transactional system into ODS, which then is loaded into data ware house.
  • Develop complex SQL Scripts for SSRS Reports and SSAS Cubes that will eventually be re-developed in Micro Strategy tool and used by Business.
  • Closely collaborate with end users in finalizing dashboard layouts and achieving desired look-and-feel of reports.
  • Involved in design of data models, table structures, and views to support reporting requirements
  • End user and .
  • Collaborated with several business functions to identify opportunities to create new business enhancing reports and automate existing labor and time-intensive critical reports.
  • Gather requirements from business and other stack holders and develop requirements documents as per the information and format required/ preferred by Micro Strategy developer to consume it.
  • Architecting the Micro Strategy Projects which involved in creating the Attributes, Facts and Hierarchies
  • Involved in Testing and Data validation of Reports.
  • Applied various excel functions like Match & Index, Vlookup etc. to perform the data validation and analysis.
  • Worked with the DBA on issues of database performance and security.
  • Worked closely with all aspects of the Business Intelligence group in order to deliver the required solutions in a timely manner.
  • Develop specifications for data integration & movement that includes performing data profiling, data analysis, identify data related issues and provide a suggested resolution, create source/target mappings and transformation, load and data cleansing rules.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Support development troubleshooting of data defects with transactional or data warehouse system.
  • Investigate an issue, review the results, and provide a suggested solution to the problem.
  • Being champion in data quality. Investigate data quality issues and develop improvement plans.
  • Maintain metadata and support the firm’s metadata platform.
  • Developing macros, lookup functions, pivot tables, etc. using excel functionality to reduce the time spent doing routing documentation work.
  • Effectively communicate and work very closely with a variety of partners including the data architects, the data engineering team, the application development team, the QA team, and business users.
  • Actively participate in various phases of a project and complete duties as assigned by manager.
  • Feasibility analysis based on the requirements gathered.
  • Design and implement reporting solution architecture & analytic solutions.
  • Involved in design/development life cycle including gathering requirements, user interaction and report design and development for Legacy Charter, Interim Reporting & New Charter projects.

Environment: SQL Server, Oracle 12C, JIRA, SQL, PL-SQL, Erwin, Micro Strategy, Share point, MS Excel, MS Word, MS Visio, MS Outlook, SQL Server Management Studios, SQL Server Analytical studio, Windows 7.

Confidential, Washington, DC

Sr. Data Analyst/ Report Developer


  • Analyze the existing application’s information by understanding the data transition between systems.
  • Extensively working on various data sources DB2, Oracle, XML/Flat files, Excel spreadsheets, transformed and mapped non-standard data to standard codes for data integrity and loaded the target Oracle warehouse database.
  • Developed Stored Procedures/ ETL processes on Oracle as needed to Extract, Transform and Load the data from multiple source systems to the data warehouse.
  • Rewrite all the reports based on Legacy process by creating packages and PL/SQL stored procedures based on requirements from business.
  • Involved in extensive DATA validation by writing several complex SQL and PLSQL scripts and Involved in back-end testing and worked with data quality issues.
  • Working as lead involves interaction with the business users to break down business requirements into technical/ ETL specifications and apply those rules during development.
  • Performed smoke test in Production post migration on the critical data elements due to time restrictions.
  • Lead the effort in converting the logical design to physical design, Data Mart Design by enforcing naming standards, referential integrity across multiple subject areas.
  • Based on the technical specification documents, designed logical/ physical data models in Oracle Data Modeler tool to support the warehouse and the data marts.
  • Involved in analyzing the bugs, performance of PL/SQL Queries and provided solutions to improve the same.
  • Pre-Migration testing and validation of all data elements been converted.
  • Conducted JAD sessions with technical and business stakeholders to determine data and interoperability requirements and also to come up with the best solutions for design and implement the BI Reports using Tableau.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Created visually impactful dashboards in Excel and Tableau for data reporting by using pivot tables and VLOOKUP.
  • Created Test data and Test Cases documentation for unit testing for each reports.
  • Performed unit testing, integration testing and regression testing for each reports being developed.
  • Work with users to identify the most appropriate source of record and profile the data required for reporting.
  • Created a CR request and got approval from change control board to change the a particular process in production system which was creating duplicated records in legacy database.
  • Analyzed the historical documentation, supporting documentation, screen prints, e-mail conversations, presented business and wrote the business requirements document and got it electronically signed off from the stake holder.
  • Assisted IT-OPS to execute the PL/SQL packages in sequential manner based on dependency to disseminate the data to web users for Flexible Querying.
  • Developed script in PLSQL to perform end to end testing of the migration for each table being migrated.

Environment: Oracle 11g, Oracle 12C, SQL, PL-SQL, Erwin, Tableau, Share point, JIRA, MS Excel, MS Word, MS Visio, MS Outlook, SQL Developer, DB2, OCTOPUS, GIT, Team City, JIRA.

Confidential, NY

Lead Data Analyst


  • Analyze raw data, draw conclusions & develop recommendations writing SQL scripts to manipulate data for data loads and extracts.
  • Participate in defect review meetings with the team members. Work closely with the project manager to record, track, prioritize and close bugs.
  • Create all data issue design documents thoroughly and with sufficient logic update;
  • Involve with Business Unit SME’s, Process Owners, and other groups to get related business rules and information on data in the development of specifications for data cleanup;
  • Involve in formulating strategies to manage the data part of the project. Beginning with data checking, data cleaning and working with developers and the business users efficiently to provide quality deliverables.
  • Execute Data quality checks in the form of SQL queries to ensure data integrity and enforce business rules.
  • Extensively involve with Data cleansing, formatting of the data to correct the mismatch in Staging area.
  • Perform User Acceptance Testing (UAT) or both Data and Functionality Issues.
  • Identify and address recurring/ root cause issues that can affect Data Integrity.
  • Analyzing the huge data from legacy system (Mainframe) to new system (SQL) for migration and identified the gaps.
  • Gathering the actual requirement from the business and coordinating with business stake holders regarding the table structure.
  • Gathered business requirements pertaining to Loans, Members, s, salary and converted the same into functional requirements and business requirement document.
  • Designed and implemented complex SQL queries for QA testing and report/ data validation. Detected defects, reported issues and results to dev and PP team using Issue Tracking tool ITS.
  • Involved in scrum process like Grooming, Meeting and reviews for sprint estimates.
  • Coordinating with the BA and DEV teams to fix the gaps between businesses, development and implementation phases of the project.
  • Perform requirement analysis and document deliverables based on the version control within internal storage.
  • Data Migration testing using SQL and checking source and target data integrity.
  • Developed daily and weekly status reports.
  • Worked on issues with migration of data from development to QA environment. Arranged and attended team meetings with off shore team to meet the project deadlines.
  • Performed Data Profiling and extensive data validation.
  • Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets. Independently perform complex troubleshooting, root-cause analysis and solution development.
  • Worked with Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.
  • Created Source to target Data mapping document of input /output attributes with the proper transformations which would help for the easy development of SSIS bridge.

Environment: SQL server 2012, SSIS (SQL server integration services), SSRS (SQL server reporting services), X-Analysis for Eclipse, SQL server Management studio, MS Excel, SQL, Erwin, Share point, Trillium, MS Access.

Confidential, Overland Park, KS

Sr. Data Analyst


  • Was responsible for analyzing raw data, drawing conclusions & developing recommendations writing T-SQL scripts to manipulate data for data loads and extracts.
  • Played an active and lead role in facilitating (JAD) Joint Application Development sessions to identify business rules and requirements and then document them in a format that can be reviewed and understood by both business and technical people.
  • Documented the Requirement traceability matrix (RTM) and created Flow Diagrams using MS Visio.
  • Participated in defect review meetings with the team members. Worked closely with the project manager to record, track, prioritize and close bugs.
  • Gathered business requirements pertaining to trading, equities and fixed incomes like bonds, converted the same into functional requirements and business requirement document.
  • Analyzed user requirements, attended change request meetings to document changes and implemented procedures to test changes.
  • Involved in developing the test strategy and assisted in developed Test scenarios, test conditions and test cases.
  • Designed ad hoc queries in SQL Server management studio based on request. Examined reports and presented findings in Excel. Facilitated file sharing and correspondence tracking with SharePoint.
  • Executed Data quality checks in the form of SQL queries to ensure data integrity and enforce business rules.
  • Analyzed Business Requirements provided by the client and already existing mainframe application and is ensured that there are no missing links and no ambiguity in the requirements.
  • Created Source to target Data mapping document of input /output attributes with the proper transformations which would help for the easy development of Informatica code.
  • Worked with Informatica developers to debug the jobs and looked for the errors using the Informatica debugger.
  • Analysis of the data stored in DB2 tables and the mainframe files required for the testing Team.
  • Worked closely with the testing team in the testing of the transformed Informatica code and provides the reviewed results to the clients. Worked with the users to do the User Acceptance Testing (UAT).
  • Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.
  • Extensively involved with Data cleansing, formatting of the data to correct the mismatch in Staging area.
  • Performed Data Profiling and extensive data validation to ensure report matches with existing mainframe Files.
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from XML to the staging area and from Staging to ODS using Informatica Power Center.
  • Worked on daily basis with lead Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic. Worked extensively with the ERwin Model Mart for version control.
  • Worked on daily basis with the main frame team to analyze the Cobol code and the process going in Mainframe application for each deliverable.
  • Worked closely with QA team and developers to clarify/understand functionality, resolve issues and provided feedback to nail down the bugs.
  • Assisted the database modelers in preparing the logical and physical data models and ascertained that requirements have been met and have worked on loading the tables in the Data Warehouse.
  • Wrote SQLs using QMF to query the DB2 database on mainframe and Sql server using Sql server management studio to analyze the data.

Environment: SQL, ETL Tool Informatica, QMF, Sql server, DB2, Sql server Management studio, MS Excel, MS Visio, JIRA, Erwin, Informatica Power center, Windows 7, Microsoft Office.

Confidential, WEST CHESTER, PA

Sr. Business Data Analyst


  • Experience in all phases of the Data warehouse life cycle involving Analysis, Design, Development and Testing as a part of Data Quality Design framework. Development of Extraction and Loading using Informatica.
  • Prepared SQL & PL/SQL Queries to validate the data in both source and target databases.
  • Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
  • Developed process for capturing and maintaining metadata from all data repository components.
  • Solely responsible for designing and migrating the Data quality framework from composite to Informatica.
  • Used various transformations like Source Qualifier, Expression, Normalizer, Aggregator, Filter for Designing and optimizing the Mapping.
  • Provided links on the dashboard to go directly to the reports for a particular metric.
  • Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Created reports based on the Date dimension and integrated with each of the reports for daily refresh of the dashboard without manual intervention.
  • Profiled various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.). Followed company code standardization rule
  • Created various tasks like Session, Command, and email task.
  • Worked on SQL Developer to develop queries in Oracle. Interaction with the offshore team on a daily basis and Analyzed the business requirements and functional specifications.
  • Developed complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards.
  • Created and tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards.
  • Perform small enhancements (data cleansing/data quality).
  • Extracted data from oracle database and Teradata, applied business logic to load them in the Teradata database. Parameterized the mappings and increased the re-usability.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Followed Informatica recommendations, methodologies and best practices on the development activities.
  • Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
  • Developed process for capturing and maintaining metadata from all data repository components.
  • Created Generated Complex reports in Cognos report studio including Drill Down reports from DMR modeled Frame work model.
  • Experience in using Informatica command line utilities like PMCMD to control workflows in non-windows environments. Automated the Informatica jobs using UNIX shell scripting.
  • Closely worked with the reporting team to ensure that correct data is presented in the attached reports, which was automatically being sent once DQ process is completed.
  • Prepared Low level design document through interaction with team lead and manager
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Independently perform complex troubleshooting, root-cause analysis and solution development.

Environment: ETL Informatica, Cognos, SQL, PL/SQL, UNIX, Erwin, Excel, Access, Oracle, Teradata, Java, .Net, HP Quality Center and Microsoft Office.

Confidential, BLOOMINGTON, IL

Sr. Business Data Analyst


  • Created customized and adhoc reports for various campaign execution based on certain criteria.
  • Scrubbed data to accurately generate customer pull. Provide output files in various file format based on customer request.
  • Reviewed specifications for feasibility of customer list pull criteria and commit date
  • Design and develop ETL Informatica mappings using Informatica to support the reporting data mart.
  • Wrote the SQL and PL/SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Involved in requirements analysis by interacting with analysts for the creation of Data marts using Informatica.
  • Delivered Enterprise Data Governance, Data Quality, Metadata, and ETL Informatica solution
  • Write SQL queries to validate source data and created mapping document for the target data.
  • Writing complex SQL and PL/SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
  • Creating and executing SQL queries to perform Data Integrity testing on an Teradata Database to validate and test data using TOAD.
  • Experience in creating UNIX scripts for file transfer and file manipulation. Created Test input requirements and prepared the test data for Data Driven testing.
  • Created ETL Informatica execution scripts for automating jobs.
  • Created and Designed logical database schema layout structure to prepare for ETL Informatica processes, data mining, extraction, Analysis and Reporting System.
  • Created recurring scheduling instances in Business object for the daily scheduling of the underlying WebI reports for Dashboard.
  • Created, documented and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprise datastores within a metadata repository.
  • Incumbent data warehouse was having trouble keeping pace, and its slow response time was preventing users from getting necessary information in time to be useful. After realizing significant performance, we decided to migrate to Teradata.
  • Tabular WebI reports are created with KPI’s in each of the individual tabs for the dashboards. Generate ad-hoc or management specific reports using Business Objects and Excel.
  • Responsible for data mapping and data mediation between the source data table and WMO data tables using MS Access and MS Excel.
  • Implemented aggregated data and created highly interactive and user friendly dashboards with visual effects.
  • Involved in the functional design for the Dashboard KPI’s and provided multiple layouts for the design approval.
  • Analyzed existing data and tagging patterns using HttpWatch, WebTrends and Business Objects.
  • Created waterfall reports to show total lead pull, number of records dropped as result of each suppression or set of suppressions
  • Responsible to design, develop mapping document for to maintain the data marts (Load data, OLAP tools).
  • Supported ongoing MDM governance, stewardships and MDM visionary projects in various
  • Provided output by segments for each Direct mail files

Environment: SQL, ETL Tools Informatica, SQL, PL/SQL, SAS, Business Intelligence Business Objects, Teradata, TOAD, Windows XP, MS Excel, Access, MS Visio, Oracle 11g, Teradata, UNIX, HP ALM.


Data Analyst


  • Data Integration is one of the initiatives of the Core Phase I Data Strategy for Confidential Bank (WFB), Home and Consumer Finance Group (HCFG).
  • Worked with Requirements Analysts and Subject Matter Experts to identify, understand and document business needs for the data flow.
  • Worked closely with various Financial, Mortgage and Credit Consumer Group business teams in gathering the business requirements.
  • Worked with Central Distribution Hub (CDH) team in developing strategies to handle data from EO (Enterprise Originations) to CDH and then CDH to downstream systems.
  • Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
  • Performed day to day migrations of various ETL Informatica objects using export-import option.
  • Worked with Chief Data Architects to slice the data requirements into work streams and various components.
  • Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries and PL/SQL stored procedures to filter data within the Oracle database.
  • Performed ETL Informatica development task’s like creating jobs using different stages, debugging etc.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Served as a resource for analytical services utilizing SQL Server, and TOAD/Oracle.
  • Run the SQL and PL/SQL queries using TOAD and SQL Navigator.
  • Identified/documented data sources and transformation rules required to populate and maintain data warehouse content.
  • Created large number of KPI’s required for Dashboard in Excel and Cognos for using formulas, variables, standard business object functions and merging between multiple universes fetching information from different underlying data targets.
  • Delivered requirements for integrating transaction, previous day and current day information reporting, for ACH, Wires, Memos, Returns, Controlled Disbursements, Lockbox, and Data Exchange.
  • Was responsible for indexing of the tables in that data warehouse.
  • Reverse Engineered the existing ODS in to Erwin.
  • Lead multiple project teams of technical professionals through all phases of the SDLC using technologies including Oracle, Erwin, Data Stage, Data Warehousing, Websphere and Cognos.
  • Created job schedules to automate the ETL Informatica process.
  • Worked on data modeling and produced data mapping and data definition documentation. Configured the Data mapping between Oracle and SQL Server 2005.
  • Designed and implemented basic SQL queries for testing and report/data validation. Used Data warehousing for Data Profiling to examine the data available in an existing database.
  • Highly contributed to the enterprise metadata efforts by completing the data mapping repository (DMR) information.
  • Ensured the compliance of the extracts to the Data Quality Center initiatives.
  • Gathered and documented the Audit trail and traceability of extract information for data quality.
  • Worked with the Business and the ETL Informatica developers in the analysis and resolution of data related problem tickets and other defects.

Environment: SQL, ETL Informatica, PL/SQL, MDM, BI, MS Excel, MS Access, SAS, MS Visio, Oracle 10g, UNIX, Windows XP, SQL,PL/SQL, Sybase Power Designer 12.1.

Hire Now