We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

0/5 (Submit Your Rating)

New York City, NY

SUMMARY

  • 8+ Years of professional experience as Data Analyst with extensive experience in Banking, Finance, Loan and Investment Banking domain with Master Data Management (MDM), ETL Development and Data Modelling experience and coordinating with stakeholders at every level, writing Use Cases, Requirement gathering, and end - to-end software life cycle development (SDLC).
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration Management
  • Expertise’s in Data Modelling, Azure Cloud, Big data, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
  • Diverse experience in Anti - Money Laundering (AML) and Know Your Customer (KYC), Client Onboarding - Business as Usual), Banking and Financial Services.
  • Design and development of Data warehouse for Business Intelligence data marts and reporting.
  • Designed Physical Data Model (PDM) using IBM Info sphere Data Architect data modelling tool and ORACLE PL/SQL.
  • Extensive experience in conducting Market Research, Feasibility Studies, Data Analyses, Data Mapping, Data Profiling, Gap Analyses, Risk Identification, Risk Assessment, Risks Analyses and Risk management.
  • Expert in Data Modelling, Data Visualization and Modern Data Warehouse concepts. Designed Various Reports/dashboards to provide insights and data visualization using BI/ETL tools like Business Objects, Tableau and Pentaho.
  • Implemented Actimize Anti-Money Laundering (AML) system to monitor suspicious transactions and enhance regulatory compliance and Involved in Security testing for different LDAP roles.
  • Created Informatics mappings and workflows for loading data from different sources to Fin Master
  • Extensive experience with the SAS programming and in data step and with various SAS Procedures in Base SAS and SAS/Stat, including thorough knowledge of SAS Macro.
  • Experience in development methodologies like RUP, SDLC, AGILE, SCRUM and Waterfall.
  • Expert in TSQL DDL/DML, perform most of the SQL Server Enterprise Manager and Management Studio functionality using T-SQL Scripts and Batches
  • Have extensive knowledge in Data flow modelling and Object modelling, case analysis and functional decomposition analysis.
  • Expertise in Data Analysis functions include Data Virtualization, Data Visualization and Data Transformation & Sourcing. Extensive experience with SQL, ETL (SSIS), PL/SQL and advanced Excel.
  • Experienced in handling of large amounts of data (Data Cleansing, Data Profiling & Data Scrubbing and Data Blending). Worked on Data quality, Data integrity, Master Data Management.
  • Deep Understanding and hands on experience on Data Sub setting, Profiling and cloning.
  • Expertise in Database Administration of Test Environment Databases, Data Pools and identifying the performance issues due to inefficient Database Administration
  • Experience in Business Intelligence (BI) Technologies like Microsoft Business Intelligence (SSIS, SSAS, and SSRS), Informatica, Business Objects and OBIEE
  • Extensive Data Warehousing experience using Informatica as ETL tool on various data bases like Oracle, SQL Server, Teradata, MS Access.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
  • Expert level experience in MS Excel including Macros, Lookups and Pivot Tables and strong expertise in writing complex SQL queries and Strong experience in BI reporting including Cognos and Spotfire.
  • Sound Proficiency in analyzing and Creating Use cases, Use Case Diagrams, Swim Lane Diagrams, Activity diagrams, Class diagrams, Data Flow Diagrams, Business Flow Diagrams and Navigational flow diagram and Sequence diagrams using Rational Rose and MS Visio.
  • Experience in dealing with different data sources ranging from Flat files, SQL server, Oracle, MySQL, MS Access and Excel.
  • Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases and in designing and developing Test Plans and Test Scripts.

TECHNICAL SKILLS

Database tool: TOAD, SQL Loader, Netezza, Fast Load, Multi Load, Netezza Manager, CLI, ODBC, Teradata, SQL Server 2012/2008/2005 , DB2 UDB 8.0, MySQL, MySQL Workbench, Sybase, MS Access, UNIX, Windows 8/7/XP/NT.RUP, UML, Joint Application Development (JAD), Unified Modelling Language (UML), Object Oriented Analysis and Design (OOAD)

Business Areas: Financial, Banking, Finance, Loan and Investment Banking domain with Master Data Management (MDM), ETL Development and Data Modelling experience.

Tools: Azure Cloud, Big data, ETL, BI, OLAP, ORACLE PL/SQL. IBM Info sphere, Business Objects, Tableau and Pentaho, MS Access and Excel.

Development methodologies: RUP, SDLC, AGILE, SCRUM and Waterfall.

Testing: GUI testing, Functionality Testing, Regression testing, System Testing, Unit Testing, Integration Testing, Performance Testing, and Stress Testing.

Reporting Tools: SQL Server Reporting Services.

PROFESSIONAL EXPERIENCE

Confidential - NEW YORK CITY, NY

Sr. Data Analyst

Responsibilities:

  • Gathered requirements from remotely based business users and defined and elaborated the requirements by holding meetings with the users (who are also Fifth-third employees).
  • Designed and Developed the Business Objects Universes which suit the standard, analytical and ad-hoc reporting requirements of the Business Objects users
  • Designed and implemented basic SQL queries for QA Testing and Report / Data Validation.
  • Created SAS programs that are used for data validation, statistical report generation and program validation and automated the Edit Check programs using Macros
  • Analyzed the historical documentation, supporting documentation, screen prints, e-mail conversations, presented business and wrote the business requirements document and got it electronically signed off from the stake holder
  • Wrote the test cases and technical requirements and got them electronically signed off.
  • Created new reports based on requirements.
  • Utilized simple methods like PowerPoint presentations while conducting walkthroughs with the stakeholders.
  • Conducted GAP analysis so as to analyze the variance between the system capabilities and business requirements.
  • Reviewed Stored Procedures for reports and wrote test queries against the source system (SQL Server) to match the results with the actual report against the Data mart (Oracle).
  • Wrote BRD, FRD, use cases, test scenarios, test cases for testing the functional and non-functional aspects of both ETL jobs and Reporting jobs
  • Interacted with teams in AFS, ACBS and InfoLease to extract the information for the reports.
  • Involved in defining the source to target data mappings, business rules, business and data definitions
  • Created Report-Models for ad-hoc reporting and analysis.
  • Created Logical/physical Data Model in ERwin and have worked on loading the tables in the Data Warehouse
  • Worked extensively with the ERwin Model Mart for version control
  • Extensively designed Data mapping and filtering, consolidation, cleansing, Integration, ETL, and customization of data mart.
  • Worked on daily basis with the main frame team and lead data warehouse developers to evaluate impact on current implementation.
  • Description of End-to-end development of Actimize models for trading compliance solutions of the the project bank.
  • Implemented Actimize Anti-Money Laundering (AML) system to monitor suspicious transactions and enhance regulatory compliance.
  • Lead Business Intelligence reports development efforts by working closely with MicroStrategy, Teradata, and ETL team
  • Senior level SQL query skills (Oracle and TSQL) in analyzing and validating SSIS ETL database data ware house processes
  • Involved in conducting JAD sessions to identify the source systems and data needed by Actimize-SAM (KYC/CIP).
  • Developed the systems implementation project management plan with milestones and steps from procurement of vendors to project implementation and maintenance.
  • Description of End-to-end development of Actimize models for trading compliance solutions of the project bank.
  • Analyzed business requirements and employed Unified Modelling Language (UML) to develop high-level and low-level Use Cases, Activity Diagrams, Sequence Diagrams, Class Diagrams, Data-flow Diagrams, Business Workflow Diagrams, Swim Lane Diagrams, using Rational Rose
  • Worked with the Enterprise and Business Intelligence Architecture team to understand repository objects that support the business requirement and process

Confidential - Montebello, NY

Sr. Data Analyst

Responsibilities:

  • Worked on Data Management and Business Solutions team, worked on Data Analysis, Requirement gathering, GAP analysis, Data Mapping and Testing of ETL and BI Reporting requirements
  • Created source to Target data mapping document for reconciliation between different data systems.
  • Report Development using AXIOM Controller View creation of Data Sources, Data Models, Shorthand, Portfolios, Aggregations, Free Form and Tabular Report and workflows using AXIOM Controller View tool
  • Create Pricing Report along with creating Pivot Table for review and compare against.
  • Worked on Gap analysis (AS-IS) & (TO-BE) by identifying existing processes, technologies, and documenting the enhancements needed to meet the end state requirements.
  • Worked with users and IT to confirm manual data input templates which could be configured by AXIOM, plan Feed and Line Automation, and create Functional Requirements Document (FRD).
  • Handle several projects for the HR and Finance areas as well as Revenue Accounting.
  • Migrated, aggregated Legacy infrastructures into Hadoop Clusters using SQL and Map reducing techniques.
  • Deploying a tool that analysed absolute products leading to productivity savings developed a model to forecast demand for different Exchanges.
  • Involved in created stored procedures, views, and custom SQL queries to import data from SQL server to Tableau.
  • Involved in creating source to target mappings, edit rules and validation, transformations, and business rules.
  • Analyzed client requirements and designed the ETL Informatica mapping.
  • Performed CRUD operations in SQL Server, HTTP and Restful APIs.
  • Operating SWIFT messaging service for financial messages, such as letters of credits, payments, and securities transactions, between member banks worldwide. Worked on Data Profiling and Data Quality assessment for Siberian Data Warehouse, ODS and source systems using TOAD and IBM Information Analyzer tools.
  • Interacted with Cash Management Division and Compliance Office to review the AML (Anti Money Laundering) requirements and provide Positive Pay service to reduce check fraud
  • Data cleansing and creation of layouts and specifications for end-user reports, delivering an end-to-end Metadata Management solution in the process
  • Created use cases to depict the interaction between the various actions and the system and created data flow models, data quality analysis and performed cost/benefit analysis.
  • Awarded with funding for Framework to analyze effect of ambient conditions on reliability of Supply and demand.
  • Worked on creating an R package to pull, merge and wrangle repair data for reproducible analysis.
  • Worked on performance tuning of mappings, sessions, Target and Source.
  • Used advanced functions like VLOOKUP’s, Pivots, graphs, and analytical and statistical tool packs in Excel.
  • Worked with senior developers to implement ad-hoc and standard reports using Informatica, Cognos, MS SSRS and SSAS.
  • Thorough understanding of various modules of AML including Watch List Filtering, Suspicious Activity Monitoring, CTR, CDD, and EDD
  • Extensively involved in Data extractions, Data mapping from source to target systems using customized PL/SQL code and Unix/Linux platform.
  • Strong experience on Administration and troubleshooting of Azure IAAS Components - VM's, Storage, VNET, NSG, Availability sets, Site to site VPN.
  • Expert knowledge of 3-tier and N-tier layered architecture.
  • Involved in creating and configuring the site-to-site Connection and created point-to-point connection.
  • Involved in the competitive market analysis research for the Production team to provide insight into enhancements using Power BI.
  • Facilitated the User Acceptance Testing (UAT) with Pega System Administrators and Business Users, documented any issues or defects and eventually got sign off from the right parties.

Confidential - Wayne, NJ

Data Analyst

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Prepared and analyzed AS IS and TO BE in the existing architecture and performed Gap Analysis.
  • Created workflow scenarios, designed new process flows and documented the Business Process and various Business Scenarios and activities of the Business from the conceptual to procedural level.
  • Analyzed business requirements and employed Unified Modelling Language (UML) to develop high-level and low-level Use Cases, Activity Diagrams, Sequence Diagrams, Class Diagrams, Data-flow Diagrams, Business Workflow Diagrams, Swim Lane Diagrams, using Rational Rose.
  • Gathered requirements to customize ATG provided solutions, services, and ongoing guidance to power a more relevant and personal e-commerce.
  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Utilized corporation developed Agile SDLC methodology. Used Scrum Work Pro and Microsoft Office software to perform required job functions.
  • Partially involved in creating and maintaining data dictionaries for naming various object in the finance domain.
  • Using Shared Containers and creating reusable components for local and shared use in the ETL process.
  • Worked on creating and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access, produce performance reports and implement changes for improved reporting.
  • Creating Excel templates created Pivot Tables and utilized VLOOKUPs with complex formulas.
  • Provided weekly, monthly & ad hoc web analytics reports using Adobe Site Catalyst & Google Analytics.
  • Worked on developing Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
  • Used Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Substantial report development experience utilizing SQL Server Reporting Services (SSRS), Congo’s Impromptu, and Microsoft Excel.
  • Worked on Informatica Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformations.
  • Helped the finance team with the forecasting by providing accurate results by ad-hoc queries, data support including in depth analysis and reporting.
  • Writing PL/SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
  • Utilize complex Excel functions such as pivot tables and VLOOKUPs to manage large data sets and make information readable for other teams.
  • Involved in developing SQL-based data warehouse environments and created multiple custom database applications for data archiving, analysis, and reporting purposes.
  • Data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
  • Wrote specification to implement application fee for trade-processing lifecycles from pre-trade analysis to post-trade risk management, settlement and regulatory compliance with Confidential 's Capital Markets and Investment Banking solution.
  • Worked with the Enterprise Data warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process
  • Assisted in Integration testing, Regression testing and analyzed data and created reports using SQL queries
  • Performed extensive requirement analysis including Data analysis and Gap analysis.
  • Responsible for maintaining versions of source code using Team Foundation Server (TFS).
  • Analyzed business requirements and segregated them into high level and low-level Use Cases, activity diagrams using Rational Rose according to UML methodology thus defining the Data Process Models.
  • Involved in designing and implementing basic SQL queries for QA Testing and Report / Data Validation.

Confidential, Bridgeport, CT

Sr. Data Analyst

Responsibilities:

  • Gathered requirements from Business Owners, stakeholders, Data Governance Team and the subject matter
  • Experts through meetings to understand needs of the system.
  • Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.
  • Involved in SDLC including requirements gathering, designing, developing, testing, and release to the Production environment.
  • Used Data Warehouse Architecture and Design - Star or Snowflake Schema, FACT and Dimensional Tables,
  • Physical/Logical Data Modelling and data flow diagrams.
  • Involved in developing and maintaining the Requirement Traceability Matrix (RTM).
  • Analysis of the Enterprise Data Warehouse in Teradata to create accurate fact and dimension tables so that it can be queried by OBIEE.
  • Participated with the team in building the Data Quality projects/solutions using Trillium Software System for customers.
  • Involved in Creating impactful dashboard in excel for data reporting by using Get pivot and VLOOKUP, helped in transforming raw data into meaningful, actionable information.
  • Moderate Linux Red hat, UNIX Commands awareness, Application Automated Performance Monitor and Resource Analysis, Oracle DB tools, troubleshooting and system application log reviews to maximize speed and accuracy.
  • Coordinate with Account Managers to discuss and approve Informatica Workflow requirements and logic Troubleshoot design options for new Informatica Workflow.
  • Analyzed SQL performance using Teradata viewpoint.
  • Extensively done ETL process and extracted the data from different source systems like Teradata, Sales force, MS SQL Server, Flat files, XML.
  • Performed numerous Ad-hoc requests, financial reports involving SQL scripts, UNIX, SAS and Teradata.
  • Reviewed basic SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
  • Liaison between Data Warehouse and marketing for using customer data efficiently.
  • Used SharePoint to store and manage documents in a single repository using a secure web-based system
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting on the dashboard.
  • Worked on evaluating, comparing different tools for normalized & de normalized test data management with Hadoop.
  • Involved in developing and maintain ORS Model and MDM rules and guidance etc.
  • Involved in Unit testing and delivered Unit test plans and documented.
  • Perform ETL on data from Dev. and QA box for staging it for micro batching to Spark.

Confidential - Stamford, CT

Sr. Data Analyst

Responsibilities:

  • Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
  • Transformations of requirements into data structures, which can be used to efficiently store, manipulate and retrieve information.
  • Worked with different business users to develop Vision Document and Business Requirement Specifications by gathering Business and Functional Requirements.
  • Experience supporting the full Software Development Life Cycle SDLC in Agile Scrum methodology.
  • Documented the AS-IS Business Workflow adhering to UML standards. Comprehensively performed requirement gathering for enterprise reporting system-using Requisite Pro.
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Ensure that models conform to established best practices including normalization rules and accommodate change in a cost-effective and timely manner.
  • Wrote, tested and implemented Teradata Fast load, Multiload and Bteq scripts, DML and DDL.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata Performed data integrity and balance sheet verification procedures, as well as contributing to process improvements in the product control function.
  • Maintained a Traceability Matrix to ensure that all Functional Requirements are addressed at the Use Case Level as well as the Test Case Level.
  • Performed Functional and GUI Testing to ensure that the user acceptance criteria are met. Created User training materials.
  • Co-coordinated with the SME'S to make sure that all the Business Requirements are addressed in the application
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ and Created new reports based on requirements.
  • Responsible in Generating Weekly ad-hoc reports.
  • Planned, coordinated, and monitored project levels of performance and activities to ensure project completion in time.
  • Created Complex Teradata scripts to generate ad-hoc reports that supported and monitored Day to Day.
  • Involved in Building a specific data-mart as part of a Business Objects Universe, which replaced the existing system of reporting that was based on exporting data sets from Teradata to Excel spreadsheets.
  • Migrated three critical reporting systems to Business Objects and Web Intelligence on a Teradata platform.
  • Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Used data analysis techniques to validate business rules and identify low quality for Missing data in the existing Bank data warehouse EDW.
  • Also Worked on some impact of low quality and/or missing data on the performance of data warehouse client.
  • Identified design flaws in the data warehouse and tested raw data and executed performance scripts.
  • Worked on the ETL process used by the Data Stage System designers used for extracting data from heterogeneous source systems, transforming and loading the data into the data warehouse/ data marts.

We'd love your feedback!