We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

Pasadena, CA

SUMMARY

  • 7+ Years of professional experience as Data Analyst with extensive experience in Banking and Finance domain with Master Data Management (MDM) experience.
  • Extensive experience in Retail & Commercial Banking, Consumer Servicing, Loan Originations,Check Operations, Investment Banking.
  • Strong knowledge of Basel concepts - Credit Risk, Market Risk, Liquidity Risk.
  • Experience working with Financial Banking systems dealing inMortgages,Collateral Management,Consumer Banking, Trading, MarketandReference Data.
  • Extensive knowledge of various asset classes like Loans, Leases, Agreements, Deposits, Asset Backed Securities, Secured Financing Transactions, Equities, Fixed Income & Derivatives.
  • Strong experience in the Anti-Money Laundering domain well versed with AML regulations and regulatory procedures.
  • Highly experienced in all aspects of Know your customer KYC, Customer Due Diligence, Behavior Detection.
  • Experience in Banking and Financial services projects involving card services, consumer banking, business banking, wealth management, auto loans, student loans and capital markets.
  • Knowledge and clear conceptual understanding of Investment Banking industry including Fixed Income, Bonds and Bond Trading Cycle, Bond Pricing, Equities, Trade Cycle, Derivatives Options, Futures, Swaps and Portfolio Management.
  • Strong experience in Data Analysis, Data Profiling, Data Migration, Data Integration and Metadata Management Services.
  • Extensive Experience inInformaticaPower Center 9.x/8.x/7.x (Designer, Repository manager, Workflow Manager, Workflow Monitor) andInformaticaDataQualitytool (IDQ),InformaticaAnalysttool.
  • Good experience in Data Modeling with expertise in creating Star & Snow-Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin.
  • Experience in Data Analysisfunctions includesData Virtualization, Data Visualization and Data Transformation & Sourcing.
  • Exposure to implementation and operations of data governance, data strategy, data management and solutions
  • Experience in understanding Stored Procedures, Stored Functions, Database Triggers, and Packages using PL/SQL.
  • Good Data Warehousing concepts including Meta data and Data Marts.
  • Experience in Analysis and review of Software and Business Requirement Documents.
  • Strong experience inBI reportingincluding Cognos and Spotfire.
  • Experience in dealing with different data sources ranging from Flat files, SQL server, Oracle, MySQL, MS Access and Excel.
  • Expertise in data governance, data stewardship, data quality, data analytics and reference data management.
  • Experienced in Data Collection, Data Extraction, Data Cleansing, Aggregation, Data Mining, Verification, Reporting and Data Warehousing.
  • Strong understanding of ETL, reporting, business analytics (business intelligence) and data warehouse concepts.
  • Experience with reports and dashboards development for operational and analytical needs and exploring data using data visualization tools.
  • Extensive experience on business intelligence (and BI technologies) tools such as OLAP, Data warehousing, reporting and querying tools, Data mining and Spreadsheets.
  • Worked with BI/Reporting tools such asTableau, Crystal ReportsandQuick sight,having created various Visualizations usingBar/Line/Pie charts,Maps,Scatter Plots,Heat mapsandclient reports.
  • Experience in DBMS/RDBMS implementation using object-oriented concept and database toolkit.
  • Proficient in setting up VPC, private/public subnets, security groups, EC2 instances, creating IAM roles and permissions, configuring redshift, using lambda functions, setting up AWS glue for ETL.
  • Hands on experience in writingETLscript usingInformatica, SSIS, Gluefor database operations and other activities (involving extraction, transformation and loading).
  • Extensive experience with SharePoint, Visio, Power Point, Microsoft Project and Microsoft Office tools.
  • Familiar withAgile(Scrum) andWaterfallmethodologies forSDLC.
  • Executed Test Plan, Test Scripts, and Test Cases based on Design document and User Requirement document for testing purposes.
  • Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases. Expertise in designing and developing Test Plans and Test Scripts.

PROFESSIONAL EXPERIENCE

Confidential, Pasadena, CA

Sr. Data Analyst

Responsibilities:

  • Analyzed the existing customerdata, and understood the necessary information that needs to be captured for each existing customer and prepared missing values, anomalies for the completion of the CDD.
  • Identified and extracted relevantdatafrom largedatasets usingdatareduction andmining techniques
  • Involved indatamigration of variousdatasources from several platforms to Oracle
  • Worked withDataWarehouse in the development and execution ofdataconversion,datacleaning and standardization strategies and plans as several small tables are combined into one singledata repository system MDM (MasterDataManagement).
  • Extracteddatafrom SQL Server intodatamarts, views, and/or flat files for Tableau workbook consumption using T-SQL.
  • Carefully examinedataand reports to make sureconversionis proceeding correctly, run test scripts with variousdatato see how new or customized transactions process through the software and verify and validate accuracy ofdatathrough the generation of a variety of reports.
  • Review theconversionresults (reports, balancing of systems, balancing systemconversionto associated General Ledger accounts, errors) with the customer and obtains client sign-off.
  • Responsible for technicaldataanalysis, interpretation of clientdataand the development of conversionprograms specific for bank applications and application to the general ledger.
  • Maintained a largedatasets, combineddatafrom various sources in varying formats to create SAS datasets and/or ASCII files by using Set and Merge for generating Reports and Graphs
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.
  • Create score cards in IDQ and helpedBusinessAnalyststo define survivorship rues in MDM.
  • Written SQL queries for checking the Data Migration, Data transformations and Database Integrity in Teradata.
  • Planned and documented procedures fordataprocessing and prepareddataflow diagrams for the application.
  • Modify and develop newETLtransformations, summary tables,datastaging areas based upon redesign activities with InformaticaETLtool.
  • DataValidation to check for the properconversionof thedata.DataCleansing to identify baddata and clean it.
  • Performed deep dive analysis of open issues by product (security/entity type), including analysis of system architecture,dataflows from Front Office trade capture system to the GL,datamodels, and unique functionality based on the businessmigrationplans (e.g. provisions functionality for Loans product)
  • Performed data reconciliation between Tableau Dashboard results and QA queries.
  • Experience with complex SQL queries fordataprofiling forDataETLIntegration tool.
  • Involved in Data Modeling of both Logical Design and Physical design of data warehouse and data marts in Star Schema and Snow Flake Schema methodology.
  • Part of team conducting logical data analysis and data modelling JAD sessions, communicated data-related standards.
  • UsedDatawarehousing forDataProfiling to examine thedataavailable in an existing database
  • Highly contributed to the enterprise metadata efforts by completing thedatamapping repository (DMR) information
  • Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
  • Collected business requirements to govern proper data transfer from data source to data target during the data mapping process
  • Used Test Director and Mercury Quality Center for updating the status of all the Test Cases & Test Scripts that are executed during testing process.
  • Developed Test cases for manual testing and automated them using Win Runner, Silk, Load Runner, Silk performer and QTP.

Confidential, Dallas, TX

Data Analyst

Responsibilities:

  • Utilized querying relational databases (T-SQL) and Microsoft SQL Server Integration Services to pull and integrate data from various sources.
  • Implemented ETL processes with SSIS packages using Heterogeneous data sources (SQL Server, ORACLE, Flat Files, Excel source files, XML files, Text files, etc.) and then loaded the data into destination tables by performing different kinds of transformations using SSIS.
  • Extensively used SSIS Import/Export Wizard, for performing the ETL operations. Performed ETL mappings and ETL task by using SSIS from credit card sales database
  • Improved the reporting dashboards (with KPIs) through SSRS and Excel OLAP Pivot table.
  • Defined report layouts including report parameters and wrote queries for Drill down, Drill through reports as per business client's requirements using SSRS.
  • Worked on all types of report types like tables, matrix, charts, sub reports etc.
  • Translated analytic insights into concrete and actionable recommendations for business or product improvement.
  • Responsible for evaluating various RDBMS like OLTP modeling, documentation, and metadata reporting tools including Erwin, and Power Designer Developed logical/ physicaldatamodels using Erwin tool across the subject areas based on the specifications and established referential integrity of the system.
  • Identified and understood the business function activities, entities,dataattributes, table's metadata and documented detailed design specifications for new system.
  • Generated reports (SSRS/Crystal) from SQL Server Database and Cubes and included parameterized and cascading parameterized reports to populate drilldowns, drill through, and sub-report.
  • Extensively used both Star Schema and Snowflake schema methodologies in building and designing the logicaldatamodel in Dimensional Models.
  • Responsible for creating weekly and monthly reports for management based on Cognosdata.
  • Ensured productiondatabeing replicated intodatawarehouse without anydataanomalies from the processing databases.
  • Reviewed the logical model with application developers, ETL Team, DBAs and testing team to provide information about thedatamodel and business requirements.
  • Involved in defect management, including root cause analysis, solution recommendations (s), and next step coordination.
  • Conducted UAT fordataintegrity. Analyzed user requirements, attended Change Request meetings to document changes and implemented procedures to test changes.
  • Queried database using SQL queries to ensuredataintegrity and to validate the inserted and updateddata.

Confidential, Warsaw, NY

Data Analyst

Responsibilities:

  • Collaborates with cross-functional team in support of business case development and identifying modeling method(s) to provide business solutions. Determines the appropriate statistical and analytical methodologies to solve business problems within specific areas of expertise.
  • Builds models and solves complex business problems where analyses of situations and/or data require in-depth evaluation of variable factors.
  • Generating Data Models using Erwin and developed relational database system and involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Participate in documenting the data governance framework including processes for governing the identification, collection, and use of data to assure accuracy and validity.
  • Worked on Metrics reporting, data mining and trends in helpdesk environment using Access.
  • Involved in loading the data from the flat files submitted by vendors into the Oracle12c External tables, extensively used ETL to load data from Oracle database, XML files, and Flat files data and also used import data from IBM Mainframes.
  • Involved with ETL team to develop Informatica mappings for data extraction and loading the data from source to MDM Hub Landing tables.
  • Involved in MDM Process including data modeling, ETL process, and prepared data mapping documents based on graph requirements.
  • Used Pivot tables, VLOOKUP’s and conditional formatting to verify data uploaded to proprietary database and online reporting.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, MS Access and Excel.
  • Coordinated with Data Architects and Data Modelers to create new schemas and view in Netezza for to improve reports execution time, worked on creating optimized Data-Mart reports.
  • Used both Kimball and Bill Inmon methodologies for creating data warehouse and transformed data from different OLTP systems.
  • Evaluate, identify, and solve process and system issues utilizing business analysis, design best practices and recommended enterprise and Tableau solutions.
  • Used advanced functions like VLookups, Pivots, graphs, and analytical and statistical tool packs in Excel.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Extensively used SQL Server tools to develop Oracle stored packages, functions and procedures for Oracle database back-end validations and Web application development.
  • Perform analyses such as regression analysis, logistic regression, discriminant analysis, cluster analysis using SAS programming.

Confidential, Oklahoma City, OK

Data Analyst

Responsibilities:

  • Worked with DBAs to create a best fit PhysicalDataModel from the LogicalDataModel using Erwin.
  • Maintained the model updates using Model Mart for the entire project team.
  • Interacted with clients to assess needs, identify key challenges, and define project scope & deliverables.
  • Involved in developingdatamapping documents for integration into a central model and depictingdataflow across systems.
  • Extensively used Star Schema methodologies in building and designing the logicaldatamodel into Dimensional Models.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables and columns as part ofDataAnalysis responsibilities.
  • Performed Verification, Validation and Transformations on the Inputdata(Text files, XML files) before loading into target database.
  • Monitored the Informatica workflows using Power Center monitor.
  • Involved in performing extensive Back-End Testing by writing SQL queries and PL/SQL stored procedures to extract thedatafrom SQL Database.
  • UsedBusinessObjects for validating the metadata and testing thedataof reports.

Hire Now