We provide IT Staff Augmentation Services!

Business Systems Analyst Resume Profile

Calabasas, CA


  • IT Pro with 10 years of experience in businessanalysis, testing, code development and training.
  • Demonstrated ability to build relationship with senior-level managers to understand their business problems and document them through different tools and methodologies.
  • Proven ability to lead discussions to broker technology solutions among key project stakeholders.
  • Strong blend of technology skills, business understanding. Proven ability to pick up new technologies, business process and influence discussions.
  • Strong understanding of data validation, meta-data, data governance, master data management, data migration, data archival, data-cleansing, data extraction, data transformation, data loading.
  • SAP certified FI solutions consultants: Strong understanding of chart of accounts, general ledger, subsidiary general ledger, special g/l transactions, value adjustments, manual accruals, posting control, debit balance check, interest calculation, check management, lockbox.
  • Excellent understanding of banking and financial sector: consumer banking, investment banking, mortgage, collateral debt obligation, collateralized mortgage obligation, value-at-risk, securitization, accounting, FAS 140, FAS 133, G-fee, options, derivative accounting, repo, rev-repo, option pricing.
  • Good conceptual understanding of SOX, COBIT, COSO, CAVR, Six Sigma, data governance,
  • Advanced skills in SQL, UNIX, PL/SQL, VBA, SSIS, PERL
  • Intermediate skills in Java. Beginner skills in Data Flux, HTML, XML, Tomcat, WebLogic.
  • Solid understanding of data normalization, first normal form, second normal form, third normal form, data mart, data warehouse, star schema, facts and dimension, object-oriented programming, source-to-target mapping, data quality, meta-data, data retention and archival.
  • Advanced skills in reporting applications like Cognos, Business Objects, SAS

Technical Skills:

Front-end Tools: Cognos, Business Objects, SAS, MicroStrategy

Programming Languages: PL/SQL, JAVA, PERL, VBA, VB, UNIX, SSIS

Databases: MS Access, Oracle 11g, Teradata, DB2, Sybase, SharePoint

Web Technologies: HTML, DHTML, ASP, CSS, XML, Tomcat, WebLogic

Testing Tools: Rational Test Suite, Load Runner, Quick Test Pro.

Bug Reporting Tools: DOORS, Rational Clear Quest, Quality Center.

ERP: SAP, PeopleSoft, Oracle Financials



Business Systems Analyst

Confidential is the second largest bank in the US asset-wise. I am currently working as a Business Systems Analyst for the Rules Standards project. The objective of the project is to move SharePoint data into SQL Server for building SSRS reports and a Web Portal.

  • Lead discussion with Line Of Business in requirement gathering, solution design and delivery.
  • Work with development team in communicating business requirements, evaluate and recommend technical solution and testing.
  • Develop key documents such as High-level design document, Low-level design document, accelerated change document, workflow diagrams, Use Cases, Data flow diagrams.
  • Key participant in project planning, estimation, solution development, testing and delivery.
  • Provide analytical support to different stakeholders, analyze issues/enhancements.
  • Evaluate technical solutions and influence the design, coding, testing and delivery of technical solutions.
  • Work with the development team and subject matter experts in documenting workflow logic while migrating FPS from .Net 1.0 to .Net 4.0.
  • Develop a test plan, test strategy and test scripts for component integration testing phase.
  • Investigate the root cause of data variance between actual and expected result and communicate them to the development team.
  • Perform ad-hoc analysis of foreclosed portfolio of loan data.


Sr. Data Archival/Master Data Management Analyst

Confidential is the financing arm of Confidential. The company is upgrading several of its legacy applications, as well as enhancing its current applications. I currently part of the company's data archival project, with responsibilities for leading the analysis and quality assurance of the data archival process.

  • Serve as a data steward for migrating the legacy data to archival database.
  • Develop documents to consolidate and harmonize data across multiple business applications.
  • Provide technical support to design, develop, implement and maintain data governance and master data policies and procedures to support business process.
  • Profile current data and develop validation rules so that existing master data conforms to data quality standards before data is entered into different systems of record.
  • Develop complex SQL queries to validate that the data archival process picks up accurately the paid, charge-off and void customers for the data archival process.
  • Work collaboratively with the development team in analyzing the legacy application table structures/data elements for developing a robust, scalable code for high-volume data migration.
  • .Perform root-cause analysis of data variance between expected and actual results for the data migration/archival process.
  • Develop defect resolution plan and communicate changes necessary to resolve the software defects.
  • Drive the software quality assurance for data archival in terms of developing the entry and exit criteria, framework for developing quality assurance, bug fixes, regression and testing.


Business Analyst

Confidential is the second largest bank in the US asset-wise. I am working as a Technical Business/Data Analyst on projects involving General Ledger Segment reporting for Confidential.

  • Develop complex T-SQL queries to validate that segmentation for cost centers has been accurately captured for first lien, Held-for-Sale secondary mortgages.
  • Investigate reasons for exceptions between actual and expected results and escalate findings to the development team.
  • Performed data migration by loading data from several ODS systems to target systems using SSIS.
  • Follow-up on bug-fixes and execute regression testing after code-changes.
  • Develop Program Change Request PCR based on High-Level Design HLD and Low-Level Design LLD documents, and inputs from key project stakeholders.
  • Worked collaboratively with team lead in developing high-level design documents and associated project artifacts for TDR-NCL and Clear Optimization Projects.
  • Work collaboratively with the Development team in clarifying Program Change Requests.
  • Respond to ad-hoc requests for mortgage-related data analysis from business and technical heads.
  • Gather requirements for validating data and develop SQL-based test scripts for data validation.
  • Test mortgage-related data HFS and HFI loans to verify that segmentation has been applied correctly and that journal entries are correctly being booked to the new cost centers.
  • Perform root-cause analysis of data variance between expected and actual results GLSR project.


Data Migration Analyst

Confidential is one of the leading credit card issuers in the US. With an eye on offering full range of banking services, Capital One has purchased two commercial banks in the last three years. At Capital One, I worked as a data analyst as well as Quality Analyst for the DDE group.

  • Worked closely with AML business group to generate reports by consolidating data residing in different databases Teradata Oracle .
  • Generate reports in SAS/Business Objects by consolidating data residing in different databases Teradata, SQL Server, and Oracle to aid auditors to verify compliance with regulations.
  • Consolidate data residing in multiple databases/platforms using SAS to generate actionable business intelligence.
  • Migrate data from multiple source systems to development/test environment using SSIS.
  • Work collaboratively with application development team to support software/application testing in conformity with Capital One's Distributed Data Environment DDE standards.
  • Develop test cases/test scripts and other test artifacts as per Capital One's DDE standards based on business requirement document, technical design document to support Integration testing.
  • Validate data movement from source-to-target identify probable causes for data variance.
  • Identify software defects and perform regression testing to validate defect resolution.
  • Validate data transformation during data movement from source-to-target.
  • Prepare appropriate data for loading into test environment for application testing.
  • Identify performance variance from actual and expected results and escalate issues to development team for software bug fixes.
  • Perform regression testing and provide sign-off on test results for code movement to production.
  • Analyze business requirements and evaluate the software testing methodology for different components of CCB integration.
  • Work collaboratively with the testing and business analytic team in reviewing CCB integration efforts with Capital One Bank.
  • Escalate concerns/issues and provide recommendation to the audit department on the CCB integration efforts.
  • Develop documentation for managerial/audit oversight.

Business Data Analyst

Confidential is one of the two secondary mortgage companies in the Confidential. The company's trillion dollar portfolio is driven by a robust technology infrastructure running on various technology platforms. I am part of the FAS140 TDQ project. The objective of the project is to ensure that appropriate data movement controls -- on the lines of CAVR framework -- are in place to comply with the SOX requirement.

  • Developed requirement documents for source-to-target mapping of column fields for the relevant tables.
  • Performed data profiling, data cleansing, and data validation using SAS.
  • Performed data validation and data variance analysis using SAS.
  • Developed test strategy, data preparation and data cleansing and data loading.
  • Developed test plan, test cases and test scripts for UAT testing.
  • Worked collaboratively with SIT in automating SIT test cases, analyzing and defect remediation.
  • Performed root-cause analysis of data variance in data quality report generated by SAS.
  • Automated UAT test cases using VBA. Analyzed data variance in the TDQ exception report during UAT phase.
  • Worked closely with the SAS Development team and the ETL development team in fixing defects.
  • Performed manual testing of non-automated test scripts as well as regression testing once defects were fixed.
  • Updated Requirement Documents and communicated the changes to all stakeholders.
  • Documented test results and all the project artifacts for audit and managerial sign-offs.

Senior Data/ETL Analyst

Confidential is the world's leading provider of financial services to institutional investors, with 12 trillion in assets under custody and 1.4 trillion under management. I was part of a team building a new BI reporting platform on Congo's.

  • Developed source-to-target documentation for mapping legacy data with data mart layer.
  • Participated in data normalization exercises. Designed Fact, Dimension and Reject tables.
  • Analyze the data model and suggest changes to the model to support the presentation layer in Congo's.
  • Performed root-cause analysis of data variance between source system and data mart.
  • Developed data quality rules based on inputs from business leaders.
  • Recommended strategies for improving the data quality to avoid bad data in data mart.
  • Prepared test data for application testing and loaded it to the Oracle database.
  • Developed testing strategy for new business intelligence application.
  • Developed test cases and test scripts using SQL for back-end testing.
  • Prepared test cases using UNIX shell scripting for regression testing.
  • Developed automated test cases using VB Scripts for parameterized testing in QTP.
  • Filed defects in Rational Clear Quest and assigned them severity levels.
  • Developed test cases on dimensional data with aggregation and transformation rules.
  • Serve as contact-person for offshore development testing team during application development.


Systems/Data Analyst

Confidential a company, is a quasi-government, shareholder-owned entity. I was part of the restatement project, where my principal responsibility was validation and user acceptance testing UAT using SQL queries. The job called for sound analytical skills, advanced Excel, Word and Advanced SQL. A sound understanding of mortgage industry, mortgage products, US GAAP 140, 144, Sarbanes-Oxley SOX , derivatives, derivative accounting, SEC Regulations was necessary. The project was 24x7.

  • Developed functional specification document for business reports on Fannie Mae analytics.
  • Interviewed SMEs to develop the validation and transformation rules for ETL process.
  • Prepared the high-level test strategy, test plan and test scripts for SIT/UAT testing.
  • Developed test cases for regression testing in UNIX using Shell Scripting.
  • Prepared test data for application testing and loaded it to the Oracle database.
  • Analyzed Mortgage-backed securities to establish the correct basis for Interest only and principal only cash flows in line with US GAAP FAS 91 guidelines.
  • Generated business reports using Business Objects, Hyperion and Cognos.
  • Analyzed MRB establish the correct basis for cash flow amortization as per FAS 140.
  • Analyzed Fannie Mae's Real Estate Mortgage Investment Conduit portfolio to establish the correct basis for amortization of cash in line with FAS 144 US GAAP guidelines.
  • Mapped the cash flow data from the Restatement Financial Data Warehouse RFDW into the Securities Cost Basis Sub-Ledger SCBSL and the general ledger.
  • Performed application testing in SIT phase, UAT and pre-production environment.
  • Performed regression testing in UNIX after bug fixes and after every new code release.
  • Worked closely with Deloitte auditors to ensure SOX 404 compliance, risk control and compliance, data availability, data integrity and data governance in light of COBIT and COSO framework.

Hire Now