We provide IT Staff Augmentation Services!

Senior Data Analyst/information Systems Analyst Resume

5.00/5 (Submit Your Rating)

San, FranciscO

SUMMARY:

  • Have strong analytical ability, creativity in developing new ideas, provide flexible solutions, identify issues, solve problems, identify trends in data, interpret the results of testing, good verbal and written communication skills, ability to Lead, co - ordinate and provide solution, planning/scheduling skills and sound business knowledge.
  • Work experience in Leading a Team, Leading a Project, Data analysis, design, development, testing, release, monitoring, maintenance and management of Data for Financial/Loan and commercial applications in the Unix environment using Databases Oracle/Teradata/SQLServer /DB2/INFORMIX for 20 years primarily on Oracle and Teradata Database. Include designing, developing, testing and trouble shooting of Teradata scripts/SQL, PL/SQL Stored Procedures and Triggers, ability to apply Optimization/Tuning methods and advance PL/SQL features. Ability to communicate with Business Analyst to discuss new projects to fulfill Reporting requirements.
  • Co-ordinate with Project Managers/DBA’s/Development and Operation’s/production support groups to discuss new installs and Data issues to Research/uncover flaws and find solutions. Worked on Data migrations. Received many letters of recognition from Management.
  • Worked over 18 years on Financial Applications related to Consumer Credit Card, Collections, Housing Loans, Margin Trading, Portfolio Management, Check Encashment, Insurance, Payment Evaluation, Offer enrollment, Offer - Order Management, Billing, Online Banking products, Bill Pay, Direct Pay, Return Item and deposit Detail, Online Desktop deposit, Fraud, Wholesale, Equity, Lease Residual and other credit risk applications.

TECHNICAL SKILLS:

  • MS DOS/UNIX/AIX/WINDOWS Operating System s
  • TERADATA, BTEQ,MLOAD,EXPORT, IMPORT, FAST LOAD
  • ORACLE 10g SQL plus, PL/SQL, Data Warehouse, Data Marts
  • SQL SERVER
  • Data Warehouse SCD, CDC concepts Full/Incremental load methods.
  • Reports, EXCEL
  • C, Pro*C, Shell Scripting, Perl, use of TCP/IP sockets
  • INFORMIX - 4GL, INFORMIX-SQL, Advance Informix
  • Data Modeling, Data Analysis, Trend Analysis, Statistical Analysis, Data investigation,
  • Data mining. Data Presentation, SDLC, Visio, Business Objects Life Cycle projects

WORK EXPERIENCE:

Senior Data Analyst/Information Systems Analyst

Confidential, San Francisco

Responsibilities:

  • Work for Data management and Insights eONE initiative to support Regulatory Reporting for IIS and FlexCube Deposit products. Project postponed
  • Map and Compare data elements between source and destination database systems for discrepancies and Gaps and provide solution. Understand source data for Analysis and Reporting. Understand the composition of regulatory line items by tracing the lineage back to FSDF (Financial Services Data Foundation - Oracle product)
  • Run sql queries to find data issues, referential integrity issues in data loaded from source to Target tables. Understand Balances associated with Instruments at ' Confidential ' granularity. SQL/ Oracle Toad
  • Understand data lineage and prepare data flow for the Deposit systems using Visio

Senior Data Analyst /Information Systems Analyst

Confidential, San Francisco

Responsibilities:

  • A Wealth Management Bank providing Personal and Business banking. I was part of New Account opening (NAO) and Investment Policy Statement (IPS) Monitoring Team to support Data Analysis and Reporting. The Team followed the Scrum Agile framework and carried out work within sprint of two week iteration with Daily stand - ups to track progress. Environment SQL server
  • Understand the backend integration between FlightPath and Revenue center to the Banks Data HUB.
  • Provide stories or requirements for ETL and API developers for data extraction of Billing/Account and Rule Associations Within FlightPath system for Revenue center processing the Banks Billing system. .
  • Map and Compare data elements between source and destination database systems for discrepancies and Gaps and provide solution. Understand source data and map to Database elements for Analysis and Reporting.
  • Prepare test data for developers, prepare SQL scripts for loading test data between Test environments.
  • Review data for data accuracy, identify gaps in data. Validate key data elements identified issues or failures, Create exception reports and present to development team for remediation.
  • Create Adhoc reports in SQL/EXCEL on Business request for House Hold/Party/Customer/Accounts and required metrics accessing MDM and other Databases.
  • Prepare Metadata for developers to better understand underlying data in Database.
  • A system that Associates Accounts to an IPS and manages accounts in line with clients risk expectations. For an IPS, the tool runs a compliance status classification hierarchy test and assigns a status depending on asset class range and Portfolio/strategy. create Weekly report to provide IPS status. Adhoc report to provide gaps in data, Identify missing Accounts from Custodian Source Systems, Account and Target systems.
  • Prior to decommissioning of FolioDynamics IPS creation system understand and Document System.
  • Understand API calls in IPS monitoring System and prepare sequence Diagram.

Senior Data Analyst/Information Systems Analyst

Confidential, San Francisco

Responsibilities:

  • Design/Create/maintained SQL/Store procedures on Sql Server Database for Interim Profitability Model. Process includes creation of product profitability data files to upload format, Upload of product files to SQL server database, Run process to Provide a consolidated view and Create profitability report
  • Reconcile Product profitability data with GL/Oracle Essbase to Lines of Business/ AU and look at change year over year.
  • Review data for data accuracy understand model and compare its data with external source data for accuracy. Carry out data mapping. Analyzing data in external data sources, validate key data elements identified issues or failures using SQL, Oracle Toad, Teradata, Sql Server and EXCEL
  • Understand source data and map to Database elements for loading to database and Analysis and Reporting. Compare data elements between source and destination database systems for discrepancies and Gaps and provide solution. Correct mapping and run through model.
  • Create exception reports on Profitview data, Data comparison between sources, for business review and remediation. Prepare Profitability report for Business in EXCEL
  • Data - Create data model to maintain history data for customer data mapping

Senior Data Analyst/Information Systems Analyst

Confidential, San Ramon

Responsibilities:

  • A Regulatory framework introduced by the Federal Reserve to regulate large Banks. My task involves Analysis and validation of Key Data elements in the Banks Loan system associated to a set of Federal instructions within this framework and Setup Fed Edit process to identify Fed failures, Report and provide remediation. Work with Business Analyst and other groups to discuss requirements.
  • Provide data for Regulatory Reporting. According to Business requirements Retrieve data from Mainframe Source System to compare with the Banks Database to identify gaps and provide explanation and remediation.
  • Create Oracle SQL s to emulate Federal instructions or conditions and execute these scripts against the Quarterly loan portfolio Oracle 11g database. Prepare SQL s for Business logic
  • Prepare power point / Excel presentation for management to support results from the Quarterly Fed Edit process Using EXCEL, pivot tables, Charts, and power point. Categorize failures into Projects, Error types for Analysis and Remediation
  • Analyzing data received from external data sources, validate key data elements against Fed Instructions/Fed edit checks. Provide feedback to Fed identified issues or failures using SQL, Toad and EXCEL
  • Understand source data and map to Database elements for loading to database and Analysis and Reporting. Compare data elements between source and destination database systems for discrepancies and Gaps.
  • Provide Null trend Analysis of key data elements to understand impact on exposure. Execute Fed Edits checks on data elements sourced by Moody Analytical application used for spreading customer financials as part of The underwriting process.Understand Allowable nulls. Work directly with Business Analyst and other required groups to discuss and assess data issues and find resolution. Provide gap analysis
  • Provide QA work for project Implementations. Perform data comparison between source and target tables to verify data loaded correctly and aligned with Fed Edits and other requirements.
  • According to Business Requirements, Pull data using Ultraquest from Mainframe Source system and compare with Banks Oracle database. Create daily CSV files for Daily Transactions combining with Obligor/obligation data, Transaction wise Summaries, Transaction wise Month - to-date summaries using SQL/Oracle Reporting/Sqlplus. Identify gaps and provide remediation and explanation. Communicated with many groups and Management to identify the data required for the project. Escalate issues

Senior Data Analyst/Information Systems Analyst

Confidential, San Francisco

Responsibilities:

  • Work for Energy Efficiency group. Identify products/programs and savings set up requirement and enable them in MDSS. Carry out Data Analysis for Data Quality, Data cleansing and data improvement efforts. Design and Develop processes to improve Reporting.
  • Create and develop SQL scripts using TOAD/SQL to perform measure or products setup in MDSS an Oracle Database.
  • Determine the data needs, collect data to achieve desired data outcomes.
  • Support metrics populations based on the data changes to the tables. Control the migration of the products and savings data changes, reference data changes through the development life cycle .
  • Write procedures /SQL's to enable data validations at time of releases.
  • QA/QC of the data during release cycles and project implementation.
  • Data mapping - Map source to Target data to include mapping measures and services.
  • Identify data quality issues, investigate issues, research and implement solutions to correct problems Carry out data validation, data integrity, data profiling, data auditing, and data gap analyses, Implement data cleansing and data improvement initiatives.
  • Design code to correct historical data. create a routine clean up strategy and a maintenance plan as needed.
  • Design and develop processes to validate new data and changes in processes.
  • Manage and enforce data constrains to ensure integrity of the database tables analyze and validate the data set up and ensure quality assurance by ensuring no duplicate records, codes or attributes in key tables.
  • Assist with impact analysis of any changes to the savings tables.
  • Liaise with Business analyst and Program Mangers to discuss Reporting needs. Design and develop processes/SQL scripts for Reporting.
  • Design and Develop Dashboard Reports to enhance customer experience.
  • Received letter of Excellence from management for Data presentation and a job well done.
  • Improve existing reporting scripts for performance and flexibility
  • Produce relevant dashboard to the Business team.
  • Create error reporting During Data Quality initiative
  • Communicates with peers inside and outside of the department exchanging ideas or gathering information

Senior Data Analyst

Confidential, San Francisco

Responsibilities:

  • Aptitude for testing and debugging code, analyzing data to identify data issues from external data sources and creating solutions.
  • Understand Data Models to prepare SQL scripts for Data Analysis.
  • Interpret results from testing. Prepare Data quality analysis including data mapping reviews, data validation and remediation and issue impact analysis. Validate the accuracy of the data, uncover flaws/data issues in the requirements/design from a Business perspective .
  • Understand data mappings and data dictionary to Compare data elements between source and destination database systems for discrepancies and Gaps. Produce Source to target mapping
  • Map Reporting Dimensions/Metrics to source columns, prepares source to Target mapping documents by Mapping Source columns with Target columns in Data warehouse .
  • Work directly with Business partners, Business Analyst and other required groups to discuss and assess data issues and find resolution. Provide gap analysis
  • Understand the recruiting Dimensions and metrics and map with source information for reporting . Meet with Business partners to gather requirements
  • Run sql queries to find data issues, referential integrity issues in data loaded from source to Dimension/Fact tables using Data Warehouse SCD, CDC concepts Full /Incremental load methods.
  • Understand data mappings to Compare data elements/counts between Oracle and DB2 database systems for discrepancies. Generate comparison reports using EXCEL to identify issues and find solutions
  • Produced data flow of entire Human Resources production Environment using Visio. work well commended by upper management.
  • Create complex SQL scripts for Oracle database and DB2 to load data to database and Data Analysis Using TOAD / AQT
  • Prepare functional specification for source related activities in Talent management Acquisition system.

Senior Data Analyst /Database Development

Confidential, San Francisco

Responsibilities:

  • Work as Senior Data Analyst/Database Development for Enterprise Capital Management operations group, supporting and providing data for Economic and Regulatory Capital requirements including Basel 1, Basel 2 and Basel 3 forecasting and management Reporting for lines of Business having a Lead role in Equity,Lease Residual and other work streams .
  • Work directly with Business partners, Business Analyst and other required groups to identify and assess requirements, product gaps and Develop solution to provide data for Capital status, Risk exposure and other key estimates for the Bank using Teradata utilities ( SQL, Mload, Fast load, Bteq, Import, Export), Unix shell scripting
  • Create Unix/Shell/Sql scripts to implement DDL changes in Testing and production environments.
  • Create Month End, Data Load jobs using UNIX/Shell, SQL utilities Bteq /MLOAD /IMPORT/EXPORT and Data processing scripts using sql to calculate and produce data to assess credit status.
  • Aptitude for testing and debugging code, analyzing data to identify data issues due to external data sources and creating solutions
  • Interpret results from testing. Prepare Data quality analysis including data mapping reviews, data validation and remediation and issue impact analysis.Validate the accuracy of the data, uncover flaws /data issues in the requirements /design from a Business perspective . Understand data mappings to Compare data elements between source and destination database systems for discrepancies
  • Generate Excel reports to identify issues and find solutions. Prepare Test cases, document issues and resolutions and present to the appropriate groups. Identify trends in data Be able to interpret the results of testing cycles and primarily responsible for the decision support needs of a specific business area.
  • Work with the Quantitative Analyst to come up with new Model and Develop SQL criteria for Capital calculation.
  • Create complex SQL scripts for Data Analysis to determine validity of data writing of scripts for Large volumes of data using commands BTEQ, SQL, MLOAD, IMPORT, EXPORT
  • Manage and handle the daily/monthly operations/processing of applications assigned. Open trouble tickets to resolve issues with external groups. Communicate with Business/ and other groups responsible for respective projects.

Data Analyst

Confidential, San Francisco

Responsibilities:

  • Work as Senior Data Analyst for Internet Services group Marketing Analytics Division carrying out Data Validation and providing Reporting requirements for Management related to Financial Products Sales/metrics and financial forecasting.
  • Prepare weekly /monthly Excel Reports /Dash boards by accessing databases Oracle, Teradata and other sources to provide financial plan/forecast, trends on Banking products for defined product hierarchy including graphical representations, Pivot tables and Charts. validate data loads, Identify issues in data and find solutions .
  • Understand the Data Models to prepare SQL scripts for Data Analysis and Reporting.
  • Understand data mappings to Compare data elements between source and destination database systems for discrepancies.
  • Create SQL scripts for Oracle/Teradata databases to access data for reporting. TOAD / SQL *PLUS /SQL Assistant.

Data Analyst

Confidential, San Francisco

Responsibilities:

  • Validate the accuracy of the integrated data associated with the Inclusion/Exclusion and Scheduling functionality (I/ Confidential Engine), uncover flaws /data issues in the requirements /design from a Business perspective
  • Carried out validation on Scheduling Meta - data between Oracle and SQL server databases.
  • Was assigned a Lead role from a Kean perspective, communicated/ coordinated the effort successfully on target to the satisfaction of the Business within a limited timeline.
  • Understand data mappings to Compare data elements/ counts between Oracle and SQL server database systems for discrepancies.
  • Generate comparison reports using EXCEL to identify issues and find solutions
  • Primarily Create SQL scripts for Oracle database for Data Analysis to determine validity of data migrated.
  • Used TOAD/SQL *PLUS/SQL Assistant. Access to SQL server as required
  • Maintain and use Shell scripts embedded with SQL to run Batch processes that load data on Business criteria s to database

Software Engineer /Data Analyst

Confidential, San Francisco

Responsibilities:

  • Perform Testing on data migrated to the Fraud Database. Validate the accuracy of the integration between Confidential and Wachovia data, uncover flaws /data issues in the requirements/design from a Business perspective.
  • Generate comparison reports for databases to identify issues and find solutions. Prepare Test cases, document issues and resolutions and present to the appropriate groups.
  • Understand the Data Models to prepare SQL scripts for Data Analysis.
  • Understand data mappings to Compare data elements between source and destination database systems for discrepancies.
  • Create SQL scripts in Oracle/Teradata databases for Data Analysis to determine validity of data migrated. Used TOAD /SQL *PLUS /SQL Assistant to connect Oracle and Teradata respectively. Writing of scripts for large volumes of data using commands BTEQ, SQL, IMPORT, EXPORT
  • Setting up Cron jobs that run SQL scripts to retrieve temporarily stored weekend transaction data for reporting.

Software Engineer /Data Analyst

Confidential, San Francisco

Responsibilities:

  • Gathering information by communicating with Business to prepare requirement specifications for online banking products.
  • Carry out Data Analysis for Business requirements to identify data elements for reporting. Provide Business with weekly and Monthly reporting using EXCEL with graphical representations, Pivot tables, charts to monitor activities, growth and profitability of Online Banking products using Teradata and Oracle Databases.
  • Create SQL scripts for Oracle and Teradata databases for Data Analysis. use SQL *PLUS /SQL Assistant to connect to Oracle and Teradata respectively. Writing of scripts for large volumes of data using Teradata commands MLOAD, FAST LOAD, BTEQ, SQL, IMPORT, EXPORT
  • Maintain and use Shell scripts embedded with SQL to run Batch processes that load data on Business criteria s to Teradata database.
  • Preparing and Execution of Shell scripts that load data into the Online Banking Oracle and Teradata Databases weekly and monthly.
  • Transport data to and from Access Database.

Software Engineer /Data Analyst/Business Analyst/QA

Confidential, San Francisco

Responsibilities:

  • Perform User Acceptance Testing for data migrated from Oracle to Teradata database. As part of UAT the project validated the accuracy of the data migrated, uncover flaws /data issues in the requirements /design from a Business perspective.
  • Understand the Data Models to prepare SQL scripts for Data Analysis. understand data mappings to Compare data elements between source and destination database systems for discrepancies.
  • Create SQL scripts in Oracle /Teradata databases for Data Analysis to determine validity of data migrated. Used TOAD / SQL *PLUS /SQL Assistant to connect Oracle and Teradata respectively. Writing scripts for large volumes of data using Teradata commands BTEQ,SQL,IMPORT, EXPORT
  • Generate comparison reports to identify issues and find solutions. Document issues and resolutions and present to the appropriate groups
  • Create reports using Business Objects and SQL to view Account/customer offer management Data and portfolio according to established hierarchy such as, House Hold, Customer and Account for Offer and Billing data.
  • Prepared Test Plan, Project plan, Documentation on Data Quality. Create /Enhance Metadata for tables. and other supporting documentation for data verification

We'd love your feedback!