We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

5.00/5 (Submit Your Rating)

Richmond, VA

PROFILE:

  • 14+ years of experience in Banking, Insurance and healthcare domain for Data Analysis, Development, production support, Maintenance and Enhancement in SAS, mainframe and AWS technologies. Currently working as Sr. Data Analyst in Confidential
  • 5 years of Experience as Senior Data Analyst in Confidential .
  • 6+ years of Experience as Technical Lead in Confidential .
  • 3+ years of experience as Software Engineer in Confidential .
  • Worked on the Agile Scrum Methodology, Continuous Integration/Delivery.
  • Created various monthly/Quarterly reports that to be submitted to the Federal Reserve Board for Comprehensive Capital Analysis and Review (CCAR), American Bankers Associations (ABA) and the Operational Risk eXchange(ORX).
  • Worked on ETL process, Data Analysis, Data migration, Data preparation, Graphical Presentation, Statistical Analysis, Reporting, Validation and Documentation.
  • Played a key role in the requirement gathering, planning, development and implementation of system enhancements and conversions which followed the appropriate IT guidelines, met the user requirements and completed in a timely fashion.
  • Supported to setup data libraries for business users.
  • Worked closely with System admin team to setup the server environment and validated the servers once it delivered for the SAS implementation
  • Managed users, group and applied security based on the business need.
  • Used SAS/ Access Interface to various databases like RedShift, Oracle, DB2, Teradata
  • Performed the data quality check and validate the data before saving in Risk Data Warehouse.
  • Designed and supported data stewardship processes and implement data cleansing processes for data governance requirements and standards.
  • Monitored ongoing data quality by performing data cleansing, data audit and data validation using the parameters for data integrity and compliance.
  • Communicated quality metrics, issues and resolution to business users; serve as a primary data expert on underlying business rules and data source.
  • Specialized in Data mining - the process of analyzing and retrieving data from large dataset/database in different perspectives and summarizing.
  • Analyzed issues, developed and implanted solutions for high priory production issues.
  • Experience in Software Development Life Cycle process.
  • Capable of executing a number of projects simultaneously with ease. Extremely adaptable with the ability to be productive and efficient under high stress and in fast paced environments.
  • Strong attitude towards learning new technologies and problem solving.

TECHNICAL SKILLS:

Languages: Python, Base SAS 9.X, SQL, PL/SQL, COBOL, JCL

Database: Redshift, Teradata, Oracle, SQL Server, DB2, VSAM, DATACOM.

SAS BI: Enterprise Guide, Information Map, SAS Management console, OLAP cube Studio, Data Management Studio. SAS XML Mapper

Reporting: Tableau, Teradata SQL Assistant

Cloud: AWS Tools

Other: Winscp, PuTTY, F-Secure, Reflection X

Mainframe: SPUFI, FILE-AID, XPEDITOR, ENDEVOR, PLATINUM, ZEKE, DOCUTEXT, ABEND-AID, CHANGEMAN, DATAQUERY, IBMDEBUGTOOL, EASYTRIEVE, Microsoft Visio, HP Quality Centre, Teamtrack

Operating System: Windows, Unix, RHEL Linux, Z/OS 390

Domain: Bank, Finance, Insurance

PROFESSIONAL EXPERIENCE:

Confidential, Richmond, VA

Sr. Data Analyst

Responsibilities:
  • Created a tool to collect all fraud and Loss data from variance system (like US Card, Partnership Card, UK Card, Legal…etc), and save into common repository automatically on daily/monthly basis.
  • Created various monthly/Quarterly reports that to be submitted to the Federal Reserve Board for Comprehensive Capital Analysis and Review (CCAR), American Bankers Associations (ABA) and the Operational Risk eXchange(ORX).
  • Worked independently to analyze various risk types like event Loss data and solve very complex operational, systems-related technical and customer issues.
  • Supported to setup data libraries for business users.
  • Worked closely with System admin team to setup the server environment and validated the servers once it delivered for the SAS implementation.
  • Managed users, group and applied security based on the business need.
  • Used SAS/ Access Interface to various databases like RedShift, Oracle, DB2, Teradata.
  • Created various risk model analysis based on loss data.
  • Created and scheduled SAS programs to generate varies summarized tables which will speed up the analyst abilities to produce more timely analysis on monthly/quarterly basis.
  • Created a tool to compare the all loss data with General Ledger.
  • Performed ETL-Extraction/Transformation/Loading, Data Analysis, Data migration, Data preparation, Graphical Presentation, Statistical Analysis, Reporting, Validation and Documentation.
  • Designed source to target maps for data extraction, transformation and load sequences.
  • Designed and developed many SAS Programs/Macros to analyze financial loss/fraud data.
  • Worked on statistical analysis of files, tables, listings, and graphs as well as ETL / Extract data from database and created SAS statistical analysis reports.
  • Responsible for the technical support of all reporting and forecasting functions of the Internal Loss data events.
  • Created complex SAS program to extract, transform, clean and manage large data files for regularly scheduled and ad hoc analysis reports.
  • Processed large amounts of data for statistical modeling, graphic analysis and reporting.
  • Extensively used SAS procedures like Proc Print, Proc Sort, Proc SQL, Proc Freq, Proc Contents, Proc Export, Proc Import, Proc Datasets, and Proc Format etc...
  • Revised and automated daily, weekly, monthly and quarterly reports to run more efficiently and provide more meaningful information is vital to the overall strategies.
  • Gather and Understand requirements, created design and support construction of analytic data. designed and supported data stewardship processes and implement data cleansing processes.
  • Responsible for system testing and functional testing is completed on time and all time and meets requirements as defined
  • Worked with the data governance teams to ensure data governance requirements and standards are met.
  • Monitored ongoing data quality - define parameters for data integrity and compliance by performing data cleansing, data audit and/or data validation.
  • Communicated quality metrics, issues and resolution to business stake holders; serve as a primary data expert on underlying business rules and data source.

Confidential, Minneapolis, MN

Technical Lead

Responsibilities:
  • Analyzed the business requirements and transforming the requirements to Functional design specifications.
  • Worked independently to analyze and solve very complex operational, systems-related and customer issues.
  • Created parameter driven reports in SAS Enterprise Guide using templates, process flows, stored processes, and multiple data sources.
  • Extracted data from Datacomm table into a flat file to insert data in to DB2 table through SAS/COBOL programs.
  • Cleansed the data for the SAS data modeling.
  • Wrote date conversion Macros from LifeComm date to normal date format.
  • Wrote the several SAS module for converting data from LifeComm to wma for data feedings.
  • Converted existing SAS code into high quality SAS code that is efficient (fast), automated, maintainable, and follows industry best practices.
  • Extensively used SAS procedures like Proc Print, Proc Sort, Proc SQL, Proc Freq, Proc Contents, Proc Export, Proc Import, Proc Datasets, and Proc Format etc...
  • Used Proc SQL for querying and ad-hoc report generation as per the requirements.
  • Wrote JCL code to run the SAS batch jobs and scheduled the job for automation.
  • Performed ETL-Extraction/Transformation/Loading, Data Analysis, Data migration, Data preparation, Graphical Presentation, Statistical Analysis, Reporting, Validation and Documentation.
  • Developed and designed SAS Programs/ Macros to analyze financial data. Statistical analysis files, tables, listings, and graphs. ETL / Extract data from databases and created SAS statistical analysis reports.
  • Core support - Adhoc report generations.
  • Problem contacts Resolution.
  • Service request - Analyzing, designing, coding and testing for minor / major enhancements with end-to-end support.
  • Program review and analysis at various stages.
  • Moved the tested code to production.
  • Lead, coordinated & reported the day-to-day offshore and onsite activities to Client.
  • Revised and automated daily, weekly, monthly and quarterly reports to run more efficiently and provide more meaningful information vital to the overall strategies.
  • Provided production on-call support (24 x 7).

Confidential

Software Engineer

Environment: COBOLII, JCL, CICS, VSAM, DB2.

Responsibilities:
  • Provided production on-call support (24 x 7).
  • Monitored batch jobs in daily, weekly, monthly and yearly run cycles.
  • Analyzing, designing, coding and testing for minor / major enhancements with end-to-end support.
  • Clarified requirements with KPIT. Also interacting with users to get better understanding of the requirements or problem reported.
  • Supported weekly payroll processing in respect of two applications.
  • Participated in weekly KPIT users / client’s status meeting.

We'd love your feedback!