We provide IT Staff Augmentation Services!

Sas Big Data Analyst Resume

Dallas, TX

EXECUTIVE SUMMARY:

  • Total of 10 plus years of experience providing innovative solutions in the Financial and Credit Card industry.
  • Thorough knowledge of the Banking industry and Data manipulation and Statistical Analysis.
  • Experience in design of ETL and ELT systems using Oracle, Teradata, DB2, Netezza, HDFS.
  • Working knowledge of Big Data technologies like Hadoop, MapReduce, HDFS, HBase, Hive
  • Experience with migration from Mainframe to Unix in various projects. Conversion of JCL to Unix Shell Script and from VSAM/PDS in Mainframe to SAS Datasets in Unix.
  • Hands on experience in Unix File management system, space management, scripting, and job management and scheduling
  • Enhancements and evaluation of credit/debit card system and Financial Applications.
  • Involved in all the stages of the software development cycle, namely design, specification, coding, debugging, testing (test plan and test execution), integration and system testing, documentation and maintenance of the programs.
  • Highly successful in developing synergistic relationships to bring projects to completion on time and within budget.
  • Created complex test plans, requirements and design test cases. Managed execution of different testing phases (unit, integration, systems, interface and compliance testing).
  • Ability to work independently as well as collectively as part of a team

TECHNICAL SKILLS:

SAS Skills: BASE SAS, SAS/MACRO, SAS/CONNECT, SAS - SQL, SAS/STATSAS/GRAPH, SAS - ODS, SAS/ACCESS, SAS/ETL,:

SAS Tools: SAS EG, SAS DI Studio, SAS Management Console, SAS Dataflux Manager, SAS Portal, SAS Visual Studio

Advanced SAS: Macros, Reports, Array Processing, Hash Objects, Pointers, SAS Connect

Languages: SAS, PL/SQL, Unix Korn Shell Script, VBA, JavaScript, Perl script, Python, Hive, BTEQ, COBOL, JCL, ASSEMBLER, REXX

Operating Systems: OS/390, Z OS, MVS/ESA, MS WINDOWS, UNIX, Linux.

Databases: DB2, ORACLE, Terradata, Netezza, SQL Server 2005/08, MS ACCESS, VSAM

Big Data Environment: Hadoop, MapReduce, HDFS, HBase, Hive, Pig

CLIENT ENGAGEMENTS:

Confidential, Dallas, TX

SAS Big Data Analyst

Responsibilities:

  • Worked in the requirement analysis with the business team.
  • Worked in Data warehouse modeling for seamless transition due to fluid business and technological requirements
  • Automated and enhancement of the ETL framework pulling data from diverse sources like DB2, Netezza, Unix, CSV and Excel files, loading to a STAR Schema and then building the reporting on top of it.
  • Optimized SQL querying using database analytical functions thus increasing the efficiency of the code.
  • Data analysis of the HDFS data using Hive QL.
  • Built User Defined Functions in Hive for automating repeatable manual queries.
  • Worked in Agile methodology to quickly deploy higher priority changes after user acceptance testing.
  • Worked in the project plan and for creating user test cases and their validation.
  • Wrote Design documents, Program specs and code.
  • Involved in various Testing phases (Unit, System, Integration and User Acceptance testing)

Environment: SAS 9.4, Enterprise Guide, (DI) Studio, Unix Shell Scripts, LSF, Netezza, DB2, Hive, HDFS, Python, SAS Portal

Confidential, New York

SAS Architect

Responsibilities:

  • Worked in the requirement analysis with the business team.
  • Pulled data from multiple databases like Netezza and DB2 and wrote back to DB2
  • Created about 13 programs which was called by a single program for one region.
  • Created scheduling scripts for these programs so that they could be submitted for each region
  • Worked on business queries during UAT..
  • Wrote Design documents, Program specs and SAS code.
  • Involved in various Testing phases (Unit, System, Integration and User Acceptance testing)
  • Used DB2 Merge and other DB2 procedures for faster processing.
  • Used SAS Procedures like PROC SUMMARY for summarization, PROC TABULATE for reporting

Environment: SAS 9.4, Enterprise Guide, (DI) Studio, SAS Management Console, SAS OLAP Cube, Unix Shell Scripts, LSF, Netezza, DB2

Confidential, New York

SAS Analyst Tech Lead

Responsibilities:

  • Worked in the migration of the data mart including the FACT and DIMENSION datasets from one version of SAS to another.
  • Helped in the testing of new functionality like XLSX macro and additional proc template options in the new server
  • Worked in the migration of SAS Stored Processes and their import in the higher version.
  • Worked in the migration of job flows from SAS DI Studio to the LSF 7.6 version
  • Helped the user community to move from SAS EG4.2 to SAS EG 6.1. Helped them with documentation and user queries in the new environment
  • Helped in the project conversion from SAS 9.1 to SAS 9.4 including issues like prompts and automation to jobs
  • Helped in the setup of SAS Connect with external database in the new environment and with the testing and validation of these.
  • Helped in the setup of SFTP Connectivity for file access as well as modification of jobs to use the Tumbleweed Connectivity instead of the direct SFTP between servers.
  • Worked in the project plan and for creating user test cases and their validation.
  • Wrote Design documents, Program specs and SAS code.
  • Involved in various Testing phases (Unit, System, Integration and Parallel Run)

Environment: SAS 9.4, SAS/Base, SAS/Graph, SAS/Macros, SAS-ODS, Enterprise Guide, (DI) Studio, SAS Management Console, Unix Shell Scripts, LSF, SAS Portal

Confidential, New York

SAS Analyst Technical Lead

Responsibilities:-

  • Initial Analysis including Impact Analysis, Requirement Analysis,
  • Code level Mapping of fields and jobs
  • Preparation of High level and Low level design documents
  • Development and Modification of around 55 jobs which were impacted by the migration in very tight timelines.
  • Produced Listings, Reports, Tabulations and Summaries of the data using procedures such as Proc Frequency, Proc Means, Proc Univariate, Proc Summary, Proc Transpose, Proc Tabulate, Proc Report
  • Helped in the set up of Tumbleweed Connectivity for file access as well as modification of jobs to use the Tumbleweed Connectivity instead of the direct SFTP between servers
  • Deploying of code in development using SAS DI Studio.
  • Testing of the modified jobs in unit testing, System testing and Integration testing.
  • Warranty support after implementation for changing requirements

Environment: SAS V9.2, SAS/Base, SAS/Graph, SAS/Macros, SAS-ODS, Enterprise Guide, (DI) Studio, OLAP Cube SAS Information Map, Unix Shell Scripts, LSF

Confidential, Irving, TX

SAS Analyst Programmer Lead

Responsibilities:

  • Design of a Teradata database ingesting data from VSAM, DB2 and text input using a SAS interface running on HP Unix machine.
  • Optimized design of the database for fast Update and Inserts of large volume of data and quick select for reporting.
  • Test codes, comparing outputs between different data systems for full data (6M members) tests.
  • Prepare reports, maintain progress reports, draft and publish procedure, consult with team.
  • Identified opportunities for cost savings by eliminating duplicate processing and added value to the client.
  • Used SQL assist and SAS EG for coding.
  • Extensive use of Autosys for job scheduling
  • Migration of data from text files and Excel files to Teradata using SAS/Access and native Teradata Procs.
  • Created complex and reusable Macros and extensively used existing macros and developed SAS
  • Developed an Automated Quality Control System for Data Validation and data integrity.
  • Wrote Design documents, Program specs and SAS code.
  • Involved in various Testing phases (Unit, System and Integration)

Environment: SAS V9.0, SAS/Access, Teradata BTEQ, SAS EG, SQL Assist, Connect Direct, Autosys, CVS

Confidential, Irving, TX

SAS Lead Analyst Programmer

Responsibilities:

  • Creation of new SAS datasets mirroring the already existing data for other partners so that a uniform reporting could be developed for Business Analysis.
  • Created complex and reusable Macros and extensively used existing macros
  • Improve code efficiency and run time load on system resources using parallel processing, hash lookups and Compiled SAS macros
  • Developed SAS Programs for Data Cleaning, Validation, Analysis and Report generation. Tested and debuggedexisting macros.
  • Did CCAR stress testing analysis as a result of portfolio addition.
  • Creation of new Rexx and Easytrieve scripts for validation of new test files from Best Buy.
  • Played a major role in quick generation of Ad-hoc Business Analysis Reports and resolving data issues upon request.
  • Built decision trees and neural networks using SAS 9.1
  • Developed an Automated Quality Control System for Data Validation and data integrity.
  • Worked with DI Studio for creating user transformations and integration of data sources.
  • Developed a JCL to run jobs in MVS system.
  • Documentation, Preparing statistical reports using SAS.
  • Wrote Design documents, Program specs and SAS code.
  • Played a role of coordinator between onsite and offshore for project delivery
  • Involved in various Testing phases (Unit, System and Integration)
  • Created SAS MACROS and SAS GRAPHICS. Used Proc REPORT and Proc TABULATE to generate reports.
  • Creation of new JCL jobs in OPC scheduler
  • Created a customized JCL for Job scheduling.

Environment: SAS V9.1, SAS/Base, SAS DI Studio SAS SQL, SAS/Graph, SAS/Macros, SAS-AF,SAS-ODS, IBM Z-OS, COBOL II, DB2, JCL, VSAM,ENDEVOR, Easytrieve FILE-AID

Confidential, Dallas, TX

SAS Lead Programmer

Responsibilities:

  • SAS programming and onsite coordinator.
  • Data Extraction from various sources like Flat files, Excel spread sheets, VSAM and manipulated to the format of intermediate tables.
  • Involved in populating intermediate tables in Data warehouse for Weekly, Monthly and Yearly load.
  • Worked with the Architects to design 20 datamart tables and around 10 FACT tables in a snowflake schema.
  • Created Reports using Base SAS, SAS/Reports Procedure.
  • Generate reports using PROC REPORT,PROC TABULATE,
  • Created Macros, in order to generate reports daily, monthly basis
  • . Worked with DI Studio for creating customized user transformations and integration of data sources.
  • Created SAS Views from tables in DB2 using SAS/Access.
  • Generation of reports and transferring the output to the SAS Output Delivery System.
  • Refreshing tables in DB2 Schemas with the SAS data sets
  • Coding SAS programs with the use of Base SAS / Macro’s for ad-hoc jobs.
  • Wrote VBA Screen scrapping tools to screen scrap mainframe screens for testing and validation
  • Client interaction and sending work to off shore
  • Work with IT support groups and management teams to define and support system and information needs,
  • Provide high quality service to all internal division and corporate support teams as well as to all external customers,
  • Involved in preparing technical specification documents based on user requirements,

Environment: SAS V9.0, SAS/Base, SAS/Graph, SAS DI Studio, SAS/Macros, DB2, MS-Excel, VBA, Unix Shell Script, Rumba Scripting

Confidential, Dallas, TX

SAS Lead Programmer

Responsibilities:

  • Sas Programmer and team lead. Involved coordination with onshore and client
  • Produced Listings, Reports, Tabulations and Summaries of the data using procedures such as Proc Frequency, Proc Means, Proc Univariate, Proc Summary, Proc Tabulate, Proc Report, Proc Transpose for statistical analysis by Marketing and MIS teams
  • Provided input to planning documents such as Validation, Study Protocol and Statistical Analysis Plans.
  • Created and implemented statistical analysis plans and specification documents; participated in database design, data collection guidelines, and logic checks.
  • Participated in designing, coding, testing, debugging and documenting SAS Programs and Macros.
  • Developed SAS programs using Base SAS for tabulation counts, correlation’s and checks for dispersion for normality using PROC Means, PROC Tabulate and PROC Univariate.
  • Applied generally accepted Programming Standards and Techniques to assure efficient program logic and data manipulation.
  • Created a SAS reports using the Data Null technique and Proc Report for submission as per Regz regulations and company standards.
  • Formatted HTML, RTF and PDF reports, using SAS - output delivery system ODS.

Environment: BASE SAS V9, SAS/ACCESS, SAS/CONNECT, SAS/Macro, SAS/GRAPH, SAS/STAT and UNIX.

Confidential, Dallas, TX

SAS Analyst Programmer

Responsibilities:

  • Developed SAS programs using Proc Tabulate, SAS Base and SAS/Stat that enabled users to generate reports on Daily/Weekly/Monthly/Quarterly/Annual reports.
  • Generated adhoc reports for company’s audit and inspection requirements.
  • Data was extracted from databases and also flat files.
  • Preprocessed training data and performed advanced modeling on testing dataset.
  • Evaluated the model performance using decision trees and confusion matrices.
  • Created SAS data mining and scoring data sets form source tables by programming in BASE SAS.
  • Report Generation and Documentation with respect to evaluation of model performance and effectively communicating the results to higher management
  • Developed and maintained multivariate statistical models used to predict customer behavior and to profile customer potential.
  • Calculating possible risk and premiums for individual organizations.
  • Customized existing SAS programs and created new programs using SAS/Macros to improve the consistency of the results.
  • Compile statistics and create detailed reports and graphs for management and customer review using SAS/Graph, SAS/Macro and SAS/Report.
  • Correlating and presenting weekly or monthly data to senior management.
  • Used Proc Frequency for frequency calculation as well as Merge and Update for data step processing.
  • Used Proc Summary for summarization

Environment: SAS/Base, SAS/Macro, SAS/Stat, SAS/Graph, SAS/SQL, Oracle, PL/SQL, MS-Excel and HTML

Confidential, Dallas, TX

Analyst Programmer

Responsibilities:

  • Migration of data from DB2 to Sas Files in UNIX
  • Creation of network nodes for successful transportation of files
  • New Unix Shell scripts to run the jobs in place of the JCL
  • Creation of new functional parameters for reusable macros for the new jobs in Unix.
  • Correlating and presenting weekly or monthly data to senior management.

Environment: SAS/Base, SAS/Macro, SAS/Stat, Unix Shell Script, Perl Unix programming

Hire Now