We provide IT Staff Augmentation Services!

Sas Analyst Resume

0/5 (Submit Your Rating)

Dayton, OH

SUMMARY

  • SAS Programmer/Analyst with 8 Plus years of experience in development, implementation and testing of applications mostly in Finical Sector
  • Over 8 years of extensive experience in working at multiple levels of a Metadata Management project, including designing the metadata strategy, defining the metadata management architecture, managing the metadata project, mentoring metadata developer and conducting trainings
  • Strong experience in SAS/BASE, SAS/MACRO, SAS/CONNECT, SAS/ODS, SAS/GRAPH, SAS/SQL, SAS/STAT, SAS/ACCESS in WINDOWS and UNIX environments.
  • Worked extensively on SAS Procedures and Data Step Programming to develop the process flow applications from Data Acquisition to Data Modification/Reporting.
  • Experienced in Base and Advanced SAS programming procedures like with PROC SQL, PROC REPORT, PROC ACCESS, PROC GPLOT, PROC GCHART, PROC FORMAT, PROC TABULATE, PROC APPEND, PROC TRANSPOSE.
  • Proficiency in ETL processing on various data sources like Oracle, DB2, SQL server, Teradata, MS Access, MS Excel and text files.
  • Experience in SAS/STAT procedures such as PROC REG, PROC GLM, PROC FREQ, PROC MEANS, PROC ANOVA, PROC MIXED, and PROC UNIVARIATE
  • Experience in Teradata development and design of ETL methodology for supporting data transformations & processing in a corporate wide ETL Solution using Teradata TD12.0/TD13.0, Mainframe and Informatica Power center 8.6.0,9.0.1, administration, analysing business needs of clients, developing effective and efficient solutions and ensuring client deliverable within committed deadlines
  • Working knowledge of UNICA Affinium Campaign and Teradata CRM tools and have worked on various databases like Oracle, DB2 and Teradata.
  • Experienced in Generating reports with browser interfaces like SAS Web Report Studio, windows client SAS Enterprise Guide and built reports using SAS Add - In for Microsoft Office
  • Designing, documenting and creating SAS OLAP Cubes within SAS OLAP Cube Studio.
  • Administering large Teradata database system in development, staging and production.
  • Involved in various stages of Software development life cycle.
  • Strong hands on experience using Teradata utilities (FastExport, MultiLoad, Fast Load, Tpump, and BTEQ Query Man).
  • Skilled in Data Warehousing Logical and Physical Data Modeling using Star Schema and Snowflake Schema.
  • Experience in Banking and Credit Card industry business processes.
  • Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1/7.1 , Teradata
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities
  • Sourced data from disparate sources like Mainframe Z/OS, UNIX flat files, IBM DB2 and loaded into Teradata DW.
  • Strong Experience in Designing end-to-end ETL framework and strategies to handle Re-startability, Error handling, Data reconciliation, Batch processing and process automation
  • Strong experience in Data analysis, Business rules development, data mapping and translating business requirements into technical design specifications
  • Good understanding of Data Modelling (Star and Snow Flake Schemas), Physical and Logical models, DWH concepts
  • Good experience in setting up real time Change Data Capture process using IBM Change Data Capture (CDC) tool sourcing DB2 database and targeting Netezza database
  • Good experience in Data profiling and Data analysis using IBM InfoSphere Information Analyzer
  • Good understanding of SCRUM methodology and Sprint planning to achieve the project goals and milestones effectively and efficiently
  • Experienced in developing Data warehouse/Data Marts, Star Schema, Snowflake Schema and changing Dimensions.
  • Excellent command in writing complex routines for Data Validations, Data Extraction, transformation and loading to target decision support systems using MS Excel, Pivot tables and SAS on various environments.
  • Very good knowledge of Budgeting, Forecasting and Financial planning, created Balance Sheet, Income Statement and Cash Flow Statement, Multiple currency using Hyperion Planning
  • Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts
  • Strong Teradata SQL, ANSI SQL coding skills.
  • Proven skills in Data Cleansing, Data Archival, Data Migration, ad-hoc reporting and coding utilizing SAS on UNIX, Windows.
  • Expertise in Report formatting, Batch processing, Data Loading and Export using BTEQ.
  • Did the performance tuning of user queries by analysing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes
  • Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.
  • Involved in designing and building stored procedures, view generators and macros for the module.
  • Worked on moving data to dedicated environment for Analytics for better performance. Involved in
  • Analysis, Administrative, Security, Development and Support.
  • Expertise in Procedures, Functions, Database Packages and Triggers. And experience in trouble shooting techniques, tuning the SQL statements, Query Optimization and Dynamic SQL.
  • Good knowledge of Performance tuning, Application Development and Application Support on UNIX, MVS and WINDOWS NT Environments
  • Developed UNIX Shell scripts for Batch processing.
  • Responsible for writing Deployment/Release Notes before the project release and End User’s Manual for production team.
  • Trained in Hadoop and worked on proof of concept to convert application from Mainframe Teradata to Hadoop.

TECHNICAL SKILLS

RDBMS: Teradata, MS Access, DB2

Query Tools: Teradata SQL Assistant, SQL*Plus

ETL Tools: Informatica Power Center, Mainframe JCL

Scheduling Tools: Autosys, CA7

SAS Procedures: Print, Means, Univariate, Correlation, Regression, SQL, Report, Freq, Sort, Summary, Tabulate, Format, Import, Export, Transpose, Compare, Gplot and Gchart.

Scripting: UNIX Shell Scripting

SAS BI Tools: SAS Enterprise Guide, SAS Data Integration Studio, SAS Management Console, SAS Information Map Studio, SAS Customer Intelligence Studio

Languages: SQL, JCL, COBOL, XML

SAS Expertise: SAS v 9.0, 9.1, 9.2, 9.3 SAS/BASE, SAS/MACROS, SAS/STAT, SAS/GRAPH, SAS/SQL, SAS/ACCESS, SAS/ODS, SAS/REPORTS.

BI & ETL Tools: Informatica 8.6 (PowerCenter, PowerExchange, Metadata Manager), IBM InfoSphere 8.7 (DataStage, Business Glossary, Metadata Workbench

Reporting Tools: Hyperion Web Analysis 9.3.1, Brio 6.x, Interactive Reporting, Smart View, Hyperion SQR, Financial Reporting, Hyperion spreadsheet Add In, Cognos, Data Reports, Business Objects 5.0, Micro Strategy 8.2, MS Access Reports

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 3.5.2/3.x/6.0, TOAD

GUI: MS Office Suite, Visual Basic

Documentation: MS-Office 2003/07/10, MS-Visio, IBM Lotus Notes

Operating Systems: Windows XP, Windows NT/98/2000, UNIX

PROFESSIONAL EXPERIENCE

Confidential, Dayton, OH

SAS Analyst

Responsibilities:

  • Importing data creating SAS Datasets, used SAS/SQL to join tables and merging data using DATA MERGE, PROC APPEND according to the requirement, generate the reports and analyze those results.
  • Created SAS Modified Reports using the Data Null technique and wiritten ad-hoc queries to service ad-hoc report requests.
  • Used SAS- EG to create SAS programs
  • Converted the existing SAS base programs to SAS EG
  • Converted SAS codes to SAS Stored Processes
  • Performed SAS analysis using the customer database
  • Responsible for creating ETL design specification document to load data from operational data store to data warehouse.
  • Develop FASTLOAD and MLOAD scripts to load data from legacy system and flat files through Informatica to Teradata data warehouse.
  • Involved with Business analysts to Provide Business Performance Data using Teradata, Oracle, SQL, BTEQ, and UNIX.
  • Building up VBA Macro for Processes which are recurring jobs to pull data for specific requirement.
  • Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.
  • Wrote BTEQ, SQL scripts for large data pulls and ad hoc reports for analysis.
  • As per Ad-hoc request created tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.
  • Strong working experience in understanding, modifying existing SAS programs and creating new SAS programs using Macros.
  • Identify data discrepancies and produce resulting datasets, tables, listings and figures to ease correction in the data.
  • Scrutinize SAS processes and prepare extensive documentation for non-technical and technical audience.
  • Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
  • Created different set of tables (Like Set, Multiset, Derived, Volatile), Macros, views, procedures using SQL scripts.
  • Automated data analysis programs with a goal of improving efficiencies. Used SAS system macros for error handling, code validation, date stamping of log files and collected files to a given directory.
  • Generated listings and reports from SAS programs using MACROS, ODS and PROC TEMPLATE /REPORT/ TABULATE.
  • Worked on Testing Primary Indexes and SKEW ratio before populating a table, used sampling techniques.
  • Involved in Unica project by providing the useful information pertaining to existing campaign process, standard exclusion criteria and output files to different channels.
  • Automated daily, weekly and monthly reports on UNIX platform and documented and tested all queries used to pull data for reporting purposes. to monitor portfolio pricing.
  • Automating and Scheduling the Teradata SQL Scripts in UNIX using Korn Shell scripting.
  • Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.
  • Worked with Business analysts to Provide Business Performance Data and other data using Teradata, Oracle, SQL, BTEQ, and UNIX.
  • Responsible for analysing business requirements and developing Reports using PowerPoint, Excel to provide data analysis solutions to business clients.
  • Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.
  • Used Output Delivery system (ODS) facility to direct SAS output to RTF, PDF and HTML files.
  • Wrote BTEQ, SQL scripts for large data pulls and ad hoc reports for analysis.
  • As per Ad-hoc request created History tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.
  • Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
  • Worked on tables (Like Set, Multiset, Derived, Volatile, Global Temporary), Macros, views, procedures using SQL scripts.
  • Worked on Testing Primary Indexes and SKEW ratio before populating a table, used sampling techniques, Explain Plan in Teradata before Query Large tables with billons of records and with several joins.
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from multiple sources.
  • Provided monthly reports related to Card Data quality by using Teradata, SAS EG, MS Excel, Pivot Charts.
  • Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.

Environment: SAS Enterprise Guide, MS Excel, MS Word, Teradata 13.0, Oracle, Teradata SQL Assistant, VB, Tableau,BTEQ, Erwin, Oracle 9i, PC SAS, MVS, UNIX Shell scripts, UNICA,DB2, SQL, PowerPoint, SAS Enterprise Guide, BOBJ, Putty, Excel

Confidential, LOS Angeles, CA

SAS Analyst

Responsibilities:

  • Performed analysis on campaign response datasets and created reports with extensive use of BASE SAS, Macros, SAS reports procedures and linking with DDE to produce tables and graphs in EXCEL.
  • Develop the code according to the LLD using tools like the SAS-RAW DATA LOAD (RDL) tool
  • Standardize the code using tools like the ALIGN JCL, ASA and QRC
  • Perform Peer Review and Code Walkthrough.
  • Performed statistical analysis and data management on study data by utilizing appropriate statistical methods using SAS and SAS tools.
  • Used base SAS and SAS/SQL to generate tables.
  • Developed various SAS reports for Survey project and involved in code reviews for all the developed reports
  • To do an impact Analysis for the New/Changed Requirements and prepare LLD (Low Level Design).
  • Compare Client Supplied products like BRD, HLD with the LLD to find out any incompleteness.
  • Prepare UTP and UTS.
  • The project’s objective is to provide Development, Enhancements and Maintenance to the Data Warehouse
  • Converted Excel application (for above-mentioned advertising campaign) to HTML, utilizing SAS ODS and Proc Template
  • The warehouse is a mainframe / Informatica-UNIX and Teradata based database structure, which is designed to handle Bank of America’s Customer Knowledge & Decision Support activities (CK&DS).
  • The main objective of the Warehouse is to help the Bank in taking Decisions & making Strategies by providing historical data from multiple streams and help in Planning and Decision making.
  • Integrate the business, technical and operational metadata captured from various tools and create data lineage showing end to end data flow
  • Design and implement metadata change management and deployment process while moving metadata from one environment to other
  • Migrate the existing campaign execution from SAS to Unica
  • Write UNIX Shell scripts and design batch process to automate the metadata import-export, run data lineage and metadata deployment process
  • Designed the metadata, workflow, models in Excel with great transparency and used SAS macro to process the excel file and generate ETL, model scoring and cash flow analysis SAS code.
  • Used SAS ODS to create HTML, RTF and PDF outputs files in the process of producing reports.
  • Assisted in setting up SAS execution platform on UNIX system.
  • Perform SIT (System Integration Testing) and provide data to Onshore Counterpart for submitting it to the User community.
  • Assist in Deployment and provide Technical & Operational support during Install.
  • Involve in Post implementation support.
  • Brief Listing of tasks done on a weekly basis
  • Ensure overall Data of all the deliverables.
  • Used IPMS for all project management related quality activities.

Environment: SAS Enterprise Guide, Teradata SQL Assistant, SAS, VB, BTEQ, Oracle 9i, UNIX Shell scripts, SQL, PowerPoint, Excel,IBM z/OS, Informatica Power Center, MVS & UNIX, JCL, Abinitio, SQL

Confidential, Pasadena, CA

SAS Analyst

Responsibilities:

  • Creating SAS data set from flat files, excel files and from oracle, Oracle was our main data base.
  • Extracted data from different sources like Oracle and Teradata and text files using SAS/Access, SAS SQL procedures and created SAS datasets.
  • Used SAS PROC SQL pass through facility to connect to Oracle tables and created SAS datasets using a variety of SQL joins such as left join, right join, inner join and full join.
  • Analysis of requirements for finding any ambiguity, incompleteness or incorrectness for enhancements and other projects assigned
  • Modified VB SQL code to execute in SAS environment; modified system interface code; wrote report programs to replicate report layout (from VB to SAS).
  • To attend technical meetings for finalization of requirements and design.
  • Involved in Development and enhancement of the Associate Data Load process set up.
  • Standardization of the code work using tool ASA (Automatic Standard Analyzer).
  • Cleaning Database by eliminating duplicate records utilizing the tool DATACLEAN before preparing the test environment.
  • Hired and trained team members on UNICA Affinium Campaign
  • Performed reporting and data analysis using Access, Excel, and Crystal Reports.
  • Wrote programs and SAS macros to assist other team members in resolving issues during conversion, and explained their use and functionality to the team.
  • To find out the testing coverage of the system after each enhancement by using the tool
  • TCA (Test Coverage Analyzer) and to provide the percentage of codes tested and also the flows that are yet to be tested.
  • Mainframe based Teradata database designed to handle customer knowledge and decision support activities.
  • Data Warehouse/OLAP/Reporting/Dashboard development
  • Started company’s movement into OLAP and data warehousing.
  • This project involved data extraction from the source files for different systems like fleet - ADIM, CDS, CSDB, CAP, IMPACS, MSP and load the data into the warehouse
  • The process does achieve the main objective of the warehouse to help the bank in decision and strategy making process.
  • Converted SQL queries and report code, from Actuate VB running on NT, to SAS running on UNIX.
  • Usage of the SAS tool specially created for DW resource pool to generate the Create and insert scripts for ADIM.
  • Data is loaded using ETL which is further used to develop views, extracts and to generate reports from them
  • Co-ordination with Offshore for the coding work based on the approved design.
  • Carried out Unit, System Integration, Regression testing for appropriate initiatives based on the test plans/scripts.
  • Installation of components in production based on appropriate timelines.
  • Production support activities, which would involve work in the areas of Production job monitoring, problem fixing and Persistent issue resolution. Been as the primary on call for ADIM.
  • To work on technical problem tickets from the user group of the ADIM. This would involve problem tracking, analysis, follow-up with the source system based on its type.

Environment: SAS Enterprise Guide, Teradata SQL Assistant, BTEQ, Oracle 9i, Teradata, Windows NT, UNIX Shell scripts, SQL, PowerPoint, Excel,IBM Mainframe, Tableau, UNICA,MVS/ESA, WINDOWS XP,Teradata SQL, JCL

Confidential

SAS Analyst

Responsibilities:

  • Used SAS and SQL skills to perform ETL from DB2 and Teradata databases and created SAS datasets, SAS macros, Proc/data steps and SAS formats as required.
  • Developed reports as per business requirements and created various reports like summary reports, tabular reports, excel reports etc.
  • Developed new/modified macros for report generation using SAS / Macros as per business requirements.
  • Created SAS datasets from various sources like Excel datasheets, flat files and created reports and files from existing SAS datasets.
  • Responsible for correcting and maintaining daily, weekly, and monthly insurance claims and payments reports using SAS.
  • Created ad hoc reports for various data analysis projects.
  • Programming in UNICA Affinium, extracting lists of card members for various marketing campaigns, utilizing PC and MVS platforms for additional programming.
  • Utilized Teradata, DB2 and SyncSort for data collection
  • Developed Rules files to upload data/metadata from financial system's database to Essbase cube
  • Migrated Metadata, Security reports using Life cycle management (LCM) from Dev to QA & to Production
  • Provide best practice recommendations and guidelines in the form of standards, checklists to all the development groups to capture the required metadata seamlessly
  • Import technical metadata of various source & target systems including Database (Netezza, DB2, Oracle, SQL Server), Files (VSAM, Sequential), Data Models (CA ERwin), Source to Target mapping spreadsheets (MS-Excel), ETL (DataStage), Data Profile Results (Information Analyzer), Reports (Cognos) into Metadata repository
  • Developed data mart for sales and financial data/SSAS OLAP cubes/SSRS reports
  • Assisted in training the off-shore SAS and IT teams in understanding company data and systems.
  • Responsible for creating and delivering recurring as well as ad hoc marketing campaign programming within strict timelines while under constantly changing requirements using SAS, SAS SQL, and Teradata SQL in a UNIX environment with Oracle, Informix, and Teradata relational databases.
  • Responsible for actively supporting the development of systems that provide monthly activity based costing reports and profitability information to departments throughout the company using SAS, JCL, PRF query creation, Teradata SQL and BTEQ in a mainframe environment crosoft Excel.
  • Responsible for Risk Management processing and criteria development of Statement Mailings each month to all current U.S. Associates Consumer customers. This includes qualifying and auditing over $200 million in pre-approved offers as well as to select customers for various Marketing initiatives using SAS in a UNIX environment
  • Used SAS to develop various ad hoc programs and reports as well as provide detailed monthly tracking analysis, executive summaries, and presentations to Senior Management on performance and execution of Marketing programs and trend analysis for Planning and Control issues

Environment: SAS Enterprise Guide, Teradata, SAS v8/v9, SQL Assistant 12.0, Erwin, UNIX Sun Solaris, Clearcase, Hyperion Brio, Oracle, Tableau,QlikView, Excel, Rational Rose, Requisite-Pro, AGILE, Documentum, MS Project 2002, MS Visio, MS Word, MS Excel, Test Director, Java, COGNOS, MYSQL, Windows NT/2000. Proc SQL, PC SAS, OLAP, PL/SQL, DB. UNIX, Teradata, SQL, SAS, Oracle, MS Excel, MS Access

We'd love your feedback!