We provide IT Staff Augmentation Services!

Data Analyst Resume

3.00 Rating

Dayton, OH


  • Data Analyst with good experience in business analysis, data analysis, application development along with reporting.
  • Strong Experience in SAS, SAS/Connect, SAS/Access, SQL, PL/SQL, web related technologies Data modeling, Data Warehousing concepts and ETL techniques (extraction, transformation and loading.
  • Good in clear understanding and participation in all phases of Software Development Life Cycle (SDLC) and preparing business documentation, test plans, communicating with the business users and working with Large Datasets.
  • Extensive experience in working at multiple levels of a Metadata Management project, including designing the metadata strategy, defining the metadata management architecture, managing the metadata project, mentoring metadata developer and conducting trainings
  • Worked extensively on SAS Procedures and Data Step Programming to develop the process flow applications from Data Acquisition to Data Modification/Reporting.
  • Proficiency in ETL processing on various data sources like Oracle, DB2, SQL server, Teradata, MS Access, MS Excel and text files.
  • Experience in Teradata development and design of ETL methodology for supporting data transformations & processing in a corporate wide ETL Solution using Teradata TD12.0/TD13.0, Mainframe and Informatica Power center 8.6.0,9.0.1, administration, analysing business needs of clients, developing effective and efficient solutions and ensuring client deliverable within committed deadlines
  • Working knowledge of UNICA Affinium Campaign and Teradata CRM tools and have worked on various databases like Oracle, DB2 and Teradata.
  • Experienced in Generating reports with browser interfaces like SAS Web Report Studio, windows client SAS Enterprise Guide and built reports using SAS Add - In for Microsoft Office
  • Designing, documenting and creating SAS OLAP Cubes within SAS OLAP Cube Studio.
  • Administering large Teradata database system in development, staging and production.
  • Involved in various stages of Software development life cycle.
  • Strong hands on experience using Teradata utilities (FastExport, MultiLoad, Fast Load, Tpump, and BTEQ Query Man).
  • Skilled in Data Warehousing Logical and Physical Data Modeling using Star Schema and Snowflake Schema.
  • Experience in Banking and Credit Card industry business processes.
  • Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1/7.1 , Teradata
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities
  • Sourced data from disparate sources like Mainframe Z/OS, UNIX flat files, IBM DB2 and loaded into Teradata DW.
  • Strong Experience in Designing end-to-end ETL framework and strategies to handle Re-startability, Error handling, Data reconciliation, Batch processing and process automation
  • Strong experience in Data analysis, Business rules development, data mapping and translating business requirements into technical design specifications
  • Good understanding of Data Modelling (Star and Snow Flake Schemas), Physical and Logical models, DWH concepts
  • Good experience in setting up real time Change Data Capture process using IBM Change Data Capture (CDC) tool sourcing DB2 database and targeting Netezza database
  • Good experience in Data profiling and Data analysis using IBM InfoSphere Information Analyzer
  • Good understanding of SCRUM methodology and Sprint planning to achieve the project goals and milestones effectively and efficiently
  • Experienced in developing Data warehouse/Data Marts, Star Schema, Snowflake Schema and changing Dimensions.
  • Excellent command in writing complex routines for Data Validations, Data Extraction, transformation and loading to target decision support systems using MS Excel, Pivot tables and SAS on various environments.
  • Very good knowledge of Budgeting, Forecasting and Financial planning, created Balance Sheet, Income Statement and Cash Flow Statement, Multiple currency using Hyperion Planning
  • Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts
  • Strong Teradata SQL, ANSI SQL coding skills.
  • Proven skills in Data Cleansing, Data Archival, Data Migration, ad-hoc reporting and coding utilizing SAS on UNIX, Windows.
  • Expertise in Report formatting, Batch processing, Data Loading and Export using BTEQ.
  • Did the performance tuning of user queries by analysing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes
  • Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.
  • Involved in designing and building stored procedures, view generators and macros for the module.
  • Worked on moving data to dedicated environment for Analytics for better performance. Involved in
  • Analysis, Administrative, Security, Development and Support.
  • Good understanding of Tableau architecture, design, development and end user experience
  • Expertise in Procedures, Functions, Database Packages and Triggers. And experience in trouble shooting techniques, tuning the SQL statements, Query Optimization and Dynamic SQL.
  • Good knowledge of Performance tuning, Application Development and Application Support on UNIX, MVS and WINDOWS NT Environments
  • Developed UNIX Shell scripts for Batch processing.
  • Responsible for writing Deployment/Release Notes before the project release and End User’s Manual for production team.


Programming Languages: SQL, PL/SQL, T-SQL, SAS 9.1/9.2/9.3/9.4 , SAS Macro, Tableau

SAS: SAS Base, SAS/Macro, SAS /Connect, SAS-SQL, SAS/ STAT, SAS/Graph and SAS/Access

SAS BI Tools: SAS Data Integration Studio, SAS Enterprise Guide, SAS Management Console, SAS Information Map Studio, SAS Customer Intelligence Studio

Web Technologies: HTML, XML, Macromedia Flash, Adobe Photoshop, Dream Weaver

SAS Procedures: Print, Means, Univariate, Correlation, Regression, SQL, Report, Freq, Sort, Tabulate, Format, Import, Export, Transpose, Compare, Gplot and Gchart

Databases: SQL server, Teradata, DB2, Oracle, MS Access

Operating Systems: MS Windows(95,98,2000,NT,XP), Unix, DOS

Application Software: MS Office (Word, Excel, Power Point, Access), Microsoft Visual Studio 6.0, MATLAB, HP Quality Center, Publisher, Lotus Notes, Outlook, SharePoint, Info path

Other: Subversion (SVN), Microsoft SQL Server Management Studio


Confidential, Dayton, OH

Data Analyst


  • Collaborated with a team of Business Analysts representing the Information Technology & Services and Customer Knowledge & Insights departments to create all business requirements documentation.
  • Conducted extensive gap analysis by analyzing downstream data to determine discrepanciesbetween new and old datasets and reconcile all information in the Campaign ManagementData Mart.
  • Played a key role in the creation of the Business Source to Target Mapping tool, summarizing, compiling, and analyzing complex data sets from multiple departments and assist the campaign team to complete a matrix to enhance gap analysis.
  • Improved data collection authenticity and validity through migration from manual collection methods to an automated data collection process.
  • Importing data creating SAS Datasets, used SAS/SQL to join tables and merging data using DATA MERGE, PROC APPEND according to the requirement, generate the reports and analyze those results.
  • Created SAS Modified Reports using the Data Null technique and wiritten ad-hoc queries to service ad-hoc report requests.
  • Used SAS- EG to create SAS programs
  • Converted the existing SAS base programs to SAS EG
  • Converted SAS codes to SAS Stored Processes
  • Performed SAS analysis using the customer database.
  • Collaborates with the Customer Marketing team to work with the Campaign Solution Requirements Document (SRD), analyzing data and key demographics to establish targets and multi-channel requirements.
  • Uses SQL to complete queries and assess tables, navigating through a process-oriented system to ensure all transfer files are accurate and compliance with bank policies.
  • Responsible for creating ETL design specification document to load data from operational data store to data warehouse.
  • Develop FASTLOAD and MLOAD scripts to load data from legacy system and flat files through Informatica to Teradata data warehouse.
  • Involved with Business analysts to Provide Business Performance Data using Teradata, Oracle, SQL, BTEQ, and UNIX.
  • Designed and developed a user management platform (Python, SQL).
  • Detected data inconsistencies in large data sets (Python, SQL, Java, Bash).
  • Building up VBA Macro for Processes which are recurring jobs to pull data for specific requirement.
  • Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.
  • Wrote BTEQ, SQL scripts for large data pulls and ad hoc reports for analysis.
  • As per Ad-hoc request created tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.
  • Designed and Optimized Data Connections, Data Extracts, Schedules for Background Tasks and Incremental Refresh for the weekly and monthly dashboard reports on Tableau Server.
  • Strong working experience in understanding, modifying existing SAS programs and creating new SAS programs using Macros.
  • Identify data discrepancies and produce resulting datasets, tables, listings and figures to ease correction in the data.
  • Scrutinize SAS processes and prepare extensive documentation for non-technical and technical audience.
  • Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
  • Created different set of tables (Like Set, Multiset, Derived, Volatile), Macros, views, procedures using SQL scripts.
  • Automated data analysis programs with a goal of improving efficiencies. Used SAS system macros for error handling, code validation, date stamping of log files and collected files to a given directory.
  • Generated listings and reports from SAS programs using MACROS, ODS and PROC TEMPLATE /REPORT/ TABULATE.
  • Worked on Testing Primary Indexes and SKEW ratio before populating a table, used sampling techniques.
  • Involved in Unica project by providing the useful information pertaining to existing campaign process, standard exclusion criteria and output files to different channels.
  • Automated daily, weekly and monthly reports on UNIX platform and documented and tested all queries used to pull data for reporting purposes. to monitor portfolio pricing.
  • Automating and Scheduling the Teradata SQL Scripts in UNIX using Korn Shell scripting.
  • Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.
  • Worked with Business analysts to Provide Business Performance Data and other data using Teradata, Oracle, SQL, BTEQ, and UNIX.
  • Responsible for analysing business requirements and developing Reports using PowerPoint, Excel to provide data analysis solutions to business clients.
  • Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.
  • Used Output Delivery system (ODS) facility to direct SAS output to RTF, PDF and HTML files.
  • Wrote BTEQ, SQL scripts for large data pulls and ad hoc reports for analysis.
  • As per Ad-hoc request created History tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.
  • Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
  • Worked on tables (Like Set, Multiset, Derived, Volatile, Global Temporary), Macros, views, procedures using SQL scripts.
  • Worked on Testing Primary Indexes and SKEW ratio before populating a table, used sampling techniques, Explain Plan in Teradata before Query Large tables with billons of records and with several joins.
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from multiple sources.
  • Utilized the tableau web data connectors to connect to the Google sheets.
  • Provided monthly reports related to Card Data quality by using Teradata, SAS EG, MS Excel, Pivot Charts.
  • Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.

Environment: SAS Enterprise Guide, MS Excel, MS Word, Teradata 13.0, Oracle, Teradata SQL Assistant, VB, Tableau, BTEQ, Erwin, Oracle 9i, PC SAS, MVS, UNIX Shell scripts, UNICA, DB2, SQL, PowerPoint, SAS Enterprise Guide, BOBJ, Putty, Excel

Confidential, NY

Data Analyst


  • Interface with marketing management to gather information requirements. creating & analyzing Business Requirements Documents (BRD) and preparing Functional Requirement Documents (FRD) and Technical Documents
  • Integrate results of the Marketing Mix Analysis, Price Elasticity and other advanced analyses into business planning process.
  • Analyzed large volume of data (Millions of structured records, millions of semi-structured, unstructured documents and over 10 TB) of loan and deposit origination data and documents, customer data, core banking data, financial data and reports, HR and payroll data and reports, loan and check images.
  • Developed automated statistical reports to summarize large datasets (Python, R, SQL).
  • Work closely with different Marketing Groups in designing and analyzing the data.
  • Perform Metadata (Information Map, OLAP, Enterprise Guide) development
  • Provide support for, and automation of, routineSAS-EBI stored procedures.
  • Create Data Dictionary and migrate the data when shifting to new version of SAS.
  • Performed SAS programming required for data analysis and reporting in Unix environment
  • Extract Customer data from the Oracle, DB2, and EXCEL databases writing SAS Proc SQL pass through queries and using SAS/ACCESS to create a customer mailing list for Direct Mailing and Telemarketing.
  • Involved in pulling data from oracle and DB2 data bases, carefully choosing source tables and deciding upon target tables to be created in local environment, also acquiring metadata sources for these tables to load them in to SAS ETL studio.
  • Created libraries and assigning these libraries to the tables in ETL studio.
  • Maintained and enhanced existing SAS reporting programs for marketing campaigns.
  • Extensively used Proc SQL for creating ad-hoc reports and graphs as per the requirements of the users.
  • Used SQL extensively to check the data integrity, data verification and validation.
  • Load flat files into Teradata RDBMS using Loading tools.
  • Created tables, views in Teradata, according to the requirements.
  • Involved in the risk and loss analysis.
  • Involved in updating and maintaining the databases like Sybase and Oracle using SQL.
  • Extensively wrote SQL queries to extract the data from databases.
  • Involved in the design of the data warehouse.
  • Responsible for configuring the ODBC drivers to connect to oracle database from SAS, which facilitated the extraction of data.
  • Developed macros and views to integrate complex business rules.
  • Developed stored procedures for managing complex business logic.
  • Used SQL and PROC SQL Pass through Facility to work with Oracle, DB2.
  • Moved data set across platforms (from PC and Mainframe to UNIX and Vice Versa).
  • Developed standard and custom data listings, summary tables, graphs.
  • Created HTML reports using SAS ODS facility.
  • Used SAS Macros to write re-usable code to produce recurring weekly and monthly reports
  • Extensive usage of DATA NULL for reporting purposes.
  • Created reports and tables using SAS Procedures like Proc Report, Proc Tabulate.
  • Performed Data Analysis by running PL/SQL Queries to validate databases.
  • Used Dynamic Data Exchange (DDE) to acquire data from Excel spread sheets.
  • Involved in PL/SQL code review and modification for the development of new requirements.
  • UsedTableauto perform data migration, visual analytics and report on critical to quality data.
  • Aggregated final data sets using SAS, aggregate script which is transformed into Teradata tables which are further loaded on to tableau and refreshed at regular intervals for the final visualization.


Confidential, Seattle, WA

Data Analyst


  • Used BASESAS,Macros,SASEnterprise Guide andSASSQL for programming.
  • Modified macros for report generation usingSASMacros as per the requirements.
  • Programmed withSASEnterprise Guide query builder for constructing complex queries analysis.
  • Extensively used Data Null andSASprocedures such as Print, Report, Tabulate, Freq, Means, Summary and Transpose for producing ad-hoc and customized reports and external files.
  • Worked with complex datasets to extract customized reports using PROC SQL, PROC SORT and PROC REPORT, PROC TABULATE for creating a preferred list of customers as per the given requirements from business analysts.
  • Developed, enhanced & executed complex programs and reports with proper documentation.
  • Worked with the team to identify the business reporting needs and studied current reporting practices outside of QlikView through pivot tables.
  • Have a very good knowledge of all the features ofMicroStrategy9.4.1-MicroStrategyVisualization,MicroStrategyDash boarding,MicroStrategyReporting
  • Proficient in working different databases environment such as MS SQL server and Teradata
  • Comprehend production codes, undertake diagnosis, resolution of complex issues and improve the efficiency of processes and programs.
  • Developed business domain expertise, rudimentary project based business knowledge and good theoretical knowledge of tools like SAS, MS office Independently delivers end to end on complex projects and assist in developing and testing new information infrastructure.
  • Worked onbug fixingand enhancements to existingCognosreports inReport Studio& models in Framework Manger.
  • Manage end-user expectation in the delivery of projects within agreed timelines and as per agreed quality standards.
  • Independently executes standard processes, implements predefined algorithms and corrects identified problems.
  • Created SQL views for retrieving and dynamically updating data.
  • Maintaining large data sets, i.e. reading in data from various sources in various formats to create SAS data sets and/or ASCII files.
  • Documenting the process, i.e. documenting all the possible information about the application like SAS programs, DATA files and source.
  • Prepared project plan for the execution.
  • Independently interacted with business partners to understand business need and translated to technical specifications. Assisted in developing and testing new information infrastructure.
  • Generated HTML, RTF, PDF and text reports using ODS statements and performed data analysis using SAS Base, TERADATA 12. And Microsoft Excel
  • End to end project management and hands on delivery when required.
  • Took ownership and actively manage project timelines. Executed standard reports, processes, files, and implements predefined algorithms
  • Sets up QA/QC framework and ensures adherence and constant improvement.
  • Used SAS system macros for error handling, code validation, date stamping of log files, collected files to a given directory and scheduling.
  • Develops business and technical knowledge of team members and identifies opportunities of improvement.

Environment: SAS, Mainframe, BASE/SAS, SAS/MACRO, SAS/SQL, SAS/CONNECT, SAS/ACCESS, SAS/ODS, Micro Strategy, Cognos, Oracle, QlikView Teradata, DB2, UNIX and Windows 2003/XP.


SAS Programmer


  • Developed and maintained programs in SAS using SAS tools for Windows in a user support environment.
  • Extensive experience with the SAS programming and in data step and with various SAS Procedures in Base SAS and SAS /STAT, including thorough knowledge of SAS Macro Language.
  • Performed Data analysis, statistical analysis, generated reports, listings and graphs using SAS Tools such as SAS /Base, SAS /Macros and SAS /Graph, SAS /SQL.
  • Program documentation on all programs, files and variables for accurate historical record and for future reference.
  • REPORT, SUMMARY and also provided descriptive statistics using PROC-Means, Frequency and Univariate.
  • Advanced modelling methods were adopted using SAS BASE, SAS STAT, SAS ACCESS, SAS MACRO, SAS GRAPH for the Consumer Lines and Loans Processing.
  • Prepared new Datasets from raw data files using Import Techniques and modified existing datasets using Set, Merge, Sort, and Update Formats, Functions and conditional statements.
  • Involved in Design and development ofAbInitioload graphs to load tables in Oracle and documenting of completeAbInitioGraphs
  • Involved in redesigning the existing graphs inAbInitioand documented all the new and enhancement requests.
  • Implemented Credit Risk Analysis reporting system based on Fico Scores and data management principles, such as joining data set, indexing, data aggregation, record selection, sub-setting, multiple records per case, creation and modification of views, accessing multiple database.
  • Created complex and reusable Macros and extensively used existing macros in developingprograms for Data Cleaning, Validation, Analysis, and Report generation. Tested and debugged existing macros.
  • CreatedSASCustomized Reports using the Data Null technique.
  • Generated graphs usingSAS/GRAPH and theSASGraphics Editor.
  • Developed routineSASmacros to create tables, graphs and listings.
  • DevelopedSASmacros for data cleaning and reporting and to support routing processing.
  • Used Proc REPORT to generate reports. Developed annual reports based on existing data.
  • Imported data from excel sheet for analysis inSAS.
  • DevelopedSASprograms using BaseSASfor tabulation counts, correlations and check for dispersion
  • Extracted the data from database using SQL queries and createdSASdata sets.
  • Created RTF, PDF, and HTML Reports usingSASODS
  • Convert MS-Word documents, MS-Excel, SQL tables into data sets.
  • Extensively used SAS / Macro facility to provide reusable programs that can be conveniently used time to time.
  • Formatted HTML and RTF reports, using SAS - output delivery system (ODS).

Environment: SAS 9.1.3 Windows, SAS /BASE, SAS /STAT, Macros and Windows


SAS Programmer/ Data Analyst


  • Report Generation and Documentation with respect to evaluation of model performance and effectively communicating the results to higher management.
  • Move data sets across platforms (from PC and Mainframe to UNIX and Vice Versa).
  • Worked directly with end users to gather Business requirements,
  • Creating SAS scripts and writing SQL queries based on input from analytics/development team.
  • Researched existing Consumer and Small Business Card, and Mortgage defect identification logic to facilitate automation of defect and dashboard reporting.
  • Effort estimations, project planning, coding, unit test, delivery & support for Ms-Excel, Ms-Access based VBA applications as a single team player since it is an initiative of Technology With Operations.
  • Preprocessed training data and performed advanced modeling on testing dataset.
  • Worked a lot on data pulling from various files.
  • Data was extracted from databases and also flat files.
  • Developed SAS programs using Proc Tabulate, SAS Base and SAS/Stat that enabled users to generate reports on Daily/Weekly/Monthly/Quarterly/Annual reports.
  • Calculated possible risk and premiums for individual organizations.
  • Created SAS data mining and scoring data sets form source tables by programming in BASE SAS.
  • Evaluated the model performance using decision trees and confusion matrices.
  • Developed and maintained multivariate statistical models used to predict customer behavior and to profile customer potential.
  • Customized existing SAS programs and created new programs using SAS/Macros to improve the consistency of the results.
  • Compiled statistics and created detailed reports and graphs for management and customer review using SAS/Graph, SAS/Macro and SAS/Report.
  • Correlating and presenting weekly or monthly data to senior management.
  • Perform data manipulations and scrubbing. Unix, BTEQ and KSH scripting
  • Performed validation testing and UAT to produce clean and consolidated results.
  • Assisted the DBA with scheduling jobs, running various scripts related to space management

Environment: SAS/Base, SAS/Graph, SAS/Macro, SAS/Stat, SAS/SQL, PL/SQL, MS-Excel and HTML.

We'd love your feedback!