We provide IT Staff Augmentation Services!

Sr Data Analyst Resume

0/5 (Submit Your Rating)

NJ

SUMMARY

  • SAS developer wif more TEMPthan 6 years of experience in analyzing, designing and developing Extraction, Transformation and Load (ETL) processes using SAS DI studio for building a data warehouse, reporting and other various data integration projects to build data marts from enterprise data sources and fully integrate the SAS Business Intelligence suite.
  • Experience in handling multiple projects as a lead.
  • Experience on creating jobs using various transformations from the vendor sources into the target data marts using SAS DI Studio.
  • Create user defined transformation for custom job processes and generic requirement.
  • Experience in architecting the data marts, OLAP cubes for fast & flexible reporting of pre - summarized large datasets and decision support systems using SAS Data Integration Studio, Enterprise Guide and OLAP Cube studio for the business users, GRAPH, SPDS, SAS Management Console, Web Report Studio, Information Map Studio, Information Delivery Portal, and DI Studio.
  • Experience in Administration, Configuration and Maintenance of foundation and custom metadata repositories, user autantications, deploying jobs for scheduling using SAS Management Console.
  • Involved in metadata jobs migration from dev./test environment to production servers using the packages facility
  • Create reports and analyzing CRM data using SAS EBI.
  • Extensive experience in using various SAS modules - SAS/Base, SAS/Stat, SAS/Macros, SAS/Access, SAS/Connect, SAS/Graph, SAS/ODS and SAS/SQL(pass-through facility) on Windows and UNIX environments in order to accomplish financial Data Analysis and preparation of SAS datasets, multi-dimensional reports, tables, listings, summaries and graphs according to the corporate guidelines and project requirements.
  • Perform the Load balance tests by tuning the SAS DI studio jobs.
  • Expertise in creating the mapping models for mapping the business models wif star schema logical database model using SAS Information Map Studio and store them in the metadata server that helps the business users to be self sufficient for generating their reports/analytics using SAS Information Delivery portal, Web Report Studio and Portlet.
  • Created new jobs schedules for daily and weekly processing using Maestro.
  • Experience includes SAS STAT, OR, ETS, CONNECT, GIS, ITRM, Enterprise Miner, Marketing Automation/Campaign Management, Integration Technologies.
  • Created stored processes to help the users to access/execute the jobs remotely and generate the reports.
  • Strong experience on various Database tools like Oracle, Teradata, DB2, SEIBEL, SQL Server
  • Experiencing in data cleansing and benchmarking the metadata jobs in the test environment before releasing to production using the SAS/DIS Data validation transformations and SAS DIS package facility.
  • Familiarity wif ERWin Data Modeling Tool.
  • Developed standard operating procedures for resource optimization, productivity and maintainability.
  • Involved in various process improvement projects, data cleansing by automating redundant tasks using scripting languages script across the teams.
  • Possess outstanding analytical and problem solving skills. Proactive, Innovative, Challenge Oriented. Possess the ability to demonstrate both as an individual and as part of a team wif excellent communication, interpersonal skills and ability to handle multiple projects at one time.

TECHNICAL SKILLS

SAS Skills: Base Programmer for SAS9 certified., SAS V7.x, V8.2, V9.1.3.V9.2., ACCESS Engines(Oracle, ODBC, OLEDB, DB2), SAS Data Integration Studio 3.3,3.4 and 4.2, SAS Enterprise Guide 3.3,4.1&4.2,SAS EBI SAS Management console 9.1, SAS OLAP Cube Studio, SAS java API, SAS Information Map Studio, SAS Enterprise Miner/data mining, SAS Web Report Studio, SAS FM, SAS AF, SAS Add-in for Microsoft Office, SAS Information Delivery Portal, SAS XML Mapper, SAS/BASE, SAS/MACROS, SAS/ACCESS, SAS/GRAPH, SAS/SQL, SAS/ODS, SAS/QC, SAS/CONNECT, PROC IMPORT/EXPORT/TRANSPORT.

Database Platforms: PL/SQL, SQL Server 2002/2005, Oracle 8x/9x/10g, Weblogic, MYSQL, SEIBEL, SPDS, TERADATA. DB2,Peoplesoft, MS ACCESS 98/2000

Operating Systems: Unix (Sun Solaris/HP/AIX), Windows XP/2000/NT, Windows 2k/2k3 servers

Office Tools: MS-Office - Excel / Power Point / Word / Project / FrontPage, Minitab, Lotus Notes, Visio, VMWARE

Applications: ERWin, SIEBEL, Rational Clearcase V6.x, Maestro

PROFESSIONAL EXPERIENCE

Confidential, NJ

Sr Data Analyst

Responsibilities:

  • Interacting wif Business users to understand new functional and reporting requirements.
  • Proposing various technical approaches and solution, based on business requirements
  • Interface wif other internal data teams to ensure relevance, accuracy, and consistency of data sources
  • Continuously research and assess the availability and relevance of alternative data fields and data sources
  • Prepare Metadata documents for key fields identified for Chase Home Finance Default Risk Analytics.
  • Design and automate a standardized process using SAS to regularly pull data used primarily for Chase Home Finance Default Risk Modelling/Analysis
  • Design and Build data repositories housing relevant data to support Chase Home Finance Default Risk Strategic Initiatives
  • Integrate external data sources (Credit Bureau, Econometrics, Hedonics data, etc.) into data repositories for further analysis
  • Analysis, Design and Development of SAS programs and PL/SQL stored Procedures, required for the Chase Home Finance Default Risk Analytics and Reporting.
  • Designed and created the HOPE MOD, HOME EQUITY marketing analysis for different MAILING waves which are done every Month.
  • Developed the Foreclosure loans analysis by finding the average no of days taken for foreclosure in each state and in each stage by JUD and NON-JUD.
  • Developed the SAVE OVER REO monthly analysis on monthly completed loans.Which can tell how much we has saved on completed loans by NET PROCEEDS and expenses.
  • Developed the SHORT SALE process for analyzing the no of pipeline and completed loans in SHORT SALE.
  • Created the process to calculate the TAXES and INSURANCE for the default loans depending on property valuation and zip code.
  • Created the process to get the RMV,BPO and AVM valuations for default loans.
  • Created the Pivot tables for monthly analysis reports.
  • Participate in Confidential ’s Quality Management Processes.
  • Coordinate wif Supports Group to make the necessary infrastructure available for the normal functioning of the project
  • Project Status reporting, Risk identification and monitoring using Confidential tools.
  • Supporting Project Managers in estimation and planning.

Environment: (Windows and UNIX): SAS 9.2, SAS EG 4.1&4.2,SAS Add-in Microsoft office, Windows SAS, UNIX, Contrl-M Scheduling, Business Objects 11,BO Professional Edition Reflection X FTP, WINSCP FTP, Erwin, F-Secure SSH file transfer, Oracle 11g,DB2, SAS/ACCESS, ODBC, OLEDB, TERADATA 13.

Confidential, TX

Data Analyst

Responsibilities:

  • Involved in development and testing of client specific SAS based software package for SAS dome Migration project.
  • Worked as technical lead for the project.
  • Acted as a Business Analyst and Gathering the requirements from users and documented.
  • Designing of UNIX environment and migrating the users from Windows to UNIX and creating the folder structure in UNIX.
  • Changing of SAS codes from Windows environment pointing SQL to TERADATA and migrate the codes to UNIX.
  • Creating the shell scripts in UNIX for creating the folder structure in UNIX and creating the .profile for the users.
  • Worked on the Credit Bureau Simplification project for campaigning auto loans.
  • Designing the tools needed for SAS Dome migration for the Confidential Auto Finance Data analysts.
  • Installed and creating a Scheduling tool PCRONTAB on UNIX platform.
  • Recommending the Reflection X FTP tool to migrate data from window to SAS UNIX platform.
  • Used SAS/Access ODBC and OLEDB drivers and SQL Pass-Through facility to retrieve the data from SQL and TERADATA database.
  • Migrating of data from SQL Database to TERADATA containers to be used for analysis.
  • Capability of resolving data quality issues during the migration of data from SQL Database to TERADATA.
  • Assisted users wif SAS Dome migration questions and occasionally wif general PC, system and database questions.
  • Familiarity wif ERWIN as a Data Modeling tool.
  • Generated HTML, RTF, PDF and text reports using ODS statements and performed data analysis using SAS Base, TERADATA 12. And Microsoft Excel.
  • Created automated FTP’s, automated Email macros using SAS and UNIX to desktop for analysis.
  • Helped various team members in setting up SAS Enterprise Guide environment.
  • Responsible for the requirements: ETL Analysis, ETL Testing and designing of the flow and logic for the Data warehouse project
  • Experience in handling multiple projects.
  • Responsible for maintaining and stability of ICM jobs to run on daily and update the reports in BO.
  • Developed and executed quality control checks on the data loaded in Teradata database by comparing the Dep2 and Dep3 data models.
  • Developed, Tested and scheduled codes for automated periodic reporting using SAS Management Console and verified its flow using SAS flow manger.
  • Proficient in working different databases environment such as MS SQL server and Teradata.
  • Experience in handling large databases and merging tables wif millions of rows and creating test cases.
  • Excellent organizational, inter-personal and presentation skills along wif an ability to work efficiently in both independent and team environment.

Environment: (Windows and UNIX): SAS 9.1.3, SAS EG 4.1&4.2,SAS Add-in Microsoft office, Windows SAS, SAS Management Console, UNIX, PCRONTAB Scheduling, LSF Scheduler, PCRONTAB Scheduling, Business Objects 11,BO Professional Edition Reflection X FTP, WINSCP FTP, Erwin, F-Secure SSH file transfer, MS SQL, SAS/ACCESS, ODBC, OLEDB, TERADATA 12& 13.

Confidential, CA

SAS ETL Developer

Responsibilities:

  • Create new metadata objects to integrate the existing SAS ETL data transformation code into SAS Data Integration Studio 4.2 (and 3.4) using the source, target and process designers to perform various data transformations and store the datasets into the data warehouse and a datamart.
  • Experience in architecting, administration and maintenance of metadata repositories (Foundation, Custom & Project based) using SAS Management Console, data sources, target datasets, data transformations using SAS Data Integration Studio and maintenance of the same according to the data models.
  • Developed complex edit checks, packages and function using PL/SQL.
  • I Developed Slowly changing dimension table using SCD type 2 loader on Employees data.
  • Cleansed, prepared and analyzed the dataset containing using 700,000 observations using Base SAS, SAS Enterprise Guide 4.1 and SAS Enterprise Miner.
  • Perform tasks of user autantications, deploying jobs for scheduling using SAS Management Console.
  • Worked on CDE program for campaigning the Disney products and coupons to the customers.
  • Worked on merging of the existing SAS ETL code as Metadata job objects and stored into the custom repositories of SAS Data Integration Studio. Performed remote submission for enterprise jobs.
  • Performed complete development and unit testing of the newly created ETL code in test environment and bench mark newly created metadata jobs before integrating to the data warehouse in order to build efficient data marts.
  • Worked extensively on various SAS modules such as SAS Data Integration Studio, SAS Base, SAS Enterprise Guide V4, SAS MACRO, SAS ACCESS, PROC IMPORT, PROC EXPORT, PROC SQL, SAS SQL-Pass Through, Oracle9i, Teradata, Microsoft Excel and Microsoft Word.
  • Worked wif SAS OLAP cube designer to build OLAP cubes in order to facilitate high performance analysis, flexible pre-summarization data and easy reporting of large volumes of customer data wif an ability to view trends over time and other dimensions.
  • Working wif the Business Analysts to come up wif better way of doing the marketing analysis by developing models and building the reporting infrastructure.
  • Created OLAP cubes using SAS EBI Environment and created stored procedures on CRM data.
  • I used predictive modeling like logistic regression and uplift modeling.
  • Performed multidimensional data analysis and reports using the cubes to produce reports.
  • Created Pivot tables in Excel and charts/tables in PPT files using the OLAP cubes for the business users using the SAS Add-in for MS Office plug-in facility
  • Expertise in creating the mapping models for mapping the business models wif star schema logical database model using SAS Information Map Studio
  • Perform the tuning of the new DI studio jobs before moving into production server.
  • Worked on administration and configuration of SAS Web Report Studio and Information Delivery Portal to create new users and authorizations and configuring the Mail Servers for the same.
  • Extracting data from an Oracle database, XML files, Excel files, CSV files, Flat files used as input for SAS programs to create Excel spreadsheets, Access databases, and hard copy reports. Used SAS Macros to make the minimize SAS code lines and for ease of maintenance.

Environment: (UNIX): SAS 9.1.3 and 9.2 SAS/Macros, SAS DI Studio 3.4&4.2 SAS Enterprise Guide 3.4&4.2,SAS EBI SAS Management console 9.1, SAS OLAP Cube Studio, SAS Enterprise Miner 5.3, SAS Information Map Studio, SAS Web Report Studio, SAS FM, SAS Add-in for Microsoft Office, SAS Information Delivery Portal, LSF Scheduler, SAS XML Mapper, SAS/SQL, Erwin,HP Neoview, PL/SQL,T-SQL,SPDS,DB2, Terabyte data, MS Visio, MS Access and MS Excel.

Confidential, NJ

SAS Developer

Responsibilities:

  • Involved in the building of data marts from enterprise data sources spread across platforms of the SAS DI studio.
  • Created metadata objects by creating the data sources, targets and data transformations using the source, target and process designer of the SAS DI Studio (also non, SAS ETL Studio).
  • Created Lookup, Fact and Dim tables and Identifying the mapping data into specific domains as per the data model using SAS Enterprise Guide by creating the DATA and PROC steps as process tasks
  • Involved in the Administration of the user autantications, metadata repositories, job scheduling, stored processes, data cleansing.
  • Experience in creating OLAP cubes for reporting of pre-summarized large datasets and decision support systems using SAS Data Integration Studio and publish the same for business users.
  • Generated SAS datasets, tables, listings, graphs and reports from analysis datasets using SAS tools like BASE, Macros, GRAPH and STAT.
  • Created reports and stored procedures using SAS BI tool.
  • Developed the data warehouse system for credit/debit OLTP process.
  • Worked on transaction data and CRM data in credit limit increase and in marketing.
  • Extracted data from Oracle views using various SAS/Access methods including Libname statements and SQL Pass through facility.
  • Created and extracted Oracle tables from SAS and wifin Oracle by using PL/SQL using SAS/ACCESS and SAS/CONNECT.
  • Created efficient SAS programs using SAS Macros and validated the same to optimize the resource utilization and faster processing. Documented and maintained libraries and catalogues of SAS programs and formats..
  • Developed SAS programs for quality checks.
  • Involved in validating other programmers code.
  • Developed utility macros for standard reports and validations.
  • Produced multi-dimensional reports and tables using procedures PROC/REPORT, PROC/TABULATE, PROC/SQL and DATA NULL .
  • Used in file statement options to control processing when reading raw data files in SAS.
  • Used IMPORT and EXPORT procedures for importing and exporting the data to MS/EXCEL files.
  • Developed programs in SAS/Base and SAS/SQL pass-through facility which produce the efficacy and to perform complex joins using the Oracle and SIEBEL databases.
  • Enhanced reports through the use of labels, SAS formats, user-defined formats, titles, footnotes and SAS System reporting options.
  • Generated HTML, RTF, PDF reports using ODS statements.
  • Produced high quality graphs through SAS/GRAPH for analysis studies.

Environment: (UNIX & Windows): SAS/Base 8.2/9.1.3,SAS DI Studio 3.3, SAS Enterprise Guide 3, SAS Management console, SAS OLAP Cube Studio, SAS/Macros, SAS/Enterprise Miner, SAS/ACCESS, SAS/Connect, SAS/SQL, Oracle 8 /9i,Teradata, PL/SQL,DB2, IBM Mainframes, SIEBEL, MS/Excel.

Confidential

SAS Developer

Responsibilities:

  • Involved in developing SAS programs for data transformations of the data received from different operational sites to maintain a data warehouse. Data is received in PC files like CSV, txt or in other DSD file formats. Performed extraction, transformation and loading of data using SAS Base programs.
  • Created SAS datasets using data steps wif infile statements wif If/else statements, Do groups, Where statements, and Arrays to control processing when reading raw data files. Used SAS/Macros for efficient SAS code.
  • Extensive noledge in database design and developing oracle database using PL/SQL.
  • Used Import and Export procedures for specific input data feeds while connecting to remote data sources like SIEBEL and Oracle databases.
  • Produced efficient datasets by combining individual datasets using various inner and outer joins in SAS/SQL, used SAS/Macros and dataset merging techniques of SAS/BASE.
  • Modifying the new datasets using analytical statements.
  • Created, validated, documented and maintained libraries of SAS application programs, formats catalogs.
  • Generated HTML, Listings and reports for presenting the findings of various statistical procedures using procedures like REPORT, PRINT, GRAPH and FREQ.
  • Generated HTML, RTF, PDF reports using ODS statements.

Environment: (Windows): Base SAS, SAS/ACCESS and SAS/SQL, MS/Excel, Oracle8i, and Minitab.

We'd love your feedback!