We provide IT Staff Augmentation Services!

Sr. Sas Administrator Resume

4.00/5 (Submit Your Rating)

Jacksonville, FloridA

OBJECTIVE:

SAS Certified Professional offering over fifteen years of experience working within the IT industry involving complete Project Management life cycle, Database / ETL Management, Data Quality Management, Production support and software development life cycle.

SUMMARY:

  • Application development, report development, troubleshooting, estimation, architecture, design documentation, testing, promotion to higher environment, enhancements and providing production support using SAS BI tools.
  • Application integration testing, training, data conversion and functional enhancements.
  • Definition of business requirements, workflow analysis, and application development.
  • Preparation of project documentation, test plan, test cases, incidence reports, and sign off documents.
  • Database administration and Datasets development using SAS technology.
  • Extraction, Transformation and Loading (ETL) using advanced SQL, PROC SQL, Base SAS and Macros.
  • Combined expertise in development of effective data infrastructure, ETL processing, data warehouse, data modeling, and business intelligence reporting and dashboards.
  • Knowledge of Confidential 's Netezza and Composite tool.
  • SAS Certified Professional (Base) Hands - on experience of database technology including SAS 8, 9.1.3, 9.2, 9.3 and 9.4.
  • Production environment management providing high availability and appropriate database troubleshooting.
  • Close coordination with development, systems and business units to implement new enhancements.
  • Preparation and maintenance of well-documented systems and conceptual design documents.
  • Developed complete project plan for scope and schedule for data quality solution, data problem discovery, data analysis, development of working as well as production prototype.
  • Architect, discover, design and developed data quality solutions using Data Flux Data Management Studio 2.2 and 2.3, Data Flux Web Studio and Data Management Server.
  • Presentation of In-Action batch as well as real time DataFlux job for - profiling, standardization, integration, enrichment and data quality methodologies for higher management and application teams.
  • Installation and configuration of DataFlux Data Management Studio along with Quality Knowledge Base (QKB) and Data Packs.

TECHNICAL SKILLS:

Languages: BASE SAS, SQL, PL/SQL, ProC, C++, Java

Administration: SAS 9.1.3, 9.2, 9.3 and 9.4 on GRID.

Databases: SAS, SPDS, Netezza, SASPS, Teradata, Oracle RDBMS, 11g, 10g, 9i, 8i 8, and 7.x, Sql Server 2005.

Development Tools: SAS BASE v8, v9, SAS Enterprise Guide 4.3, SAS DI Studio 4.2, SAS Management Console 4.2, SAS Information Map Studio 4.2, SAS OLAP Cube Studio 4.2, SAS Web Report Studio 4.3, SAS Information Delivery Portal 4.3, SAS, BI Dashboard 4.3, SAS SQL, SAS Macros, Composite, TOAD, Oracle WorkflowEnterprise Manager, Oracle Designer 6.0,SQL Navigator, DBArtisan, SQL *Plus, SQL Loader, Visio, Enterprise Manager,and JDeveloper.

Data Quality: DataFlux Data Management Studio 2.2, 2.3, DataFlux Expression Language 2.3, DataFlux Web Studio 2.3, DataFlux Data Management Server 2.3 DataFlux DfPower Studio 8.0 and Master Data Management (MDM).

Architecture: SAS BI Platform 9.1, 9.2, 9.3 & 9.4 on SAS GRID, Client/Server, J2EE and Oracle Application Server 10g.

Front End: Oracle Forms 4.5, 5.0, 6i, 9i and 10g Oracle Reports 2.5, 3, 6i and Visual Basic 6.0

Integration: Oracle Data Integrator (ODI)

Operating Systems: WINDOWS 98/2000/XP/Vista/Windows 7, UNIX(HP/UX, AIX, Solaris, Linux)

Others: Microsoft Package (word, excel, access, power point and project) and WINSCP

WORK EXPERIENCE:

Sr. SAS Administrator

Confidential, Jacksonville, Florida

Responsibilities:

  • Administration and Managing GRID platform using SAS Management Console, SAS Deployment Manager, SAS Web Administration Console, SAS Metadata server and Application Server Configuration which involved not limited to Updating the SID File for licenses renewals, change the host names, rebuild Web applications, update Passwords, uninstall SAS software from the local machine, apply downloaded hot fixes to SAS software, Web-based interface that enables to monitor details for SAS Web applications, managed Metadata repositories by a repository manager, Clearing the Credentials Cache, working on Server manager, user manager, establishing connections using ODBC.ini and SPDS manager. successfully renewed licenses, apply patches and hot fixes for platform.
  • Administered, Controlled and Managed SAS Grid environment using EGOSH service commands in Putty sessions and RTM web application.
  • Developed Platform administration and access management related documents for the SAS Admin team.
  • Successfully performed SPDS server and security maintenance on a regular basis for stable and efficient SPDS platform.
  • Performed SAS GRID administration tasks of first priority setup, standard setup and optional setup.
  • Worked on Starting, Stopping, and Checking the Status of Servers on GRID platform.
  • Worked on Monitoring the Activity of SAS Servers and Administering Logging for SAS Servers using SAS Management Console, Server Performance Counters and Information Fields and SAS OLAP Server Monitor.
  • Coding /Building the data quality solutions of profiling, standardization, data quality and integration using DataFlux.
  • Build and scheduled data flux batch job processing to clean the incoming data to the database.
  • Coding/Building the SAS / DataFlux jobs, codes, flows and schedule using LSF process/schedule manager.
  • Using Data Management Studio merged customer, product, or other enterprise data, Unify and integrated disparate data through a variety of data integration methods (batch, real time, virtual) Jobs to Verify and complete address information.
  • Coding /Creation of DataFlux schemas to implement customized business rules for BofA.
  • Presentation of In-Action DataFlux - profiling, standardization, integration, enrichment and data quality methodologies for higher management and application teams.
  • Used the Information Riser Bar to review information that summarizes DataFlux Data Management Studio implementation and monitored the status of assets in the implementation.
  • Architect, designed and developed data quality solutions using Data Flux Data Management Studio, Data Flux Web Studio and Data Management Server.
  • The deployment and configuration of the DataFlux products was done strictly based on SAS Institute guidelines and documents.
  • Interview and identify the problems in data.
  • Analysis, case study and identifying the possible best approach for a clean data.
  • Conceptual design’s of solution to solve the problem. Meetings to pick the best approach/solution to solve the problem.
  • System design document for the solution to address the problem.
  • Testing and implementation of DataFlux production prototype.
  • Designed and developed Extract, Transform, Load (ETL) processes using Base SAS, SAS Data Integration Studio, SAS Enterprise Guide and DataFlux Data Management Studio to populate Data Warehouses and Reporting Data Marts.
  • Designed data warehouses, detailed data stores (DDS), and reporting data marts using relational and dimensional modeling techniques.
  • Successfully performed migration of dfPower Studio content which is upgraded to run in Data Management Studio and Data Integration Server content is upgraded to run on a Data Management Server.
  • Developed star schemas for effective reporting on high volumes of data.
  • Worked on Administering Logging for SAS Servers.
  • Configured and executed Job Statistics Reports for SAS Data Integration Studio. worked on Best Practices for Backing Up and Restoring SAS Content - Using Operating System Commands to Back Up the Metadata Server and Using the Export SAS Package Wizard to Back Up Specific SAS Folders. performed SAS Metadata Server Backup Tasks.
  • Managing Metadata Server Memory by Setting the Server's MEMSIZE Parameter and Input and Output Balancing.
  • Worked on SAS Metadata Repositories and Folders for Copying, Promoting, Importing, Exporting, and Analyzing Metadata, Moving a Metadata Repository to a New Location on the Same Metadata Server and Registering a Metadata Repository.
  • Used Promotion Tools - Export SAS Package, Import SAS Package Wizards and Batch Export / Import Tools to promote individual metadata objects / groups of objects from one metadata server to another, from one location to another on the same metadata server. Also promoted the physical files that are associated with the metadata.
  • Used the SAS Deployment Manager to Update Host Name References. Also, worked on Troubleshooting the Update Host Name References Tool.
  • Performed SAS 9.3 to SAS 9.3 Migration from Test environment to Prod using the SAS Migration Utility. Worked on designing migration, performing pre migration tasks, run the SAS deployment wizard to migrate SAS content, complete Manual migration steps, validating new environment and maintaining the same.
  • Coding SQL / Proc SQL programs related to platform support using SAS Enterprise Guide.
  • Build, deploy and scheduled LSF Process Manager jobs for SAS flows for production processing.
  • Developed Extract, Transform, Load (ETL) processes related to code promotion / platform administration tasks using Base SAS, SAS Data Integration Studio and SAS Enterprise Guide.
  • Designed data warehouse connection techniques to connect to heterogeneous data sources. Connections performed from SAS to SQL, Oracle, SPDS and Terradata.
  • Provided analysis and coding support for fixing failing SAS and SQL programs on processing high volumes of datasets and SPDS tables.
  • Analyzed and recommended changes to make OLAP cubes efficient to report large volumes of data, hierarchical drill downs, and slice and dice capabilities.
  • Managed SAS software depot and platform licenses.
  • Provided 24/7 support for monthly production processing for data loading as Business Consumable Objects (BCO’s).
  • Working knowledge of Installing Netezza tools on SAS platform servers.
  • Knowledge of Confidential 's Netezza tool to design high-performance data warehouse applications for using in enterprise data warehousing, business intelligence, predictive analytics and business continuity planning.
  • Knowledge of using the Composite Data platform to integrate data from multiple, disparate sources across the extended enterprise into a unified, logical virtualized data layer for consumption by front-end business solution including portals, reports and applications.
  • Worked on Autosys jobs and Shell scripts which executes SAS Macros and Programs.
  • Successfully handled work requests related to server access, user access, data access, troubleshooting and performance issues and user profile configuration.

Environment: SAS Platform Administration GRID 9.2, 9.3 and 9.4, SAS Enterprise Guide 4.3, SAS GRID Platform 9.2 / 9.3 / 9.4, SAS BASE, SAS Macros, SQL, SAS EBI, SAS Web Report Studio 4.2, SAS D I Studio 4.2, SAS OLAP Cube 4.2, SAS Information Map 4.2, SAS BI dashboard 4.2,SAS Information Delivery Portal 4.2, LSF Process Manager 7.1, DataFlux Data Management Studio 2.2, 2.3, DataFlux Expression Language 2.3, DataFlux Web Studio 2.3, DataFlux Data Management Server 2.3, DataFlux DfPower Studio 8.0, Confidential 's Netezza, Composite Data platform, Autosys, Shell scripts, windows and Linux operating system.

Sr. SAS / DataFlux Consultant

Confidential, Jacksonville, Florida

Responsibilities:

  • Gather detail requirements and conduct requirement analysis through walkthroughs and interview with business users and SQL DBA’s. Create requirement documents for building data warehouse and data marts for analytical reports and dashboard needs.
  • Document Business rules for transformation logic to be used in creation of data ware house and analytical reports. Create Mapping Specification Document - maps the source data to target datasets/ reports. This document contains detail transformation/code logic used to transform source data.
  • Conduct inventory of data sources, build data dictionary around subject areas for Data Quality initiative as a baseline for DataFlux jobs.
  • Data quality and cleaning initiative effort involve: Architect and create profile jobs to produce profile reports using Basic Statistics, Frequency Distribution and Pattern Frequency Distribution Analysis.
  • Use Data Flux DB Viewer to view, analyze and query source data. Used SQL query builder to build, write SQL code to execute against the data pulled in data flux DB Viewer.
  • Develop data profile reports for subject areas which identify bad data along with analysis notes. This document is used for Data Quality methodology implementation.
  • Deliver SAS code, reports and testing results on time as per the schedule.
  • Develop unit/system test cases and perform Unit/system testing on SAS datasets and reports.
  • Present the end products with Q&A sessions for Data warehouse datasets, web report studio reports, Information maps, OLAP Cubes and Information delivery portal to the users.
  • Develop extensively SAS code, Macros, Proc-SQL to perform Extraction, Transformation and Loading (ETL) related to data warehouse and reporting tasks using SAS BI tools Enterprise Guide, Base SAS and DI Studio.
  • Develop an efficient SAS codes to perform high volume of data crunching and transformation quickly.
  • Create and manage Metadata for source and target data using SAS Management Console (SMC). Enable libraries for consumption by EBI client applications.
  • Create source tables, pre filters/filters, formats and Info maps on the fly for reports using Information Map.
  • Write procedures and functions to perform ETL(Extraction, transformation and loading) of data to create temporary and permanent SAS datasets.
  • Develop ad-hoc report, Data sources and Data summarization - functionality to be used by business using Web Report Studio, DI Studio and OLAP cubes.
  • Concatenate and merge large volume of data using SET and MERGE functionality.
  • Use Proc SQL and SQL commands to process the data faster wherever applicable.
  • Develop SAS Programs for Data Cleaning, Validation, and Analysis for adhoc reports. Test and debug ETL code, macros and report code.
  • Define and implement SAS coding standards/specifications to read and understand the code easily by other SAS users.

Environment: SAS Enterprise Guide, SAS Platform 9.3, SAS BASE, SAS Macros, SAS SQL, SAS EBI, SAS Enterprise Guide 4.3, SAS Web Report Studio 4.3, SAS D I Studio 4.3, SAS OLAP Cube 4.3, SAS OLAP Server 9.3, SAS Information Map 4.31, SAS BI dashboard 4.3,SAS Information Delivery Portal 4.31, Data Flux - dfPower Studio 8.1, SQL server 2008, windows and UNIX operating system.

SAS Administrator / Principal Consultant

Confidential, Washington, DC

Responsibilities:

  • Requirement analysis, SAS dataset creation and development of reports related to credit loss forecasting for the business users.
  • Developed SAS code to perform Extraction, Transformation and Loading (ETL) related to reporting tasks and modifying the existing code to create the reporting SAS dataset from LFM source data Using various SAS EBI tools.
  • Developed efficient SAS codes to perform high volume of data crunching and transformation quickly. The developed SAS programs are transformed as stored processes which are called by JAVA - GUI to produce the reports on Graphical User Interface (GUI).
  • Created, managed and administered GUI screens (From existing GUI setup) for reports. The developed reports running in production are: Income Statement, Money Page, Run-off, Life of Loan and Completion Month Forecast Reports.
  • Created source tables, pre filters/filters, formats and maps as on the fly reports using Information Map Studio.
  • Creation of Data quality solutions using DataFlux which involve: Architect and create profile jobs to produce profile reports using Basic Statistics, Frequency Distribution and Pattern Frequency Distribution Analysis.
  • Use Data Flux DB Viewer to view, analyze and query source data. Used SQL query builder to build, write SQL code to execute against the data pulled in data flux DB Viewer.
  • Develop data profile real time jobs to profile, standardize and integrate the source data.
  • Developed SAS code to perform ETL (Extraction, transformation and loading) of data to create temporary and permanent SAS datasets using SAS Base, Enterprise Guide and SAS DI studio.
  • Created Stored Process to encapsulate the SAS program to accept the parameters and generate the result as a report in GUI screen.
  • Developed ad-hoc report, Data sources and Data summarization - functionality to be used by business using Web Report Studio, DI Studio and OLAP cubes.
  • Concatenated and merged large volume of data using SET and MERGE functionality.
  • Summarized the data using Proc Summary and OLAP cubes.
  • Packaged many reusable SAS code as a standalone macro to be reused and called by other code within the same program. Created complex logic as reusable Macros.
  • Used Proc SQL and SQL commands to process the data faster wherever applicable.
  • Enhanced reports through the use of labels, SAS formats, user-defined formats and custom titles.
  • Developed SAS Programs for Data Cleaning, Validation, and Analysis. Tested and debugged existing macros and report code.
  • Implemented Dynamic Data Exchange (DDE) feature of SAS for exporting of data from SAS to Excel
  • Defined and implemented SAS coding standards/specifications to read and understand the code easily by other SAS users.
  • Developed reports using ODS/HTML, PROC REPORT, PROC TABULATE, PROC FORMAT, PROC FLIST and DATA NULL .
  • Developed Test Cases for Unit Testing of reports using PROC FREQ, PROC MEANS, PROC SUMMARY, PROC REPORT, PROC TABULATE and PROC TRANSPOSE.
  • Efficiently developed complex transformation logic to process incoming source data based on the rules defined by business team.
  • Created customized logic for indenting and setting of format for reports.
  • Handled User Acceptance and Testing (UAT) debugging and testing of SAS programs in a timely manner.
  • Delivered all the SAS code, reports and testing results on time as per the schedule.

Environment: SAS Enterprise Guide, SAS Platform 9.2, SAS BASE, SAS Macros, SAS SQL, SAS EBI, SAS Web Report Studio, SAS D I Studio, Data Flux - dfPower Studio 8.0, SAS OLAP Cube, SAS Information Map, Teradata, Confidential Rational Clear Quest, Clear Case, Hummingbird Connectivity, windows and UNIX operating system.

SAS Team Lead

Confidential, Hartford, Connecticut

Responsibilities:

  • Developed SAS code to perform ETL (Extraction, transformation and loading) and process flow of data to create source SAS datasets using SAS Base SAS DI Studio.
  • Generated list reports using the PRINT and REPORT procedures using Base SAS and Web Report studio.
  • Created summarized business data using OLAP Cubes which are in turn used by SAS programmers and business.
  • Modified variable attributes using options and statements in the DATA step
  • Used SAS functions to manipulate character data, numeric data, and SAS date values
  • Used FORMATTED, LIST and COLUMN input to read raw data files
  • Used INFILE statement options to control processing when reading raw data files
  • Used various components of an INPUT statement to process raw data files including column and line pointer controls, and trailing @ controls.
  • Created source tables, pre filters/filters, formats and maps as on the fly reports using Information Map Studio.
  • Created VB script with windows job scheduler to automate the data base creation process.
  • Automated several production and ad-hoc SAS Applications. This improved the performance of the current system and minimized the number of Ad-Hoc requests.
  • Created SAS code to generate reports with dynamically processing of the data for recent data sets to create the Campaign measurement reports.
  • Created test execution plan, test cases, test scenarios and test scripts in SAS for Quality Assurance testing.

Environment: SAS Enterprise Guide, SAS platform 9.1.3 and 9.2, SAS EBI 9.2, SAS Web Report Studio, SAS D I Studio, SAS OLAP Cube, SAS Information Map, SAS Macros, SAS SQL, Teradata and Windows XP

Sr. SAS Systems Analyst

Confidential, Novato, California

Responsibilities:

  • Extensively involved in SAS programming to create SAS data sets including large SAS data steps
  • Compiled Stored SAS Macros, SAS procedures and reusable SAS include programs.
  • Developed new or modified SAS programs to load data from the source, applied required transformations and loaded the transformed data to the target using SAS base, SAS macros, SAS-SQL, SAS functions and SAS procedures.
  • Developed reports using SAS Add-In for Microsoft Excel and created Stored Process and Macros.
  • Developed complex reports using PROC FREQ, PROC MEANS, PROC SUMMARY, PROC REPORT, PROC TABULATE and PROC TRANSPOSE.
  • Handled status update and Process change request (PCR).
  • Prepared, documented and tested SAS programs which include creation of Conceptual Design Document, Systems Design Document, Test Plan and Test Cases.
  • Developed and summarized reports using SAS OLAP for Multidimensional Expression (MDX).
  • Created OLAP Cube using SAS OLAP Cube Studio, SAS OLAP data Provider 9.1, Source Designer Wizard and Cube Designer Wizard.
  • Provided SAS programming technical support to data modeling group for SAS Base, SAS macros, SAS Enterprise Miner and SAS EG for Data transformation, data crunching data mining tasks and (regression) model functionality.
  • Created reports using BI Server i.e. Web Report Studio, Enterprise Guide and Microsoft office integration.
  • Designed and developed Physical and Logical data view using SAS Information Map Studio.
  • Administered SAS Management Console for the creation of metadata, user profiles, configuration of client applications for SAS server connectivity, repositories creation and management etc.
  • Creation, management and troubleshooting of SAS Metadata Repository. Involved with higher management for Repository planning, access and space allocation.
  • Performed administrative tasks applying to the platform, including starting and stopping servers, checking the status of servers, setting server logging options, administering the SAS Metadata Server and administering SAS Metadata Repositories.
  • Created metadata objects that are used to establish connectivity to data sources and targets. Also performed setting up shared access to SAS data.
  • Designed and Developed reports using SAS BI Server i.e. Enterprise Guide and Web Report Studio.
  • Used SQL Pass-Through data access method of SAS Access to connect to varies databases using ODBC (Open Data Base Connectivity).
  • Applying the data quality solutions of profiling, standardization, data quality and integration using data flux.
  • Used data flux batch job processing to clean the incoming data to the data base.
  • Coding /Building the data quality solutions of profiling, standardization, data quality and integration using SAS - DataFlux.
  • Coding /Creation of SAS - DataFlux schemas to implement customized business rules.
  • Assisted higher management in implementing technical project management methodology using project management plan which includes scoping, scheduling, risk management and quality assurance.

Environment: SAS V9.1.3-BASE, SAS-SQL, SAS-Macros, SAS Enterprise Guide 4.1, Enterprise Miner 5.3, SAS EBI, SAS Web Report Studio, SAS D I Studio, SAS OLAP Cube, SAS Information Map, DB2-SQL, Oracle 10g and Data Flux dfpower studio 8.0 in windows / Unix environment.

Principal SAS Consultant

Confidential, Novato, California

Responsibilities:

  • Developed and implemented new application using SAS, Unix and Mainframe.
  • Monitoring and trouble shooting of daily, weekly and monthly jobs.
  • Handling and resolving tickets related to SAS, DB2, Mainframe and UNIX.
  • Handled status update and Process change request (PCR).
  • Planned and Implemented Knowledge Transfer(KT) from Fireman’s Fund Insurance Co, Novato, CA, USA to Confidential, Bangalore, India.
  • Strategically planning to provide 24/7 support to the client by scheduling the teams availability.
  • Performing Enhancement and development work to add/edit the functionality to the applications.
  • Successfully handled and resolved P1, P2 and P3(critical) tickets .
  • Successfully trained the Technical competency for SAS Resources.
  • Monitoring and troubleshooting the daily jobs AMDW, ADS, APPS Genesys and QDW1.
  • Monitoring and troubleshooting the weekly jobs Quick Screen, Experian, Equifax, CLNTSERV and Collection Process.
  • Monitoring and troubleshooting the monthly jobs AMDW, Client Data mart, TS 49 data mart, ADS Dashboard, APPS Dash board, TS 49 Dash board, Data mart Billing Promo, Acquisition Data mart, S2K Dash board, APPS Legacy mainframe extract, Edge Extract, Narex Extract, CSDM Loads, History Mon and Monthly Code2.
  • Handled and Resolved tickets related to SAS, Mainframe and UNIX.
  • Handled status update, Business update, Tickets status calls on daily and weekly basis.
  • Planned and Scheduled Knowledge Transfer(KT) from Accenture Corporation, Mumbai.
  • Strategically planned to provide 24/7 support to the client by scheduling the teams availability.
  • Successfully decommissioned AMDW.
  • Successfully handled Emergency Response Team(ERT) call and troubleshot the problem .
  • Handle the FICO extractions.
  • Maintained the daily checklist for a) Minutes of meeting(MOM) and b) Real time involvement.
  • Successfully checked Technical competency for SAS Resources.

Environment: SAS V8-BASE, SAS-SQL, SAS-Macros in Windows/Unix Environment, Oracle 7.x & 8(SQL & PL/SQL - Program Units), Mainframe- MVS/ESA-JCL, ISPF & AS400Sr. Systems Analyst

Confidential

Principal SAS Consultant

Responsibilities:

  • Once the request is received by the Business Partner (BP) in the form of Campaign Management System (CMS) through lotus notes it is checked for Business Justification of the request and purpose. The request is thoroughly checked for Business compliances as per different regions and countries. The request is at last checked for any extra information needed or for eligibility file or for lack of information. For any of the above things the BP is communicated keeping the managers in loop. The starting of the project is intimated to the BP.
  • The Information of the Card Members (CM) resides in Unix which is updated monthly, and the weekly updated information resides on Mainframe. Depending upon the need the CM information is pulled from tabled in Unix or Mainframe using SAS Data steps and PROC SQL.
  • The data pulled is saved in a mainframe file named: ICIM01.LOY . .Trigger. This run through the process of Last minute suppressions and created a Listauth file (Private information file) named: PRDIN.PUBLIC.LISTAUTH. . From this listauth file the final file is created after filtering the variables, which is in the form of text or Excel file by using the SAS Datasteps.
  • The project documentation is done specific to the country stating the step by step explanation of execution.
  • Once the document is complete it the project is sent for Quality Assurance(QA) check. After QA check the final file requested in whatever format by BP is sent using CMS.
  • Closed 80 projects so far as flawless project programming execution. QC check done on 40 projects successfully.
  • No programming or logical error established for the closed projects.
  • As a excellent track record no project failed till date.

Environment: SAS V8-BASE, SAS-SQL, SAS-Macros in Windows/Unix Environment, SQL/PLSQL-Oracle 8, Mainframe- MVS/ESA-JCL, ISPF & AS400 and Lotus notes.

Team Lead - Business Intelligence

Confidential, Gainesville, Florida

Responsibilities:

  • Involved in project focusing on planning, execution and control of the complete Software Development life cycle. This included monitoring and identification of risks, corrective action and updating the risk response plan. Defined and created the change management document, managed the changes to the software version, project scope, project schedule and project costs as per the change management document. Also mentored the team in order to improve performance by developing team cohesiveness, training and motivation for efficient software development and proper time management.
  • Also responsible for the coding, executing and managing of a team of programmers for data mining, data warehousing and database Management, E-commerce applications, N-Tier applications and Business objects.
  • Used FORMATTED, LIST and COLUMN input to read raw data files
  • Used INFILE statement options to control processing when reading raw data files
  • Used various components of an INPUT statement to process raw data files including column and line pointer controls, and trailing @ controls.
  • Combined SAS data sets using the DATA step
  • Creating Data Structures
  • Created temporary and permanent SAS data sets
  • Created and manipulate SAS date values
  • Used DATA Step statements to export data to standard and comma delimited raw data files
  • Controlled which observations and variables in a SAS data set are processed and output.
  • Investigated SAS data libraries using base SAS utility procedures
  • Sort observations in a SAS data set
  • Conditionally execute SAS statements
  • Used assignment statements in the DATA step
  • Modified variable attributes using options and statements in the DATA step
  • Accumulated sub-totals and totals using DATA step statements
  • Used SAS functions to manipulate character data, numeric data, and SAS date values
  • Used SAS functions to convert character data to numeric and vice versa
  • Processed data using DO LOOPS
  • Processed data using SAS arrays
  • Generated list reports using the PRINT and REPORT procedures
  • Generated summary reports and frequency tables using base SAS procedures
  • Enhanced reports through the use of labels, SAS formats, user-defined formats, titles, footnotes and SAS System reporting options
  • Generated HTML reports using ODS statements
  • Identified and resolve programming logic errors
  • Recognized and correct syntax errors
  • Examined and resolve data errors
  • Projects Executed - USA: Direct MAC System, RDG Insurance System, Test Tutor, Florida Tourism Information System(FTIS).

Environment: SAS V8-BASE, SAS-SQL, SAS-Macros, Active X, COM, MTS, SQL Server - 7.0, Oracle 7.x & 8 (SQL & PL/SQL - Program Units), VB-6.0, ASP, HTML, VB Script, Java Script, J Script in Windows

We'd love your feedback!