Principal Consultant Resume
SUMMARY:
SAS platform EBI/EDI/GRID administrator/developer and UNIX administrator/developer with in - depth experience in large distributed systems infrastructure administration, performance, workload management and capacity planning, statistical analysis and modeling as well as efficient relational database processing.
TECHNICAL SKILLS:
Hardware: Sun; HP; Confidential RISC/6000; Intel servers; Confidential Mainframes; Teradata; Netezza; NetAppliance
Operating Systems: Solaris; HPUX; Confidential /6000; Linux; Windows; Confidential ; OSF/UNIX; MVS; VM
Languages: SAS EBI/DI Server; SAS System all components; DS2; GRID; IML; SPDS, ITRM SQL; Perl; C; Shell Scripts; MATLAB; PL/I; Fortran; APL; JCL
Technologies: DNS; Confidential ; Confidential ; LDAP; Kerberos; SAMBA; FLEXlm/Globetrotter; Autosys; Documentum; PowerBroker, BMC Patrol & Best1; CA Unicenter; Concord, Sun SE Toolkit; MXG; NTSMF
PROFESSIONAL EXPERIENCE:
Principal Consultant
Confidential
Responsibilities:
- Responsible for maintenance and operation of SAS Grid 14 nodes sever. Designed and developed SAS Grid workload management framework by parsing and analyzing SAS Grid server logs. It allows grid administrator to analyze and SAS long running SAS session to take appropriate actions.
- Built, configured and maintained OpenLDAP software for authentication and authorization of thousands of users using green high performance cloud computing cluster.
- Built, configured and led the administration and development of predictive analytics environment with SAS 9.3 platform using Enterprise Miner 7.1, Model Manager 3.1 and In-Database - Teradata running scoring models.
- Built and optimized SAS 9.2 Web infrastructure for EBI prod/dev/test distributed environments.
- Developed a system for performance analysis and SAS workload management for a large number of Linux nodes running different SAS logical servers.
- Designed and developed SAS Bulk Loader of RDBMS data -- automated system for monthly builds of SAS data marts from Oracle EDW dumps. The system parses corresponding DDL files and generates SAS 9.2 Data and Proc steps and Macro code to read flat-delimited-compressed data files and populates any format of SAS library or SAS/SPDS library with dynamic cluster tables in SAS/SPDS server.
- Designed and developed complete monitoring, workload management and capacity planning framework for SAS 9.2 EBI server running on Confidential p795 server using Perl and SAS metadata management tools supporting hundreds of SAS users and batch processing.
- Installed, configured and maintained several instances of SAS/EBI platform and SAS/SPDS server.
- Conducted evaluation and installation of SAS/GRID environment as an alternative to SAS/EBI environment.
- Built SAS data warehouse for efficient processing of Medicare part A/B/D data to evaluate variability of health services across different US geographical regions.
- Deployed and customized entire SAS/EBI/EDI suite on Confidential P570 Confidential partitions dedicated to DEV/TEST/PROD environments to implement change control and change management processes. Developed Perl parser/post-processor that splits SAS/DI studio generated code into data independent code and SAS autoexec code with all data definitions that run SAS production jobs under Confidential ESP scheduling system.
- Built the framework for performance monitoring and memory capacity management of every SAS/EBI client.
- Developed a system using SAS Data Integration studio for processing and analysis of the state medical records.
- Deployed and administered entire SAS/EBI platform on Solaris 10. Developed a framework for automated stopping and re-starting in proper sequence of all 20 platform’s servers; monitoring all servers and open SAS clients’ sessions and their corresponding workspaces, activity logging and other real time system activity. The framework is developed using Perl and SAS Open Metadata Interface to obtain data dynamically from the SAS metadata server repositories. Led the development of turn-key Customer Relationship Management (CRM) system to manage relationship between VC and very large Internet betting community.
- Developed IT cost allocation system based on resource utilization data collected from over 30 z/OS, 10 z/VM Linux systems, thousands of UNIX/Linux and Wintel servers, LAN/WAN networks and SAN/NAS and other storage and network devices and multi-tiered business metadata. System is built using SAS/ITRM as a warehouse and modeling environment.
- Developed an application using SAS system for processing gargantuan amount of data and reliably loading it into Oracle using bulk loader sqlldr with SAS generated custom control and data files. Also implemented a set of instrumentation routines using SAS and Perl for performance and capacity planning of Sun Fire E25K server that this production system runs on.
- Built and configured OpenLDAP server on a Linux system. Designed and built LDAP directory to be used for user’s authentication and authorization for very large hierarchical organization. Wrote a parser to convert the LDIF file into SAS metadata canonical tables and to import them into SAS metadata server. Designed and developed analysis and reporting system processing data from several very large Oracle instances at the Confidential .
- Upgraded ten large Confidential, Solaris and HP-UX servers running SAS 8.2 to SAS 9.1.4 system. Built and configured two SAS/EBI servers for consolidation of the SAS enterprise applications to run in this distributed environment. Defined all instances of Oracle, DB2, and ODBC in the SAS Metadata Server that is used by SAS users running Enterprise Guide 4.1. Built a set of stored processes defined in SAS Information Delivery Portal for use with SAS Web Report Studio and Report Viewer.
- Installed and configured very large SAS/EBI server running on several UNIX systems. Implemented security infrastructure for the SAS Web Information Delivery Portal utilizing WebSeal—WebLogic - SAS WIK secure single sign-on. Also instrumented all of SAS client applications and server communication over SSH tunneling. SAS metadata server authorization along different authentication providers is deployed for external and internal users/applications.
- Designed and developed large ETL system to extract voluminous data from Teradata, transform within SAS warehouse and load it into Oracle RDBMS. Designed and coded a set of SAS macros that control the flow and execution of the ETL process based on a large matrix of business parameters.
- Designed and developed SAS data mart used for credit risk analysis, scorecard and forecast modeling. The data mart consists of several SAS Libraries with additional metadata for automatic validation of analysis and reporting procedures manipulating this data.
- Developed a hedge fund trading software system. The system encompasses reliable acquisition and management of Bloomberg market data, quantitative models to predict performance of different financial instruments, automated back testing of trading models and signal generation for portfolio allocation. System is implemented using the SAS system, Perl, MATLAB, Excel and RDBMS on Linux and Windows platforms.
- Built a heterogeneous UNIX/Linux/Windows infrastructure running WebLogic, WebSphere, iPlanet, Jrun, Tomcat web servers for E-commerce software development. LoadRunner scripts were used to generate thousands of transactions to predict system performance and perform capacity planning of the distributed RDBMS and web transactions software systems.
- Designed and implemented a performance monitoring and capacity planning system of large WANs consisting of over five hundred Cisco routers and switches. The system utilizes HP Network Node Manager to collect different snmp mib data from each network device. SAS/ITRM (formerly known as SAS/ITSV) is used to manage the data and analyze performance, availability and trends in capacity usage of each business circuit. The reports are produced in tabular/html and graphics/gif formats for viewing on the Intranet.
- Developed a system to manage and analyze the data collected by CA Unicenter TNG enterprise management system. A set of parser modules was written using SAS data step to process CA computer performance cubes into a variable number of data sets in SAS Library. Macro facility was used to preprocess a different set of data from multiple releases of Solaris, HPUX, and Confidential, to invoke an appropriate parser module and to create a schema and data dictionary for storing data into SAS/ITRM warehouse.
- Designed and developed two-tiered software system for performance management and capacity planning in very large UNIX and Windows distributed computing environments.
- In the first tier each server’s performance data is collected using standard UNIX utilities, e.g. accton, mpstat, vmstat, iostat, etc and NTSMF collector for NT servers. All data is collected independently within each unit of the existing distributed computing environment hierarchy. Each data collector is enhanced with metadata identifying it with the company hierarchy. Then data is reliably delivered to a central Sun Enterprise 4500 server for processing and analysis.
- In the second tier of the system the SAS ITSV software was enhanced with a set of generic data collectors to transform data collected by the first tier’s utilities into ITSV PDBs. An additional set of SAS macros was written to augment SAS ITSV to work on expanded PDBs, and to analyze, summarize and produce reports for viewing on the Intranet.
- The data from the SAS ITSV PDBs was automatically loaded into the firm’s disaster/recovery Oracle database, using SAS/ACCESS Interface to Oracle.
- Compiled and packaged, using Solaris software packaging system, vendor and Internet software such as GNU compilers, GDB, JDK reference releases, Apache Web Server, Perl, CDE, RCS, CVS, Rogue Wave software, Globetrotter License Management suite of tools, Rational Clear Case, Rational Rose, NASDAQ and many others.
- Installed and configured FLEXlm/Globetrotter software for distributed license management and utilization in very large distributed servers’ environment.
- Compiled, installed and configured Internet software for multiple UNIX/Linux platforms. Some of these tools include: rsync, dig, s-trace, lsof, nfswatch, Solaris SE Toolkit. Integrated these products into a centrally managed ubiquitous distributed file system.
- Integrated BMC Patrol and BEST/1 software into company distributed UNIX/Windows infrastructure. Administered and used it for performance monitoring and capacity planning.
- Compiled, installed and configured numerous Internet freeware for multiple UNIX platforms. Some of these tools include: Rsync, RCS, CVS, dig, s-trace, lsof, nfswatch, Solaris SE Toolkit. Integrated these products into a centrally managed ubiquitous distributed file system.
Confidential
Senior UNIX System Administrator/Developer, New York, New York and Jersey City, New Jersey
Responsibilities:
- Built and supported an enterprise-wide system management server that houses and distributes all system and name spaces information for a large distributed UNIX environment.
- Major components of this system include Ingres database, Kerberos server, DNS server, and file distribution to every domain's Confidential server.
- Responsible for implementation and operation of the enterprise distributed backup and disaster recovery system for over 2000 Sun systems based on Legato NetWorker software.
Confidential
Advisory Member of Technical Staff, Poughkeepsie, New York
Responsibilities:
- Developed a software system for performance analysis and capacity planning of Confidential ’s largest customers’ installations.
- The system used the SAS software to process MVS RMF and SMF data and to process, analyze and report on performance of customers’ workloads on a complete range of Confidential /370 and Confidential /390 hardware architecture.
- Built and supported distributed UNIX environment across Confidential on Confidential ES/9000, Confidential on RISC System/6000, Confidential PS/2, SunOS and HPUX systems.
- Wrote an Confidential red book for Confidential system administrators.