We provide IT Staff Augmentation Services!

Senior Modeler/manager Resume

4.00/5 (Submit Your Rating)

St Louis, MO

EXECUTIVE SUMMARY:

  • A top technical level, senior production code developer in responsible for a reputable financial institute to perform model implementation, validation and monitoring, data QA, QC and migration with extensive SAS, Perl and DB sql programming.
  • Expert in handle big data, ETL, parallel computing and automation process.
  • Proven ability to translate ideas into products and work efficiently to meet project deadlines.
  • Ability to work independently as well as playing leadership in helping and training team members and model end users.
  • Excellent communicator and coordinator with modelers, business analysts and well balance multi - tasks and concentrate on priorities.

TECHNICAL EXPERTISE:

  • SAS - BASE, MACRO, STAT, SQL, GRAPH, TABULATE, ODS, EG, CONNECT, and ACCESS with many procedures, such as UNIVERITE, ANOVA, GLM, MIXED, REG, CORR, FREQ, LOGISTIC, LIFETEST;
  • Perl - DBI, REGEX, CGI, OO model;
  • Shell scripting, Python, C and C++;
  • SQL, SQL plus and PL SQL;
  • Visual Basic (GUI and Database);
  • Java (applet and servlet);
  • HTML, JavaScript, XMLand CSS
  • Windoews, UNIX, Linux, Macintosh and Mainframe
  • Oracle, DB2, MS Access, MySQL, postgreSQL, and Sybase
  • Access, Excel, PowerPoint, and Word
  • R, SPSS, Statgraphics, CVS

PROFESSIONAL EXPERIENCE:

Confidential, St Louis, MO

Senior Modeler/Manager

Responsibilities:

  • In charge of Risk CCAR model production code package combined with multiple models and portfolios to perform CCAR model implementation and validation;
  • Final ‘official’ person to release team production code and model results (model performance, characteristic test, and sensitivity test to MRM);
  • Responsible for data preparation in model development and data mart and data warehouse migration between different SAS servers;
  • Automation and optimizations of BI process by modularizing SAS code components and applying parallel computational technique with SAS macros;
  • Convert a quantile regression Fortran program to SAS macro, while SAS did not have either binary quantile regression procedure or independent simulated annealing function at time;
  • Develop Automated Housing Value Updating application( Confidential ). Using VB to develop user GUI, send request to SAS through SSH, then get results send back to window Excel;
  • Set up comprehensive model backtest bed with multi Confidential models, multi snap dates back tests, stress test, sensitivity test and characteristic analysis. The whole process applies modulized approach and dynamically utilize the system resource to do parallel processing;
  • Develop MIS reporting system for all the production models;

Confidential, Chattanooga, TN

Senior SAS Application Developer

Responsibilities:

  • Working with HEDIS business partners to clarify and refine business requirements, Design, development, and testing of HEDIS Technical Solutions, provide consultative technical support.
  • The job main focus is on SAS MACROS, SAS MF, SAS PC, SAS for UNIX, SAS SQL, HEDIS business knowledge, platform communication technologies (i.e. MF to UNIX communication protocols like NDM, FTP ...).
  • Developing ETL system process in pulling data from DB2 to SAS for further business analysis. In the new process, parallel computing technology is fully implement to replace current sequential ETL process. It has been great cut data process time and much more efficiently utilize system resource.
  • Using SAS SQL and SAS macro to write SAS program to fit complex business logic requirement.
  • Working closely with business specialists to redefine, implement and test new business rules to related existing SAS programs.

Environment: SAS, Perl, DB2, SQL under Unix, PC and Mainframe.

Confidential, St. Louis, MO

Senior System Software Engineer

Responsibilities:

  • Provide technical support to ensure major financial data operation activities for home mortgage risk analytic team and modeling team and their software needs under SAS DI and BI environment.
  • Overhauled system SAS ETL data processing programs which handle super large of records with hundred variables over many years. Successfully resolved several bottlenecks in ETL data process steps and cut the running time from a few days to overnight in most case.
  • Modified system software to make them run more dynamic and more efficient in data aggregation process.
  • Developed ad hoc database update programs to allow analytic team updating the most current information in real time to the database and also revised web interfaced data uploading programs to adopt switching new tables and new variables in meet business needs with proper error tracking mechanisms.
  • Identified data processing bottlenecks and actively searched possible solutions, new algorithm and new techniques, optimized codes, performed benchmark testing and data validation.
  • Performed trouble shooting and debug in system software and provide programming help for analytic team and modeling team.

Environment: SAS, Perl, DB2, shell scripting, sql and Unix

Confidential, St. Louis, MO

Senior Statistical Data Analyst

Responsibilities:

  • Worked closely with professors to develop several large scales of statistics and genetics programs, include coding, debugging, and code optimizing.
  • Most of these programs were using Perl incorporated into SAS macros and combined with Sun's Gridware parallel computing.
  • Performed routine statistical and genome wide genetic analysis.
  • Managed and coordinated multi-year and multi-centered international clinical and genetic studies. Developed robust reporting system to track data collection and study progress. Assisted researchers to design study investigation forms. Developed and standardized restrict data QC procedures for data quality control.
  • Developed a bioinformatics database with dynamic web interface for data uploading, downloading, updating and querying with proper data formats.
  • Edited data dictionaries and codebooks to help multi-centered researchers to know how the variables were defined and in what format with basic statistical descriptions.
  • Provided statistical consulting to WUSM non-statistic medical researchers for their statistical analysis needs, test hypothesis, exam associations, prediction, and power calculation. Prepare tables and graphs and interpret the results.

Environment: SAS, R, Sun's Gridware Perl, Mysql, PostgreSQL, Solaris, Linux. Shell scripting

Confidential, St. Louis, MO

Bioinformaticist

Responsibilities:

  • Provided multiple product leads based on solid bioinformatics analysis and statistical analysis.
  • Performed biotech research and development, focusing on nutrition and agriculture related traits.
  • Created ten invention disclosures and one patent for company intellectual properties.
  • Discovered genes involving major nutrient components of Arabidopsis seeds by designing and analyzing gene expression profiling experiments. Cluster analysis was applied to identify genes with similar pattern of cluster that significantly contribute to seed major components.
  • This process provides a clear understanding of the most valuable genes for seed nutrient.
  • Designed and applied augmented design to pull out putative mutants in high throughput mutant screening (HTS). Then used a restricted design for mutant confirmation with a series statistical hypothesis test (t test, w test, chi-square test and F test). Resulted in discovering tens of transcription factor genes in controlling a major nutrient component in Arabidopsis seeds.
  • Developed a comparative genomics approach to discover a nutrient metabolism pathway related genes. More than twenty genomes were compared and final list has been narrowed down for bench researchers at a workable scale, so it's significantly reduce the research time span for a project.
  • Designed a microarray sample tracking system database for tracking experiment samples and monitoring sample quality.
  • Utilized Visual Basic to design database GUI to create user-friendly environment for easy access and easy operate the database.
  • These have led to built effective monitor system to ensure the microarray lab operations.

Environment: Perl, Visual Basic, Oracle, Access, Unix, window NT

We'd love your feedback!