We provide IT Staff Augmentation Services!

It Specialist Resume

4.00/5 (Submit Your Rating)

Washington, DC

CAREER OBJECTIVE:

  • Enterprise data management, data architect, and data analytics.
  • Data onboarding and ETL (Extract, Transform, and Load)

PROFESSIONAL SUMMARY:

  • Performed full life cycle of data onboarding tasks making data available to researchers. In the process, analyzed user requirements, studied data structure, identified data anomalies and inconsistencies, and finally recommended the best practice for transforming and loading the data.
  • Designed and implemented ETL(Extract, Transform, and Load) jobs for loading data and maintained quality data to meet research needs.
  • Automated tasks for data ETL, data profiling, data quality analysis using various tools such as SAS Data Integration Studio, TalendOpen Studio, SQL scripts, Python programs, Linux shell scripts, and HDFS commands.
  • Managed the whole data business processing flow with source files in various formats, including XML, Excel, delimited flat files, and SAS data files, and with target repository in relational database, or HDFS on Hadoop clusters for big data; help maintained enterprise data architecture.
  • Collaborated with and assisted research teams in tasks of data descriptive analysis and data accessing using different tools to data on various databases (SQL, Hive, etc). Provided best practice query methodology and scripts to researchers to enhance their capabilities and productivities.
  • Experienced and highly skilled with more than 20 years in data management and software development. Worked with all phases of Software Development Life Cycle (SDLC).
  • An advocate of software design review, code review, independent testing, and documentation to build robust, modular, maintainable, and future - proof applications.
  • Created utilities tools and statistical tools for enhancing the effectiveness and efficiency of researchers; built analytical data mart by joining, transforming, imputing datato meet research needs and timely delivery.

PROFESSIONAL SKILLS:

Base: SAS, SAS Data Integration Studio, SAS Enterprise Guide, SAS Enterprise Miner

Talend: Open Studio (data integration and data profiling) Data management, data architect, data analysis, profiling, and modelling

Databases: SQL Server, PostgreSQL, MySQL, Access, HDFS/Hive database

Programming Languages: SAS, Python, C/C++, Java

Hadoop/HDFS: big data processing and analytics

PROFESSIONAL EXPERIENCE:

IT Specialist

Confidential, Washington, DC

Responsibilities:

  • As the lead developer, implemented, deployed and scheduledtens of SAS or TalendETL jobs loadingfinancial data to SQL database and to HDFS/Hive database.
  • Designed and implemented an architect of coordinated, interdependent Python programs, Talend jobs, and SAS DI Studio jobs fordata ETL and/or data analysis to fully take advantage of the best functionality/capability of each of the three component applications.
  • Created, deployed, and scheduleddata analysis and data profiling jobs making analysis reports available to various teams on a regular and timely basis.
  • Helped design a data transpose mechanism to handling a data set with changing number of variables. As the sole developer, implemented transpose mechanism in SAS DI Studio ETL job; the first-ever job of such kind implemented in data management team at OFR.
  • Wrote Python script to transform unstructured big data into structured data, and implemented Talend job for loading structured data to HDFS/Hive; the first-ever job of such kind implemented in data management team at OFR.
  • Assisted researchers in accessing and assessing big data on Hadoop/HDFS using SAS and/or using Hue applications.

ETL Specialist

Confidential, Washington, DC

Responsibilities:

  • Resolved the issue of importing data from XML files withnested data structure for loading data using SAS DI Studio; the issue had been a roadblock for monthsfor the OFR data management team until I resolved it.
  • Designed and created a SAS ETL job that handles transaction data with insert/update/delete operations directly against relational database; the first-ever ETL job created in OFR data management team that handles transaction data.
  • Created SAS jobs that produced data analysis and data profiling reports based ad hoc requirements from researchers.

SAS Programmer

Confidential, McLean, Virginia

Responsibilities:

  • Created SAS utilities for assisting statistical model building and model assessments.
  • Implemented SAS macros for exploratory data analysis and predicative data modeling.

SAS Programmer

Confidential, Washington, DC

Responsibilities:

  • Performed exploratory data analysis using base SAS programming.
  • Created analytical data mart as a basis for predicative data modeling.
  • Built a binary tree model for predicative analysis using SAS Enterprise Miner.

We'd love your feedback!