We provide IT Staff Augmentation Services!

Senior Database Engineer Resume

2.00/5 (Submit Your Rating)

Pleasanton, CA

PROFESSIONAL SUMMARY:

  • Extensive knowledge in Data Warehousing and Data Mart Applications
  • Experience in Evolving strategies and developing Architecture for building a Data warehouse.
  • Extensive experience in Data Modeling using Erwin tool.
  • Expertise in Data warehousing and Data migration.
  • Extensive knowledge of all phases of application development life cycle - Planning, Designing, coding, testing and implementing of mainframe applications.
  • Experienced in interacting with end users and analyzing business needs.
  • Strong in ETL/ELT design and development skills.
  • Extensively involved in Modelling STAR, SNOWFLAKE schemas.
  • Expertise in Metadata models.
  • Strong leadership, interpersonal and oral/written communication skills.

TECHNICAL SKILLS:

OS: UNIX, MVS, Windows, OS/390

Languages: PL/SQL, SQL, COBOL, JAVA, C#, PYTHON

Databases: TERADATA, DB2, MS SQL, Access 97, ORACLE

ETL Tools: DataStage, Informatica, Ab Initio

BI Tools: MICROSTRATEGY, SAP-BO, Cognos

Software: HADOOP, HIVE, SPARK, SQL*Loader, TERADATA ADMINSTRATOR, TERADATA SQL ASSISTANT, TERADATA UTILITIES (FLOAD, MLOAD, FEXPORT, BTEQ), OLAP/OLTP, METADATA, MDM, Erwin, SHELL SCRIPTS, SYNCSORT, CONTROL-M, AUTOSYS

PROFESSIONAL EXPERIENCE:

Confidential, Pleasanton, CA

Senior Database Engineer

Responsibilities:

  • Design and Develop ELT/ETL jobs using Datastage, TERADATA
  • Expertise in Teradata Tools like Fast Load, Fast Export, Multi Load, TPUMP and TPT Utilities .
  • Write Map Reduce Jobs, HIVEQL, Spark
  • Export and Import data into HDFS, HBase and Hive using Sqoop.

Confidential, Bellevue, WA

Senior Database Engineer

Responsibilities:

  • Design and Develop ELT/ETL jobs using Datastage, TERADATA
  • Database Tuning
  • Expertise in Teradata Tools like Fast Load, Fast Export, Multi Load, TPUMP and TPT Utilities .
  • Work on Hadoop Cluster with current size of 56 Nodes and 896 Terabytes capacity.
  • Write Map Reduce Jobs, HIVEQL, Spark
  • Export and Import data into HDFS, HBase and Hive using Sqoop.
  • Tuned long running Hive queries for performance improvement to better utilize the Hadoop cluster resources

Confidential, Folsom, CA

Database Engineer

Responsibilities:

  • Design and Develop ETL Jobs using SSIS
  • Developed SQL Scripts - SQL Server
  • Load data into SQL server database
  • Develop Metadata, Data Model
  • Develop reports using SSRS
  • Responsible for Tuning SQL queries
  • Responsible for data security

Confidential, Sacramento, CA

Data Manager

Responsibilities:

  • Gathering business requirements
  • Responsible for aligning and Integrate Multiple Sources of Claims (Managed Care and Fee For Service) into one Target Database - Size 5 billion Claims for the last
  • Responsible for Cleansing and Transformation of Data
  • Design and Development of Grouper Data using Symmetry Suite 8
  • Creating data mappings between various source systems to the data warehouse Objects
  • Responsible for Teradata 15.0 Administration
  • Developed reports using SAS
  • Data Analysis/Profiling was performed using Confidential Datastage - ETL tool.
  • Confidential source data in VSAM/DB2/COBOL was loaded into TERADATA and Business Objects/SAS was used for Reporting.
  • Responsible for Data Security Like Encryption
  • Responsible for Change Control, System Enhancements
  • Develop Data Model using Erwin
  • Writing Technical specifications for ETL programs, Model Metadata
  • Designing the general ETL infrastructure necessary to address the common issues like
  • Error handling and restart/reprocessing techniques
  • Parameterizing the DataStage jobs and building the controls necessary to feed parameter values correctly at run time
  • Scheduling the DataStage jobs

Confidential, Boston, MA

Software Engineer

Responsibilities:

  • Design and Develop ETL Jobs using SSIS
  • Developed PL/SQL Scripts in Oracle
  • Create Mapping from Source to Target
  • Scheduling the DataStage jobs using Autosys scheduler
  • Training/Guiding other developers on the methodologies and concepts used in the project
  • Interacting with users

Environment: Unix, DataStage, Oracle, Teradata

Confidential, Wilkesboro, NC

Programmer/Analyst

Responsibilities:

  • Actively participated in collecting enterprise module design requirements, discussions.
  • Prepared different specifications for data loads and aggregations.
  • Developed many Data stage jobs for data processing and loading.
  • Tuning SQL statements
  • Design and development of reporting system using Cognos (impromptu & power play) Environment: DB2, ETI Extract, Cognos, DB2 loader, SQL, PL/SQL, DB2 UDB, Unix AIX, ERWIN, JCL, OS/390
  • Production support - rotational basis (24x7).
  • Involved in the design of reports using MicroStrategy.

Confidential, Dearborn, MI

Data Warehouse Specialist

Responsibilities:

  • Involved in the design of database objects at staging database level and target level.
  • Created job flow diagrams and scheduling jobs in production using scheduler CA7.
  • Wrote queries for extraction of data using SQL/TERADATA.
  • Wrote programs using Cobol for formatting data to load into Teradata.
  • Involved in the design and loading of staging tables using Responsible for Production support

Environment: MVS/ESA, JCL, SQL, TERADATA, TERADATA UTILITIES (FLOAD, MLOAD, FEXPORT), OLAP FUNCTIONS, HOLOS

Confidential, Philadelphia, PA

Programmer

Responsibilities:

  • Production support for the PIMS online system
  • Implement software change requests which usually come through an ISR (Initiate Software Requirement) that has been initiated by Custodial team leaders
  • Develop programs in COBOL & SMF/SLR to automate the manual process to distribute system overhead cost amongst the applications that accrue cost and use tools that cannot be directly attributed to one application

Environment: ES/9000, MVS/ESA, TSO/ISPF, VSAM, JCL, COBOL, DB2, EXPEDITER,

We'd love your feedback!