We provide IT Staff Augmentation Services!

Staff Data Engineer Resume

2.00/5 (Submit Your Rating)

Sunnyvale, CA

SUMMARY

  • Over 10 years of Data Warehousing/Business Intelligence experience using ETL tools Ab Initio, Data Stage, RETL and WhereScape RED.
  • Experience in Financial, Retail and Networking domains.
  • 10 years of Database Experience using Teradata, Oracle, SQL Server, MySQL, PostgreSQL and DB2.
  • 1 year experience in Big Data ecosystem tools Hadoop, Spark and Aster Data.
  • Total 10 years of Scripting experience in UNIX, Python & Perl.
  • Certified in Teradata 12 Enterprise Architecture.
  • Certified in Big Data ecosystems Hadoop/Pig/Hive/HBase/Sqoop.
  • Trained in Spark/Scala.
  • Over 7 years of experience wif Business Intelligence (Reporting) tools such as Micro Strategy, & Tableau.
  • Extensively worked in Database OLAP functions.
  • Extensively used Teradata utilities such as BTEQ, MultiLoad, FastLoad, Fast Export, and Teradata Parallel Transporter (TPT) for data loading/unloading to/from Teradata.
  • Extensively used Oracle Utilities such as external tables, SQL*Loader, and Oracle Data Pump for Data Loading,
  • Good understanding of three levels of data modeling Conceptual, Logical, and Physical as well as Data modeling tools such as ER/Studio Data Architect and Erwin.
  • Good working noledge in Dimensional Data modeling systems such as Star Schema, Snowflake schema and Normalized Modeling.
  • Extensive Knowledge in Configuration Management tools such as Subversion, IBM Clear Case, CVS and Ab Initio EME.
  • Extensive experience in scheduling tools such as Control - M, Autosys and Crontab.
  • Expensively used change management tools such as HP OpenView Service Desk, Service now and Remedy for production implementations.
  • Exceptional analytical and problem solving skills and flexible to learn new technologies in teh IT industry towards company’s success.
  • Received Above and Beyond award in Williams Sonoma for excellent support for Business users.
  • Received Colleague to Colleague Award recognitions from cross functional teams in Confidential .

TECHNICAL SKILLS

ETL Tools: Ab Initio (Co>Operating System 3.0.5.1/ GDE 3.0.5.1), DataStage 7.1, RETL (Retek ETL), WhereScape RED 6.7.3

Big Data Ecosystem: Hadoop 2.2.0 Pig 0.12.0, Hive 0.13.1, HBase 0.96.2, Sqoop 1.4.4, Spark 1-4.0

BI/Reporting tools: MicroStrategy, Tableau, Business Objects/Crystal reports, SAS

DW databases: Teradata 14.10/13.10/12

OLTP Databases: Oracle, MySQL, SQL Server, PostgreSQL, AS/400 DB2

Database Utilities: BTEQ, Multiload, FastExport, FastLoad, TPT, SQL Loader

Campaign Management Tools: Unica Affinium 6.4.8

Operating Systems: AIX Unix, Linux, Windows 95/98/2000/NT/XP

SQL Editors: Teradata SQL Assistant (Queryman), Toad, Sql *Plus, SQL Developer, WINSQL

Scripting Languages: T-SQL, Oracle PL/SQL, Unix shell scripting, Python 2.7, Perl, Java

Configuration Management Tools: SubVersion (SVN), CVS, IBM Rational Clear Case and Ab Initio EME

Job Scheduling tools: Control-M, Autosys, Crontab

Defect Tracking Tools: Rational ClearQuest, JIRA, GNATS, Bugzilla

Change Management Tools: HP OpenView Service Desk, Service-now, Remedy

PROFESSIONAL EXPERIENCE

Confidential, Sunnyvale, CA

Staff Data Engineer

Responsibilities:

  • As a Staff Data engineer, leading a team of 3 ETL/Reporting off-shore & on-site developers and coordinating teh ETL/Reporting work between on-site and off-shore.
  • Involving in Data Modeling, source-target mapping documentation and ETL design.
  • Writing Python scripts to read teh json/xml format data from source APIs.
  • Writing Teradata stored procedures to implement ETL solutions.
  • Working on Tableau to develop BI reports/dashboards for teh end users.
  • Working wif front-end dashboard developers in implementing dashboard applications using Python and provide SQL queries.
  • Assist end users to in writing custom SQLs using OLAP/Ordered Analytic functions for teh reports.
  • Diagnosing and Performance tuning of long running Oracle/Teradata application queries and recommendation of indexes, statistics.
  • Working as Teradata Database DBA on a weekly rotation.
  • Involve in Integration testing wif other cross functional teams during project life cycle.
  • Involved in POC to evaluate Big Data tools Hadoop/Spark for engineering data warehouse.

Environment: Teradata 14.10, Wherescape RED 6.7.3.0, Oracle 11g, Linux, Teradata SQL Assistant 14.10, Tableau 8.2.0, MySQL, PostgreSQL, Hadoop 2.2.0, Spark 1.4.0, Hive, HBase, Pig, Hive and Sqoop.

Confidential, San Francisco, CA

Data Warehouse Lead

Responsibilities:

  • Worked on RDW (Retek Data Warehouse) migration project (Oracle to Teradata) and conversion of ETL code from RETL to Ab Initio, and integrated CDW/RDW into Enterprise Data Warehouse EDW.
  • Involved in various projects like Event Triggered Email (ETE) and Web visits marketing projects in CDW (Customer Data Warehouse) environment.
  • Developed complex Ab Initio graphs, subgraphs, linked subgraphs & generic Ab Initio graphs.
  • Developed multiple data assets using Perl to assist in ranking like ability to distinguish a product from accessory, identifying words wif teh relevant parts of speech, classifying search terms into food/non­food, look up of subcategories that cannot co­exist together etc. by leveraging open source APIs such as Amazon Product API, Wikipedia Freebase API etc.
  • Lead development team of around 4 people and production support team of 4 people (2 onsite and 2+ offshore).
  • Created secondary indexes and Join Indexes on Teradata tables to improve teh performance of teh queries.
  • Developed Unix wrappers for Ab Initio graphs and Teradata stored procedures.
  • Worked wif production support team to resolve critical defects & nightly batch failures. Did production batch tuning to meet teh critical business SLAs.
  • Organized and led ETL design & code review meetings.
  • Worked on reporting tools such as MicroStrategy and Crystal reports to develop and tune teh reports.
  • Worked on Teradata Aster tool for market basket analyses and solving web analytical problems.

Environment: Ab Initio (GDE 3.0.5.1/Co>Operating System 3.0.5.1) Teradata 13.10, Oracle 11g, AIX UNIX, Quest Toad, Teradata SQL Assistant 13, Control-M 7.0.00.100, MicroStrategy 9, Crystal reports, ER/Studio Data Architect 9.1, Tortoise SVN 1.6.16

Confidential, Richmond, VA

Senior Ab Initio/Teradata DataWarehouse Consultant

Responsibilities:

  • Developed Ab Initio graphs, sub graphs, and linked sub graphs using various components such as Transform, Partition, Departition, Database, Datasets, Sort, FTP, and Validate
  • Involved in EDW Teradata physical design.
  • Created Secondary Indexes and Join Indexes on teh table to improve teh performance of teh reporting queries.
  • Involved in Performance Tuning of teh Ab Initio ETL graphs.
  • Used phases and checkpoints to prevent deadlock and safeguard against failures.
  • Involved in writing unix wrapper Scripts to run teh graphs from command line Interface.

Environment: Ab Initio (GDE 1.14.27, Co>Operating System 2.14, Teradata V2R6, Oracle 8.1.7, UNIX, Toad, Teradata SQL Assistant, Rational ClearQuest, HP OpenView Service Desk 4.5.

Confidential, Boston, MA

ETL/DataStage Developer (Consultant)

Responsibilities:

  • Involved in gathering Business requirements, developed ETL technical Design documents, ETL mappings using DataStage, performance tuning of teh mappings and SQL queries.
  • Used various DataStage transform components for data cleansing.
  • Involved in using Stage Variables to calculate teh derived fields in target tables.
  • Involved in on-call rotation for teh production applications.
  • Involved in writing shells scripts to call DataStage mappings.
  • Worked on Job scheduling tool ‘Autosys’ to automate teh DataStage jobs on daily basis.
  • Involved in writing Autosys JIL (Job Information Language) to setup teh scheduling of teh jobs.

Environment: IBM Web sphere DataStage 7.1(EE), Oracle 9i, SQL Server 2000, Erwin 4.0, UNIX, Business Objects 6.0

Confidential 

Database Developer

Responsibilities:

  • Responsible for writing Stored Procedures and Functions using PL/SQL.
  • Involved in tuning teh SQL statements and writing shell scripts.
  • Developed PL/SQL Packages, database triggers for enforcing business rules.
  • Involved in teh development of various screens using oracle forms.

Environment: Oracle 7.0, Unix.

We'd love your feedback!