We provide IT Staff Augmentation Services!

Sr. Data Engineer Resume

Seattle, WA

SUMMARY

  • 10+ years of experience on IT with focus on Data Engineering side.
  • Extensive experience on designing, developing, implementing, testing, documenting, and operating large - scale, high-volume, high-performance data warehouse systems for business intelligence analytics.
  • Implementing data structures using best practices in data modeling, ETL processes, and Oracle, Teradata, Redshift, SQL Technologies.
  • Experienced on using SQL with large data sets (Oracle/Teradata/SQL Server/Redshift).
  • Experience in BI/DW as a change leader providing strategic research, recommendations, and implementations.
  • Implemented various components in OLAP systems using different ETL tools like OWB, Wherescape RED and Pentaho, Informatica PWC.
  • Experienced on leading data warehouse and analytics projects, which are running on different environments including AWS technologies (Redshift, S3, Quicksight) and other big data technologies.
  • Automating ETL process by using Unix/Perl/Python scripting languages.
  • Experienced in creating different database components by using Teradata Utilities like BTEQ, Fastload, Multiload, FastExport, and DataMover.
  • Experienced with working Hadoop Framework, MPP DB platform, and other NoSQL database like Mongo
  • Experienced on reporting using tools like Tableau, Business Objects, MicroStrategy.
  • Experience with Agile, DevOps, CICD frameworks.
  • Experienced in diagnosing and optimizing the performance issues related to the data warehouse components by using the Explain plan and the Data dictionary.
  • Experience on working with multi-terabyte data sets using relational databases (RDBMS) and SQL.
  • Experienced in data modelling for data warehouse systems using tools Erwin, Visio, Wherescape 3D.
  • Developed Framework scripts to automate Workflows in Data warehouse systems using Unix/Perl scripting language.
  • Experienced in analyzing/designing various components related to Data warehouse system and keeping the artifacts in Confluence/SharePoint repository.
  • Experienced in leading development teams and ability to implement projects with demanding objectives.
  • Worked on different version control tools like Dimensions, VSS, Clearcase, SVN, and GIT.
  • Strong organizational skills coupled with high work ethics.

TECHNICAL SKILLS

  • Oracle
  • Teradata
  • Redshift
  • OWB
  • Informatica PWC
  • Wherescape Red
  • Pentaho.
  • Tableau
  • BO
  • MicroStrategy
  • Erwin
  • Microsoft Visio
  • Wherescape 3D
  • Toad
  • Teradata SQL Assistant
  • SQL Developer
  • SQL Workbench
  • VSS
  • Clearcase
  • SVN
  • Dimensions
  • Control-M
  • Crontab
  • Autosys
  • Oracle Enterprise Manager
  • Teradata Viewpoint
  • Unix,
  • Perl
  • AWS, Unix, Linux, Windows.

PROFESSIONAL EXPERIENCE

Confidential, Seattle, WA

Sr. Data Engineer

Responsibilities:

  • Used agile process for software development. Each sprint is two-week long. All the requirements are documented in user stories in Rally.
  • Responsible for building the Solution Approach in confluence page and the high-level estimates based on the requirements.
  • Responsible for creating ETL workflows by using tools like Informatica/Wherescape-RED/Pentaho.
  • Automate the ETL process by using scripting languages like Python/Unix.
  • Responsible for writing SQL queries against RDBS with query optimization.
  • Responsible for creating database components for migration by using Teradata/Oracle/Redshift utilities.
  • Document all the ETL workflows by using data modeling tools like Wherescape-3D, Erwin, Visio.
  • Responsible for reviewing the Testcases from system testing team.
  • Used database utilities(Teradata FasExport, Fastload, Multiload, Fastclone, Oracle exp/imp, expdp/impdp) to move the data from Oracle to Teradata database.
  • Creating the Oracle PL/SQL scripts and packages for processing the data in Data Warehouse systems residing on Oracle database.
  • Responsible for creating Teradata procedures for processing the data.
  • Used Oracle SQLloader utility and external tables to process the flat files in Oracle database.
  • Responsible for creating business layer by using Business Objects, Tableau.
  • Responsible for migrating the components into AWS.

Environment: Teradata, Oracle, Wherescpae RED, Infomratica PWC, Putty, WinScp, Autosys, XML data, Scripting Languages (Unix/Perl), Data Transfer (SCP/FTP), Toad, Teradata SQL Assistant, Wherescape 3D, Visio, Dimensions, WinMerge, BO, Tableau, MicroStrategy.

Confidential, Seattle, WA

Sr. Data Engineer

Responsibilities:

  • Used agile process for software development. Each sprint is two-week long.
  • Responsible for building the Solution Approach in confluence page and the high level estimates based on the requirements.
  • Responsible for creating the sub-task in Jira board based on the requirements user story and based on the components that are needed for implementing that Jira task.
  • Used confluence pages for detailed level design and that will get linked to the corresponding Jira user story.
  • Used Wherescape 3D for creating the table level data flow diagrams and created data mapping excels to define the relation between systems.
  • Responsible for reviewing the Testcases from system testing team.
  • Responsible for creating Teradata procedures to process the data from source team and then load the data to dimension/fact tables by using the ETL tool Wherescape RED.
  • For some interfaces the data from source systems are getting through flat files, these are getting loaded into Data Warehouse using the Teradata utilities Fastload/Multiload.
  • Responsible for migrating the database into AWS environment (S3, Redshift).
  • Responsible for creating Teradata BTEQ scripts to call the procedures through UNIX shell scripts.
  • Responsible for processing the big data in distributed environment by using the Hadoop framework.
  • Creating the Oracle PL/SQL scripts and packages for processing the data in Data Warehouse systems residing on Oracle database.
  • Used Oracle SQLloader utility and external tables to process the flat files in Oracle database.
  • Used Oracle datapump utility/UTL FILE to move the data among oracle databases.
  • Used FastExport utility to move the data from Oracle to Teradata database.
  • Used DataMover utility to move the data from Teradata operational database to Mart Database.
  • Responsible for creating the daily batch jobs to call all the database components to extract transform and load data. Jobs are created by using the UNIX/Perl Shell scripting and Python.
  • Used Visio/Wherescape3D tool to document the job flow related to each interfaces/modules.
  • Responsible for implementing the components by using the Version control tool Dimensions and then scheduling it through Control-M.

Environment: Teradata, Oracle, Legacy Data Warehouse system RDW, Mart Database, Hadoop Framework, RMS, Putty, WinScp, Control-M, XML data, Scripting Languages (Unix/Perl/Python), Data Transfer (SCP/FTP), Toad, Teradata SQL Assistant, Informatica PWC, AWS, Redshift, Wherescpae RED, Wherescape 3D, Visio, Dimensions, WinMerge.

Confidential, Seattle, WA

Data Engineer

Responsibilities:

  • Interacted with System Engineers & Architects to analyze business & technical requirements
  • Responsible for creating Solution approach to re-calculate the preferences.
  • Responsible for updating the existing design to incorporate the new changes and store the same in SharePoint.
  • Created the modified flow diagrams by using Microsoft Visio.
  • Worked on extracting the sales data from Oracle Data Warehouse system to Customer Analytics Data Mart by using the Teradata utility Fastexport.
  • Created Teradata procedures to recalculate the email/phone preferences by using the Wherescape RED ETL tool.
  • Worked on processing the flat files by using the Teradata utilities Fastload/Multiload.
  • Responsible for creating the Teradata BTEQ scripts to call the procedures for processing the data. These BTEQ scripts are written inside the shell scripts.
  • Responsible for investigating the issues, defects, system outages, determining problem root cause, formulating corrective action recommendations & implementing them.
  • Actively involved in support for production issues, System, Integration & dev issues.
  • Responsible for building Teradata Macros or Views to support Database Marketing team.
  • Responsible for building Oracle views on top Oracle Mart database to support database Marketing team.
  • During perf testing, monitored the job execution in Teradata database by using the Teradata Viewpoint and corrected the issues as per the result.
  • Responsible for implementing the components by using the Version control tool Dimensions and then scheduling it through Control-M.

Environment: Teradata, Oracle, Mart Database, Putty, WinScp, Control-M, XML data, Scripting Languages (Unix/Perl), Data Transfer (SCP/FTP), Toad, Teradata SQL Assistant, Wherescpae RED, Wherescape 3D, Visio, Dimensions, WinMerge.

Confidential

Data Engineer

Responsibilities:

  • Interacted with System Engineers & Architects to analyze business & technical requirements
  • Responsible for creating Solution approach to add new source in the existing Data warehouse system.
  • Modified the existing design to accept any number of new source systems by using the Teradata Procedures.
  • Interacting with the client and the team members to ensure issue resolution.
  • Used Oracle SQLloader utility and External tables to process the flat files in Oracle database.
  • Data moved from operational database to Data mart by using the Teradata utility Data Mover.
  • Used Oracle PL/SQL scripts to process the data in Oracle data warehouse systems and created the view on top of the Mart tables for database marketing team.
  • Involved in testing along with users and fixing bugs in UAT, Production and Maintenance.
  • Responsible for change requests and maintenance during development of the project.
  • Used database tools like Toad/Teradata SQL assistant for creating components in databases Oracle/Teradata.
  • During perf testing, monitored the job execution in Teradata database by using the Teradata Viewpoint and corrected the issues as per the result.

Environment: Teradata, Oracle, Mart Database, Putty, WinScp, Control-M, Scripting Languages (Unix/Perl), Data Transfer (SCP/FTP), Toad, Teradata SQL Assistant, Wherescpae RED, Wherescape 3D, Visio, WinMerge.

Hire Now