We provide IT Staff Augmentation Services!

Sr. Data Engineer Resume

2.00/5 (Submit Your Rating)

Cincinnati, OH

SUMMARY

  • 8 years of IT experience in Analysis, Design, Development, Testing, Implementation and support of business applications using ETL tools, Data Warehousing applications, Big Data Applications and Cloud strategies
  • Expertise in Snowflake cloud Datawarehouse and cloud data ingestion
  • Expertise in Data Warehousing tools Data Stage 11.3/9.1/8.7/8.5 (Manager, Designer, Director and Administrator) and Oracle Data Integrator
  • Expertise in Big data applications like Hive, Impala, oozie workflow, Sqoop.
  • Expertise in creating tables using Hadoop File System and executing queries on Hadoop cluster using hive scripts.
  • Involved in all teh phases of teh SDLC requirement gathering, design, development, Unit testing, UAT, Production roll - out, enhancements and Production support.
  • Experience in source system analysis and data extraction from various sources like Oracle 10g/11g, DB2 UDB, MS SQL Server and worked on integrating data from Flat Files, CSV files and XML files into a common reporting and analytical Data Model.
  • Strong Knowledge in dimensional Star Schema, Snowflake Schema and Data vault methodologies to develop data marts.
  • Involved in writing complicated SQL Queries, PL/SQL and Unix Shell scripts.
  • Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment and change data capture.
  • Knowledge in maintaining different versions of code using Version control systems like Clear case, Tortoise SVN.
  • Extensively worked in higher data volume environments.
  • Expert in debugging, troubleshooting, monitoring, and performance tuning.
  • Involved in Unit Testing, Integration and User Acceptance Test (UAT) preparation for different applications.
  • Extensively worked in migration project which involves migrating jobs from Data stage 8.7 to 11.3 and 11.3 to 11.7 versions.
  • Good team player wif excellent communication skills, Conceptual, Teamwork, Goal oriented and a Quick learner.

TECHNICAL SKILLS

ETL Tools: IBM Web Sphere Data Stage 11.3/9.1/8.7/8.5 (Manager, Designer, Director, Administrator), Oracle Data Integrator.

Databases: Oracle 12c/11g/10g, MS SQL Server 2000/2005, AS400, DB2 UDB, MS Access and HDFS

DB Utilities: SQL*Plus, TOAD 9.1, Oracle SQL Developer, Rapid SQL, Hive, Impala, AQT, Squirrel

Other S/W Tools: Jenkins, MS Office, Putty, Clear case, Tortoise SVN, JIRA, HP Quality Center, Rally

Scheduling Tools: Auto sys, Tidal, Control-M, TWS, Airflow

Cloud Strategies: Snowflake Data warehouse, AWS S3, AWS Secrets Manager, Azure Data Factory

PROFESSIONAL EXPERIENCE

Confidential, Cincinnati, OH

Sr. Data Engineer

Responsibilities:

  • Worked as a snowflake SME by mentoring team and trained them to understand snowflake architecture and components
  • Designed and implemented a real-time data pipeline to process financial data by integrating ~150 billion raw records from multiple data sources using SQL, SnowSQL, Jenkins and stored processed data in Snowflake Cloud
  • Ingested data from disparate data sources into Snowflake cloud and created data views to be used in BI tools like Tableau for data analysts
  • Maintained data pipeline up-time of 99.9% while ingesting transactional data across different primary data sources
  • Led teh migration from Oracle to Snowflake cloud using SnowSQL, Data stage, Jenkins resulting in an increase in performance of 14%
  • Worked wif business users to understand business needs and translate those business needs into actionable reports saving 10 hours of manual work each week
  • Worked on Bigdata technologies like Hive, Impala, HDFS, Oozie workflows for ingesting data from different sources to audit layer and tan to harmonized layer
  • Worked on a Data Domains project, which maintains data as a single source of truth for each subject area like customer, product, billing domains
  • Designed, built, and maintained Big Data workflows to process millions of records using oozie workflows, Data Stage and maintained tables in HDFS using partitions in parquet encrypted format
  • Involved in design and development of multiple ETL jobs using IBM Data Stage for different initiatives
  • Implemented Data vault modeling for creation of EDW and used Git Hub for code versioning
  • Led migration from IBM Data Stage version 8.5 to 11.5 and 11.5 to 11.7
  • Used AWS S3 bucket for creation of stage and used for transferring to a different network file system
  • Used AWS Secrets manager for storing snowflake service accounts and used in Jenkins for daily job loads

Environment: Snowflake, AWS, Hive, Impala, Oozie Workflows, HDFS, IBM Data stage 11.5 and 11.7, Oracle12c, SQL Developer, Informatica Axon Data Governance, Jenkins, Airflow, TWS, GitHub, SQLServer, Crontab, Agile Methodologies using Rally

Confidential, Kansas City, MO

Sr. Data warehouse Developer

Responsibilities:

  • Working as a DataStage developer in EDW team which build enterprise data warehouse and data marts for FSA.
  • Involved in functional discussions and design reviews.
  • Working in building EDW for FSA using data vault modeling.
  • Involved in creating data stage jobs which loads data from source to staging and staging to data vault.
  • Adapted teh functionality of existing Oracle Data Integrator (ODI) process and converted teh same into data stage.
  • Applied Metadata approach in loading table in which we need to pass teh table names as parameters and tan loading whole data by using Runtime Column Propagation (RCP) in data stage.
  • Involved in performance tuning for complicated SQL queries by using Indexes, Partitioning on database.
  • Built Data validation jobs for analyzing and validating data.
  • Worked wif high volume data.
  • Involved in unit testing and fixing defects.
  • Involved in production monitoring, support and provided on call support.
  • Involved in troubleshooting and fixing critical production issues.

Environment: IBM Information Server (Data stage & Quality stage) 9.1, Oracle11g, Web sphere DB2 UDB, AIX UNIX, Windows 7, Putty, TOAD, SQL Developer, ODI, JIRA, HP ALM

Confidential, Mclean, VA

Sr. Data Stage Developer

Responsibilities:

  • Worked as a Data stage consultant in CDW ( Corporate Data warehouse) Team which involves in all teh phases of SDLC.
  • Worked in LQA (Loan Quality Advisor) project which deals wif teh loans of teh existing customers to determine teh quality of each loan
  • Worked in Data stage components like Designer, Director and Administrator extensively.
  • Involved in requirements gathering from source team, development of jobs using data stage designer, monitoring teh data stage jobs in data stage director.
  • Extensively used many stages like transformer, sort, filter, join, lookup, dataset, sequential file, peek, row generator, column generator, XML Input, XML output and many of teh database connector stages like ODBC connector, Oracle connector, DB2 connector, DB2 UDB and enterprise stages as well.
  • Worked on XML data, created XQuery’s and pulled teh XML data using oracle connector stage.
  • Used Node map constraints for better allocation of nodes to data stage jobs.
  • Used appropriate partitioning techniques for better performance of data stage jobs.
  • Created shell scripts for running teh data stage jobs from UNIX.
  • Worked on clear case version control system. Using clear case explorer and project explorer, latest version data stage jobs are maintained.
  • Worked on Auto sys scheduling tool for scheduling teh data stage jobs and created required JIL files.
  • Extensively used SQL developer for creating SQL queries and some PL/SQL queries
  • Used RAD tool for DB2 queries.
  • Did teh unit testing for data stage jobs developed by us.
  • Worked closely wif SIT and UAT teams.
  • Involved in resolving teh Quality Center defects raised by SIT and UAT teams and documented them as well.
  • Involved in maintenance of documentation like mapping documents, migration forms and technical design documents.
  • Interacted wif rest of teh team for better solutions and resolving some critical issues.

Environment: IBM Information Server (DataStage & Quality Stage) 8.7, Oracle11g, WebSphere DB2 UDB, AIX UNIX, Windows 7, Putty, SQL Developer, RAD, Clear case, Autosys Scheduler, HP Quality Center (HPQC)

Confidential, Princeton, NJ

Data Stage Developer

Responsibilities:

  • Worked as a shared resource for two teams in parallel which involves all phases of SDLC.
  • Worked on a data warehouse project and an Inbound project which involves policies and claims data
  • Extensively used IBM Data stage components Designer for developing parallel and server jobs and Director for monitoring and debugging teh running jobs
  • Part of a team which develop teh new data stage jobs as per teh business user requirements and involved in meetings wif business users for understanding their requirements.
  • Improved teh design of teh old data stage jobs by making required enhancements and improved teh performance of teh jobs
  • Created a common sequencer job which function same as teh multiple similar sequencers by removing teh hard-coded parameter values and adding a common parameter set and increased reusability
  • Used many parallel edition stages like Join, Lookup, Aggregator, Remove Duplicates, Sort, Transformer, Funnel, Filter, Oracle Connector, DB2, ODBC connector, column generator, row generator, peek, FTP, Xml Input, Xml Output, Sequential file, Datasets.
  • Worked on server jobs for teh data warehouse team and used server edition stages like Transformer, Link Collector, Aggregator, Merger, FTP Plug-in, Hashed file, Sequential file.
  • Created shared containers in both parallel and server editions and converted teh existing shared containers to parallel containers.
  • Involved in conversion of shared edition jobs to parallel edition jobs and used Basic Transformer in parallel edition which contains teh server edition functions
  • Worked on XML stages for extracting data from XML input stage and loading it to XML format using XML output stage.
  • Created data stage jobs for writing data to different servers using FTP stage.
  • Worked on mainframes data by using a database tool SQUIRREL and created teh data stage jobs for extracting mainframes data using DB2 stage.
  • Involved in writing SQL queries which are required for extracting data from databases.
  • Created DB Links for querying from one environment to other environments
  • Worked on Quality center defects for teh data warehouse team and done teh unit testing for all those defects and documented them as well
  • Used to work on Unix scripts for running teh data stage jobs in teh data warehouse team
  • Worked on wrapper scripts for running teh data stage sequencer jobs in teh inbound team and created some of teh scripts as well.
  • Created an UNIX shell script which get teh parameter values from .OraclePasswords file containing teh user credentials for all databases.
  • Used Control-M scheduler for scheduling teh data stage jobs.

Environment: IBM Information Server (Data stage) 8.7, Oracle11g, DB2 UDB, AIX UNIX, TOAD, Windows 7, Putty, SQUIRREL, Control-M Scheduler, HP Quality Center (HPQC)

Confidential, Peoria, IL

Data Stage Developer

Responsibilities:

  • Worked extensively in middleware development team, involves developing data stage jobs which transforms all kinds of data from SAP to different systems like VPP, MRC, MSC, MES, etc. and vice-versa and troubleshoots teh issues occurred in between.
  • Involved in functional and technical meetings and worked closely wif business analysts and SAP people for gathering and understanding teh requirements.
  • Used Data stage Designer for developing various data stage jobs which extracts, transforms and loads data to different systems
  • Used Data stage Director for monitoring and debugging teh errors occurred.
  • Worked on SAP packs for extracting data from SAP and loading data to SAP.
  • Responsible for developing teh jobs which involves IDOC Load and IDOC Extract.
  • Using SAP Data stage Administrator, checked IDOC logs, generated IDOC’s by selecting IDOC types and cleared IDOC logs when required
  • Involved in working wif BAPI stage while troubleshooting teh issues.
  • Implemented Data stage jobs in GRID environment and made them run.
  • Created user variables, timestamps and parameter sets for teh jobs in teh GRID environment.
  • Made enhancements for several existing jobs as per teh functional requirements
  • Involved in creation of shared containers and used them for multiple jobs wif same business logic.
  • Worked on databases like Oracle 10g/11g, IBM DB2 by using ODBC, DB2 connectors
  • Used many stages like Transformer, Sequential file, Dataset, Remove Duplicates, Sort, Join, Merge, Lookup, Funnel, Copy, Filter, ODBC, DB2, FTP, etc.
  • Used real time stages like web services, web sphere MQ connector, XML and used all teh sequencer stages.
  • Created environmental variables for all teh existing data stage jobs using Data stage Administrator Client.
  • Involved in migrating jobs from Development to QA, Preproduction, Production
  • Worked on Tidal scheduling tool by inserting new data stage jobs, scheduling those jobs, creating job dependencies and passing variables if required
  • Responsible for supporting teh existing jobs and work on change requests from different teams as per teh new requirements
  • Provided Go-Live support for few weeks until teh warranty period
  • Used remedy for troubleshooting teh production issues and reassigning to teh respective person or team.
  • Identifying teh defects using HPQC and resolving those as per teh requirements

Environment: IBM Information Server (Data stage) 8.7/8.5, Oracle11g/10g, SQL*Loader, WebSphere MQ 6.0, DB2 UDB, AIX UNIX, TOAD, Windows 7, TIDAL, SAP PI 7.3, SAP ECC, HP Quality Center (HPQC), BMC Remedy Action Request System

Confidential, Akron, OH

Web Developer (Graduate Assistant)

Responsibilities:

  • Created teh table structures according to teh data model
  • Developed PL SQL procedures to load data onto teh tables
  • Also worked wif teh Student’s schedule management in teh database.
  • Designed a website using dotCMS content management system for H.K. Barker center for economic education and maintaining it
  • Involved in maintaining and updating teh website
  • Created some shell scripts to automate group of processes

Environment: Oracle 10g, MS SQL Server 7.0, UNIX, Putty, dotCMS

We'd love your feedback!