We provide IT Staff Augmentation Services!

Data Engineer Resume

5.00/5 (Submit Your Rating)

Houston, TX

PROFESSIONAL SUMMARY:

  • 11+ years of experience in Database development, Data warehousing, Data Integration and Bigdata Integration.
  • Demonstrated technical expertise in popular business intelligence such as Informatica, Datastage, Talend, Cognos and Web focus.
  • Have good Experience on Big Data Integration using Informatica BDM and Talend BDI.
  • Have good Experience on Salesforce API Integration through Informatica.
  • Have good Experience on Hadoop and Hive.
  • Hands on experience on scheduling tools Autosys, Tivoli and Control - M.
  • Hands on Experience on Quality management tool Mercury Quality Center.
  • Hands on experience on Service Desk Tools like HPSD and REMEDY.
  • Hands on Experience on Asset Management tool UAPM.
  • Worked with the clients Confidential, The Hartford, Confidential, Confidential Insurance Group, Confidential and Confidential .
  • Having experience in Insurance, Healthcare, Banking, Finance and Aviation Domains.

TECHNICAL SKILLS:

Business Intelligence: Informatica Power Center, Informatica B2B DX/DTInformatica Data Quality, Informatica MDMIBM Datastage, Talend, Webfocus, Cognos, CA ADT

Big Data/Hadoop Stack: Hadoop, Hive, Spark, Blaze, Informatica BDE/BDMTalend BDI, Sqoop

Cloud: AWS, Salesforce

Data Bases: Oracle, SQL Server, Teradata, HBase

Languages: SQL, PL/SQL, UNIX Shell Scripting, C, Java, Python

Tools: Autosys, Tivoli, PLSQL Developer, TOAD, Remedy

OS/Environment: Windows, DOS, UNIX, LINUX

DevOps: GitHub, Jenkins

PROFESSIONAL EXPERIENCE:

Confidential, Houston, TX

Data Engineer

Environment: Informatica Power center/Bigdata Management 10.2.x, Hive, Spark, Blaze, Salesforce, Python, Spark, Oracle 12c, PL/SQL, UNIX, Autosys, Jenkins and GitHub.

Responsibilities:

  • Requirements, Analysis and Design - Involved in data analysis, preparation of mapping documents, procuring signoffs on design.
  • Salesforce Data Integration using Informatica PC - Extract and load the data from/to Salesforce API using Informatica Power Center.
  • Informatica BDM is used to ingest data into Everest Platform which is on Hadoop.
  • Python is used for POC’s to work on Spark Framework.
  • Migration activities - Informatica 9.6.1 to 10.2 migration and Oracle 11g to 12c migration as part of new infrastructure migration.
  • Informatica Administration activities - Responsible for all administration activities like user/group creation, repository set up in lower environments.
  • GitHub/Jenkins are used for auto code build and deployments.
  • Contribute to performance tuning and volume testing of the application.
  • Built reusable framework including error handling, duplicate file checks, etc.
  • Involved in scheduling jobs and workflows to automate the various Inbound/Outbound transactions using Autosys

Confidential, Grand Rapids, MI

Tech Lead/Sr. ETL developer

Environment: Informatica Power center 10.1.x, Informatica B2B Data Exchange, Oracle 12c, PL/SQL, UNIX and Tortoise SVN

Responsibilities:

  • Requirements, Analysis and Design - Involved in data analysis, preparation of mapping documents, procuring signoffs on ETL design
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center 10.1.x as well as PL/SQL procedures and packages
  • Maintained B2B Data Exchange in terms of Trading Partner creations, endpoints and profiles
  • Contribute to performance tuning and volume testing of the application.
  • Work with testing team to define a robust test plan and support them during the functional testing of the application.
  • Built reusable framework including error handling, duplicate file checks, etc.
  • Involved in Unix Shell Scripts for automation of ETL and various other process
  • Involved in scheduling jobs and workflows to automate the various Inbound/Outbound transactions

Confidential, Phoenix, AZ

Sr. ETL Developer

Environment: Informatica 9.6.1, Informatica Data Quality, Informatica MDM, Oracle 11g, PL/SQL, UNIX, Autosys

Responsibilities:

  • Requirements and Analysis- Understanding the requirements of the client and act as a sole functional resource.
  • Design- Preparing the functional, technical documents and the test cases for the given application
  • Participate to define and implement project level standards and guidelines and ensure adherence to enterprise level policies
  • Develop robust ETL designs (functional and technical) for the ETL solution
  • Lead and mentor the team in Offshore in design, develop and deliver the ETL tasks.
  • Work with testing team to define a robust test plan and support them during the functional testing of the application.
  • Contribute to performance tuning and volume testing of the application.

Confidential, Portland, OR

Sr. ETL Developer

Environment: Informatica 9.5.1, Oracle 11g, PL/SQL, UNIX, Tivoli

Responsibilities:

  • Gather and understand the business and technical requirements and preparing the Physical solution design documents.
  • Participating in scrum meetings for agile methodology and discuss the daily progress and areas of concern.
  • Creating the Mapping, Sessions & Workflows for the respective application.
  • Performance tuning in Oracle & Informatica.
  • Review, Managing and coordinating with the Offshore team members.
  • Implementation of Audit framework for ETL Loads.
  • Data modelling for the CSR data mart.

Confidential, Hartford, CT

Sr. ETL Developer

Environment: Informatica 9.1.1, Datastage 7.5, Webfocus, Oracle 10g, Teradata, PL/SQL, UNIX, Autosys

Responsibilities:

  • Existing applications have been migrated from Datastage to Informatica.
  • Worked closely with the Sr. Application Tech. Lead & Architect to create the high-level design for new Applications like MyHR, Talent Suite, FAST, BRO, LDRPS etc.
  • Participate to define and implement project level standards and guidelines and ensure adherence to enterprise level policies
  • Develop robust ETL designs (functional and technical) for the ETL solution
  • Assist with task identification and effort estimates for ETL development per Agile Development methodology.
  • Lead and mentor the team in Offshore in design, develop and deliver the ETL tasks.
  • Work with testing team to define a robust test plan and support them during the functional testing of the application.
  • Contribute to performance tuning and volume testing of the application.

Confidential

ETL Developer

Environment: Informatica 8.1, CA ADT 2.2, UAPM R11.3.4, Oracle 9i, SQL Server 2005, UNIX and Mercury Quality Center

Responsibilities:

  • Requirements and Analysis- Understanding the requirements of the client and act as a sole functional resource.
  • Design- Preparing the functional, technical documents and the test cases for the given application
  • Used CA ADT2.2 for to migrate some of the mappings which already pointed to Argis database and developed by using CA ADT2.1 tool.
  • Used Informatica 8.1 to develop new mappings as part of upgrade from Argis to UAPM.
  • Created complex mappings using Aggregator, Expression, Joiner, Router, Update strategy, Lookup Transformations.
  • Used verbose data at tracing level to analyze data movement and troubleshoot the mappings.
  • Created corresponding sessions and workflows to load the data
  • Testing- Testing the application in the dev and QA Environment.
  • Reconciliation of discovery assets through Hardware Reconciliation

Confidential

ETL Developer

Environment: Informatica 7.1, Oracle 9i, Mercury Quality Center, PL/SQL, UNIX

Responsibilities:

  • Requirements and Analysis- Understanding the requirements of the client and act as a sole functional resource.
  • Design- Preparing the functional, technical documents and the test cases for the given application
  • Taking the CI (Configuration Item) and WG (Workgroup) information from HPSD and creating them as Groups, Teams and assigning the Team Members to the Teams.
  • Used Informatica 7.1 to develop mappings for HPSD and Alarmpoint Integration.
  • Created complex mappings using Aggregator, Expression, Joiner, Router, Update strategy, Lookup Transformations.
  • Created corresponding sessions and workflows to load the data
  • Applied event wait, Decision and command objects along with the email objects from error notification information.
  • Used verbose data at tracing level to analyze data movement and troubleshoot the mappings.
  • Used some of the techniques to identify the performance bottle neck of Informatica jobs and tuned them as per Informatica given standards

We'd love your feedback!