We provide IT Staff Augmentation Services!

Etl-hadoop Production Operations Engineer Resume

2.00/5 (Submit Your Rating)

San Jose, CA

PROFESSIONAL SUMMARY:

  • Over 9+ years of experience in Analysis, Design, Development, and Production support of Data warehouse applications.
  • Experience in working in Finance and Retail domain.
  • Worked as a part of command center team and handled ETL - Batch operations in 24*7 production support environment to meet all defined SLAs with business partners around the world.
  • Extensive Data warehousing experience in production support, designing and creating mappings using Informatica Power Center (9.x/8.x/7.x), Power exchange, Data quality developer including Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Main area of experience includes Teradata, Oracle, Informatica, UNIX and strong knowledge on Data Warehousing concepts.
  • Expertise in implementing the business rules by creating transformations like (Expression, Joiner, Source Qualifier, Filter, Aggregator, Lookup, Router, Update Strategy, SQL transformation etc.), and developing mappings.
  • Solid experience in all phases of Data warehouse life cycle involving design, development, testing and production support of Data warehouses using Informatica.
  • Extensively worked on Relational Databases Systems like Oracle11g, Golden gate environment, MS SQL Server, Teradata, DB2 and Source files like flat files, XML files.
  • Worked with cross functional teams such as QA, DBA to deploy the code from development to QA, QA to UAT and UAT to Production and involved in production support to meet SLAs.
  • Highly competent in quantitative and qualitative analysis and critical thinking.
  • Self-reliant and quick to identify and understand business problems to resolve production issues.
  • Good team player with multitasking, hardworking and fast learning.
  • Worked in end-to-end data warehousing projects.
  • Experienced dealing with outsourced technical resources and coordinating global development efforts.
  • Excellent communication and interpersonal skills. Committed and Motivated team player with good Analytical skills.

TECHNICAL SKILLS:

ETL tools: Informatica Power center 8.x/9.x Power Exchange 8.6.1, Cloud, IDQ, Abinitio

RDBMS: Oracle, Teradata, DB2, Oracle Golden Gate, Vertica, My Sql

Messaging services: Kafka, Horton, Scala, Stampy

Reporting Tools: BRIO, Cognos, Tableau

Languages: SQL, PL/SQL, C, C++, Java, Unix shell programming, Python, Hadoop

Scheduling Tools : Control-M, Auto Sys, Airflow, UC4

PROFESSIONAL EXPERIENCE:

Confidential, San Jose, CA

ETL-Hadoop Production Operations engineer

Environment: Informatica Power Center 9.6.1, Linux grid, Oracle golden gate, Control-m, Hadoop cluster

Responsibilities:

  • Involved in 24/7 Operational support of mission critical applications and merchant reporting, various systems and other areas of Confidential infrastructure and worked in flexible rotational shifts.
  • Monitor, analyze, troubleshoot and escalate any issues with business process flows, moving 2 billion dollars daily.
  • Work with and drive tier II groups (ETL Ops, Data Integration, DBAs, SAs, Network Security etc.) to diagnose complex problems and drive resolution as quickly as possible to meet SLAs.
  • Monitoring Golden Gate user extract process.
  • Perform root cause analysis and document to prevent production issues from reoccurring.
  • Provide technical support for PP merchants and vendors to mitigate any payments or collection issues.
  • Participated in multiple critical projects (Bill Me Later integration, Disaster Recovery project.)
  • Mentoring and training junior team members. Assist other teams with automation projects and troubleshooting of production issues.
  • Monitor and troubleshoot high performance server clusters designated for global merchant reporting process, interact with worldwide multifunctional teams and vendors.
  • Coordinating with the customers and vendors for any system up gradation and giving the exact procedure to follow up
  • Coordinate to troubleshoot the environmental issues with NOC team.
  • Worked with Job Scheduling team to design and setup ETL flow on Workload Automation.

Confidential, San Ramon, CA

Sr. Informatica Developer

Platform: Informatica Power Center 9.5.1, Informatica Data Quality, Oracle, UNIX.

Responsibilities:

  • Working on building new data warehouse for customer data information.
  • Experienced working with team leads, Interfaced with business analysts and end users.
  • Interacted with Business Analyst to gather requirements and translated them into ETL technical specifications.
  • Worked with data modelers to understand financial data model and provide suggestions to the logical and physical data model.
  • Involved in data quality profiling, design, development, unit testing and deployments of ETL Informatica components.
  • Document unit test cases and provide QA support to make testers understand the business rules and code implemented in Informatica.
  • Extracted data from various sources like XML files, flat files, RDBMS loaded into target data warehouse Oracle.
  • Implemented error handling for invalid and rejected rows by loading them into error tables.
  • Involved in Performance Tuning of ETL code by addressing various issues during extraction, transformation and loading of data.
  • Involved in writing SQL queries to validate the data on the target database according to source to target mapping document.
  • Handled Production issues and monitored Informatica workflows in production.
  • Extensively worked on batch frame work to run all Informatica job scheduling

Confidential

Sr.ETL Developer

Platform: Informatica Power Center, Netezza, Oracle, Micro Strategy, Omniture analytics, AWS

Responsibilities:

  • Involved in gathering the business requirements from the team and from the end users.
  • Involved with data model team to design and build relationships between different entities.
  • Involved in coding, testing, implementing, debugging and documenting the complex programs.
  • Involved in documenting High Level Design, Low level Design, STM's, Unit test plan, Unit test cases and Deployment document
  • Ensures that all the standard requirements have been met and is involved in performing the technical analysis.
  • Debugged the sessions by utilizing the session logs.
  • Resolving issues and providing on-call support regarding production SLAs.
  • Prepared codes for all modules according to require specification and client requirements and prepare all required test plans.
  • Involved in all production issues, inquiries and provided efficient resolution for same.

Confidential, Wilmington, DE

Onsite Operations lead (TCS)

Platform: Informatica Power Center, Power exchange, Data Quality, Control-M, Teradata, Oracle, HP ALM, UNIX

Responsibilities:

  • Making a note of the Impediments faced by the team and addressing the same with the help of the project management team.
  • Develop Technical specification documents, Unit Test Cases.
  • Development of Informatica mappings Transformations and Mapplets using Source Qualifier, Filter, Lookup, Aggregator, Union, Router, Update Strategy etc.
  • Reviewing the code prepared by other team members with the client and taking signoff on the same.
  • Getting the code deployed to the QA environment as per the Implementation plan provided by the team members with the help of DBA and Admin team.
  • Creating and deploying the Informatica label and Control-M XMLs in the test and Production environment.
  • Verifying the Informatica Mappings, Mapplets, Transformations and Workflows.
  • Verifying the Control-M XMLs and ordering the jobs to run the workflows.
  • Running SQL Queries on database and verifying if the data is loaded to the respective target tables and with the expected transformation.
  • Making sure that all the failed test cases are linked to the respective defects.
  • Reviewing the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects.
  • Provide daily status to the management team and weekly status to the Client for the project.

Confidential, Frankfurt, Germany

Onsite Lead/Production Support

Platform: Informatica 9.1.0, DB2, SQL Server 2008, UNIX, Cognos and BRIO for reporting

Responsibilities:

  • Interacting with clients on daily basis for status and requirements.
  • Meeting SLAs as provided by client for supporting their customers and Sent regular check points to Management and Business users about the status of the Batch, Reports execution.
  • Worked as operations Engineer in 24*7(Onsite/Offshore) production support environment and escalate the batch issues/failures to on calls and SME for immediate resolution.
  • Involved in scheduling the jobs by using Control-M (batch processing) monitored the production support jobs and provided quick solutions on failures to meet SLAs on daily basis.
  • Worked in production team to solve the issues while the sessions are running in Informatica.
  • Monitoring the batch jobs which refresh warehouse on a daily basis in scheduler tools with Control-M.
  • Documented the production issues and RCA for future understanding and reference.
  • Excellent team player with ability to work consistently with team towards attaining goals and targets.
  • Provide supports for extra hours for critical issues and batch recovery efforts in case of database crashes and hardware and other environment issues with ETL servers.
  • Doing enhancement work and giving innovative idea to improve the existing solutions.
  • Reduced issues regarding data integrity and billing inconsistencies.
  • Technical and Quality Lead for team.

Confidential

ETL Developer

Platform: Informatica power center 9.1/8.6, DB2, SQL Server 2008, Oracle 11g, UNIX.

Responsibilities:

  • Working on Informatica Power Center tools-Source analyzer, Warehouse Designer, Mapping Designer and Transformation developer.
  • Created the Source and Target Definitions in Informatica Power Center Designer.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Designed and developed Informatica Mapping for data load and data cleansing.
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Weekly, Monthly and Yearly Loading of Data.
  • Involved in fixing invalid Mappings, Unit and Integration Testing of Informatica Sessions and the Target Data.
  • Used Workflow Manager/Monitor for creating and monitoring workflows.
  • Extensively used mapping parameters, mapping variables and parameter files.
  • Involved in troubleshooting the loading failure cases, including database problems.
  • Created the Business scenario documents.

Confidential

ETL Developer

Platform: Informatica Power Centre 8.6, TOAD DB2, UNIX.

Responsibilities:

  • Involved in developing Mappings and reusable transformations using Informatica Designer and perform Testing using SQL.
  • Used various types of sources like RDBMS sources and external Sources like flat files and XML sources to load data into targets.
  • Conducted the peer review for mappings developed and tested.
  • Identified Bottlenecks in the mapping, and Involved in Performance tuning.
  • Involved in configuring the dependency of the workflows.
  • Configured various tasks in the workflows for the dependency by using Command, Event wait, timer and Email task.
  • Documented the test cases and results for future understanding and reference.
  • Involved in process of Unit testing and integration testing before deploy code.

Confidential

ETL Developer

Platform: Informatica Power Center 8.6/8.1, Oracle 10G, SQL Server 2005, DB2, UNIX.

Responsibilities:

  • Understanding the Requirement Specifications and analyzed it and prepared Mapping docs.
  • Used mapping parameters and variables.
  • Identified Bottlenecks in the mapping, and Involved in Performance tuning.
  • Deployment of Informatica (static & dynamic) deployment groups, which consists of mappings, sessions and workflow from development to QA and QA to Production.
  • Monitor production loads daily, monthly and quarterly.

We'd love your feedback!