We provide IT Staff Augmentation Services!

Resume

Etl Design And, DeliverY

SUMMARY:

  • Technically Sophisticated professional with over 7.5 years of career reflecting qualifications in IT industry.
  • Exposures in Informatica and Pentaho possess good knowledge of ETL Methodology for supporting data extraction, transformation and loading process.
  • Worked on Data Warehouse & Data Mart Projects to facilitate BI Reporting and building the Enterprise Data Warehouse.
  • Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
  • Involved in data modeling. Defining the entities identifying related attributes and the relationship between entities.
  • Created Pentaho transformations and Jobs to extract and load the data based on the business rules have used Salesforce Input/output, Joins, Http client, Webservice Lookup, Database Join, Database lookup, Modified Java Script Value, File watch, Row Normaliser, Select values, Exporting and importing Variables(Set & Get Variables),Run ssh scripts, Execute Sql scripts, etc.
  • Created scripts with Teradata utilities BTEQ, MLOAD and FLOAD
  • Performed Informatica Server upgrade from V9.1.0 to 9.6.3.
  • Creation and maintenance of Informatica users and privileges.
  • Designed ETL Process using ETL tool and Implementation of Data Movement, Error Capturing & Reporting, Initial & Delta Load, Implemented Change Data Capture methodology.
  • Vast experience in Designing and developing complex Informatica mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
  • Proficient in Data Extraction, Transformation and Mapping from flat file and relational sources. Well versed with integration of data sources with multiple relational Databases like Oracle, Teradata and SQL Server, SalesForce.
  • Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.

TECHNICAL SKILLS:

ETL Tools: Informatica 8.6.1, 9.6.3Pentaho 4.2, 6.1

Job Monitoring Tools: UC4, AutoSys, Informatica Monitor.

Service Support Management Tools: BMC Remedy HP ALM

RDBMS: SQL,SOQL,Oracle.

PROFESSIONAL EXPERIENCE:

ETL Design and Delivery

Confidential

Environment: Windows XP, Pentaho 4.2.1, Pentaho 6.1 EE, Oracle, UNIX

Responsibilities:

  • Requirement gathering, ETL Design and Development of jobs and transformations using Pentaho.
  • Designed jobs to extract from different source sytems Salesforce, Webservices, Azure cloud storage and integrate into the Target data store based on the business and transformation logics.
  • Designed and developed jobs using steps Salesforce Input/output, Joins, Http client, Webservices Lookup, Database Join, Database lookup, Modified Java Script Value, File watch, Row Normaliser, Select values, Exporting and importing Variables(Set & Get Variables),Run ssh scripts, Execute Sql scripts and used fact - Cosort get the incremental Records.
  • Have developed Fast export, Fast load, Bteq scripts to extract and load data from teradata sources.
  • Upgraded Pentaho 4.2 Enterprise to 6.1 community edition.
  • Migration and execution of jobs to the upgraded version and ensured smooth transition and delivery.
  • Code & Process Review to make sure all standards are followed during development.
  • Interact with Various teams for smooth E2E testing & UAT and handover code for Deployment into Prod.
  • As Delivery lead, Co-ordinated and helped peer team members to resolve technical problems as well as logical business scenarios.
  • Co-ordinate with Offshore make sure the deliverables are delivered as per the agreed Timeline & SLA

ETL Support, Lead

Confidential

Environment: Windows XP, Informatica 9.1.0 & 9.6.1, Oracle, UNIX, Infa Admin Console

Responsibilities:

  • To make sure there is no deviation in the deliverables as promised.
  • Upgraded Informatica 9.1.0 to 9.6.1
  • Handling MEC,QEC & YEC activities.
  • Leading & mentoring a team of 5.
  • To Make sure SLA is not breached based on the priority of Incidents logged in.
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable components such as Mapplets, Reusable transformations and sessions etc
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Worked on Change Requests (CR) for ETLs
  • Test cases preparation & debugging,

ETL Support, Developer

Confidential

Environment: Windows XP, Informatica 9.0.1, Oracle, UNIX, Infa Admin Console

Responsibilities:

  • Interaction with clients to understand and address their problems.
  • To Make sure SLA is not breached based on the priority of Incidents logged in.
  • Worked on Change Requests (CR) for ETLs
  • Unit testing, Integration testing and Performance testing of the ETLs
  • Involved in UAT phase. & production deployment.
  • Part of DR activity Team.
  • Infa user creation and granting privileges.
  • Troubleshooting Production Load failures and make it to completion.
  • Preparing Unit Test Cases (UT) and Unit Test the ETL mapping Developed.
  • Have handled Escalated Incidents.
  • Month end Order Finalization.
  • Involved in MEC & QEC Process.

Confidential

ETL Developer

Environment: Windows XP, Informatica 8.6.1, DB2, UNIX, Autosys

Responsibilities:

  • Creating Mappings, sessions and Workflows
  • Working on Change Requests (CR) for ETLs
  • Unit testing, Integration testing and Performance testing of the ETLs
  • Creating AutoSys JILs(Job Information Language) for automating the ETLs
  • Preparing Unit Test Plans (UTP), Run Books and AutoSys design documents

ETL Developer

Confidential

Environment: Informatica 8.6, Oracle Ms Excel

Responsibilities:

  • Developing Design Documents for the ETL Transformation based on the Data Dictionary.
  • Developing a Informatica ETL Package for moving data from Source to Staging or to move from Staging to EDW Data warehouse.
  • Preparing Unit Test Cases (UT) and Unit Test the ETL mapping Developed.
  • Working in Performance tuning, Optimization, Data integrity and Statistics
  • Analyzing the data and moved the data from Source to Staging to EDW.
  • Self-reviewed Coding Standard and Naming Conventions.

Hire Now