We provide IT Staff Augmentation Services!

Sr Etl Developer, Informatica Etl Resume

2.00/5 (Submit Your Rating)

Woonsocket, RI

SUMMARY

  • Over 9 Years Of Experience in Software Development, administration and Expertise in Extracting data from heterogeneous sources using ETL Mappings and scripts in Informatica 10.1/9.6.1/8 x, Talend 7.3.
  • 5+ years of experience in Master Data Management
  • Had extensive knowledge on Informatica cloud data integration using IICS.
  • Had extensive knowledge on knowledge on Spark, Map Reduce, RDBMS, Hive/Pig, Scala, Linux/Unix technologies
  • Worked with different organizations to plan, develop, deploy, and maintain successful data management solutions.
  • Had extensive knowledge on end - to-end data flow in Master data management (MDM) from sources and back to sources via MDM inbounds (LND), Staging and BO tables using Informatica MDM 10.1/9.7.
  • Had knowledge on Informatica Product Information Management (PIM), which provides an end-to-end transparent process that enables efficient data management and integration.
  • Experience in working on different databases like Teradata, Oracle, SQL Server, DB2, Netezza 3.6, MS Access and Writing efficient and complex SQLs on huge volumes of data.
  • Experience in the Data Analysis,Profilingand cleansing usingInformaticadeveloper (IDQ) tools.
  • Working experience in Informatica Power Center 9.6 9.1 8.6X,IDQ/IDE, Informatica Metadata Manager 9.6.
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
  • Designed and developed data loading processes usingPL/SQLand UNIX Shell scripts.
  • Expert in implementing business rules by using corresponding Source, Targets and Transformations like Source Qualifier, Sequence Generator, Filter, Router, Joiner, Lookup, Expression, Update Strategy, Aggregator, Data masking, B2B and WebServicesConsumer to populate the data.
  • Proficient in IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.
  • Developed Mappings/ Mapplets to load the data into Datamart using Slowly Changing Dimension (SCD) methodologies. (SCD Type 1, Type 2, and Type 3).
  • Involved in massive data cleansing prior to data staging.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Good knowledge of Database concepts such as indexing, views, schemas and other database objects in bothOracleandDB2databases.
  • Experience in UNIX shell scripting and file management.
  • Worked extensively on Web services for automating mappings in MDM and Power Center by passing parameters through SoapUI.
  • Experience in scheduling of ETL jobs using Informatica Scheduler, DAC, Control-M and Autosys scheduling tool, UC4 and Tidal.
  • Experience in developing OBIEE Repository (.rpd) at three layers (Physical Layer, Business Model & Presentation Layer), Time Series Objects, Interactive Dashboards with drill-down capabilities using global & local Filters, Security Setup (groups, access / query/report privileges), configured Analytics Metadata objects (Subject Area, Table, Column), Web Catalog Objects (Dashboard, Pages, Folders, Reports) .
  • Expertise in implementing Oracle BI Apps in all phases like implementing Out of order Box prebuilt mappings, Designing ETL, metadata management using DAC, Creating Oracle provided Data Warehouse, OBIEE RPD development, OBIEE reports and Dashboards.
  • Good Knowledge in Developing and customizing Reports using OBIEE 10.1.3.4.1/10.1.3.4.0/10.1.3.3/10.1.3.2 , Oracle Reports 6i/4.5/2.5 Discoverer 10g/4i/3i, XML Publisher 5.6.3/5.6.2.
  • Well versed with using SQL and PL/SQL for BI Publisher and XML Publisher.
  • Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.

TECHNICAL SKILLS

Programming Languages: SQL, PL/SQL, C, Java 1.3/1.4/1.5/1.6 , HTML, XML

ETL Tools: Informatica 10.1/9.6.1/8 .x, Informatica MDM 10.2/9.7, B2BDX/DT v 8.0

BI Tools: OBIEE 11g, 10.1.3.x, Siebel Analytics 7.x.x., BI Apps 7.9.x.x.

Database: Oracle 11g /10g/9i/8i/7.x, SQL Server, Teradata, DB2

Scheduling Tools: Dseries, DAC, Control-M, Autosys, Active Batch

Data Modelling Tools: Erwin, Toad, and Microsoft Visio

Oracle Tools: SQL Developer, TOAD, SQL * Loader, Oracle Developer 2000

Scripting Languages: Java script, UNIX shell script

Operating System: Windows Vista, Win 95/98/2000/2003/2007/ XP, Linux, UNIX

PROFESSIONAL EXPERIENCE

Confidential, Woonsocket, RI

Sr ETL Developer, Informatica ETL

Responsibilities:

  • Work independently as a team player and work in Agile/Scrum environment.
  • Prototype and develop interactively in systems development life cycle.
  • Analyze and recommend alternative solutions for system enhancement and problem.
  • Developing ETL code using PowerCenter processing CORAM data.
  • Develop ETL code dealing with XML sources and csv files that has WorkBrain PUNCH data.
  • Prepare Batch run book for CORAM related ETL jobs pushed to PROD.
  • Gave production support for existing ETL processes.
  • Debug failures, finding root cause for the job failure and make necessary code changes to job scheduler (Control-M) or ETL job (Unix/PowerCenter)
  • Troubleshooting load failure issues and data quality issues on a day to day basis.
  • Good hands-on Python/PowerShell scripts for handling daily use cases playing with Excel and csv.

Environment: PowerCenter 10.4, Control-M 16.0, UNIX, WinSCP, Putty, Oracle 11g

Confidential, Marlborough, MA

Solution Developer II Riversand MDM, Talend ETL

Responsibilities:

  • Architect and implement applications and systems of medium to large level of complexity.
  • Develop processes within multi-domain MDM Azure Platform (on-prem or cloud SaaS).
  • Develop UI configs for entity types, relationships, configuration profiles for outbound within Riversand MDM.
  • Develop Riversand business rules for data governance model.
  • Design and implement complex data integration processes in Riversand MDM.
  • Develop Talend jobs to integrate Riversand Json files from Azure blob containers to Mainframe Message QUEUEs.
  • Develop Scripts to analyze data in XML and JSON formats.
  • Writing complex SQL and PL/SQL scripts for data analysis.
  • Work independently as a team player and work in Agile/Scrum environment.
  • Prototype and develop interactively in systems development life cycle.
  • Analyze and recommend alternative solutions for system enhancement and problem.
  • Resolution projects for cloud platform-Azure blob store.
  • Work with web services - SOAPUI, REST API’s.

Environment: Talend 7.3, Postman 7.21.0, UNIX, Riversand MDM Saas platform, Azure blob store., IBM Mainframe message QUEUE.

Confidential, Orrville, OH

Senior ETL Developer-Informatica Analyst

Responsibilities:

  • Configuring BO tables in MDM and coded LND to STG mappings.
  • Developing several ETL mappings from sources to MDM landing, Mappings that make operations like Cleanse-Put and Put API’s on MDM BO.
  • Configured PowerExchange for JMS in powercenter environment to read messages from message queue.
  • Developed complex mappings using PowerExchange for JMS to read messages from Queue.
  • Working on coding Real time messaging mapping which reads from JMS message queue, Parse and make a MDM SIF call to load that record to the BO tables.
  • Profiling data and creating rules for validation checks before loading data into MDM Landing.
  • Working with MDM match rule tuning using EMI and EMO tables.
  • Coding IDQ mapplet’s and using them as cleanse functions in MDM (LND to STG mappings) and in IDD dashboards.
  • Coding IDQ mapplet’s for cleansing ADDRESSs without using ADDRESS VALIDATOR.
  • Coding Informatica Powercenter mappings using PowerExchange for JMS to write data into downstream applications that use MDM data.
  • Developed complex mappings to invoke SIF operations like Search and PUT to load data into MDM BaseObjects.
  • Working on coding powercenter mappings which will receive and sent JMS messages to JMS queues (which are message-oriented middleware systems)
  • Creating Batch Group’s in MDM and schedule them in Dseries.
  • Setting up the batch load jobs and define the dependencies
  • Migrating code to TEST and PROD. by checking and loading code into Subversion and PPM tools.
  • Built a POC on integrating Sales force system with Oracle EBS using Informatica Intelligent Cloud services (IICS).
  • Worked closely integrating data into Hadoop clusters using Blaze/Hive engines in BDM.

Environment: Informatica Powercenter 10.1, IDQ, Informatica Data Analyst (IDA), SoapUI 5.2.1, Oracle11g, PL/SQL, UNIX, Dseries, TortoiseSVN, Project and Portfolio Management (PPM).

Confidential, Midland, TX

ETL Informatica Developer/IDQ/IDA/TDM

Responsibilities:

  • Worked extensively onInformaticato extract data from flat files, Teradata and Oracle, and to load the data into the target database.
  • InformaticaData Quality (IDQ9.6.1) is the tool used here for data quality measurement.
  • Involved in admin related activities like setting up Application services (Analyst services, Metadata manager services, data integration services, Model repository services etc.) which are used for IDQ.
  • Worked extensively to create data profiles and score cards using Informatica Data Analyst (IDA).
  • Involved in meetings to determine Data quality rules.
  • Developed data quality rules, mapplets according to the business rules.
  • Developed many ETL mappings to integrate data from various systems like ENERTIA, WELLVIEW, IHS, ARIES, PROCOUNT etc.,
  • Worked on Complex SQL queries according to business requirements, to code the same using Informatica ETL mappings.
  • Used B2B Data Transformation to parse all the EDI Files and complex XML into CSV files.
  • Built several test cases and created queries to systemically proceed with testing.
  • Worked on building mappings in such a way that scheduling tool invokes Informatica mappings using Web services.
  • Loaded data into Teradata Target tables using Teradata Utilities (FastLoad, MultiLoad, FastExport). Queried the Target database using Teradata SQL andBTEQfor validation.
  • Worked on supporting all the ETL Inbounds and Outbound of TDM in production environment.
  • Build mappings to extract data and load to ASCII4 files.

Environment: Informatica Power Center 9.6.1 HotFix3, Informatica Data Analyst (IDA), SoapUI 5.2.1, Oracle11g, PL/SQL, SQL-Server2008, Teradata, Tableau, UNIX.

Confidential, Houston, TX

Informatica ETL/MDM Developer

Responsibilities:

  • Supporting the ETL Inbound for heterogeneous sources using ETL CDC Mappings and scripts in Informatica Power Center.
  • UsedIDQto profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
  • Worked withInformaticaData Quality(IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities ofIDQ9.6.1
  • Created test data subsets for cleansing team to test and part of team to create data cleansing rules.
  • Involve in discussion with source system users to gather requirements for new Inbounds.
  • Building custom cleanse functions using IDQ for Informatica Data Analyst IDA to take data cleansing to the next level.
  • Design data merging and data de-duplicationjobs to perform data merging and data de-duplication(removal of duplicate records).
  • Built custom IDQ mapplets and import them in Power Center to perform data cleansing and to raising the DQs (errors).
  • Developed DQ mappings in Power Center using mapplets that are designed in IDQ.
  • Working on building the match and merge rules. Working with Trust and Validation rules in order to arrive at survivorship of multiple attribute values coming from different sources.
  • Working with User Exits to provide additional functionality at various levels of Batch Jobs in HUB.
  • Worked on Informatica web services Consumer transformation to fetch the MDM executed batch related data using SIF API WSDL by passing parameters to Power Center mapping through SoapUI.
  • Gained extensive knowledge on SIF APIs in communication between external systems and MDM HUB.
  • Worked on automating the Power Center mappings using Web Services through SoapUI.
  • Interacting with users to notify and get the corrections done to make sure the WELL changes are appropriate across the systems and hence making sure the integrity of the data.
  • Supporting, maintaining, installing, and customizing Informatica Power Center 9.x systems and related components.
  • Handling day to day migration requests with in Informatica over the different environments and IDQ to Informatica.
  • Tuned performance by analyzing and comparing the turnaround times between SQL andSpotfire.

Environment: Informatica Power Center 9.6.1 HotFix3, Informatica Siperian MDM 10.1, Informatica Data Quality(IDQ) 9.6.1, Informatica Data Analyst(IDA), SoapUI 5.2.1, Oracle11g, PL/SQL, SQL-Server2008, Toad, UNIX, Spotfire.

Confidential, Houston, TX

Informatica ETL/MDM Developer

Responsibilities:

  • Developed several complex mappings inInformaticaa variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both theInformaticaPower Center andIDQ.
  • Interacted with Business Analysts and Data Analysts for analysis of business requirements.
  • Involved in documenting the business requirements and the technical requirements based on the understanding of the existing systems.
  • Applied Change Data Capture (CDC) methodology using Informatica Power Center Designer to extract data from various systems like OWELL, TLS, WELLVIEW, SAP, ARIES (source databases- SQL server, Oracle, Teradata and flat files) into a staging area in Oracle database.
  • Extensively worked with Teradata utilities like Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Applied incremental loading technique in Informatica Mappings load the data from staging area to ODS- Operational Data Store in Oracle database.
  • Developed Mappings/ Mapplets to load the data into Datamart in Oracle database using Slowly Changing Dimension (SCD) methodologies. (SCD Type 1, Type 2, and Type 3 have been used)
  • Developed Mappings for Error Handling to fix any errors captured in Datamart after they are fixed in the source system.
  • Worked extensively on reusable and non-reusable transformations in simple and complex mappings. Examples of transformations used in Designer- Source Qualifier, Expression, Filter, Normalizer, Aggregator, Lookup, Update Strategy, Sequence generator, Joiner Transformations, Router, Rank, Data masking and WebServicesConsumer etc.
  • Built mappings to mask the production data using Data masking transformation which had confidential data related to agreements made by the client.
  • Converted complicated PL/SQL procedures to informatica mappings.
  • Used Informatica Workflow Manager for creating reusable and non-reusable sessions and worklets. Examples of tasks used in Workflow Manager- Session, Event Wait, Event Raise, Email, Decision, Worklets etc.
  • Responsible for scheduling and launching workflows to run at specified times and recovering the failed sessions.
  • Responsible for identifying the missed records in different stages from source to target and resolving the issue.
  • Involved in Performance Tuning of Informatica ETL mappings and sessions.
  • Developed UNIX shell scripts as part of the ETL process for loading.
  • Created test data subsets for testing team where I was a part of building test cases.
  • Supporting the ETL Inbound for the Legacy MDM solution built.
  • Debugged existing MDM outbound views and changed it according to the requirement.
  • Worked on Informatica Data Quality toolset and has proficiency in IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.
  • Worked in building mappings for populating data into the MDM Landing tables by processing the errors as DQ violations and re-processing them.
  • Worked on adding new DQ violation checks to the existing mappings by building IDQ mapplets for data cleansing.
  • Debugged several DQ related issues and fixed them.
  • Built mappings that fetch data from MDM HUB outbound JMS message Queues where published XML messages are processed and feed back to the sources.

Environment: Informatica 9.5.1, Informatica Data Quality (IDQ) 9.5.1, Informatica MDM 9.7, Toad 11.6.1.6, WinSCP, Putty, Oracle 11g, SQL server, Teradata, Netezza, UNIX, SQL

Confidential, Detroit, MI

Informatica ETL/IDQ Developer

Responsibilities:

  • Teamed with business analysts to deduce the business requirements to produce effective technology solutions
  • Responsible for understanding the business requirements and Functional Specifications document and prepared the Source to Target Mapping document following Organization standards.
  • Worked closely with Business Analyst and involved in Data Analysis to identify the business rules.
  • Extensively involved in updatingPL/SQLstored procedures, functions, triggers and packages to meet the business requirements based on change requests.
  • Developed reusable objects like Mapplets and worklets combined with user defined functions to use across multiple mappingsto pull data into MDM from heterogeneous sources like flat files, Oracle, SQL server, and Epic and Cerner database tables.
  • Designed various mappings and Mapplets using different transformations such as key generator, match, labeler, case converter, standardizer, Address Validator, parser and lookup.
  • Analyzing the issues and providing the solutions to dev and test teams.
  • Involved in optimization to identify the bottlenecks for the existing jobs.
  • Developed complex SCD-2 mappings in process of building operational data store (ODS).
  • Preparing test scripts and reviewing the test results.
  • Monitor and running the Power Center workflows, MDM Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and AutomationProcesses
  • Involved in fine tuning SQL overrides and Look-up SQL overrides for performance enhancements, Executed Pre and Post session commands on Source and Target database Optimizing the mappings by changing the logic to reduce run time.
  • Handling issues raised by clients in Production Support environment.
  • Performed Unit testing and Performance Tuning testing.
  • Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them.
  • Worked with the offshore team and supervised on their development activity

Environment: Health Insurance Data, Informatica9.1, IDQ, MDM, Oracle 10g, Netezza, Epic and Cerner, Toad, UNIX.

Confidential, Charlotte, NC

Sr. BI APPS Consultant -Internship

Responsibilities:

  • Configured Oracle Business Intelligence Application (BIAPPS). Including setting up Oracle Business Analytics Warehouse (OBAW), using Database Administration Console (DAC).
  • Created Financial Analytics reports which are used to analyze Trial Balance, Intercompany Balances, Journal Inquiries, Flow Analysis, Expense Analysis, Treasury Movement of Cash analysis and Payable reports.
  • Involved in developing Initial and Incremental ETL process i.e., Mappings & Mapplets using various Transformations and Workflows with Sessions/Tasks
  • Customized many complex Informatica ETL process and Oracle PL/SQL Packages i.e., Mappings & Mapplets using various Transformations and Workflows with Sessions / Tasks.
  • Developed many Reports/Dashboards with different Analytics Views Drill-Down & Drill-Across, Pivot Table, Graph, View Selector, and Column Selector, with global & local Filters using Oracle BI Web.
  • Built BI Publisher reports with various templates by using BI Publisher Desktop and scheduled to users based on the requirement.
  • Used DAC to schedule and run the Full and Incremental ETL loads and to schedule the Informatica jobs in Oracle Business Analytic Warehouse.
  • Configured Intelligence dashboards by including embedded content such as links/images & HTML objects.
  • Upgraded Rpd, Web Catalogs from OBIEE 10.1.3.4 to OBIEE 11.1.1.7.
  • Involved in performance tuning by setting Cache, limiting the number of initialization blocks, limiting select table types, pushing the calculation to the database.
  • Provided end-user training and documentation and second-line support to power users to develop dashboards, reporting metrics and reports.
  • Involved in Repository Configuration, Troubleshooting, Migration and Server Administration of DEV, QA and PROD environment

Environment: OBIEE 11.1.1.2, OBIEE 10.1.3.4, OBIA 7.9.6, DAC 7.5, Informatica 8.6.1, Oracle Applications R12, Seibel Analytics (Financials, Order Management, HR Analytics, Procurement & Spend, Project Analytics), Linux, UNIX, SQL DEV, TOAD.

We'd love your feedback!