We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

5.00/5 (Submit Your Rating)

Grand Rapids, MI

SUMMARY

  • Over 8+ years of IT experience and extensive ETL tool experience using Informatica Power Center 9.x/8.x with strong working experience in Oracle SQL, PL/SQL.
  • Responsible for all phases of the Software Development Life Cycle (SDLC), from Requirement gathering, Analysis, Design, Development, Production in various industries viz., Insurance, Finance, Telecom, Retail.
  • Strong work experience in Data Mart life cycle development, performed ETL procedure to load data from different sources like SQL Server, Oracle, Teradata, DB2, COBOL files & XML Files and flat files into data marts and data warehouse using Power Center Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensively worked with Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalize, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and XML Source Qualifier.
  • Expertise in developing Mappings and Mapplets, Sessions, Workflows, work lets and Tasks using Informatica.
  • Experience in Teradata 14, utilities like fast export, fast load, BTEQ.
  • Good knowledge on US health care insurance system such as FACETS at DB level.
  • Very strong experience in Informatica Power Centre suite which includes Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Demonstrated experience with design and implementation of Informatica (IDQ v9.5), Data Quality applications for entire full development life - cycle.
  • Expertise includes Data Analysis, Data Modelling, Data Cleansing, and Transformation, Integration, Data import, Data export and use of ETL tool including Informatica.
  • Developed logical data models and physical data models.
  • Provided EDI related maintenance and backup support for clients and internal staff, such as tracking of missing EDI files and analyzing file format issues.
  • Experience in all stages of ETL - requirement gathering, designing and developing various mappings, unit testing, integration testing and regression testing.
  • Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, transform, cleanse data and load it into data marts.
  • HIPAA 4010 - 5010 Conversion Analysis - Involved in the documentation of HIPAA 5010 changes to EDI 837, 834, 835, 276, 277 Transactions.
  • Experience in writing UNIX shell scripts to support and automate the ETL process.
  • Expertise in creating Complex Informatica Mappings and reusable components like Reusable transformations, Mapplets, Work lets and reusable control tasks to work with reusable business logic.
  • Experience in power centre IDQ of developing plans for Analysis, standardization, Matching and merge, Address doctor, consolidating data from different components.
  • Excellent track record in working various types of projects, OLTP, Data Warehousing, Business Intelligence.
  • Automated critical processes using UNIX shell programming.
  • Excellent exposure on Oracle Partitioning including Range partitions, List Partitions.
  • Expertise in Dynamic SQL, Collections and Exception handling.
  • Built multiple SSIS packages for automation for movement of pricing data specific to the market.
  • Experience in migration of Data from Excel, Flat file, Oracle to MS SQL Server by using MS SQL Server DTS and SSIS.
  • Good Experience in Performance tuning of the Queries and attacking the bottlenecks of the applications.
  • Experienced in Unix Shell Scripting and other UNIX utilities like sed, awk.
  • Expertise in unit testing of Informatica mappings using debugger.
  • Experience in Performance Tuning and Debugging of existing ETL processes.
  • Expertise working with PL/SQL Stored Procedures, Cursors, Functions and Packages.
  • Worked with UNIX shell scripting for enhancing the job performance.

TECHNICAL SKILLS

BI Tools: OBIEE (10G, 11G), Tableau

ETL Tools: Informatica, Power Center 9.x/8x, Informatica MDM Multi-Domain, SSIS

Scheduling Tools: Control-M, Autosys

Languages: C, Java, SQL, PL/SQL, Ruby

RDBMS: Oracle 11g/10g, DB2, Teradata 14/13, SQL Server

Tools: /Utilities: Toad, Perl, Unix Shell

Operating Systems: Windows, UNIX, Linux

Web Technologies: HTML, CSS, XML, JAVA SCRIPT

PROFESSIONAL EXPERIENCE

Confidential, Grand Rapids, MI

ETL/Informatica Developer

Responsibilities:

  • Involved in discussions of user and business requirements with business team.
  • Involved in leading and monitoring the team assigning the task reviewing the development activity status calls.
  • Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements.
  • Created custom reports and calculated fields for HCM and reporting purposes.
  • Extensively used Teradata Utilities like Tpump, Fast-Load, MultiLoad, BTEQ and Fast-Export.
  • Implemented full cycle support of Workday HCM, Security and Reporting issues.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Development of Oracle -PLSQL packages, triggers etc.
  • Designed complex SSIS packages with error handling as well as using various data transformations like conditional split, fuzzy look up, multi cast, data conversion, derived columns, Merge join, row counts, Oledb source, Oledb destination, Excel source, Oledb Command.
  • Used Teradata Utilities (SQL Assistant, BTEQ, MultiLoad, and FastLoad) to maintain the database
  • Build a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ.
  • Created UNIX KSH shell scripts to kick off Informatica workflow in batch mode.
  • Invoked Informatica using "pmcmd" utility from the UNIX script Teradata using "BTEQ".
  • Used Teradata SQL Assistant to work on the database. Wrote C#,VB.NET Code for Script Task in SSIS 2008
  • Designed Web-based ASP.NET Internet applications linked to firm wide SQL Databases.
  • Implemented new BI solutions and converted legacy VB, VB.Net applications to SSIS packages.
  • Developed SSIS Packages for ETL process to load and transform the data from multiple sources into the data mart. Created and Published Tableau Dashboards into Tableau Server.
  • Writing SQL queries for data validation by using join conditions.
  • Worked on supporting all the ETL Inbounds and Outbound of TDM in production environment with SOA gateway.
  • Used Informatica tool for cleaning, enhancing and protecting the data.
  • Created XML mappings for mapping certain objects where we upload data using a Salesforce tool called DataLoader.
  • Participated and implemented the security to anonymous the sensitive customer data.
  • Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target files in efficient manner.
  • Solution Architecting using Informatica products - PowerCenter, Data Quality and MDM.
  • Create MDM Data Model by analyzing the data across the sources systems and profiling results which serves customer purpose.
  • Installation and Configuration of Oracle Data Integrator (ODI). Responsible for configuration of Master and Work Repositories on Oracle.
  • Install, upgrade and migrate MDM on Single node and Multi node environments.
  • Developed set analysis to provide custom functionality in tableau application.
  • Good exposure in interacting with Restful web services, SaaS, PaaS, and IaaS.
  • Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
  • Proficient with Software development methodologies like Agile Methodologies.
  • Design and Development of drill down reports, complex dash board charts and cross tab reports.
  • Converted the requirements into an ETL Technical Design document for ETL development using SQL Server.
  • Written Unit test scripts to test the developed mappings.
  • Developed mapplets and rules using expression, labeler and standardizer transformations using IDQ.
  • Deployed code from IDQ to power center.
  • Reporting and resolving EDI errors in production and testing.
  • Designed and developed an ELT process to move the data from OLTP to OLAP systems by using Oracle Data Integrator (ODI).
  • Responsible for moving data migration from various external systems to Oracle database.
  • Implemented data discovery and data mapping strategies.
  • Review and provide suggestions on the performance pain points of the existing system. Used Teradata BTEQ for writing scripts in Teradata database.

Environment: Informatica Power Centre 9.x, Oracle 11g/10g, Teradata 14, LINUX, Facets, IDQ 9.x, PL/SQL, TOAD

Confidential, Tempe, AZ

ETL/Informatica Developer

Responsibilities:

  • Interacted with business to gather requirements, to understand the Business Requirements and Business Rules
  • Subjected to Business, analyzed the data and delivered the client expectation.
  • Prepared ETL Specifications to help develop mappings.
  • Developed and maintained ETL (Data Extract, Transformation and Loading) mappings to extract the Data from multiple source systems like Oracle and Flat Files and loaded into Oracle.
  • Participated in team meetings to analyze requirements of data load.
  • Closely Monitored Activity logs.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
  • Worked on Informatica Power Centertools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Performance tuning, monitoring and index selection while using PMON, Teradata Dashboard, Statistics wizard and Index wizard and Teradata Visual Explain to see the flow of SQL queries in the form of Icons to make the join plans more effective and fast.
  • Streamlined the Teradata scripts and shell scripts migration process on the UNIX box using Autosys.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS. Created appropriate Teradata NUPI for smooth (fast and easy) access of data.
  • Developed SQL overrides in Source Qualifier/Lookup according to business requirements.
  • Designed and Developed mappings by using Lookup, Expression, Filter, Update Strategy, Aggregator, Router, transformations to implement requirement logics while coding a Mapping.
  • Opened tickets with Informatica support on MDM installation and configuration issues.
  • Analyzed Hub and process server logs for any issues with MDM and IDD.
  • Provided multiple IDD demo sessions to Business and IT Teams.
  • Designed and Developed IDQ mappings for address validation and data cleansing. Address Doctor service has been integrated with MDM.
  • Createdpre-session and post session scripts using UNIX/PL-SQL.
  • Used Apex Data loader to read/write CSV files to Salesforce Objects.
  • Tuned the Sources, Targets, Transformations and Mapping to remove bottlenecks for better performance.
  • Involved in scheduling the UNIX shell scripts which runs Informatica workflows usingControl-M jobs.
  • Documented the process for further maintenance and support.

Environment: Informatica Power Center 9.x, Oracle 10g, Delimited Flat Files, PL/SQL, Toad 10.6, UNIX, WinSCP, and Control-M.

Confidential, Indianapolis, IN

ETL/Informatica Developer

Responsibilities:

  • Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation
  • Used InformaticaPower Center 9.1 forextraction, loading and transformation(ETL) of data into the target systems.
  • HIPAA 4010 - 5010 Conversion Analysis - Involved in the documentation of HIPAA 5010 changes to EDI 837, 834, 835, 276, 277 Transactions.
  • Extracted data from different sources likeCOBOL copy book,flat filesloaded into ORMS.
  • Created complex mappings in power Center Designer usingAggregator, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, Source Qualifier and Stored procedure transformations.
  • Developedmappings/Reusable Objects/mappletsby using mapping designer, transformation developer and mapplet designer in Informaticapower CenterDesigner, since different datatype conversions have to be dealt multiple times which are similar and also most of the lookups have been similar
  • Worked extensively with different caches suchas Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared).
  • Provided EDI related maintenance and backup support for clients and internal staff, such as tracking of missing EDI files and analyzing file format issues.
  • Responsible for running batch jobs (daily and monthly) to create the EDI files from the existing XML.
  • Responsible for validating and analyzing the EDI 834 and 820 files. Created jobs to schedule multiple reports in Cognos connection.
  • Scripting in multiple languages on UNIX, LINUX and Windows - Batch, Shell and Python script etc.
  • Installation and Configuration of Oracle Data Integrator (ODI)
  • Support in full legacy to ODI data conversion and Integration task.
  • Move structured data from Oracle to Mongodb in JSON format and used pentaho data integrator for this.
  • Developed error handling & data quality checks in Informatica mappings.
  • UsedInformatica Power Center Workflow managerto create sessions, batches to run with the logic embedded in the mappings.
  • Writing Bash shell scripts for getting information about various Linux servers
  • Performedunittestingon the Informatica code by running it in theDebuggerand writing simple test scripts in the database thereby tuning it by identifying and eliminating thebottlenecksfor optimumperformance
  • DevelopedUNIX Shell Scriptsfor scheduling the sessions in Informatica.
  • Involved in scheduling the UNIX shell scripts which runs Informatica workflows usingAutosys.
  • Involved inPerformance tuningfor sources, targets, mappings and sessions.
  • Created Test Plans and Test Scripts to support the testing team.
  • Migrated mappings, sessions, and workflows from development to testing and then to Production environments.

Environment: Informatica Power Center 9.1, Oracle 10g, COBOL Copy book, Delimited Flat Files, SQL developer, UNIX Shell Programming.

Confidential, Omaha, NE

ETL/Informatica Developer

Responsibilities:

  • Involved from initial stage of requirement gathering, to understand the Business Requirements and Business Rules Subjected to Business.
  • Analyzed the business requirements and suggested changes to the design accordingly.
  • Prepared High-Level Designs documents according to the Functional Specification document to help in development.
  • Maintained Technical specification documents according to the Source to Target mapping documents.
  • Prepared ETL Specifications to help in developing mappings.
  • Worked with Power Center Designer tools in developing mappings to extract and load the data from XML files into different Flat-File formats
  • The sources were in XML format and the target was in either database tables or in the form of flat file .csv/ .txt / xml file.
  • Implemented code using various Informatica Transformations like XML Source Qualifier, Expression, Filter, Router, Lookup, Normalizer, Aggregator, and Source Qualifier.
  • Developedmappings/Reusable Objects/ mappletsby using mapping designer, transformation developer and mapplet designer, since different datatype conversions have to be dealt multiple times which are similar and also most of the lookups have been similar
  • UsedInformatica Power Center Workflow managerto create sessions, batches to run the mappings.
  • Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
  • Created pre-session and post session scripts using UNIX.
  • Implemented Unit, Functionality, and Performance testing on Mappings and created Test plans to support the Testing Team.

Environment: Informatica Power Center 8.6, Oracle 10g, Delimited Flat Files, xml, Teradata12 SQL assistant, UNIX Shell Programming.

Confidential, Chicago, IL

ETL / Informatica Developer

Responsibilities:

  • Involved from initial stage of requirement gathering, to understand the Business Requirements and Business Rules Subjected to Business.
  • Generated Entity-Relationship diagrams based on actual values and relationships held in source data.
  • Prepared ETL Specifications to help develop mapping.
  • Worked with Power Center Designer tools in developing mappings and Mapplets to extract and load the data from flat files, Oracle into Oracle database.
  • Implemented code using various Informatica Transformations like Aggregator, Expression, Filter, Sequence Generator, Update Strategy, Source qualifier, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Developed SQL overrides in Source Qualifier/Lookup according to business requirements.
  • Developedmappings/Reusable Objects/mappletsby using mapping designer, transformation developer and mapplet designer, since different datatype conversions have to be dealt multiple times which are similar and also most of the lookups have been similar
  • Worked with different caches in optimizing the transformations suchas Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared).
  • UsedInformatica Power Center Workflow managerto create sessions, batches to run the mappings.
  • Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
  • Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
  • Created pre-session and post session scripts using UNIX/PL-SQL.
  • Tuned the Sources, Targets, Transformations and Mapping to remove bottlenecks for better performance.
  • Migrated mappings, sessions, and workflows from development to testing and then to Production environments
  • Created deployment groups, migrated the code into different environments.
  • Worked closely with reporting team to generate various reports.

Environment: Informatica Power Center 8.6, Oracle 10g, Toad 9.5, Delimited Flat Files, Crystal Reports, UNIX Shell Programming, PL/SQL, and Erwin.

Confidential

Oracle Developer

Responsibilities:

  • Implemented triggers and Stored Procedures as per the design and development related requirements of the project.
  • Written stored procedure to refresh the materialized views.
  • Created PL/SQL stored procedures, function, packages, Cursor, Ref Cursor and Triggers for the system.
  • Tuning the SQL queries to increase the performance.
  • Created the shell scripts for automation.
  • Created the Indexes and constraints to increase the performance of the system.
  • Created the shell scripts to run the procedures and to create the reports and cleansing the data.
  • Scheduling the scripts using UNIX corn job.
  • Implemented oracle utility called SQL loaded to load data.
  • Worked extensively in ORACLE and PL/SQL creating objects like tables, views, synonyms, triggers, stored procedures, functions and packages.
  • Creating, maintaining multiple database environments for Development, Testing etc.

Environment: Oracle 9i, UNIX, SQL, PL/SQL, SQL Developer, TOAD, Putty, WINSCP.

We'd love your feedback!