We provide IT Staff Augmentation Services!

Senior Etl/ Informatica Developer/administrator Resume

3.00/5 (Submit Your Rating)

Austin, TexaS

SUMMARY:

  • 9+ years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using InformaticaPowerCenter9.x/8.x and Informatica IDQ 9.x. Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
  • Experience working with PowerCenterRepository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Design the ETL flow to load the data into hive tables. 
  • Involved in ETL design and development in SAP BODS 
  • Proficiency in developing SQL with various relational databases like Oracle,NoSQL, Teradata, MongoDB, SQL Server.
  • Knowledge in Full Life Cycle development of Data Warehousing.
  • Strong understanding of OLAP and OLTP Concepts.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Having solid Experience in Informatica and Teradata mix in Enterprise distribution center environment. Having solid involvement in utilizing Teradata utilities like TPT, FASTLOAD, MULTILOAD and BTEQ scripts. 
  • Good Knowledge on applying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformation and loading confidential data into targets. 
  • Developed mappings in Informatica to stack the information from different sources into the Data Warehouse, utilizing distinctive changes like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner. 
  • Experienced using Informatica Test Data Manager(TDM)Tool to create masking policies.
  • Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
  • Good knowledge on Informatica tools like Power Center, MDM, IDQ and ILM. 
  • Built logical data objects (LDO) and developed various mappings, Mapplet/rules using Informatica data quality (IDQ) based on requirements to profile, validate and cleanse the data.
  • Experience in determining on - going upkeep issues and bug fixes; observing Informatica sessions and in addition execution tuning of mappings and sessions. 
  • Extensive involvement in composing UNIX shell scripts and computerization of the ETL forms utilizing UNIX shell scripting. 
  • Expertise in implementing performance tuning techniques both ETL & Database level.
  • Experience in utilizing Automation Scheduling instruments like Autosys, Control-M and Maestro. 
  • Experience in working both Waterfall & Agile Methodologies. Good relational abilities with solid capacity to connect with end-clients, clients, and colleagues.  Responsible for the Data Cleansing of Source Data using LTRIM and RTRIM operations of the Expression Transformation.
  • Expertise in RDBMS concepts, with hands on exposure in the development of relational database environment using SQL, PL/SQL, Cursors, Stored Procedures, Functions and Triggers.
  • Experience in working with Perl Scripts for handling data coming in Flat files.
  • Strong with relational database design concepts.
  • Expertise in Health care domain like Medicare, Medicaid and Insurances compliance within HIPPA regulation and requirement.
  • Expertise in Change Data Capture (CDC) using Informatica Power Exchange.
  • Experience in integrating various data sources such as Flat Files, Cobol sources, XML, Salesforce, Oracle, Teradata, SQL Server, Share Networks, Webservices. 
  • Performed data validation by Unit testing, integration testing and System Testing.
  • Experience in Test Data Management (TDM) via ILM. 
  • Expertise in Datacentric Testing. 
  • Hands Middleware setup for integration with ECC. Replication of Customizing and Master data. 
  • Knowledge on Informatica Data Explorer (IDE), DEH(Data Exchange Hub), DIH(Data Integration Hub), and IDQ Workbench for Data Profiling.
  • Extensive experience in managing teams/On Shore-Off Shore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
  • Good knowledge of Data modeling techniques like Dimensional/ Star Schema, Snowflake modeling, slowly changing Dimensions using Erwin.
  • Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
  • Able to work independently and collaborate proactively & cross functionally within a team.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

TECHNICAL SKILLS:

Tools: Informatica PowerCenter9.x/8.x, Power Exchange 9.x/8.x,Informatica IDQ 9.x,Informatica MDM,Informatica MDM Hub Console,Informatica PowerCenter Data Replication Console, Informatica IDQ, Informatica ILM,Talend.

Database: Oracle 11g/10g/9i, MS SQL Server 2008/2012, MS Access, DB2, Teradata 14/13/12/V2R6/V2R5,MOngoDB, Sybase&Greenplum 4.0/4.1.

Reporting Tools: Business Objects XIR2, SAP BI 7.0, OBIEE 11g/10g,Cognos& Tableau.

Languages: C, C++, SQL, PL/SQL, HTML, JAVA, UNIX Scripting& Python Scripting

Other Tools: Toad, SharePoint, SCM, Putty,GIT,MATT, Autosys, ESP&Control-M.

Operating Systems: LINUX, UNIX,SUN Solaris, Windows7/XP/2000/98

PROFESSIONAL EXPERIENCE:

Confidential, Austin, Texas

Senior ETL/ Informatica Developer/Administrator

Responsibilities:
  • Worked with ETL Architects and Senior developers for understanding the requirements.
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
  • Provided subjects matter expertise in Informatica 9.x, PowerCenter, PowerExchange, Cloud Data Integration, Informatica Data Quality (IDQ), Information Lifecycle Management (ILM)( TDM 9.x, DA 6.x), Informatica Proactive Monitoring for PowerCenter Operations (2.x)and JAWs (4.x,5.x). 
  • Imported Source/Target tables from the respective SAP R3 and BW systems, created reusable transformations 
  • Worked on creating Unit testing documents.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Extracted data from SAP ECC 6.0, SAP R3 using SAP BCI extrators. 
  • Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads. 
  • Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
  • Created complex mappings in PowerCenterDesigner using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
  • Testing and validation of the developed Informatica mappings. 
  • Testing and Validation by Datacentric Testing using Checksum comparison and Domain comparison. 
  • Worked on unit testing for the loaded data into the target.
  • Develop, Test and stage components developed in SAP BODS using SQL, Query and Target stages
  • Developed group, tabular and form reports, menus, and object libraries using oracle reports developer. 
  • Generated various statistical reports using oracle reports builder based on the migrated data for the review of the management. 
  • Analyzed the source data coming from Oracle ERP system to create the Source to Target Data Mapping. 
  • Optimized SQL, HQL queries for better performance.
  • Defined Business rules in Informatica Data Quality (IDQ) to evaluate quality of data by creating cleanse processes to monitor compliance with standards and also identified areas for data quality gaps and assist in resolving data quality issues.
  • Having exposure on Informatica Cloud Services. 
  • Created HLDS for different requirements.
  • Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Worked on JIRA Agile for Agile planning and management and Integration as its flexible to each user to plan sprints, change and release. 
  • Supported Integration solutions across different applications such as SAP ECC, MQ Series, Mainframes, databases, Web technologies etc. 
  • Used of Apache Hive for stored data in various databases and file systems that integrate with Hadoop. 
  • Working with Data scientists on migration of traditional SAS code into Hive HQL to run on Hadoopplatform with higher efficiency and less time. 
  • Used JIRA to create and track issues, update stories and changes requests. 
  • Used IDQ and InformaticaPowerCenterto cleanse the data and load into TGT database
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
  • Using Informatica Power center9.1 to make the changes to the existing ETL mappings in each of the environments. 
  • Performed data migration from Legacy system into a new system using FISERV platform 
  • Created LDO's (Logical Data Objects), CDO's (customized data objects) and physical data objects in IDQ.
  • Deployed mappings and mapplets from IDQ to power center for scheduling.
  • Provided various INFACMD commands for automation and scheduling the IDQ jobs.
  • Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
  • Involved in Agile development, project planning and release planning activities. 
  • Actively involved in the Production support and also transferred knowledge to the other team members. 
  • Worked on supporting all the ETL Inbounds and Outbound of TDM in production environment. 
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Environments: InformaticaPowerCenter10/ 9.6.2, Informatica IDQ 9.6.2, SAP R3, SAP ECC 6.0, EBS, JIRA,Oracle Forms/Reports, Oracle EBS 11.5.10, TOAD 9.0,Oracle 11g, UNIX, Data Marts, Datacentric, TDM,FTP, MS-Excel, UNIX Shell Scripting, WinSCP

Confidential, NJ

Senior Informatica Developer

Responsibilities:
  • Collaborated with Manager, Data Architects and Data Modelers to understand the business process and functional requirements.
  • Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, porting and monitoring capabilities of IDQ 9.6.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
  • Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
  • Optimized SQL queries for better performance.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Deployed mappings and mapplets from IDQ PowerCenter for scheduling.
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance.
  • Created LDO's (Logical Data Objects), CDO's (customized data objects) and physical data objects in IDQ.
  • Tuned both ETL process as well as Databases.
  • Develop the ETL jobs and make sure to load the data into HIVE tables & HDFS files. 
  • Developed various Mappings, Mapplets , Workflows and Transformations for flat files and XML.
  • Identified and eliminated duplicates in datasets thorough IDQ components.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
  • Utilized the Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
  • Cleansed, standardized, labeled and fix the data gaps in IDQ where it checks with reference tables to fix major business issues.
  • Used DVO to validate the data moving from Source to Target. 
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).Exposed IDQ mapping/mapplets as web service.
  • Designed, configured, tested, reviewed & Optimized Informatica MDM and Informatica IDD Application
  • Created draft technical design documents based on the functional design documents. 
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Utilized ofInformatica IDQ 9.6 to complete initial data profiling and matching/removing duplicate data.
  • Worked on Designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Workflow Designer.
  • Created various rules in IDQ to satisfy the Completeness, Conformity, Integrity and Timeliness.
  • Tuned performance of InformaticaPowerCentersession for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Involved in production support and maintenance.

Environments:InformaticaPowerCenter9.6.2, SAP BODS,Informatica IDQ 9.6,Informatica IDE,Informatica BDE, SAP ECC 6.0,Oracle 11g, UNIX, SqlDeveloper, FTP, Toad.

Confidential, Los Angeles, CA

ETL/ Informatica Developer

Responsibilities:
  • Designed ETL's to extract data from COBOL Files and update the EDW with the Patient related Clinical information. This includes a one-time history load and subsequent daily loads.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server,Flat filesand COBOL files.
  • Created SSIS package which will download data from the respective source and will push into loan data hub.
  • Extracted data from a wide variety of heterogeneous Sources like Flat files, COBOL files, XML files, Relational Databases (Oracle, SQL Server, Postgress and Netezza) and loaded into Data Warehouse.
  • Experience on working with complete Software Development Life Cycle of the application.
  • Involved in monitoring and maintenance of the Unix Server performance.
  • Involved in running shell scripts for the different environments with the same shell scripts.
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
  • Worked on PowerExchange for importing source data to the Power center.
  • Created mappings to read COBOL source file and write it in ASCII file format. Target file is same as source structure.
  • Created mapping documents with business rules using MATT Tool.
  • Generated complex Transact SQL ( Confidential -SQL) queries, Sub queries, Co-related sub queries, Dynamic SQL queries etc.
  • Performed Data Analysis, Data Validation and Data verification using Informatica DVO (Data Validation Option) from raw data to user acceptance.
  • Responsible for analysis, design, development, testing and implementation of B2B Power Center mapping and DT services.
  • Development and maintenance of database objects in Oracle Exadata and Teradata databases. Writing SQL queries, DDLs and DMLs.
  • Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
  • Developing jobs for migration of data from Sap Bods to Informatica 9.x version.
  • Successfully database migration from Oracle Exadata to Teradata.
  • Error Handling and Recovery strategies in Informatica ETL environments, Oracle Exadata and Teradata.
  • Created sprint plan for developers responsibilities and scheduling sprint plan in version one.
  • Fixed defects in different environments for platform code.
  • Developed and deployed SSIS packages for ETL from OLTP and various sources to staging and staging to Data warehouse using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term, Slowly Changing Dimension and more. Performed ETL mappings using MS SQL Server Integration Services.
  • Extracted and reviewed data from heterogeneous sources from OLTP to OLAP using MS SQL Server Integration Services (SSIS).
  • Worked with Session logs and Workflow logs for Error handling and troubleshooting in Dev environment.
  • Worked on GIT Bash for code check in into the different environments using GIT commands.
  • Used Spash to verify checkin code in different environments.
  • Performance Tuning, Query Tuning of Stored Procedures and SQL queries using SQL Profiler and Index Tuning Wizard in SSIS.
  • Worked with different methods of logging in SSIS.
  • Worked on unit testing for the loaded data into the target.
  • Optimized SQL queries for better performance.

Environment:Informatica MDM, SIF, InformaticaPowerCenter 9.6.1/9.6.0 , Oracle 11g, Power Exchange 9.6, MS SQL Server 2008, SSMS, SSIS,DB2,UNIX, MATT, Data Marts, Spash, Version one,UNIX Shell Scripting, Data Modeling, GIT.

Confidential, Dallas, TX

ETL/ Informatica Developer

Responsibilities:
  • Created complex mappings in PowerCenterDesigner using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
  • Developed various Mappings, Mapplets , and Transformations for data marts and Data warehouse .
  • Implemented Extraction Transformation and Loading (ETL) process by pulling large volume of data from various data sources using SSIS.
  • Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load
  • Build simple to complex ETL Jobs using BODS Transforms like Table Comparison, Query, Case, Pivot, Reverse Pivot, Map Operation and functions like lookup
  • Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
  • Worked on implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata.
  • Converted SSIS code into Informatica mappings, sessions and workflows.
  • Set up batches and sessions to schedule the loads Confidential required frequency using PowerCenterWorkflow manager.
  • Used Teradata manager, Index Wizard and PMON utilities to improve performance.
  • Extensively worked on Autosys to schedule the jobs for loading data.
  • Multiload, BTEQ, created & modified databases, performed capacity planning, allocated space, granted rights for all objects within databasesetc.
  • Worked on Power Exchange for change data capture (CDC).
  • Worked on Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, MULTILOAD EXPORT, Teradata SQL and BTEQ Teradata utilities.
  • Developed Fast Load jobs to load data from various data sources and legacy systems to Teradata Staging.
  • Creating and modifying MULTI LOADS for Informatica using UNIX and Loading data into IDW.
  • Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
  • Involved in InformaticaData MaskingusingInformaticaILM tools& Data Subset Data Mapping.
  • Executed conversion maintenance on existing legacy system.
  • Loaded data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.

Environment: InformaticaPowerCenter9.5.1/9.1, Oracle 11g, Power Exchange 9.1, SSIS,SQL Server,Teradata 13.10 , DB2, Data Marts, Erwin Data Modeler 4.1, SAP BODS, UNIX Shell Scripting, Data Modeling, PL/SQL,Tableau,Autosys& UNIX(Sun Solaris5.8/AIX),

Confidential

ETL/ Informatica Developer

Responsibilities:
  • Responsible for design and development of Sales Data Warehouse.
  • Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Extracted data from heterogeneous sources like Oracle, SQL Server
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Conducted a series of discussions with team members to convert Business rules into InformaticaPowerCentermappings.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Used Informatica B2B data transformation to read unstructured and semi structured data and load them to the target.
  • Performed the development and testing of Extraction and transformation mappings using DT-studio
  • Tuned performance of InformaticaPowerCentersession for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • CreatedMapplets and used them in different Mappings.
  • Involved in doing error handling,debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Worked with SAP and Oracle sources to process the data. 
  • Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
  • Creation of reports like Reports by Customer, Reports by Period, Demographic reports and Comparative reports, Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies. 
  • Created reports using the business object functionality like queries, Slice and Dice, Drill down, functions and formulae. 
  • Created reports using standard company templates. A few of them were created based on the Department requirements. 
  • Extensive use of Business Objects functionalities in formatting reports. 
  • Created user Prompts, conditions and Filters to improve the report generation. Also used Alerts to improve the readability of the report.
  • Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica power center.
  • Created Unix Shell Scripts to automate sessions and cleansing the source data.
  • Implemented pipeline partitioning concepts like Round-Robin, Key-Range and Pass Through techniques in mapping transformations.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
  • Worked on the data masking activities for cloning the GDW for several buyers

Environment: InformaticaPowerCenter8.6, Oracle9i, Teradata v2r6, SAP, SAP BI 7.0, SQL Server, Sun Solaris, SAP Business Objects XI R2 SP2.

We'd love your feedback!