Informatica Etl Warehouse Developer / Integration Engineer Resume
SUMMARY
- 12 Years of IT experience in analysis, design, development and implementation of Data Integration, Data Warehouse/ETL, Business Intelligence applications.
- Experience as a skilled ETL/Data warehousing professional, specializing in Informatica Power Center 10/9/8/7 for ETL and proficient in data analysis, data mart design, development and implementation of ETL processes against high - volume data sources.
- Experienced in Informatica Cloud Data Integration for Salesforce Integration (IICS for Salesforce) to integrate data to Salesforce Health Cloud as well as Read Salesforce data for Data warehouse loads.
- Experienced in Data Migration projects from Legacy systems to Salesforce, SAP and Soriano systems.
- Thorough understanding of Business Intelligence, Data Integration and Data Warehousing Concepts wif emphasis on ETL.
- Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling and Multidimensional Data Model design techniques (data Granularity, Facts and dimensions, Star and Snow-Flake schemas, confirmed dimensions). Complete knowledge of data warehouse methodologies (Ralph Kimball, Bill Inmon), ODS, EDW and Metadata repository.
- Highly skilled in Tuning SQL.
- Good understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflakes modeling, fact and dimension table modeling of data.
- Expert in data warehouse performance tuning.
- 2 years of experience in building integration API’s wif Mule Soft.
- Strong application integration experience using mule ESB wif connectors, transformers, Routing, Active MQ, JMS and IBM MQ. Data transformations using mule ESB.
- Experienced in using Mule Soft Studio that integrates APIs, databases, Salesforce, SalesCloud and SaaS applications and deployed Mule Soft applications to on-premise servers.
- Experienced on using mule connectors like FTP, FILE, SFTP, etc. as a part of integration usage.
- Hands on experience developing an API's using RAML.
- Hands on experience designing RAML specifications, building APIs using API Kit in Mule applications
- Worked wif various clients in the HealthCare, Higher Ed, Pharma, Energy Industries.
TECHNICAL SKILLS
Integration Tools: Informatica Cloud Data Integration (IICS for Salesforce, Snowflake etc), Informatica Power Center 10/9/8/7 (Source Analyzer, Data warehousing designer, Mapping Designer, Mapplet Designer, Transformations, Workflow Monitor, Workflow Manager, Repository Manager), Informatica TDM, Informatica IDQ, Oracle Warehouse Builder(OWB), Postman, Rest Client, Any point Studio.
Security Policies: Oauth 2.0, Jump Server Private-Public Key enforcement.
RDBMS: Oracle Exadata 12C, SQL, PL/SQL, SQL*Plus, SQL*Loader,IBM DB2 UDB 9.1/8.1/7.0, Sybase, MS SQL Server, Progress DB,IMS-DB.
Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, SnowFlake Modeling, FACT and Dimensions Tables, Physical andLogical Data Modeling, Erwin 3.5.2/3.x, Microsoft VisioTools Toad, AQT, Quest Central for DB2, Autosys, Rapid SQL, Oracle SQL Developer, Toad,WINCVS, ClearCase, PUTTY, Control-M,TIDAL Scheduler, Mercury Test Director, Remedy, Quality Center, Xpediter, File-Aid, VSAM, ISPF/PDF, TSO/ISPF,Inter-test, Endeavor, Win CVS, Bit Bucket, Confluence.
PROFESSIONAL EXPERIENCE:
Confidential
Informatica ETL Warehouse Developer / Integration EngineerResponsibilities:
- Worked on both loading of data from transactional systems to CRM systems and replenishment of supporting detail back to BI warehouse for reporting.
- Worked on Informatica Cloud Data Integration (IICS for Salesforce) for Data Synchronization and Data replication.
- Built Replication data source which fed into the Warehouse on a nightly basis. The replication datastore was also used for transactional reporting that were synchronized back to Salesforce for historical trend analysis and generating alerts for patients at risk for hospitalization and ESRD treatment complication like peritonitis.
- Integrated HL7, FHIR data, JSON and XML parsed by MULE RTF to salesforce by IICS secure agent, by end point invocation.
- Integrated Master Data Management EMPI resources into Salesforce load sequence, by invoking EMPI assets using IICS webservice calls.
- Worked on Informatica Cloud integration and PowerCenter Power Exchange integration wif Snowflake CloudWarehouse.
- Worked on Migrating Power Center Oracle Exadata warehouse ETL data processing to IICS CDI Snowflake Integration ELT loads.
- Worked on Informatica TDM to Mask production PHI and PII data to non-prod environments.
- Worked on Salesforce PowerCenter ETL to Replication components in IICS.
- Created automated Reconciliation reports for Daily patient load Audits, by comparing the inbound conformance stage to the transactional replication datasets to support Ops team wif front end user queries.
- Worked on Extracting Data for Financial and patient Supplies reporting from SAP using Informatica Power Center adapter for SAP NetWeaver and IICS for SAP.
- Read data from SAP hierarchies and Z-tables using Informatica plug-ins to SAP wif ABAP code generation methods.
- Loaded data from SAP sources wif staging and Streaming modes, Used RFC/BAPI transformation to make RFC calls from Informatica.
- Read data from SAP BW and write data to SAP Power Connect for SAP BW.
- Worked on Data migration projects from Adabase and Db2 legacy systems using Informatica Power Exchange for FMC when they merged wif Liberty Dialysis services and RAI dialysis services.
- Extensively used the Debugging feature in Informatica to fix Bugs in Mappings.
- Used advanced SQL transformations to run SQL on the database directly in script mode using both static and dynamic connections.
- Used pipeline lookup methods to cache data selected directly as a source.
- Used database techniques like Indexing, Adjusted various properties like Commit Interval, Commit Type, Pushdown Optimization, Line Sequential buffer length, Max Memory allowed for Memory Attributes in informatica mappings and sessions to obtain higher performance. Also created Backward-Compatible Log files and Performance statistics file to analyze performance.
- Used the features in the Normalizer transformation to de normalize the data and Aggregator transformation to normalize data.
- Worked on HTTP, XML parser/generator, XML Source/Target transformations in Informatica.
- Used Constraint Based loading & target load ordering to efficiently load tables wif PK-FK relation in the same mapping.
- Used concepts like Pushdown Optimization, analyzed Performance Stats, Thread Statistics, Incremental Aggregation, Backward Compatible Session Log Files, Performance statistics files and Reject files to improve performance in mappings.
- Used SQL transformation to in Dynamic and Static SQL model. Also used it wif Dynamic and Static database connection types.
- Adjusted various properties like Commit Interval, Commit Type, DTM Buffer Size, Enable High Precision, Pushdown Optimization, Constraint Based Load ordering, Line Sequential buffer length, Max Memory allowed for Memory Attributes, Pre-Build lkp Cache size, Data Cache Size and Index Cache size in informatica mappings and sessions to obtain higher performance. Also created Backward-Compatible Log files and Performance statistics file to analyze performance.
- Worked on ingesting data from legacy systems like DB2, Cobol VSAM files as part of our Data migration projects.
- Extensively used the Debugging feature in informatica to fix Bugs in Mappings.
- Used Update-override property along wif Logical keys to perform Insert and Update operations on tables wif no pre-defined Primary Key.
- Extensively used variable ports for better performing look-up(single call instead of multiple calls) and De-Duplication on basis of natural keys.
- Extracted data from different source systems like Oracle, Sql Server, Flat Files, HL7 and XML formatted data to load into target systems of Oracle Exadata and Netezza.
- Worked on FHP claims processing jobs and wrote the adjustment Logic for Paid, Reversed and Denied Insurance claims.
- Workflow, Worklets using tasks like Session, Control, Event Wait, Event Raise, Command, Assignment, Decision as per the business logic.
- Used Informatica Developer (IDQ) to develop transformations for Address Validation, email Validation, Data Masking using the profiling analysis done by IDE.
- Created PL/SQL packages, stored procedures, functions and triggers to implement business rules and validations and auditing.
- Created operation manuals and support documents for transition after go Live.
- Experience in writing Shell scripts for SFTP, Awk file processing, Creating Index File Listing, Public and private key encryption, Compressing files, PassPhrase encryption etc.
ENVIRONMENT: Informatica Power Center 10/9, Informatica IICS for Salesforce and SAP, Informatica Developer for IDQ, Data Analyst for IDE, Oracle Exadata 12C, PL/SQL, SQL server, Autosys, XML, TOAD, Oracle SQL developer, Sql Server, Postgresql, DB2, SQL, PUTTY, Veeva CRM.
Confidential
MuleSoft integration Engineer
Responsibilities:
- Involved in Design, Development, Testing, and Integration of the application Mule applications that load andretrieve data from salesforce.
- Configured API’s and deployed to the run time fabric.
- Integrated web services and messaging systems wif MuleSoft ESB.
- Worked extensively wif Data warehouse extraction and integration wif CMS (Center for Medicare Services)using mulesoft.
- Designed and developed enterprise services using RAML and REST-based APIs and used various transformers inMule ESB based on the use case and implemented the custom transformations.
- Developing Mule ESB projects for both synchronous and asynchronous Mule flows.
- Worked on FHIR and HL7 messaging standards for data parsing from internal systems.
- Used Data Weave for data transformations and data validations wifin flows & sub-flows.
- Integration of Mule ESB system while utilizing the MQ Series, Http, File system, and SFTP transport.
- Enforced security features for APIs wif secured http, OAuth and Jump server key pair encryption.
- Involved in creating HTTP inbound & outbound flows, transformers, filtering, and Security of Mule Flows.
- Created Request & Response transformers, Custom Components & configured them in Mule-config.xml.
- Created Mule Flow using End Point, Connector, and Component bean using Mule ESB to communicate between client/server systems.
- Mapped data from JSON to XML and vice-versa using Data weaver and Data mapper and configured the transformers in Mule XML Configuration file.
- Defined and executed the pagination technics among the applications using data weave.
- Used EMPI mater data management API calls to resolve patient identity to avoid inserting dupes into the system.
ENVIRONMENT: Anypoint Studio 7.4.2, Bit Bucket
Confidential
Sr.Informatica ETL Developer / Warehouse & Integration Expert
Responsibilities:
- Worked wif the subject matter experts and tailored solution to fit their needs.
- Attended ETL strategy meetings and discussed various approaches and formulated the simple and highly maintainable solutions.
- Created ETL architecture which was simple and easily maintainable.
- Worked on both loading of data from transactional and warehouse to Hyperion systems and replenishment of supporting detail back to warehouse for reporting.
- Extensively used the Debugging feature in Informatica to fix Bugs in Mappings.
- Used advanced SQL transformations to run SQL on the database directly in script mode using both static and dynamic connections.
- Used pipeline lookup methods to cache data selected directly as a source.
- Used database techniques like Indexing, Adjusted various properties like Commit Interval, Commit Type, Pushdown Optimization, Line Sequential buffer length, Max Memory allowed for Memory Attributes in informatica mappings and sessions to obtain higher performance. Also created Backward-Compatible Log files and Performance statistics file to analyze performance.
- Used the features in the Normalizer transformation to de normalize the data and Aggregator transformation to normalize data.
- Worked on Extracting Data from SAP using Informatica Power Center adapter for SAP netweaver.
- Created both argument and aggregated data files using Informatica for ROMBI sales reporting.
- Created operation manuals and support documents for transition after go Live.
ENVIRONMENT: Informatica Power Center 9.1, Oracle 11i, PL/SQL, Tidal, XML, TOAD, Oracle SQL developer, PUTTY.
Confidential, Southborough-MA
Sr.Informatica ETL Developer / Warehouse & Integration Expert.
Responsibilities:
- Worked wif the Functional experts from NRT and GRRA teams to develop the Technical Design specifications for the Dimensions and facts.
- Worked wif architects to load the inventory and material management Dimensional model which includes Order Header Dim, Sales Credits Dim, Cost Center Dim, Company Dim, Account Dim, Shipments Fact, Billings Fact, Bookings Fact, Backlog Fact and RMA Issues Fact.
- Attended ETL strategy meetings as part of ETL/Informatica technical advisory team to offer opinion and guidance to managers to select the best plan of approach.
- Created and changed ETL standards around migration Strategy and Reusability of code.
- Created and loaded Conversion tables by reading data using SAP ECC power Exchange reader to read data from Transparent and cluster tables.
- Developed maps to load slowly changing Dimension Type 2 (SCD 2) mappings wif TEMPeffective Date ranging.
- Read data from SAP hierarchies and Z-tables using Informatica plug-ins to SAP wif ABAP code generation methods.
- Loaded data from SAP sources wif staging and Streaming modes, Used RFC/BAPI transformation to make RFC calls from Informatica, Moved ECC Master data using IDOC extraction methods in real time mode using "IDoc Integration Using ALE" using IDOC ALE interpreter transformations to interpret SAP IDOC segment data.
- Read data from SAP BI using Open Hubs and SAP Power Connect for SAP BI.
- Used pre-session and post-session variable assignment for simulating multithreading scenarios and to transport variable (flags and counters) values across mappings, sessions and worklets.
- Used Constraint Based loading & target load ordering to efficiently load tables wif PK-FK relation in the same mapping.
- Used concepts like Pushdown Optimization, analyzed Performance Stats, Thread Statistics, Incremental Aggregation, Backward Compatible Session Log Files, Performance statistics files and Reject files to improve performance in mappings.
- Extensively used Parameter file to override mapping parameter, mapping Variables, Workflow Variables, Session Parameters, Ftp Session Parameters and Source-Target Application Connection parameters.
- Increased ease of code migration by heavy parameterization of connection objects and Session properties when moving code between different environments.
- Used Transaction Control transformation to perform on-demand commits.
- Also used File Name port in both Flat files and XML targets wif transaction Control transformation to generate Different files depending on requirement.
- Used pre build look up caches to improve performance and Dynamic caches to deal wif duplicates from the source for generating accurate Inserts and Updates while dynamically working wif targets.
- Used hash partitioning for Load Balances in maps which loaded millions of records.
- Created Dynamic parameter files and also indirect file lists to be used as inputs in sessions.
- Implemented persistent, Dynamic, Flat file look ups and Source qualifier Pipeline look ups for improved performance.
- Used SQL transformation to in Dynamic and Static SQL model. Also used it wif Dynamic and Static database connection types.
- Adjusted various properties like Commit Interval, Commit Type, DTM Buffer Size, Enable High Precision, Pushdown Optimization, Constraint Based Load ordering, Line Sequential buffer length, Max Memory allowed for Memory Attributes, Pre-Build lkp Cache size, Data Cache Size and Index Cache size in informatica mappings and sessions to obtain higher performance. Also created Backward-Compatible Log files and Performance statistics file to analyze performance.
- Extensively used the Debugging feature in informatica to fix Bugs in Mappings.
- Used Update-override property along wif Logical keys to perform Insert and Update operations on tables wif no pre-defined Primary Key.
- Extensively used variable ports for better performing look-up(single call instead of multiple calls) and De-Duplication on basis of natural keys.
ENVIRONMENT: Informatica Power Center 9.1, Informatica Power Exchange for SAP, Oracle 11i, SQL Server 2005, Greenplum DB, SAP R/3, Oracle SQL Data Modeler, PL/SQL, Control-M, XML, DB Visualizer, PUTTY, Excel Macros, APSE,HPQC, Harvest migration tool.
Confidential, Boston-MA
Sr.Informatica ETL Developer / Warehouse & Integration Expert.
Responsibilities:
- Developed ETL Objects to load Mainframe ADABAS data to Oracle stage schemas using Power Center and Power Exchange.
- Worked wif the Functional experts from ATS and Security teams to develop the Adabas specifications for the different files which were to be transported to Oracle.
- Worked wif the ETL team to create the Modernization, Consolidation, Conversion and Replenishment specifications which were based on the Functional specs for ETL development.
- Attended ETL strategy meetings as part of ETL/Informatica technical advisory team to offer opinion and guidance to managers to select the best plan of approach at project startup.
- Created and Used source and target Power Exchange mappings wif the help of ADABAS developers who understood the Adabas data-structures.
- Used the Informatica CDC option for Power Center to incrementally load data to Informatica using changes registered in the Adabas P logs.
- Used Informatica Data Explorer (IDE) to profile and cleanse data to assess the Data Quality of Historical and transactional data to develop profiling and cleansing rules for IDQ and Power Center.
- Used Informatica Developer (IDQ) to develop transformations using the profiling analysis done by IDE.
- Created Conversion files from Mainframe data which were to be used as Seed data for the SAP engine.
- Worked wif the end Users and analyzed the current reports to build a Data warehouse model using Orcle SQl Data Modeler.
- Developed slowly changing Dimension Type 2 (SCD 2)mappings wif TEMPeffective Date ranging.
- Modeled and loaded dimension tables like Demographic, Constituent College, Stipend Source, Pledge Source, ZipClub dimensions in DAR warehouse.
- Converted Oracle Warehouse Builder code to Informatica code for DAR application.
- Developed a Replenishment process for Mainframe to load SAP transactional data Back into Mainframes.
- Read data from SAP hierarchies and Z-tables using Informatica plug-ins to SAP wif ABAP code generation methods.
- Created and deployed ABAP code on SAP systems to execute mappings which pull from SAP sources.
- Performed SAP transports for ABAP code wif the help of Basis personnel which were used in Informatica mappings.
- Loaded data from SAP sources wif both staging and Streaming modes.
- Used RFC/BAPI transformation to make RFC calls to the SAP system.
- Used IDOC extraction methods in real time mode using "IDoc Integration Using ALE".
- Created IDOC ALE interpreter transformations to interpret SAP IDOC segment data.
- Loaded SAP BW Info Cubes and Info Sources using Informatica Power Exchange for SAP.
- Extensively used pre-session and post-session variable assignment for simulating multithreading scenarios for Load balancing and performance improvement.
- Used pre-session and post-session variable assignment to transport variable (flags and counters) values across sessions and worklets.
- Used Constraint Based loading & target load ordering to efficiently load tables wif PK-FK relation in the same mapping.
- Designed and developed mappings wif optimal performance using Aggregator, Java, Joiner, Normalizer, Rank, Sequence Generator, SQl, Transaction Control, Un-Cached & various kinds of Cached Lookup, Connected-Unconnected Source-Target pre and post load Stored Procedure Transformations, Update Strategy, Union, XML Transformations, etc.
- Used concepts like Pushdown Optimization, analyzed Performance Stats, Thread Statistics, Incremental Aggregation, Backward Compatible Session Log Files, Performance statistics files and Reject files to improve performance in mappings.
- Extensively used Parameter file to override mapping parameter, mapping Variables, Workflow Variables, Session Parameters, Ftp Session Parameters and Source-Target Application Connection parameters.
- Increased Code Reusability using shortcuts, Mapplets, Reusable Transformations, Re-Used mappings, Reusable Sessions and Worklets to reduce redundancy in code.
- Used Transaction Control transformation to perform on-demand commits.
- Used pre build look up caches to improve performance and Dynamic caches to deal wif duplicates from the source for generating accurate Inserts and Updates while dynamically working wif targets.
- Created Dynamic parameter files and also indirect file lists to be used as inputs in sessions.
- Implemented both Flat file look ups and Source qualifier Pipeline look ups for improved performance.
- Used SQL transformation to dynamically generate and execute DML statements in mappings where required.
- Used Normalizer Transformation to normalize data from ADABAS PE groups when reading data into Oracle and used Aggregators to De-normalize data while loading a PE group in the replenishment of the mainframe file.
- Adjusted various properties like Commit Interval, Commit Type, DTM Buffer Size, Enable High Precision, Pushdown Optimization, Constraint Based Load ordering, Line Sequential buffer length, Max Memory allowed for Memory Attributes, Pre-Build lkp Cache size, Data Cache Size and Index Cache size in informatica mappings and sessions to obtain higher performance. Also created Backward-Compatible Log files and Performance statistics file to analyze performance.
- Extensively used the Debugging feature in informatica to fix Bugs in Mappings.
- Adjusted properties like Stop on Errors, On Stored Procedure Error, Command Task error to control the way the session reacts to various scenario at run time.
- Used Update-override property along wif Logical keys to perform Insert and Update operations on tables wif no pre-defined Primary Key.
- Extensively used variable ports for better performing look-up(single call instead of multiple calls) and complicated calculations, Also used them in De-Duplication logic when needed.
- Provided post-production support for projects post Go-Live to resolve issues.
ENVIRONMENT: Informatica Power Center 9/8.6, Informatica Power Exchange for SAP, Informatica Power Exchange for ADABAS, Work Flow Manager, Work Flow Monitor, Source Analyzer, Target Designer, Transformation Developer, Mapping Designer, Informatica Data Analyzer(IDE), Informatica Developer(IDQ), Oracle 10g, SQL Server 2005, SAP R/3, DB2, Oracle SQL Data Modeler, PL/SQL, Control-M, XML, IBM AIX 5.3, Toad 9.1, DB Visualizer, PUTTY, Excel Macros.
Confidential
Sr.Informatica ETL Developer& Administrator/ Warehouse Consultant
Responsibilities:
- Developed mappings for File Ftp, File Validation, Transaction Validation, Data load and encryption mappings to process the credit card data from various sources.
- Created Common Framework for Transactional validations for data coming from different sources by loading all the data from various types of terminals into one common XML structure.
- Extensively used XML targets, XML sources, XML generator transformation and XML parser transformation.
- Loaded XML targets wif multiple hierarchies and used various XML Flushing options wif Transaction Control transformation to generate user defined commits to reduce cache usage and improve performance.
- Developed mappings for Transaction, Customer, Product and Demand dimensions, implementing the ETL logic provided as part of the Technical Specifications in the Credit-Card Project.
- Created sessions and workflows to help schedule nightly loads and process data from all source terminal Data Collection points.
- Extensively used pre-session and post-session variable assignment for simulating multithreading scenarios for Load balancing and performance improvement.
- Used pre-session and post-session variable assignment to transport variable (flags and counters) values across sessions and worklets.
- Edited different kinds of Partition points and Created partitions to improve performance by activating multiple Transformation Threads for optimal use of system resources.
- Collected Performance Stats and Thread Statistics to analyze the mapping and found out where the performance deadlock exists. Improved session performance by eliminating performance bottlenecks.
- Increased Code Reusability using shortcuts, Mapplets, Reusable Transformations, Re-Used mappings, Reusable Sessions and Worklets to reduce redundancy in code.
- Adjusted data, index cache sizes in the mapping level, DTM buffer sizes in the workflow level and Line Sequential Buffer lengths for FlatFile Loads to achieve optimal performance.
- Designed and developed mappings wif optimal performance using Aggregator, Java, Joiner, Normalizer, Rank, Sequence Generateor, SQl, Transaction Control, Un-Cached & various kinds of Cached Lookup, Connected-Unconnected-SourceTarget pre and post load Stored Procedure Transformations, Update Strategy, Union, XML Transformations, etc.
- Used Transaction Control transformation to commit data at different intervals as per the requirements and demand of various scenarios.
- Used pre build look up caches to improve performance and Dynamic caches to deal wif duplicates from the source for generating accurate Inserts and Updates while dynamically working wif targets.
- Implemented both Direct and Indirect Flat file look ups and Source qualifier Pipeline look ups for improved performance.
- Executed stored procedures from source qualifiers and SQL transformations and used the return data set in the mapping.
- Used Normalizer Transformation to normalize data from cobol sources and used Aggregators to De-normalize records to create one record.
- Extensively used variable ports logics to implement counter logics and to De-duplicate records based on key field values.
- Used Incremental Aggregation technique to load data into Aggregation tables for improved performance.
- Used Java Transformation to import complicated built-in java packages and executing them as part of mapping.
- Read data from SAP hierarchies, Z-tables and IDOCS using Informatica plug-ins to SAP.
- Used RFC/BAPI transformation in 8.6 to make RFC calls to the SAP system.
- Executed Single Stream-Multistream RFC calls by importing Custom transformations using SAP plugins in Version 8.1.
- Created and deployed ABAP code on SAP systems to execute mappings which pull from SAP sources.
- Used static and dynamic filters in Application Source qualifier also used joins on ZTables in the Application Source qualifier.
- Performed SAP transports for ABAP code wif the help of Basis personnel which were used in Informatica mappings.
- Loaded data from SAP sources wif both staging and Streaming modes.
- Loaded SAP BW InfoCubes using Informatica Power Exchange for SAP.
- Extensively used Parameter file to override mapping parameter, mapping Variables, Workflow Variables, Session Parameters, Ftp Session Parameters and Source-Target Application Connection parameters.
- Worked on changing the Transformation scope option in Cached-Active transformations like sorter, Joiner, Aggregator, Rank transformations to improve performance.
- Used performance improvement techniques like drop indexes and key constraints before you run the session only to create them after load, Bulk loading wifout DB logging on few accessions, Created indexes on fields of tables which are used on the order By, Group by, lookup fields and choosing the right join in the joiner (master or detail) by the studying the data being processed from incoming pipelines.
- Used update overrides and created target structures wif Incomplete PK to make partial key updates and used Target Table Name override option to update the right table.
- Used source and Target pre and post load stored procedures to drop indexes and truncate tables and also used multiple pre and post session procedure transformations in a specific execution order.
- Used Pushdown Optimization to push the transformation logic on to the Database both on the Source and the Target sides where ever possible to improve performance of the mapping.
- Used UNIX performance monitoring tools to analyze the CPU usage and Cache size usage and OS paging size to improve performance.
- Analyzed Thread statistics from the system logs to identify which thread is taking the longest to complete and created partitions and other performance tuning techniques to improve performance.
- Studied Backward Compatible Session Log Files and Reject files to understand the root cause of data rejection and performance deadlocks.
- Developed Slowly changing Dimension Type 2 mappings.
- Extensively worked wif secure systems and databases which are inside firewalls for critical data security and protection.
- Expert in using the Informatica Debugger to understand the errors in mappings and used the expression editor to evaluate complex expressions and look at the transformed data to solve mapping issues.
- Worked wif different kinds of databases like Progress, Sybase, Oracle, SQL server.
- Executed stored procedures from source qualifiers, SQL transformations and used the return data set in the mapping.
- Converted SQL server DTS packages to Informatica interfaces.
- Imported mapplets created by Informatica IDQ and used them in Informatica mappings.
- Preparation of Detailed Design Documents for ETL in conjunction wif eGate developers and Functional Experts.
- Assisted in effort estimates at the initial phases of the Project design and planning.
- Created test plans for Unit Test, System Integration Test and UAT and tested code in all Development, QA and pre-production (UAT).
- Prepared ETL standards, Naming conventions and wrote ETL flow documentation for CCStage and CC ODS stages.
- Prepared detailed Documentation for off-shore Production Support teams.
- As a interim Team Lead worked wif Users and clients to get clarification on requirements, Coordinate Production migration schedules and delegating work to developers as and when a new request comes in.
- Responsible for creation of CSDOC and ISPW requests to schedule the Informatica jobs in Control M.
- Worked on Informatica Version upgrade wif technical architects and SMEs’. Responsible for testing various applications and running diagnostics on various key functionalities of Informatica in post-upgrade Testing.
- Called stored procedures from UNIX scripts because Informatica was not using native driver.
- Created Shell scripts to start/stop workflows which would be invoked by Control-M scheduler.
- Provided post-production support to resolve the issues.
- Used both Debugger and Test load options in the workflow level.
- Worked wif Change Management for code migrations and Production incidents.
ENVIRONMENT: Informatica Power Center 8.6, Informatica Power Exchange for WebSerices, Informatica Power Exchange for Microsoft Messaging Queuing,Work Flow Manager, Work Flow Monitor, Source Analyzer, Target Designer, Transformation Developer, Mapping Designer, Oracle 10g,Sybase, SQL Server 2005, SAP R/3, Progress 9.0, DB2, Erwin, PL/SQL, Control-M, XML, IDQ, IBM AIX 5.3, Toad 9.1,DB Visualizer, PUTTY, Excel Macros.