Sr. Informatica Iics Developer Resume
Milwaukee, WI
SUMMARY
- Around 9+ years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
- Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
- Worked extensively on ETL process using Informatica Power Center 10.x/9.x/8.x/7. X, Informatica Data Quality (IDQ), Informatica MDM, Data Lake, Informatica Intelligent Cloud Services, Informatica Cloud Real Time, Snowflake Cloud Data Warehouse V2 Connector and Informatica B2B.
- Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
- Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
- Experience working wif Cloud Computing on Platform Salesforce.com
- Proven ability in middleware integration components preferably in iPaaS (Integration platform as a service) platform such as IICS (Informatica Intelligent Cloud Services).
- Experience in extracting/loading data into Enterprise Data Warehouse meeting data standardization requirements.
- Proficiency in Salesforce real-time integration using APIs (REST/SOAP/Bulk), Web services.
- Knowledge in developing Informatica Cloud Mappings, Mapplets, Integration Templates, Tasks, and Task flows for data loads.
- Demonstrated knowledge of enterprise level Data Warehousing using Snowflake.
- Experience in extracting data from AWS to other cloud based and on-premise applications.
- Expertise in Informatica cloud services like Data Synchronization and Data Replication for various databases and applications.
- Developed Shell/Python scripts to handle incremental loads
- Adept at Salesforce CRM Configuration, Customization, and Testing of applications.
- Proficient in creating roles, profiles, email templates, page layouts, workflows, workflow actions and approval process.
- Extensively worked wif Informatica performance tuning involving source level, target level and map level bottlenecks.
- Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer.
- Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software, Utility, Pharmaceutical, Health Care, Insurance, Financial and Manufacturing industries.
- Worked on Teradata and its utilities - tpump, Fastload through Informatica.
- Experience in development and maintenance of SQL, PL/SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
- Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience wif UNIX command line and LINUX.
- Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata Fastexport.
- Used the Semantic Layer concept to guarantee correct results, improve database performance and improving user understanding and acceptance.
- Worked on Teradata and its utilities - tpump, Fastload through Informatica. Also created complex Teradata Macros
- Experience in ETL development process using Informatica for Data Warehousing, Data migration and Production support.
- Extensive experience in scheduling, monitoring and production support of OraclePL/SQLBatch jobs using Tidal and DBMS SCHEDULER.
- Strong knowledge on Data warehousing concept, Data mart, Star Schema and Snow Flake Schema modelling, Fact tables and Dimensional tables. Implemented Slowly Changing Dimension methodology for accessing the full history of accounts and transaction information.
- Proficiency in data warehousing techniques for data cleansing, surrogate key assignment and Change data capture (CDC).
- Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge on Teradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
- Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
- Optimized the Solution using various performance-tuning methods (SQL tuning, ETL tuning (i.e., optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
- Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
- Extensive knowledge in all areas of Project Life Cycle Development.
- Strong analytical, verbal, written and interpersonal skills.
TECHNICAL SKILLS
Databases: Oracle 10g/9i/11i/R12, DB2, MS SQL server 7.0/2000/2005/2008 , MS Access 2000/2005, Teradata, MySQL
Languages: Transact- SQL, PL/SQL, HTML, C, C#, PERL, Java, Python
Operating Systems: Windows, Linux, UNIX, MS-DOS, Sun Solaris.
OLAP/Reporting Tools: SQL Server Analysis Service (SSAS), SQL Server Reporting Service (SSRS), Share Point MOSS 2007, Business Objects 6.x, Cognos Framework Manager
ETL Tools: Informatica Power Center 10.x/9.x/8.x/7.x, Informatica Power Exchange, Informatica Data Quality Suite 9.6, Informatica MDM, IICS, ICRT, AWS, SQL Server Integration Services (SSIS)Data Modeling Tools Microsoft Visio
SQL Server Tools: SQL server Management Studio, SQL server Query Analyzer, SQL server mail service, DBCC, BCP, SQL server profiler
Web Technologies: MS FrontPage, MS Outlook Express, FTP, TCP/IP
Other Tools: Microsoft Office, Visual Basic 6
Scheduling Tools: Tidal, Autosys, Windows Scheduler
Data Quality Tools: Informatica Analyst, Informatica Data Quality, Informatica Developer
MDM Tools: Nextgate, Informatica MDM
PROFESSIONAL EXPERIENCE
Confidential, Milwaukee, WI
Sr. Informatica IICS Developer
Responsibilities:
- Worked wif the IT architect, Program managers in requirements gathering, analysis, and project coordination
- Developed Data Integration Platform components/processes using Informatica Cloud Platform, Azure SQL Datawarehouse, Azure Data Lake Store and Azure Blob Storage technologies
- Analyzed existing ETL Datawarehouse process and ERP/NON-ERP Applications interfaces and created design specification based on new target Cloud Datawarehouse (Azure Synapse) and Data Lake Store
- Created ETL and Datawarehouse standards documents - Naming Standards, ETL methodologies and strategies, Standard input file formats, data cleansing and preprocessing strategies
- Created mapping documents wif detailed source to target transformation logic, Source data column information and target data column information
- Designed, Developed and Implemented ETL processes using IICS Data integration
- Created IICS connections using various cloud connectors in IICS administrator
- Installed and configured Windows Secure Agent register wif IICS org
- Extensively used performance tuning techniques while loading data into Azure Synapse using IICS
- Extensively used cloud transformations - Aggregator, Expression, Filter, Joiner, Lookup (connected and unconnected), Rank, Router, Sequence Generator, Sorter, Update Strategy, Union Transformations
- Extensively used cloud connectors Azure Synapse (SqlDW), Azure Data Lake Store V3, Azure BLoB Storage, Oracle, Oracle CDC and SQL Server
- Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes
- Extensively used Parameters (Input and IN/OUT parameters), Expression Macros and Source Partitioning Partitions
- Extensively used Push Down Optimization option to optimize processing and use limitless power of Azure Synapse (SqlDW)
- Extracted data from Snowflake to push the data into Azure warehouse instance to support reporting requirements
- Performed loads into Snowflake instance using Snowflake connector in IICS for a separate project to support data analytics and insight use case for Sales team
- Created PYTHON scripts to create on demand Cloud Mapping Tasks using Informatica REST API
- Created PYTHON scripts which will used to start and stop cloud Tasks (the scripts use Informatica Cloud API calls)
- Developed CDC load process for moving data from Peoplesoft to SQL Datawarehouse using “Informatica Cloud CDC for Oracle Platform”
- Developed complex Informatica Cloud Task flows (parallel) wif multiple mapping tasks and task flows
- Developed MASS Ingestion tasks to ingest large datasets from on-prem to Azure Data Lake Store - File ingestion
- Designed Data Integration Audit framework in Azure SqlDw to track data loads, data platform workload management and produce automated reports for SOX compliance
- Worked wif a team of 4 onshore and 6 offshore development teams and prioritizing project tasks
- Involved in Development, Unit Testing, SIT and UAT phases of project
Environment: Informatica Intelligent Cloud Services, Informatica PowerCenter 10.2, Informatica Power Exchange 10.2, Windows Secure Agent, Teradata v1310, Azure Synapse (Azure SqlDW), Azure Data Lake Store, SQL Database, Tableau Server & Desktop
Confidential, Minneapolis, MN
Sr. ETL Informatica Developer
Responsibilities:
- Involved in gathering and analyzing the requirements and preparing business rules.
- Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.
- Worked wif Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
- Developed Informatica Workflows and sessions associated wif the mappings using Workflow Manager.
- Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Responsible for Data Cleansing and Data Quality checks using Informatica Data Quality (IDQ).
- Implementing methods to validate dat data using Informatica DVO supplied by external sources were loaded correctly into the awards database
- Develop python scripts to separate the various Rec types provided by the vendor - Sedgwick
- Complete understanding of regular matching, fuzzy logic and deduce limitations on IDQ suite.
- Involved in L2 production support batch processing and application support.
- Created various tasks like Session, Command, Timer and Event wait.
- Design complex mappings involving constraint-based loading and target load order.
- Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations and create mapplets dat provides reusability in mappings.
- Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.
- Extracted data from different databases like Oracle and external source systems like flat files using ETL tool.
- Well versed in developing the complex SQL queries, unions and multiple tables joins and experience wif views.
- Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
- Involved in debugging Informatica mappings, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
- Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.
- Involved in Performance Tuning of mappings in Informatica.
- Created Test cases for the mappings developed and tan created integration Testing Document.
- Followed Informatica recommendations, methodologies and best practices
Environment: Informatica Data Quality (IDQ), Informatica DVO, Informatica Power Center 10.2, Oracle 11g, Python2.5, Autosys, DB2, SQL, PL/SQL, Unix, Flat Files, Putty, WinSCP, Toad 10.6, SQL Developer.
Confidential, Minneapolis, MN
Sr. ETL Cloud (IICS) Developer
Responsibilities:
- Worked wif Informatica Data Quality 10.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.1
- Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
- Designed, developed and maintained Informatica cloud data integration processes using the appropriate data load technique and performance optimization.
- Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jiro Distance.
- Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.1.
- Prepared ETL Specifications and design documents to halp develop mappings.
- Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.1.
- Used Informatica Power Center for migrating data from various OLTP databases to the data mart
- Pulled Data from various DBS (Dealers Business Systems) application into SQL server.
- Load/extract data from multiple cloud applications Salesforce, Snowflake Data warehouse and legacy applications (e.g., DBS)
- Develop ETL mappings using Informatica Intelligent Cloud Services (IICS) by creating various transformations as appropriate using aggregator, look up, joiner, filter, sequence generator, normalizer, sorter, router, stored procedures in Data Integration.
- Migrated existing ETL process from Jitterbit into Informatica Intelligent Cloud Service (IICS).
- Designed, developed and maintained Snowflake database objects (tables, views, stored procedures, etc.) and SQL scripts.
- Defined and documented complex technical design/requirements for integrations and data warehousing. Participated in code reviews and ensure designed systems are reliable, self-recovering and require minimal support.
- Worked on proof of concept to use Informatica CIH (Cloud Integration Hub) to publish order details from the CRM application to the Orders.
- Created various API connection based on SOAP, REST and REST V2 in IICS.
- Created processes dat conduct service calls through APIs dat interface wif 3rd party applications in IICS.
- Managed and configured IICS administration as per the requirement
- Created Out Bound Processing (OBM) process between salesforce and Sql server to maintain real time changes over salesforce.
- Created ICRT and IICS jobs using Snowflake Cloud Connector V2 and REST API V2 connector to send and receive data from third party vendor applications.
- Created IICS Data Synchronization Jobs to load data to Snowflake.
- Used Integration Templates to create reusable mappings in IICS.
- Parsed HL7 messages and worked wif HL7 delimiter definitions (Segment Terminator, Field Separator, Component Separator, Subcomponent Separator, Repetition Separator, Escape Separator) for identifying and separating HL7 Data.
- Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
- Work wif various source systems to find data anomalies in the source data, fix data and design/Implement ETL process using Informatica PowerCenter, Big Data and Informatica Cloud wif AWS Redshift, REST V2 connector, third party APIs, Informatica LDAP Connector, NoSQL, Netezza and Oracle as the databases.
- Performed POC using IICS Snowflake Cloud Data Warehouse V2 Connector.
- Created Informatica mappings wifPL/SQLprocedures/functions to build business rules to load data. Worked wif Oracle DDL and DML Scripts and established relationships between tables using Constraints.
- MaintainedSQLscripts and complex queries for analysis and extraction. Coding and testing of various database objects such as views, functions and stored procedures usingSQLandPL/SQL.
- Upload any delimited files directly into the data lake from the UI
- Created Mappings to load data using various transformations like Source Qualifier, Sorter, Lookup, Expression, Router, Joiner, Filter, Update Strategy and Aggregator transformations.
- DevelopedMDMHubMatch and Merge rule's Batch jobs and Batch groups.
- Worked specifically wif the Normalizer Transformation by converting the incoming fixed-width files to COBOL workbooks and using the Normalizer transformation to normalize the data.
- Worked wif Lookup Dynamic caches and Sequence Generator cache.
- Involved in the creation of database Tables, Materialized views and Indexes andPL/SQLstored procedures, functions, triggers and packages. Extensively used T-SQLandPL/SQLfor development of Procedures, Functions, Packages and Triggers.
- Semantic Layer design and development wif the halp of BI developer and Data architect, preparing the aggregates and required deformalized tables for quality performance reporting using Oracle ODI/ PL SQL stored procedures etc.
- Created Reusable Transformations and Mapplets to use in Multiple Mappings and also worked wif shortcuts for various Informatica repository objects.
- Prepared ETL Specifications and design documents to halp develop mappings.
Environment: Informatica Power Center 10.1, IDQ 10.1, Informatica MDM 10.0, Informatica Cloud (IICS), ICRT (Informatica Cloud Real Time), SQL Server, Azure, Jitterbit, Salesforce, Snowflake, Netezza, AWS EC2, AWS Redshift, SQL, PL/SQL, Oracle Database 11g, SQL server, Toad for Oracle, Unix Shell scripts.
Confidential, Denver, CO
Sr. ETL Cloud Developer
Responsibilities:
- Worked wif Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
- Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
- Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
- Worked on Informatica power center tools - designer, repository manager, workflow manager, and workflow monitor.
- Build and maintain ETL mappings and workflows using IICS.
- Create and manage automated data exchanges for file retrievals and submissions using CMI (Cloud Mass Ingestion).
- Provide Real-time integration E2E solution using APIs and various DBs wif IICS.
- Build incremental/delta ETL solution for Oracle EDW and ThoughtSpot (Data Analytics platform) using IICS.
- Design, build and test MCT and ICRT services & processes according to business and application requirements.
- Extracted data from DB2 database on Mainframes and loaded it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities.
- Involved in ETL code usingPL/SQLin order to meet requirements for Extract, transformation, cleansing and loading of data from source to target data structures.
- Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
- Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
- Used Informatica Power Center Workflow manager to create sessions, batches to run wif the logic embedded in the mappings.
- I Specialize in developingSQLquerieswif hands on experience in writing complexqueriesfor staging clean data sets to front end web applications.
- Worked on Teradata and its utilities - tpump, Fastload through Informatica.
- As aPL/SQLDeveloper created sprocs in Oracle to meet the available technology usage.
- Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
- Build ad-hoc ETL solutions using disparate data sources like flat file, RDBMS, AWS RDS & S3 bucket.
- Write newSQL,PL/SQLcode to support new enhancements to module(s).
- Created MDM mapping and configured match and merge rules to integrate the data received from different sources.
- Created Batch Groups in Utilities Workbench and scheduled them externally using Power Center and Autosys (any scheduling tool).
- Generated and used SQL queries to fetch the required statistics from repository database in different environments.
- Design the Source - Target mappings and involved in designing the Selection Criteria document.
- Responsible for manually start and monitor production jobs based on the business users’ requests.
- Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
- Migration of code between the Environments and maintaining the code backups.
- Worked on staging the data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
- Implemented various Performance Tuning techniques.
- Worked wif PMCMD to interact wif Informatica Server from command mode and execute the shells scripts.
- Project based on Agile SDLC methodology wif 2 weeks of software product release to the business users.
Environment: Informatica Power Center 10.0, IDQ 9.6.1, Informatica MDM, IICS, ICRT, AWS, Oracle Database 11g, SQL, SQL server, PL/SQL, Toad for Oracle, Unix Shell scripts.
Confidential, Deerfield, IL
ETL Developer
Responsibilities:
- Worked on Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.6.
- Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor.
- Responsible for studying the existing data warehouse and also working on migrating existingPL/SQLpackages, stored procedures, triggers and functions to Informatica Power Center.
- Designed and developed Complex mappings like Slowly Changing Dimensions Type 2 (Time Stamping) in the mapping designer to maintain full history of transactions.
- Used SQL queries and database programming using PL/ SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
- Developed Database application to meet business need using OraclePL/SQLfeatures. Created Packages, Procedures, Functions and Triggers.
- Used Informatica Power Center for Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.
- Developed efficientPL/SQLclean-up scripts to remove older transactional data from non-partitioned tables.
- Involved in Data Loading Sequence and Populated Data into Staging Area and Warehouse wif Business Rules.
- Extensively used ETL to load Flat files, XML files, Oracle and legacy data as sources and Oracle, Flat files as targets.
- Created Sessions and managed the Workflows using various tasks like Command, Decision, Event wait, counter, Event raise, Email using Workflow Manager.
- WroteSQLQueriesto verify the data in source and Target Systems.
- UsedPL/SQLto build, format and display the user screens, web pages and reports and for writing code dat resides in the databases.
- Used TOAD to runSQLqueriesand validate the data in warehouse.
- Extensively used the Informatica Debugger for debugging the mappings.
- Hands on Informatica MDM and efficient on various Informatica stages database objects
- Extensively worked wif Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
- Profile files and shell scripts were used for recreation of dynamic parameter files.
- Scheduling of Informatica workflows using Tidal Scheduler.
- Migration of Informatica code from DEV to TEST environments in Informatica by creating deployment groups, folders, applying labels, creating queries in the Informatica Repository Manager.
Environment: Informatica Power Center 10.1, SQL, PL/SQL, UNIX, Shell Scripting, SQL Server 2008, Sybase, Oracle 11g, DB2, Control-M.
Confidential, Atlanta, GA
ETL Developer
Responsibilities:
- Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
- Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
- Responsible for creating scalable, multithreaded ETLPL/SQLframework to run batch jobs to move data from landing zone to Target tables.
- Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
- CreatedPL/SQLscripts to perform ETL from different data sources (Main Frame, DB2, ORACLE) to Oracle database. Implemented Oracle External Tables wif Parallel Pipelined Table Functions and utilities likeSQLLoader.
- Worked wif ETL tool Informatica Power center for performing custom cleansing when the data is loaded to the landing tables in theMDM.
- Work on SQL coding for overriding for generated SQL query in Informatica.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing wif partitioned tables and automating the process of partition drop and create in oracle database.
- Involve in migrating the ETL application from development environment to testing environment.
- Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
- Perform Data Conversion/Data migration using Informatica Power Center.
- CreatedPL/SQLAPI Functions to generate code to perform ETL. Trained junior developers through multiple workshops.
- Create UNIX shell scripts for Informatica pre/post session operations.
- Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
- Worked wif XML targets for the data coming from SQL server source.
- Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier TEMPthan the given date i.e. the change data capture (CDC).
- Configure and setup a secure FTP connection to the vendor using the Informatica Managed File transfer software.
- Created complex Shell scripts for various set of actions dat would automate the process of executing the actions like validating the presence of indicator files.
- Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
- Involved in Unit testing and system integration testing (SIT) of the projects.
- Assist the team members wif the mappings developed as part of knowledge transfer.
Environment: Informatica Power Center8.6.1, MDM, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files,), Tidal 5.3.1. UNIX.
Confidential
ETL Developer/Analyst
Responsibilities:
- Involved in business analysis and technical design sessions wif business and technical staff to develop
- Requirements document and ETL specifications.
- Involved in designing dimensional modeling and data modeling using Erwin tool.
- Created high-level Technical Design Document and Unit Test Plans.
- Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
- Tuned queries doing repeatedPL/SQLfunction calls used scalar sub-queries instead. Created views embedding function logic intoSQL.
- Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
- Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager.
- DevelopedPL/SQLstored procedures calling in house service from database.
- Prepared user requirement documentation for mapping and additional functionality.
- Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
- Analyzed current system and programs and prepared gap analysis documents
- Experience in Performance tuning & Optimization of SQL statements using SQL trace
- Involved in Unit, System integration, User Acceptance Testing of Mapping.
- Supported the process steps under development, test and production environment
Environment: Informatica Power Center 8.1.4/7.1.4 , Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case.