We provide IT Staff Augmentation Services!

Sr Informatica/idq Developer Resume

3.00/5 (Submit Your Rating)

Kansas City, MO

SUMMARY

  • Around 8 years of experience in IT Industry with emphasis on Data Warehousing tools using industry accepted methodologies and procedures.
  • Technical expertise in ETL methodologies,InformaticaData Quality (IDQ),Informatica9.x/8.x/7.x/6.x - InformaticaServer Manager, Repository Server Manager, and Power Exchange.
  • Power Center, Client tools - Mapping Designer, Mapplet Designer, TransformationsDeveloper, Workflow Manager/Monitor and Server tools Expertise on Informatica Data Quality Worked on Informatica MDM
  • Expertise in Data Warehousing, Data Migration, Data Modeling, and Data Cleansing.
  • Experience in Developing/Optimizing/Tuning mappings usingInformatica.
  • Expertise on various types of transformations like Lookup, Update Strategy, Stored Procedures, Joiner, Filter, Aggregator, Rank, Router, Normalizer, Sorter, External Procedure, Sequence Generator, and Source qualifier and SCD Type-2 etc.
  • Expertise on implementing various data quality transformations and experience in Data profiling and generating scorecards.
  • Good experience in implementing Data Analysis usingInformaticaAnalyst tool.
  • Experience in Address Validator transformation extensively for North America Address validations. Built several reusable components onIDQusing Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
  • Created custom rules to validate zip codes, states and segregated address data based on country.
  • Created web services for address mapplets of different countries to integrate with SOAP UI.
  • Expertise using Debugger, Mapping wizards, Workflow administrator and Workflow Monitor.
  • Expertise in Create/ Build/Run/Schedule Batch Job streams, Sessions, Workflows using the Workflow Manager, Server Manager.
  • Experience in writing UNIX shell scripting.
  • Good at Power Exchange for extracting data from SAS, VSAM source systems.
  • Strong back end experience in writing PL/SQL stored procedures, functions, packages and triggers.
  • Good noledge and experience in implementing practices in Software Development Life Cycle (SDLC).
  • Expertise in design and implementation of all Slowly Changing Dimensions (SCD)
  • Experienced in loading data, troubleshooting, Debugging mappings, performance tuning ofInformatica(Sources, Targets, Mappings and Sessions) and fine-tuned transformations to make them more efficient in terms of session performance.
  • Demonstrated understanding of best practices in SQL queries in different RDBMS such as Oracle, SQL Server, and Netezza.
  • Experienced with Integration of data from heterogeneous sources such as Relational tables, flat files, MS Excel, and XML files.
  • Good noledge with Data migration with respect to Insurance and Telecom domains.

TECHNICAL SKILLS

ETL Tools: Power Center, InformaticaData Quality (IDQ), Power Exchange, IDD SQL Server Integration Services (SSIS)

Databases: Oracle, IBM DB2, SQL Server, Teradata

Reporting Tools: Microstrategy,Business Objects, Crystal Reports, OBIEE.

Development Tools: PL/SQL, SQL,Developer2000, OracleDeveloperSuite, SQL Plus, SSRS, Toad 8.x/9.x/10.x.

OLAP Tools: Business Objects 5.0, Oracle Discoverer 4.1, Cognos Series 8 Transformer6.5,Power Play 6.5.

Operating Systems: UNIX (putty), Sun OS, AIX, Windows, Mainframe.

Languages: SQL, PL/SQL, C, UNIX Shell Scripting

Database: Oracle 8i/9i/10g, PL SQL /SQL Server 7.0

PROFESSIONAL EXPERIENCE

Confidential, Kansas City, MO

Sr Informatica/IDQ Developer

Responsibilities:

  • Worked with business analysts to gather requirements. Responsible for converting Functional Requirements into Technical Specifications and design documents. Designed and developedInformaticaETL process to integrate data from loyalty, sales third - party vendors dat are being phased out.
  • Design, configuration ofInformaticaweb services to automate the EID requests using WebServicesHub.
  • Developed mapping using Webservice Provider.
  • Working with various sources such as Flatfiles, Relational, XML and Webservices as part ofInformaticaMappings.
  • Designed and developed dimensional model for Customer Data Warehouse in Teradata 14 and Created Fact and Dimensional tables in the Star Schema for data warehouse.
  • Implemented data cleansing, match/merge, de-duplication, standardization usingInformaticaIDQtransformations like Parser, Standardizer, Classifier, Match, Merge and Decision to create gloden copies of Customer. Successfully completed data integration into MDM.
  • Analyzed and extracted data from the PeopleSoft to Oracle Instance.
  • Integrated Oracle DAC withInformaticaPower Center and Configured DAC for Processing.
  • Based on user requirements, DAC tasks andInformaticamappings were debugged and DAC parameters were changed to load multiple Subject Areas into Datawarehouse.
  • Created profiles to analyze the structure of source data. Created and applied rules in profiles
  • Created scorecards to visually represent data quality progress and to score the frequency of each values for columns.
  • UsedIDQanalyst tool to create rules and apply while running column profiling
  • Worked on designer tools Source Analyzer, Warehouse Designer, TransformationDeveloper, Mapplet Designer and Mapping Designer.
  • Designed and implementedInformatica(IDQv9.1), Data Quality applications for the business and technology users across the entire full development life cycle.
  • Worked withInformaticaData Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator.
  • Created complex SCD Type 1, Type 2 and Type 3 mappings using Dynamic Lookup, Joiner, Router, Union, Expression transformations.
  • Created various rules inIDQto satisfy the Completeness, Conformity, Integrity, Timeliness
  • Cleansed, standardized, labeled and fix the data gaps inIDQwhere it checks with reference tables to fix major business issues
  • Identified issues, performance bottlenecks, andoptimizedtheBusiness Intelligence Dashboards and Reports.
  • ExposedIDQmapping/mapplets as web service.
  • Worked on enhancements for stored procedure andIDQweb services.
  • Created scorecards to visually represent data quality progress and to score the frequency of each values for columns.
  • Created Pre-Sql and Post-Sql scripts which need to be run atInformaticalevel.
  • Developed SQL, PL/SQL, T-SQL Stored Procedures and Views in Oracle 11g, SQL Server 2012 and Teradata 14.
  • Extensively used Explain Plans to identify SQL query performance and created Indexes to optimize execution time.
  • Developed BTEQ, FastLoad and MultiLoad scripts in Teradata. Wrote UNIX Shell scripts.
  • Scheduled various daily and monthly ETL loads using Control-M.
  • Developed the automated and scheduled load processes using Autosys scheduler.
  • Populated and updated data dictionary, standardized naming convention. Delivered ETL and Data warehouse technical design documentation.

Environment: InformaticaPowerCenter 9.5.1,InformaticaData QualityIDQ9.1, B2B DX/DT v 9.1, ORACLE DAC, Teradata 14, Oracle 11g, PL/SQL, T-SQL, Toad, Erwin, SOAP UI, Teradata SQL Assistant, JIRA, Unix, Win

Confidential, Sacramento, CA

Sr. ETL /IDQ Developer

Responsibilities:

  • Coordinated in daily team meetings, technical code review meetings and interacted with business people for better technical solutions and proposed ETL strategy.
  • Worked on all of theInformaticaPowerCenter 9.5.1 Tools - Repository Manager, Designer (Source/Target Analyzer, Mapping/Mapplets Designer & TransformationDeveloper) Workflow Manager/Monitor, Server Manager,IDQetc.
  • Validated, tuned and debug/fixed the old mappings, testing of Stored Procedures and Functions, testing ofInformaticaSessions and figured out the better technical solutions for the new mappings for Source/Target compatibility due to version changes.
  • Involved to provide the technical solutions for the complex mappings, created various complex mappings using transformations such as the Source Qualifier, Aggregator, Expression, Con/Uncon Lookup, Router, Filter, Update Strategy, SQL, Sequence Generator, Union, Rank, Joiner, Stored Procedure and other Transformations. Successfully implemented SCD Type1/ Type 2 for inserting, updating and deleting the target tables to maintain the history changes.
  • Worked on Session, Workflow Manager/Monitor to create, schedule & worklets (command, email, assignment, control, event wait/raise, conditional flows and session etc tasks) and configured them according to business logics & requirements to load data from different Sources to Targets.
  • Identified the bottlenecks in Sources, Targets, Mappings, Transformations, Sessions, Database and Network then fixed them for Performance Tuning and worked on SQL query tuning etc.
  • Tuned the existing mappings for better performance, used Mapping & Session variables/parameters, parameters files, Mapplets & Reusable Transformations to reuse during the development of life cycle.
  • Prepared ETL technical mapping documents along with all test cases for every mapping and Data Migration documents for smooth transfer of project from development to testing environment and then to production environment and uploaded them in SharePoint.
  • Prepared TEMPeffective Unit, Integration and System test cases of mappings for various stages to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading.
  • Created Pre & Post-Session UNIX Scripts, Functions, Triggers and Stored Procedures to drop & re-create the indexes and to solve the complex calculations on the table. Designed and coded of major change requests as per the new requirements.
  • Worked withInformaticaData Quality 9.5.1(IDQ) toolkit for analyzing, data cleansing, data matching, data conversion, exception handling, reporting and monitoring the data.
  • Identified and eliminated duplicates in datasets thoroughIDQ9.5.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • UsedIDQtools to create the mapplets/mappings with business logics to use as WSDL link to represent the data in MS-Word, MS-Excel in Ads-on menu.

Environment: s: InformaticaPower Center 9.5.1, MSQL-Server 8r, Oracle 11g, PL/SQL, UNIX, Toad 9.5, Putty, Dynamic SQL, UNIX, Shell Programming. Web Services, SQL Navigator,InformaticaIDQ9.5.1, SharePoint, NetFlex, HIPPA, ILM.

Confidential, Irving, TX

ETL/Informatica developer

Responsibilities:

  • Involved in all phases of the project life cycle including requirement gathering, analysis, coding, testing and implementation
  • Developed ETL procedures consistent across all systems by analyzing business requirement and working closely with various application and business teams
  • Documented technical requirement specifications and detailed designs for ETL processes of moderate and high complexity
  • Involved in analyzing the source data coming from different Data sources such as Oracle, XML, flat files etc.
  • Developed data models/mappings between source systems and warehouse components
  • Worked onInformaticaPower Center tools - Source Analyzer, Warehouse Designer, Mapping & Mapplet Designer, and TransformationDeveloper
  • Created mappings using transformations like Source Qualifier, Aggregator, Expression, lookup, Router, Filter, Update Strategy, Joiner, Union, Rank, Sequence Generator, Stored procedure, and XML transformations
  • Schedule and run Extraction and Load process, monitor task and workflow using the Workflow Manager and Workflow monitor
  • Used Workflow Manager for creating workflows, worklets, email and command tasks
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files
  • Used SQL tools like TOAD for Data Analysis, wrote and executed tuned SQL queries to troubleshoot data issues and to view & validate the data loaded into the warehouse
  • Utilized QlikView to develop complex dashboards like product, orders, traffic, billing, customer and sales with multiple sheets using boxes, buttons, charts and cyclic groups.
  • Resolved Dashboard data issues by modifying load scripts, eliminating synthetic keys, and closed loops
  • Customized dashboards by using expressions, bookmarks, functions, dynamic charts, filters, action triggers and system variables
  • Created incremental load scripts using QlikView for ordering, billing and service monitoring
  • Connected to different ODBC, OLE Database sources (SQL Server, Oracle) and flat files to create and schedule different complex reports
  • Established a secure environment as well as implemented field and document level security using QlikView Section Access and Triggers

Environment: InformaticaPower Center 9.1x, TOAD, Oracle 11g, QlikView 10x, Flat Files, Windows

Confidential, New York, NY

Informatica Developer

Responsibilities:

  • Responsible to meet with business stakeholders and other technical team members to Gather and analyze application requirements.
  • Involved in development of Logical and Physical data models dat capture current state Developed and tested all theInformaticaData mappings, sessions and workflows - involving several Tasks.
  • Developed end-to-end ETL processes for Trade Management Data Mart UsingInformatica.
  • Worked on source analyzer, Target Designer, Mapping and Mapplet Designer, Workflow manager and Workflow Monitor.
  • Created mappings for initial load in Power Center Designer using the transformations Expression, Router and Source Qualifier.
  • Created complex mappings for full load into target in Power Center Designer using Sorter, Connected Lookup, Unconnected Lookup, Update Strategy, Router, Union etc.
  • Created Mapplets to reuse all the set of transformations for all mappings.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Created various tasks to give various conditions in the workflows.
  • Mappings, Mapplets and Sessions for data loads and data cleansing. Enhancing the existing mappings where changes are made to the existing mappings usingInformaticaPower center.
  • Involving in extracting the data from Oracle and Flat files. Developed and implemented various enhancements to the application in the form of Production and new production rollouts.
  • Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
  • Extensively worked on confirmed Dimensions for the purpose of incremental loading of the target database.
  • Worked on Fast Load, MultiLoad, Pump and Fast Export loading techniques throughInformaticainto Teradata.
  • Performed Unit testing on theInformaticacode by running in the debugger and writing simple test scripts in the database theirby tuning it by identifying and eliminating the bottlenecks for the optimum performance.
  • Involved in fixing invalid Mappings, Testing ofInformaticaSessions, Work lets and Workflows.
  • Created parameters and variables for the reusable sessions.
  • Analyzed the various bottlenecks at source, target, mapping and session level.
  • Tuning of the mappings and SQL Scripts for a better performance.
  • Designed the ETL processes usingInformaticato load data from DB2, SQL Server and Flat files to the Target Database.
  • Assign work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.

Environment: Informatica8.6.1, Teradata, Oracle 10g, PLSQL, DB2, XML, SQL* PLUS, MS Excel, UNIX(AIX), UNIX Shell

We'd love your feedback!