We provide IT Staff Augmentation Services!

Sr.informatica Mdm Developer Resume

2.00/5 (Submit Your Rating)

Hartford, CT

SUMMARY

  • 09+ years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
  • Worked extensively on ETL process using Informatica Power Center 10.x,9.x,8.x, 7.x, IDQ 10.2.1/9.0.1/8.6 , MDM 10.x,B2B,PCF, Teradata, SQL server and oracle databases
  • Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
  • Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
  • Experience wif special emphasis on system analysis, design, development and implementation of ETL methodologies in all phases of Data warehousing life cycle and Relational databases using IBM InfoSphereDataStage 11.5/11.3/9.1/8.5/8.1.
  • Extensive experience as an ETL developer and working wif IBM InfoSphereInformationServer.DataStage (Versions 11.5,11.3, 9.1,8.5, 8.1, 8.0.1) and AscentialDataStage 7.5.2 Enterprise Edition (Parallel Extender) and Server Edition.
  • Hands-on experience on foundation tools like Fast Track, Metadata Work Bench, DataStage and Quality Stage, Information Analyzer etc.
  • Extensive experience ETL tool using IBM InfoSphere DataStage.
  • Expertise in analyzing and understanding data dependencies in database tables using the corresponding metadata stored in the DataStage repository.
  • Extensive experience in SOAP UI
  • Expertise in Informatica MDM, Amazon S3 browser
  • Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Informatica MDM Work Flow Manager, Mapping Designer and Mapplet Designer.
  • Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software, Oil & Gas, Health Care, Insurance, Financial industries.
  • Experience in development and maintenance of Oracle11g/10g/9i/8i, PL/SQL, SQL *PLUS, TOAD, SQL*LOADER.SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
  • Experienced in IDQ (9X, 9.5.1) handing LDO’s PDO’s & some of the transformation to cleanse and profile the incoming data by using Standardize, Labeler, Parser, Address Validator Transformations8 years of experience in using different versions of Oracle database like 11g/10g/9i/8i.
  • Excellent working noledge of c shell scripting, job scheduling on multiple platforms, experience wif UNIX command line and LINUX.
  • Experience is design MDM architecture solutions, complex match rules, complex custom survivorship rules which cannot be handled out-of-the box MDM tool to determine best version of truth (BVT) aka Golden Record and reusable framework
  • Experience in creating batch scripts in DOS and Perl Scripting.
  • Experience in ETL development process using Informatica for Data Warehousing, Data migration and Production support.
  • Sound noledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inmon) - Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.
  • Data modeling experience in creating Conceptual, Logical and Physical Data Models using Erwin Data Modeler.
  • Experience wif TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.
  • Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture the changed data.
  • Proficiency in data warehousing techniques for data cleansing, surrogate key assignment and Change data capture (CDC).
  • Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good noledge on Teradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
  • Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
  • Developed Tableau visualizations and dashboards wif interactive views, trends and drill downs along wif user level security using Tableau Desktop.
  • Hands on experience in MDM development.
  • Optimized the Solution using various performance-tuning methods (SQL tuning, ETL tuning (i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
  • Adept in formulating Test plans, Test cases, Test Scenarios, Test Approaches, Setting up Test Environments in conjunction wif the Testing team and expertise in formulating test approaches wif Load Testing, Integration Testing, Functional Testing, and User Acceptance Testing (UAT).
  • Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
  • Extensive noledge in all areas of Project Life Cycle Development.
  • Experience in both Waterfall and Agile SDLC methodologies.
  • Strong analytical, verbal, written and interpersonal skills.

TECHNICAL SKILLS

Databases: Oracle 10g/9i/11i/R12, DB2, MS SQL server 7.0/2000/2005/2008 , MS Access 2000/2005, Teradata, MySQL

Languages: Transact- SQL, PL/SQL, HTML, CSS, JavaScript, C, C#, PERL, Java

Operating Systems: Windows, Linux, Unix, MS-DOS, Sun Solaris.

OLAP/Reporting Tools: SQL Server Analysis Service (SSAS), SQL Server Reporting Service (SSRS), Share Point MOSS 2007, Business Objects 6.x, Cognos Framework Manager, Tableau

ETL Tools: InformaticaPowerCenter10.x/9.x/8.x/7.x, Informatica Power Exchange, Informatica Data Quality Suite 10.4 Informatica MDM, Informatica Data Director (IDD), Informatica B2B DT, Pentaho Data Integration (PDI)SQL Server Integration Services (SSIS), IBM Infosphere DataStage, IBMDatastage 11.5/11.3/9.1/ 8.7/8.5/8.0 , Informatica MDM

Data Modeling Tools: Microsoft Visio, Erwin r9.0/8x/7x, ER/Studio 9.7/9.0/8.0/7. x

SQL Server Tools: SQL server Management Studio, SQL server Query Analyzer, SQL server mail service, DBCC, BCP, SQL server profiler

Web Technologies: MS FrontPage, MS Outlook Express, FTP, TCP/IP, SOAP UI

Other Tools: Microsoft Office, Visual Basic 6, Amazon S3 Bucket

Scheduling Tools: Informatica Scheduler, CA Scheduler (Autosys), ESP, Autosys Control-M

Data Quality Tools: Informatica Analyst, Informatica Data Quality, Informatica Developer

MDM Tools: Nextgate, Informatica MDM

PROFESSIONAL EXPERIENCE

Confidential, Hartford, CT

Sr.Informatica MDM Developer

Responsibilities:

  • Support production wif incidents related to missing data missing, timeouts, Citrix issues, access/ permissions issues, etc.
  • Creating report services, dynamic dashboards, mobile dashboards, report development, and visualization using the noledge of Star/ Snowflake/ Dimensional data modelling, data warehouse, and reporting techniques.
  • Work closely wif Informatica Support to resolve the issues identified in the Informatica MDM tool. Verifying and validating the developed application.
  • Complete the build-in MicroStrategy Components like MSTR, Desktop/Developer, MSTR Administration, Web, Distribution services, Transaction Services, and SDK.
  • Complete the necessary development dat includes analysis, design, development, and testing which also, require strong Teradata SQL skills and the ability to create simple to complex schema objects as per the requirements.
  • Unit testing and document test results. Capturing and resolving application defects in the unit testing phase.
  • Implement SAM-controlled configuration in IDD and Co-ordinate wif other application areas during integration testing. Ensure all information is passed to offshore and resolve their queries by clarifying wif the client.
  • Providing System test support. Fix the defects dat may arise out of the system test phase.
  • Configure hierarchies and Customize IDD as per business requirements. Define and maintain metadata, data sources, and validation Rules Match/Merge Setup - Match Paths, Match Columns, match rules, and properties.
  • Responsible for developing, supporting, and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter. Creating MicroStrategy schema and application objects such as logical tables, attributes, facts, hierarchies, metrics, prompts, filters, consolidations, and custom groups.
  • Designing and maintaining complex ETL mappings, sessions, and workflows.
  • Responsible for monitoring all the sessions dat are running, scheduled, completed, and failed debugged the mapping of the failed session to check the progress of the data load.
  • Participated in requirements gathering, business analysis, and user meetings, discussing the issues to be resolved and translated user inputs into ETL Design documents.
  • Experienced in troubleshooting Teradata scripts, fixing bugs, and addressing production issues and performance tuning.
  • Scheduling and running workflows on a daily, weekly, and monthly basis using the ESP Scheduling tool.
  • Extensively worked on encoding the reports using UTF- 8 techniques and then transferred files securely using SFTP (Secure File Transfer Protocol) the SSH file transferring protocol and SCP secure shell protocol in order for the down streams to consume our files as their feed and decoded the files once after securely transferred.
  • Performs all technical services required for the integration, installation, and maintenance of information technology and voice communication systems.

Environment: Informatica Power Center 10.1/9.6, Power Exchange, Informatica Data Quality 9.6.1, Informatica MDM 10.1, 10.2, 10.3, 10.4.1, 10.4.1, 10.4.2, 10.4.3 UNIX,Windows NT/2000/XP, SQL Server 2008, Oracle SQL 20.4.1, Windows Server, UNIX and CA ESP Informatica Scheduler, UNIX, IDQ(IDE), WINSCP, Amazon S3 Browser.

Confidential, Oriskany, NY

Sr.Informatica MDM Developer

Responsibilities:

  • Involved in Business Analysis and Requirements collection.
  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Scheduling the Informatica workflows using Control-M, Tivoli scheduling tools & trouble shooting the Informatica workflows.
  • Involved in extracting addresses from multiple heterogeneous source like flat files, oracle, SAS and SQL server.
  • Extensively used all Power Center/Power mart capabilities such Target override, Connected, Unconnected and Persistent lookup’s
  • Used Informatica MDM 10.1 tool to manage Master data of EDW.
  • Extracted consolidated golden records from MDM base objects and loaded into downstream applications.
  • Experience in validating data quality & business rules by using Mapping document and FSD to maintain the data integrity.
  • Represent PowerCenter mapping detailed transformation information in the catalog (EDC)
  • Setting-up Enterprise Data Catalog (EDC v 10.2x) tool on Linux platform Implement Data Governance in Informatica, EDC based on data validation performed in IDQ.
  • Data Lineages between Consumer Master inbound and outbound systems. Business term and technical tern lineages in EDC and Axon for Data Stewards better understanding. End to End installation and configuration of Informatica 9.x/10.x product suits (PowerCenter(PC), Enterprise Data Catalog(EDC), Informatica Data Quality(IDQ), Preparation of Unit test cases
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Worked wif team to convert Trillium process into Informatica IDQ objects.
  • Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data. Use of debugging tools to resolve problems & created reference tables to standardize data.
  • Used bunch of transformations in Pentaho transformations including Row Normalizer, Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs for various data sources including Tables, Access, Text File, Excel and CSV file.
  • Participated in design of Staging Databases and Data Warehouse/Data mart database using Star Schema/Snowflakes schema in data modeling.
  • Worked very closely wif Project Manager to understand the requirement of reporting solutions to be built.
  • Used Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others.
  • Implemented Logic wif Database lookup table to maintain Parent- Child relationship and maintain hierarchy.
  • Use Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others (DEV/QA/PREPROD/PROD).
  • Created Transformations/Jobs to take daily back up Enterprise repository for DEV/QA/PREPROD/PROD.
  • Experience in writing SQL test cases for Data quality validation.
  • Utilized XML and SQL Server table configuration for the management and migration of SSIS packages in staging and pre-production environments.
  • Deployed packages from test environment to production environment by maintaining multiple package configurations in SSIS utilizing package and project deployment models in SSIS 2012.
  • Implemented advanced features in SSIS such as error handling, transactions, checkpoints, loggings and package configurations utilizing package deployment utility.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Worked wif Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Experience in end to end Data quality testing and support in enterprise warehouse environment.
  • Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects.
  • Assisted wif automation and deployment of SQL Scripts and SSIS Packages, SSRS reports and maintained daily jobs in different environments using SQL Server Agent & Tivoli Scheduler.
  • Used Tivoli Scheduler to schedule the ETL batch jobs to load the data into EDW.
  • Provided production support to schedule and execute production batch jobs and analyzed log files in Informatica 10.1 Integration servers.
  • Involved in daily status call wif onsite Project Managers, DQ developers to update the test status and defects.

Environment: Informatica Power Center 10.1/9.6,Pentaho Data Integration 8.0.0(PDI/Kettle), Power Exchange, Informatica Data Quality 9.6.1, Informatica MDM 10.1, UNIX,Windows NT/2000/XP, SQL Server 2008, SSIS, OBIEE, DB2, Control tool, SVN, Windows Server, UNIX and CA ESP Workstation Scheduler, UNIX, IDQ(IDE).

Confidential, Houston, TX.

Sr Informatica Developer/IDQ Developer/DataStage

Responsibilities:

  • Worked wif Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
  • Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
  • Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated wif mailing lists by preventing multiple pieces of mail.
  • Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
  • Worked on IBM InfoSphereDataStage 11.5 to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database
  • Implementing Industry ETL standards and best practices, performance tuning during designing the DataStage Jobs.
  • Using IBM InfoSphereDatastage software, extracting data from DB2, Oracle and Flat File and Load to target tables.
  • Worked on DataStage V11.3 to developETLjobs dat loads the data from staging to target tables in Teradata server as database.
  • Developed, tested and implemented DataStage Jobs, JIL Jobs, Ksh scripts for several projects in an Operational Data Store.
  • Configured integration between the ActiveVOS 9.2.4.1 and MDM 10.1 for creating the custom workflow process and designing the orchestration project.
  • Resolving issues related to Enterprise data warehouse (EDW), Stored procedures in OLTP system and analyzed, design and develop ETL strategies.
  • Worked wif Services and Portal teams on various occasion for data issues in OLTP system.
  • Extensively worked on theETL mappings, analysis and documentation ofOLAP reportsrequirements. Solid understanding of OLAP concepts and challenges, especially wif large data sets.
  • Carried out changes into Architecture and Design of Oracle Schemas for both OLAP and OLTP systems.
  • Involved in creating IDD tasks and assigning roles to the tasks in ActiveVOS to trigger the workflows.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
  • Perform InformaticaMDM design, implementation, experience on master data managementInformatica Data Quality (IDQ 9.6.1) is the tool used here for data quality measurement.
  • Exposure to Informatica B2B Data Exchange dat allows to support the expanding diversity of customers and partners and their data wif capabilities dat surpass the usual B2B solutions.
  • Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • DevelopedPL/SQL triggersandmaster tablesfor automatic creation of primary keys.
  • Created PL/SQLstored procedures, scripts, functionsandpackagesto extract the data from the operational database into simple flat text files usingUTL FILEpackage.
  • Design the Source - Target mappings and involved in designing the Selection Criteria document.
  • Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data
  • Responsible for manually start and monitor production jobs based on the business users requests.
  • Responsible to look into production issues and resolved them in timely manner.
  • Developed Informatica process to replace stored procedure functionalities and provide a time TEMPeffective and high data quality application to the client.
  • Formulated a comprehensive data migration plan wif different conversion strategies, detailed object and field mappings including its transformations and business rules for converting legacy Nationwide data into Oracle.
  • Represent PowerCenter mapping detailed transformation information in the catalog (EDC)
  • Setting-up Enterprise Data Catalog (EDC v 10.2x) tool on Linux platform Implement Data Governance in Informatica, EDC based on data validation performed in IDQ.
  • Data Lineages between Consumer Master inbound and outbound systems. Business term and technical tern lineages in EDC and Axon for Data Stewards better understanding. End to End installation and configuration of Informatica 9.x/10.x product suits (PowerCenter(PC), Enterprise Data Catalog(EDC), Informatica Data Quality(IDQ), Preparation of Unit test cases
  • Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
  • Prepared ETL Specifications and design documents to help develop mappings.
  • Created Mappings for Historical and Incremental loads.
  • Worked on staging the data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
  • Worked wif PMCMD to interact wif Informatica Server from command mode and execute the shells scripts.
  • Project based on Agile SDLC methodology wif 2 weeks of software product release to the business users.
  • Take part in daily standup and scrum meetings to discuss the project lifecycle, progress and plan accordingly, which is the crux of Agile SDLC.
  • Provide post release/production support.

Environment: IBMInfoSphere DataStage11.5 (Designer, Administrator, Director),Quality Stage, Informatica Power Center 10.0, IDQ 9.6.1, Informatica MDM,Informatica B2B DT, Oracle Database 11g, SQL server, SQL * Plus, TOAD, SQL*Loader, Toad for Oracle, Tableau, Unix Shell scripts, Teradata.

Confidential, San Francisco, CA

ETL-Informatica Developer/IDQ/DataStage

Responsibilities:

  • Worked on Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.6
  • Worked wif Informatica Data Quality 9.5.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, Score cards, reporting and monitoring capabilities of Informatica Data Quality IDQ 9.5.1.
  • Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor
  • Designed and developed Complex mappings like Slowly Changing Dimensions Type 2 (Time Stamping) in the mapping designer to maintain full history of transactions.
  • Involved in Data Loading Sequence and Populated Data into Staging Area and Warehouse wif Business Rules.
  • Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica MDM using IDQ cleanse Adapters. Extracted the source definitions from various relational sources like Oracle, XML and Flat Files.
  • Extensively used ETL to load Flat files, XML files, Oracle and legacy data as sources and Oracle, Flat files as targets.
  • Created Sessions and managed the Workflows using various tasks like Command, Decision, Event wait, counter, Event raise, Email using Workflow Manager.
  • Extensively used the Informatica Debugger for debugging the mappings.
  • Created Mappings to load data using various transformations like Source Qualifier, Sorter, Lookup, Expression, Router, Joiner, Filter, Update Strategy and Aggregator transformations.
  • Worked specifically wif the Normalizer Transformation by converting the incoming fixed-width files to COBOL workbooks and using the Normalizer transformation to normalize the data.
  • Worked wif Lookup Dynamic caches and Sequence Generator cache.
  • Created Reusable Transformations and Mapplets to use in Multiple Mappings and worked wif shortcuts for various informatica repository objects.
  • Designed, developed and tested the DataStage jobs using Designer and Director based on business requirements and business rules to load data from source to target tables.
  • Used several stages like Sequential file, Hash file, Aggregator, Funnel, Change Capture, Change Apply, Row Generator, Peek, Remove Duplicates, Copy, Lookup, Join, Merge, Filter, Datasets during the development process of the DataStage jobs.
  • Development of XSLT's, Restful and SOAP based web services. Developed various ETL jobs including Data Extractions,TX, Transformations rules based on business requirements using IBM InfosphereDatastage 9.1.
  • Established best practices for DataStage jobs to ensure optimal performance, reusability, and restartability.
  • Used Autosys to schedule, run and monitor Datastage jobs.
  • Identified and eliminated duplicatesindatasets thoroughIDQcomponents.
  • Used Teradata utilities like FastLoad, MultiLoadand Teradata SQL Assistant.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages in MDM. Integrated IDQ process in MDM.
  • Load data files coming from external vendors onto Teradata EDW using mload and floadutilities.
  • Workedinimplementation of Profiling, Score Card, Classifier models, Probabilistic models, Humantask and Exception record management as part ofIDQprocess.
  • Hands on Informatica MDM and efficient on various Informatica stages database objects
  • Worked wif Informatica Power Exchange to pull the changed data in the form of Condense files and load into Teradata tables using Tpump import.
  • Created parameter files wif Global variables.
  • Integrated Angular framework to RESTservices to facilitate LOGIN over Java Interfacing.
  • Extensively worked wif Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
  • Represent PowerCenter mapping detailed transformation information in the catalog (EDC)
  • Setting-up Enterprise Data Catalog (EDC v 10.2x) tool on Linux platform Implement Data Governance in Informatica, EDC based on data validation performed in IDQ.
  • Data Lineages between Consumer Master inbound and outbound systems. Business term and technical tern lineages in EDC and Axon for Data Stewards better understanding. End to End installation and configuration of Informatica 9.x/10.x product suits (PowerCenter(PC), Enterprise Data Catalog(EDC), Informatica Data Quality(IDQ), Preparation of Unit test cases
  • Profile files and shell scripts were used for recreation of dynamic parameter files.
  • Scheduling of Informatica workflows using Tidal Scheduler.
  • Migration of Informatica code from DEV to TEST environments in Informatica by creating deployment groups, folders, applying labels, creating queries in the Informatica Repository Manager.

Environment: IBMDataStage9.1/8.5, Quality Stage, Informatica PowerCenter 9.6.1/9.5.1 , Informatica IDQ (9.5.1), Informatica MDM, IDD, UNIX, Teradata13.0, Shell Scripts, SQl Server.

Confidential, Charlotte, NC

Informatica/Datawarehouse Developer

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
  • Create workflows and work lets for Informatica Mappings.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Design, develop, test and review & optimize Informatica MDM and Informatica IDD Applications.
  • Involved in match/merge and match rules to check the TEMPeffectiveness of MDM process on data.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing wif partitioned tables and automating the process of partition drop and create in oracle database.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Worked wif Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Perform Data Conversion/Data migration using InformaticaPowerCenter.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Worked on Direct Connect process to transfer the files between servers.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Worked wif XML targets for the data coming from SQL server source.
  • Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier than the given date i.e. the change data capture (CDC).
  • Parameterized the whole process by using the parameter file for the variables.
  • Imported xsd file to create the xml target and create the HierarchicalRelationship
  • Implemented the logic by using HTTP transformation to query the web server.
  • Configure and setup a secure FTP connection to the vendor using the Informatica Managed File transfer software.
  • Created complex Shell scripts for various set of actions dat would automate the process of executing the actions like validating the presence of indicator files.
  • Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
  • Involved in Unit testing and system integration testing (SIT) of the projects.
  • Assist the team members wif the mappings developed as part of noledge transfer.

Environment: Informatica PowerCenter 8.6.1/ 8.1.1 , Informatica IDQ 9.5.1, MDM 9.5.1, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files,),Tidal 5.3.1. UNIX, IDQ(IDE).

We'd love your feedback!