We provide IT Staff Augmentation Services!

Informatica Developer\ Mdm Consultant Resume

3.00/5 (Submit Your Rating)

Hoboken, NJ

SUMMARY

  • 8+ years of ETL and Data Integration experience in developing ETL Mappings, Mapplets, Workflows, Worklets and scripts using Informatica Power Center 9.6.1/9.1/8. x/7.x, Informatica B2B Data Exchange 8.6, Informatica B2B Data Transformation Studio 8.6.
  • 2+ years of experience in Master Data Management concepts (MDM), IDD, IDQ and ability to apply this knowledge in building MDM solutions.
  • Experience in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Experience in installation and configuration of MDM Hub Console.
  • Exposure in configuration of Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter.
  • Using Informatica Product Availability Matrix( PAM) for identifying Proper software for MDM.
  • Participated in requirement gathering and preparation of High level and Low - level Designs Throughout the Software Development Life Cycle.
  • Experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries and packages.
  • Experience in configuring Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management.
  • Experience in testing user exits and configuring Message queues as per business requirement.
  • Experience in integrating external business application with Informatica MDM hub using Batch processes, and the SIF.
  • Experience working with ORACLE, SQL Server, My SQL databases & writing complex queries.
  • Experience in development, configuration and management of Informatica objects and also working on design reviews and code reviews.
  • Exposure in HTML, XML and also working in SOAP, REST Web services.
  • Familiar with JBoss, Web Logic, Web Sphere application server.
  • Experience in deploying and testing web services (JAR,EAR files) in WEBSPHERE
  • Good working experience with Agile Software Development Methodology.
  • Extensive ETL experience using Informatica Power center client tools (Designer, workflow manager, workflow monitor and server manager).
  • Developed OLAP applications using Cognos 8BI - (Frame Work Manager, Cognos Connection, Report Studio, Query Studio, and Analysis Studio) and extracted data from the enterprise data warehouse to support the analytical and reporting for Corporate Business Units.
  • Experienced in Data warehouse reporting with Cognos 10.1, 8.0, 8.2 series, Cognos.
  • Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica, Talend (ETL).
  • Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows.
  • Good understanding in entity relationship and Data Models.
  • Good experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages
  • Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
  • Experienced in all phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Experienced in all phases of SDLC using Waterfall, Agile methodology.
  • Experienced in working with business analysts to understand requirements and translating them into ETL code in Requirement Analysis phase.
  • Involved in creating High Level Design and Detailed Design documents in the Design phase.
  • Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, OLTP Data Modeling and Data warehousing concepts (Star Schema and Snow Flake Schema).
  • Experience in integration of various data sources like Oracle, XML, SQL server and MS access and non-relational sources like flat files into staging area.
  • Knowledge on Teradata Utility scripts like Fast Load, Multi Load to load data from various source systems to Teradata.
  • Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
  • Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Hands on experience on several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Slowly Changing Dimensions (SCD Type I and Type II).
  • Strong experience in performance tuning, debugging and Error handling of mappings, sources and targets for better performance, also identified and fixed the bottlenecks.
  • Extensive experience in creating and Reusable Mapplets, Worklets used event wait, Event raise, Decision box, Email and Command tasks in Workflows.
  • Experience in Debugging of mappings using data and error conditions, Scheduling jobs using Control-M, Maestro, and Informatica Scheduler.
  • Strong knowledge in writing efficient & complex SQL Queries Oracle 11g/10g/9i and in PL/SQL for Stored Procedures, triggers, indexes, cursors.
  • Worked in writing SQL Queries DDL, DML and DCL Commands.
  • Experienced in UNIX work environment, file transfers (FTP, SFTP), job scheduling and error handling.
  • Hands on experience on UNIX shell scripts for Informatica pre & post session operations.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Experience in support and knowledge transfer to the production team.
  • Extensive functional and technical exposure. Experience working on high-visibility projects.
  • Assign work and provide technical oversight to onshore and offshore developers.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.x/ 8.x/7.x, Informatica B2B DX, DT, DTA, Talend RTX 4.1

EIM Tools: Informatica MDM Multi-Domain, IDD, SIF

Data Warehousing: Informatica Power Center 9x/8x/7x/6x, Informatica Power Exchange 7.x/8.x, Repository Manager, Designer, Workflow Manager, workflow Monitor.

Databases: Oracle 11g/10g/9i/8i, SQL Server 2008, Teradata V2R5, MS Access, MS-Excel.

RDBMS Load Tools: Toad, SQL Developer, SQL Loader, SQL plus.

Design/Application Tools: Star-Schema Modeling, Snowflakes Modeling, Fact and dimension tables, Erwin, Toad, SQL Navigator, Clear case, Control-M, Maestro, Microsoft Visio.

Programming Skills: SQL, PL/SQL, UNIX shell scripting.

Data Modeling Tools: MS - Visio, Erwin

Operating Systems: UNIX, Windows 98/2000/2007/ XP/NT, UNIX (Solaris, AIX).

PROFESSIONAL EXPERIENCE:

Confidential, Hoboken, NJ

Informatica Developer\ MDM Consultant

Responsibilities:

  • Extensively used Informatica Client tools - Power Center, Work flow Manager, Work flow Monitor and Repository Manager .
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, XML,XSD’s,Flat Files.
  • Developed complex mapping using Informatica Power Center tool.
  • Extracting data from Oracle and Flat file, Excel files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
  • Created Sessions, Tasks, Work flows and worklets using Work flow manager.
  • Exposure in configuring Informatica MDM Hub server, cleanse server and Address Doctor5 in Development, QA
  • Configured the Landing Tables, mappings, staging tables, Base objects, relationships, Lookups, queries, packages, query groups, Batch groups
  • Scheduling the batch jobs.
  • Defined the Trust and Validation rules.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Took ownership for populating and validating lookup tables associated to data model .
  • Ran batch jobs for customer, contacts, material, vendor, customer material domains
  • Defined Match rules in Match and Merge settings of the base table by creating Match Path Components, Match Columns and Rule sets.
  • Created IDD application to support the ongoing business needs to feed the data into the hub.
  • Created roles, users for roles, and assigned privileges to roles using Security Access Manager
  • Worked with Data modeler in developing STAR Schemas
  • Involved in performance tuning and query optimization.
  • Used TOAD, SQL Developer to develop and debug procedures and packages.
  • Involved in developing the Deployment groups for deploying the code between various environment (Dev, QA).
  • Experience developing and supporting complex DW transformations
  • Excellent understanding of Star Schema Data Models; Type 1 and Type 2 Dimensions.
  • Created pre sql and post sql scripts which need to be run at Informatica level.
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
  • Used Debugger to fix the defects/ errors and data issues.
  • Expertise in using both connected and unconnected Lookup Transformations.
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Monitored and improved query performance by creating views, indexes, hints and sub queries
  • Extensively involved in enhancing and managing Unix Shell Scripts.
  • Developed workflow dependency in Informatica using Event Wait Task, Command Wait.

Environment: Informatica Power Center 9.5, Informatica MDM 10.0, HF2, 10.1, IDD, WebSphere, Jboss, Trillium, TFS, Oracle 11i, SQL Server 2008, MS Access 2010, FLINK,SQL*Loader, UNIX, Winscp, Putty, Erwin 7.2, SQL, PL/SQL.

Confidential, Bluebell, PA

Informatica MDM Consultant

Responsibilities:

  • Working with business analysts to gather business requirements and implementing the same into a functional data warehouse design.
  • Prepared Business Requirement Documents (BRD’s) referring to the Functional Requirements provided by System Users that described the scope of work for technical team to develop prototype and overall system.
  • Involved in greater insights which are made possible by combining data from multiple payers and increasing the number of claims to detect new patterns and establish correlations with strong statistical validity.
  • Combining data from multiple payers and creating a consistent provider identifier across those claims streams can highlight problems that might not be obvious from just one health plan.
  • Worked on various Healthcare payers, including the Centers for Medicaid and Medicare Services (CMS), are banking on informed, engaged consumers to play an increasingly important role in driving down healthcare costs.
  • Worked on an Agile SDLC model to deploy our product and customer data.
  • Worked towards data modeling, baseline the data model to accommodate all business specific requirements and configured Informatica MDM Hub Server, Lookup Adapter, Cleanse Server, Cleanse Adaptor and Resource Kit.
  • Implemented Data&Services integration with customer registry.
  • Designed and created BO, staging tables, mappings, transformations as per business requirements.
  • Created mappings to perform the tasks such as cleansing the data and populate that into staging tables.
  • Used small sample data sets to complete DEV match iterations.
  • Defined multiple conservative match rules for IDL.
  • Used multiple server properties and cleanse properties for matching.
  • Performed External match jobs.
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Responsible for creating user groups, privileges and roles to the users using Security Access Manager.
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Worked on integrating Hierarchies created in MDM Hub Console and IDD.
  • High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
  • Implemented Search feature, Add/Update feature and Task Assignment feature in IDD for Admin user.
  • Implemented Post Landing, Pre-Staging, Save Handler User Exits using Eclipse Mars and integrated those user exits with MDM Hub Console using User Object Registry module.
  • Performed analysis, design, development and maintenance activities associated with the database and data in support of multiple applications (including following standards/procedures to achieve integration of systems through database design).
  • Used ActiveVos Business process management model to deploy and manage process applications that combine system and human tasks.
  • Using ActiveVos to create BPMN2.0 compliant process models that seamlessly integrate people, processes and systems for increasing the efficiency and visibility.
  • Extensively worked on Informatica client tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Workflow Designer, Worklet Designer and Task Developer.
  • Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, XML files...etc. into the Staging table and then to the target database.
  • Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
  • Developed new and maintaining existing Informatica mappings and workflows based on specifications.
  • Created Mapplets, reusable transformations, Worklets and used them in different mappings, workflows.
  • Used Source qualifier, Expression, Aggregator, Lookup, Router, Normalizer, Sorter, Stored Procedure transformations to do necessary data calculations and manipulations according to the business rules and loaded data into Target systems.
  • Performed trunk & Load process by using stored procedure transformation and load control table to stage the data.
  • Worked on Slowly Changing Dimensions i.e. Type1 & Type2 and data ware housing Change Data Capture (CDC).
  • Developed PL/SQL scripts, stored procedures, Indexes, Constraints and triggers in Oracle.
  • Worked on production issues like bug fixing, bottlenecks, data validation and report errors.
  • Expertise in creating complex reports in Cognos Report Studio such as List reports, Cross-tab reports, Drill through reports, Master-Detail Reports and Cascading Reports and involved in reports performance tuning.
  • Experience in Metadata Modeling both Relational and Dimensional model.
  • Created Projects, Models using Cognos Framework Manager and published packages to Cognos Server for reporting authoring.
  • Performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
  • Created pass through session partitions to improve performance of reading source data.
  • Extensively used Shell scripts to automate the Pre-Session and Post-Sessions processes.
  • Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
  • Scheduled Informatica jobs through Control M as per the business requirement.
  • Worked with reporting team to help understand them the user requirements on the reports and the measures on them.
  • Involved in designing and developing the reporting requirements by using Business Object Xi3.1, Business Objects Report Designer.
  • Created and maintained several custom reports for the client using Business Objects.
  • Created List Reports and Cross Tab Reports and Drill Through reports.

Environment: Informatica MDM 10.1, Informatica Multi Domain Edition MDM 9.7.1 HF3, IDQ, IDD, SIF, Address Doctor5,ActiveVos, Informatica Power Center 9.6.1, Informatica B2B Data Exchange 8.6, Informatica B2B Data Transformation Studio 8.6, Cognos8.x, Control-M, Oracle 11g, PL/SQL, Visio, TOAD, SQL*Plus, Java, JBoss Application Server, XML, Windows, Winscp, UNIX, Putty, Business Objects Xi3.1.

Confidential, KOP, PA

Informatica MDM Consultant

Responsibilities:

  • Profiled the data using Informatica Analyst tool to analyze source data (Departments, party and address) coming from Legacy systems and performed Data Quality Audit.
  • Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.
  • Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data.
  • Planned, created and executed SSIS packages to integrate data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Worked on integrating Hierarchies created in MDM Hub Console and IDD.
  • Configured Entity Objects, Entity Types, Hierarchy and Relationship Types for Contract, Product, and Party Hierarchical view in IDD.
  • Configured Supply Chain IDD Application for use by Data Stewards.
  • Configured Queries and Packages for all the domains for use in IDD.
  • Configured Subject Area Groups for Supplier, Manufacture, Product and Contract.
  • Configured Roles for Read Only, Approver and IDD specialist for respective domains.
  • High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
  • Implemented Search feature, Add/Update feature and Task Assignment feature in IDD for Admin user.
  • Installed B2B data exchange, server plug-in, client plugging and configuring with powercenter.
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store.
  • Involved in creating, monitoring, modifying, & communicating the project plan with other team members. Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
  • Designed and built ETL mappings and workflows to load data from different sources Oracle, SQL Server and fixed width as well as Delimited Flat files.
  • Installed, Configured Talend ETL on single and multi server environments.
  • Worked on Talend RTX ETL tool, develop jobs and scheduled jobs in Talend integration suite. Extensively used the concepts of ETL to load data from AS400, flat files to Sale force.
  • Modified reports and Talend ETL jobs based on the feedback from QA testers and Users in development and staging environments.
  • Created various data marts from data warehouse and generated reports using Cognos.
  • Developed Standard Reports, List Reports, Cross-tab Reports, Charts, Drill through Reports and Master Detail Reports Using Report Studio.
  • Created Query prompts, Calculations, Conditions, Filters, Multilingual Reports Using Report Studio.
  • Good knowledge in Framework Manager, Report Studio, Query Studio, Cognos Connection, Analysis Studio.
  • Created Mapplets, Reusable transformations, Worklets and used them in different mappings and workflows.
  • Used Source qualifier, Expression, Joiner, Lookup, Filter, Router, Sequence Generator, Sorter, Stored Procedure transformations to do necessary data calculations and manipulations according to the business rules and loaded data into Target systems.
  • Implemented Variables and Parameters in the mappings.
  • Worked on loading data from SQL server to Oracle using SQL loader.
  • Developed stored procedure to automate the testing process to ease QA efforts and also reduced the test timelines for data comparison on 5000 tables.
  • Extensively Worked in writing SQL Queries, DML, DCL, TCL Commands.
  • Written UNIX shell scripts to automate SFTP files, archive files and pre-validation scripts by exchanging SSH keys between UNIX servers.
  • Expertise in documenting Mapping documents, Unit test plans, test results and Deployment plans.
  • Scheduled various daily and monthly ETL loads using Control-M
  • Analyzed change requests and performed impact analysis and estimated level of efforts.
  • Worked on advanced concepts like concurrent running of the workflow, session level partitioning techniques and pushdown optimization.
  • Scheduled workflows using PMCMD and UNIX Shell scripts using Cron Tab for daily and monthly loads.
  • Involved in production deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
  • Coordinated offshore and onsite teams for multiple projects.

Environment: Informatica Multi-domain MDM 9.7.1, Address Doctor, IDD,SIF,Informatica Power Center 9.6.1, Control-M, Informatica B2B Data Exchange 8.6,, Informatica B2B Data Transformation Studio 8.6, Cognos8.x,Talend 4.1, SQL Developer, Java, Oracle 10g, Toad 10.6, WS02,SQL Server, Windows XP, UNIX, Putty & Winscp.

Confidential, Bellevue, WA

ETL Informatica Developer

Responsibilities:

  • Involved in Requirement gathering and Business Analysis.
  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica, Talend (ETL), Cognos 10, Impromptu and Powerplay.
  • Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.
  • Developing Teradata Utility scripts like Fast Load, Multi Load to load data from various source systems to Teradata.
  • Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Developed models in Framework Manager and deployed packages to the Cognos Connection.
  • Customized data by adding filters at both the Framework Level and Report Level.
  • Published different packages from Framework Manager to Cognos Connection.
  • Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
  • Developed mappings using XML, Oracle, fixed and delimited Source files and loading into a flat file.
  • Extensively used Erwin tool in Forward and Reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
  • Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.
  • Extensively used Power Center to design multiple mappings with embedded business logic.
  • Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier, Update strategy Transformations in Informatica Designer.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator and Router transformations for populating target table in efficient manner.
  • Created Mapplet and used them in different Mappings.
  • Extracted Incremental data (CDC) from source systems to load data to targets.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
  • Designed and developed Oracle PL/SQL scripts for Data Import/Export.
  • Used bulk load to manage the performance by dropping indexes, truncating stage tables, rebuilding the indexes, and analyzing the tables during the process.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.
  • Worked on to schedule Informatica workflows.
  • Managed Change control implementation and coordinating daily, monthly releases and reruns
  • Provided production support for Business Users and documented problems and solutions for running the workflow

Environment: Informatica 9.5.1, Oracle 10g,Teradata, WS02, Cognos 8.2, SQL Developer, SQL plus, SQL Assistant, UNIX, & Winscp.

Confidential, Detroit, MI

ETL Informatica Developer

Responsibilities:

  • Involved in the requirements definition and analysis in support of Data Warehousing efforts.
  • Involved in Logical and physical database design using Erwin.
  • Worked Extensively on Informatica tools -Repository Manager, Designer and Workflow Manager.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Created the Source and Target Definitions using Informatica Power Center Designer.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Created and Monitored workflows and Sessions using Informatica Power Center Server.
  • Tuned the mappings & oracle queries to increase its efficiency and performance.
  • Used Informatica Workflow Manager to create workflows.
  • Workflow Monitor was used to monitor and run workflows.
  • Created UNIX scripts and used in Informatica with command tasks.
  • Set up Batches and Sessions to Schedule Loads at regular intervals in Informatica.

Environment: Informatica Power Center 9.5.1, Windows 2000, Oracle 9i, PL/SQL, SQL Developer, UNIX shell scripting, Erwin.

Confidential, Rio Rancho, NM

ETL Informatica Developer

Responsibilities:

  • Involved in Data transfer from OLTP systems forming the extracted sources.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Analyzed the sources, transformed the data, mapped the data and loaded the data into targets using Power Center Designer.
  • Designed and developed Oracle PL/SQL Procedures.
  • Worked on Informatica Utilities - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Worked with various transformations like Source Qualifier, Expression, Filter, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner transformations.
  • Involved in creating Shell Scripts to automate Pre-Session and Post-Session Processes.
  • Developed Sessions and Batches to schedule the loads at required frequency using Power Center Server Manager 5.1.
  • Involved in data quality testing.

Environment: Informatica Power Center 9.5.1, Oracle 8i, SQL, PL/SQL, TOAD and Windows NT.

Confidential

ETL Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Review settled insurance claims to determine that payments and settlements have been made in accordance with company practices and procedures.
  • Report overpayments, underpayments, and other irregularities.
  • Confer with legal counsel on claims requiring litigation.
  • Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Develop the mappings using needed Transformations in Informatica tool according to technical specifications
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Used Informatica reusability at various levels of development.
  • Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Building Reports according to user Requirement.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implementedslowly changing dimensionmethodology for accessing the full history of accounts.
  • Write Shell script running workflows in unix environment.
  • Optimizing performance tuning at source, target,mapping and session level
  • Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica 8.6 .1, Oracle 11g, SQL Server 2005, HP-UX.

Confidential

ETL Developer

Responsibilities:

  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
  • Developed several reusable transformations and mapplets that were used in other mappings.
  • Prepared Technical Design documents and Test cases.
  • Involved in Unit Testing and Resolution of various Bottlenecks came across.
  • Implemented various Performance Tuning techniques.
  • Used Teradata as a source system

Environment: Informatica 8.6.1 Power Center, Teradata, Oracle 11g, Windows NT.

Confidential

ETL Analyst

Responsibilities:

  • Extensively used ETL to load data from Flat Files, XML, Oracle to oracle 8i
  • Involved in Designing of Data Modeling for the Data warehouse
  • Involved in Requirement Gathering and Business Analysis
  • Developed data Mappings between source systems and warehouse components using Mapping Designer
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML.
  • Setup folders, groups, users, and permissions and performed Repository administration using Repository Manager.
  • Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.
  • Created, launched & scheduled sessions.
  • Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
  • Used Server Manager to schedule sessions and batches.
  • Involved in creating Business Objects Universe and appropriate reports
  • Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.

Environment: Informatica 8.6.1, ORACLE 10g, UNIX, Windows NT 4.0, UNIX Shell Programming, PL/SQL, TOAD Quest Software

We'd love your feedback!