We provide IT Staff Augmentation Services!

Informati Developer Resume

CA

SUMMARY:

  • Around 7+ years of IT experience in the analysis, design, development and testing of Data warehousing development applications using Informatica Power center.
  • Expertise in Extraction Transformation Loading (ETL) process, Dimensional Data Modeling experience using Data modeling, Star schema/ Snowflake modeling, Fact and Dimensions tables, dimensional, multidimensional modeling and De - normalization techniques.
  • Experience in Repository Configuration, creating Transformations & Mappings using Informatica Designer & processing tasks using Workflow Manager.
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehouse/Decision Support Systems using ETL tools, Informatica Power Center 8.x, 9.x, OLTP, OLAP.
  • Experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling.
  • Solid experience in Informatica Power Center, Mappings, Mapplets, Transformations, Workflow Manager, Workflow Monitor, Repository Manager, Slowly Changing Dimensions, OLTP, OLAP.
  • Strong back end experience in writing PL/SQL stored procedures, functions, packages and triggers.
  • Very good hands on experience of MDM development.
  • Involved in Performance Tuning of the Data Warehouses including the creation of materialized views, bitmapped indexes and partitions.
  • Experience in data integration of various data sources from Databases like MS Access, Oracle, SQL Server and formats like flat-files, CSV files and XML files.
  • Experience in creating various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Normalizer, Sorter, Router, XML, Stored procedure in Informatica Power Center.
  • Experienced in parameterizations (active parameters, static parameters) of worklets, sessions, mappings, using Audit Control Execution for automation of processes in informatica power center.
  • Experienced with PowerShell script for automation of process, designed conversion from Excel to CSV without excel installed
  • Good knowledge in Normalizing and De-normalizing the tables and maintaining Referential Integrity by using Triggers and Primary and Foreign Keys.
  • Extensive database experience using Microsoft SQL Server, Oracle 11g, SQL, PL/SQL, Teradata.
  • Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
  • Experience in migrating Power center to IICS(Informatica Intelligent Cloud Services)

TECHINCAL SKILLS:

ETL Tools: Informatica PowerCenter 9.6/8.x, Informatica PowerExchange9.6/8.x, Informatica Data Quality 9.x

Languages: C, C++, SQL, PL/SQL, HTML, XML.

Methodology: Agile RUP, SCRUM, Waterfall

Databases: Oracle 11g/10g, SQL Server 2012/2008/2005 , DB2, Teradata 14/13, UDB DB2, Sybase

OS: Windows 2003, 2007,2010, UNIX, Linux

IDEs: PL/SQL Developer, Teradata SQL Assistant

Modelling Tool: Erwin 9.1/7.2, MS Visio

Reporting: Tableau 9.2, Cognos 9/8

Other Tools: Notepad++, Teradata SQL Assistant, MS office, SQL Developer, XML Files, ORACLE ERP, PYTHON and Share Point.

PROFESSIONAL EXPERIENCE:

Informatica Developer

Confidential, CA

  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, and Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
  • Assisted in building the ETL source to Target specification documents
  • Effectively communicate with Business Users and Stakeholders.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Worked extensively on Power Center and Power Exchange for SAP
  • Perform Data Conversion/Data migration using Informatica PowerCenter.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows using Power Center to load the data from source to stage, stage to persistent (ODS), stage to reject and ODS to EDW.
  • Schedule and run Extraction and Load process and monitor sessions by using Informatica Server Manager.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Created Data Model for the DataMarts.
  • Used materialized views to create snapshots of history of main tables and for reporting purpose.
  • Contact with Informatica tech support group regarding the unknown problem.
  • Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Prepared SQL Queries to validate the data in both source and target databases.

Environment: Informatica Power Center 9.6.1, Power Exchange, IDQ, Hadoop, SQL Server, TOAD, JIRA, UNIX shell scripting, MS OFFICE, SAP

Senior Informatica Powercenter & IDQ Developer

Confidential, OR

Responsibilities:

  • Extensively used Informatica transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer transformations to extract, transform and load the data from different sources into Teradata, Oracle, DB2 and SQL Server targets.
  • Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
  • Cleanse the source data, Extract & Transform data with business rules, & built re-usable mappings, using Informatica PowerCenter Designer.
  • Created complex joiners, transformations of all types as needed to smoothly pass data through ETL maps.
  • Using Informatica PowerCenter created mappings and Mapplets defining the transformation logic according to the business rules for the loads.
  • Extensively worked on performance tuning of Informatica and IDQ mappings.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
  • Extensively used PowerExchange for Salesforce to read data from relational sources (Oracle) and load into Salesforce objects.
  • Worked with Dimensional Data warehouses in Star Schema, created slowly changing (SCD) Type1/2-dimension mappings using Ralph Kimball methodology.
  • Wrote PowerShell scripts for automating converting excel to CSV conversion without excel installed on server.
  • Loaded data from various sources (Flat files, Oracle, Sql Server) using different Transformations like Source Qualifier, Joiner, Router, Sorter, Aggregator, Connected and Unconnected Lookup, Expression, Sequence Generator, Java, XML Source Qualifier, Union and Update Strategy to load the data into the target.
  • Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows using Power Center to load the data from source to stage, stage to persistent (ODS), stage to reject and ODS to EDW.
  • Schedule and run Extraction and Load process and monitor sessions by using Informatica Server Manager.
  • Responsible for the data management and data cleansing activities using Informatica data quality (IDQ)
  • Used Jenkins to automate packaging and deployment of various ETL, UNIX components.
  • Extensively used Enterprise Manager Tool in Control-M to load the charts and run the jobs for initial load of the tables whenever a new environment is created.
  • Utilized Informatica IDQ to complete the initial data profiling and matching/removing duplicate data for the process of data migration from the legacy systems to the target Oracle Database.
  • Performed Unit testing and Integration testing on the mappings. Worked with QA Team to resolve QA issues.
  • Worked on accessing data and business services residing in the cloud and/or on-premises without having to replicate data
  • Worked on API creation and consumption, service orchestration, Process automation and integration, Message- based integration, Business-to- business integration, data synchronization and replication, Managed file transfer, Bulk and batch data integration and transformation of data et, Handling of structured and unstructured data in IICS.

Environment: Informatica PowerCenter, IDQ, IDE, Oracle, Teradata, Oracle, DB2, Erwin, MQ series, Load, Ingest, T-SQL, PL/SQL, RMS, Linux, SQL Server AIX, ERWIN Tidal Scheduler, Shell Scripting, DCC (DocumentControlCenter), Putty, WinSCP, JIRA, Cognos.

Senior Informatica Developer

Confidential, PA

Responsibilities:

  • Studied the existing environment and accumulating the requirements by querying the Clients on various aspects.
  • Developed various Mappings with the collection of all Sources, Targets, and Transformations using Designer.
  • Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner.
  • Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Implemented Informatica loading techniques like Incremental loading (Change Data capture) and slowly Changing Dimensions SCD TYPE II and I.
  • Integrated the data from DB2 UDB 9.6/9.7, Oracle 11g, and SQL Server for populating large scale Data Mart and/or Data Warehouse.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Created E-mail notifications tasks using post-session scripts.
  • Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, work lets & scheduling of the workflow.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Designed ETL to load Snapshot fact tables used for reporting portfolios.
  • Build Informatica velocity mapping specifications, unit test cases, and standard procedures documents.
  • Worked with Informatica Deployment Groups in migrating the ETL code across environments.
  • Worked with the applications owners and system analysts to resolve data issues and refine transformations rules.
  • Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
  • Interacting with the Source Team and Business to get the Validation of the data.
  • Involved in performance tuning of the Informatica mapping and identifying the bottlenecks.
  • Created Complex mappings using Unconnected, Lookup, and Aggregate and Router transformations for populating target table in efficient manner.

Environment: Informatica Power Center 9.5.1/9.6.1 , Power Exchange, Hadoop, SQL Server, TOAD, JIRA, UNIX shell scripting, MS OFFICE.

ETL Developer

Confidential, NJ

Responsibilities:

  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
  • Assisted in building the ETL source to Target specification documents
  • Effectively communicate with Business Users and Stakeholders.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Perform Data Conversion/Data migration using Informatica PowerCenter.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Created Data Model for the DataMarts.
  • Used materialized views to create snapshots of history of main tables and for reporting purpose.
  • Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5
  • Contact with Informatica tech support group regarding the unknown problem.
  • On-Call support during the weekend.
  • Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.

Environment: Informatica 9.5/8.6, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE, Unix, Flat files, Teradata

ETL/ Informatica Developer

Confidential

Responsibilities:

  • Analyzed the source system and involved in designing the ETL data load.
  • Developed/designed Informatica mappings by translating the business requirements.
  • Involved in performance tuning of the Informatica mappings using various components like Parameter Files, round robin and Key range partitioning to ensure source and target bottlenecks were removed.
  • Implemented documentation standards and practices to make mappings easier to maintain.
  • Extensive SQL querying for Data Analysis and wrote, executed, performance tuned SQL Queries for Data Analysis & Profiling. Extracted business rule and implemented business logic to extract and load SQL server using T-SQL.
  • Worked with Teradata utilities like FastLoad and MultiLoad
  • Involved in automating retail prepaid system process. Created packages and dependencies of the processes.
  • Identified common issues in Cognos and published in NJSI Wiki page. Established Dashboards and Business reports.
  • Created automating retail prepaid system process; created packages and dependencies of the processes.
  • Created support, maintaining, enhancing and developing Wiki page new interfaces for Claim warehouse application.
  • Used Autosys for scheduling various data cleansing scripts and loading processes; maintained the batch processes using UNIX Scripts.
  • Monitor & troubleshoot batches and sessions for weekly and monthly extracts from various data sources across all platforms to the target database.
  • Tuned the mappings by removing the Source/Target bottlenecks and Expressions to improve the throughput of the data loads.

Environment: Informatica PowerCenter, IDE, Oracle, PL/SQL, MS SQL Server, Cognos, Autosys, and Quality Center

Informatica Developer

Confidential

Responsibilities:

  • Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner.
  • Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Involved in creation of Folders, Users, Deployment Group using Repository Manager.
  • Implemented populate slowly changing dimension to maintain current information and history information in dimension tables.
  • Worked with the applications owners and system analysts to resolve data issues and refine transformations rules.
  • Build Informatica velocity mapping specifications, unit test cases, and standard procedures documents.
  • Worked on Informatica Power Center - Source Analyzer, warehouse designer, Mapping Designer, Server Manager, Mapplets, and Reusable Transformations.
  • Performed tuning and optimization on SQL queries using Analyze, Explain Plan and optimizer hints.
  • Created workflows to run sessions sequentially one after the other and concurrent work lets to start all the sessions in the workflow at once.
  • Involved in code migration from Informatica 7.1.3 to Informatica 8.5 and maintains version control to track different versions as an Informatica change management
  • Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level.
  • Provided support to develop the entire warehouse architecture and plan the ETL process
  • Documented technical design documents and error logics.
  • Created deployment groups, migrated the code into different environments.

Environment: Informatica Power Center 8.5/7.1.3, Informatica Power Exchange, Oracle 9, Flat files, VSAM, MS Series, SQL*Plus, TOAD, PL/SQ, Erwin, UNIX, Business Objects.

Hire Now