We provide IT Staff Augmentation Services!

Informatica Developer Resume Profile

5.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Over 8 plus years of IT experience in the Analysis, Design, Development, Testing, and Implementation of business application systems for Pharmaceutical, Financial, Insurance and Technology Sectors.
  • 2 years of experience on Hadoop Big Data in the Analysis, Development, Testing and Implementing.
  • Experience on Datameer, Hive, Map Reduce and Spotfire.
  • Experience with structured and un-structured data types.
  • Preparing the High level Design Specifications for ETL Coding, Talend ETL and mapping standards.
  • Experience in Performance tuning of targets, sources, mappings and sessions and also in ILM lifecycle management .
  • Experience in integration of various data sources like Oracle, SQL server and MS access, flat files, Teradata, and XML, EBCDIC files. Experience in ERWIN.
  • Solid experience in Ralph Kimball Methodology, Logical Modeling, Physical Modeling, Dimensional Data Modeling, Star Schema, Snowflake Schema, FACT tables, Dimension tables, DATA stage.
  • Experience in OLTP/OLAP System study, developing Database schemas like star schema, snowflake schema Dimensional Data Modeling used in relational, dimensional modeling and slowly changing dimensions SCD's .
  • Performed ETL procedure to load data from different sources into data marts, Data Warehouse using Power Center, ESP jobs and BI tools.
  • Worked on Slowly Changing Dimensions SCD's and its implementation to keep track of historical data.
  • Performed system Analysis and QA testing and involved in Production Support, ER diagramming tool.
  • Worked on PeopleSoft development Tools like Application designer, people tools, people code and web services.
  • Actively involved in Performance Tuning, Error handling Product support on various Platforms.
  • ETL experience in development of mappings and tuned existing mappings for better performance using Informatica Power Center as per the business rules.
  • Worked with Teradata 12 utilities like BTEQ, Fast Load, conversant with Inforrnatica and Query man.
  • Involved in generating M load and T pump scripts to load the data into Teradata tables.
  • Development experience across the business areas such as Finance, Insurance, Healthcare.
  • Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
  • Strong with relational database design concepts
  • Team player, motivated, quick learner with analytical and problem solving skills.
  • Comprehensive technical, oral, written and communicational skills.

TECHNICAL SKILLS

  • ETL Tools:
  • Informatica Power Center 9.5/9.1/9.x/ 8.5/8.1/8.x/7.1.4 64 bits/6.2/5.1,
  • Power Mart 5.1/5.0/4.7, Power Connect for DB2, Data Stage, People tools
  • 8.48, 8.5 Application Designer, IDE IDQ, ERWIN 7.2.
  • RDBMS:
  • Oracle 11g, 10g, 9i, 8i, 7.X, SQL Server 2000, B2 UDB, MS Access, Teradata.
  • Data Modeling:
  • Erwin 7.1, Erwin 4.x/3.5.2/3.x, Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling, Oracle Designer.
  • Operating Systems:
  • Windows Advanced Server2000/2003, Windows XP/2000/NT/9X,
  • LINUX, UNIX, HP UX 10.X, Sun Solaris.
  • Database tools
  • Greenplum, Oracle, Datameer 2.1.5, Hive 0.9, Map reduce, Spotfire

PROFESSIONAL EXPERIENCE

Confidential

Sr.BI Data Integrator

Description:

A global leader in information solutions, they leverage one of the largest sources of consumer and commercial data, along with advanced analytics and proprietary technology, to create customized insights that enrich both the performance of businesses and the lives of consumers. They provide the credit Report to many countries. The project is about the data migrate to the new platform.

Responsibilities:

  • Hands on experience in using Hadoop ecosystem components like Hadoop Map Reduce, HDFS and Hive.
  • Importing and exporting data into HDFS via Datameer.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Involved in loading data from UNIX file system to HDFS.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker,
  • Task Tracker.
  • Developed Map Reduce program to convert data from storage systems, flat file system into HDFS.
  • Worked on experience on Datameer tool to deliver a business user focused BI platform for big data analytics.
  • Developed scripts to load processed data from HDFS into Greenplum database.
  • Created DDL, DML for greenplum database.
  • Worked on Spotfire to create business report.
  • Developing hive query to analysis and generate reports from user activity log.
  • Implemented test scripts to support test driven development and continuous integration.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Load and transform large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Responsible for creating Hive tables, loading data and writing hive queries.
  • Created External Hive Table on top of parsed data.
  • Analyzed Business Requirement Document written in JIRA.
  • Excellent listening, oral, and written communication/presentation skills with strong ability to influence people at all level.

Environment: Datameer 2.1.5, Hive 0.9, Spotfire, SQL, PL/SQL, Windows XP, SQL Server, Greenplum, Unix.

Confidential

Sr.BI Data Integrator

Description:

Confidential engages in the research and development, manufacturing and marketing of pharmaceutical products for sale principally in the prescription market, but the firm also develops over the counter medication. The project is all about to integrate data from different source systems sources and build a strategic global data warehouse for decision support and business metrics such as territory, region, division and area level sales for respective markets.

Responsibilities:

  • Involved in Business Analysis and Requirement Collection.
  • Worked on Datameer to create import job, export jobs.
  • Created HDFS files on by using Datameer tools.
  • Developing number of Complex Informatica Mappings, Mapplets and Reusable Transformations for different types of reports research studies for monthly loading of data.
  • Created Table on HIVE and loaded into External table.
  • Developed Re-usable transformations, Mapplets to use them for data load to data warehouse and database Oracle .
  • Extensively worked with Data Extraction, data analyst and Data Validation.
  • Created DDL and DML for Greenplum database.
  • Using Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Worked to find missing data from Sieble source.
  • Worked on Spotfire tool to generate report.
  • To improve decision making to accessing complete view of business partner data by using B2B data Exchange.
  • Writing documentation to describe program development, logic, coding, testing, changes and corrections.
  • Optimizing the mappings by changing the logic to reduce run time.
  • Involved in various testing activities like database testing, unit testing, system testing, performance testing and was also responsible for maintaining of testing metrics, production support, defect tracking , ER diagramming tools as well.

Environment: Datameer 2.1.5, Hive, Greenplum ,Informatica Power Center 9.5/9.1, Oracle 10g, SQL, PL/SQL, Windows XP

Confidential

Informatica Developer

Description:

Confidential planning products that includes: Demand Manager DM , Demand Fulfillment DF , and Master Planning MP , Material Allocator MA and Performance Manager PM . The purpose of this project is to integrate ETL data from these ABPP Agile Business Process Platform products and loading into PMDW Performance manager Data Warehouse .

Responsibilities:

  • Analyzed business documents and created software engineering requirement specification.
  • Helped business analyst in design, development and implementation of the Enterprise Data Warehouse, Data analyst, Data stage and Data Marts.
  • Understanding working knowledge of Informatica CDC change data capture , ILM, I2.
  • Interacted with the business users, analysts for requirements, developed conceptual and logical data models using ERWIN, B2B, BI tool and Teradata.
  • Extracted data from the various source systems that include Oracle, SQL Server, DB2, Talend and flat files.
  • Created and walked the team through mapping specification document and quality assurance.
  • Developed several mappings using transformations like Source Qualifier, Expression, Aggregator, Joiner, Lookup, Sequence Generator and Update Strategy etc in Informatica to populate the data to the target systems.
  • Developed user defined functions, reusable transformations and mapplets and made use of them in several mappings to streamline the mappings.
  • Implemented Incremental Aggregation to capture only the new records from the source, which increases the performance.
  • Improved the mapping performance by overriding the default SQL queries.
  • Knowledge on Talend open studio for Data integration and supply chain.
  • Developed mappings using Type2 slowly changing dimensions to keep track of historical data.
  • Created sessions and batches in the workflow manager tool and monitored the status using the workflow monitor tool.
  • Extensively used various performance tuning techniques to improve the session performance.
  • Implemented Pushdown Optimization to reduce the burden on the Integration service and thereby increase the performance.
  • Running parallel sessions by using concurrent batches reduced the time for loading the data.
  • Partitioning the session improved the session performance by creating multiple connections to the source and target systems.
  • Tracked the defects and wrote Test Cases.
  • Used debugger to test the data flow and fixed the mappings and B2B Data Exchange.
  • Extensively used PL/SQL stored procedures to build the business rules and wrote the shell scripts which automated the activities.
  • Performed code reviews with peers and created unit test plan document.
  • Created test cases for Unit test, System Integration test and UAT to check the data quality.
  • Successfully upgraded Informatica 8.0 to 8.6 and responsible for validating the objects in the new version of Informatica.

Environment: Informatica power center 9.x./8.6/8.0, Power Exchange, ERWIN 7.2, Oracle 11g, Flat Files, SQL Server 2008, DB2 ,UNIX scripting, SQL, PL/SQL, Autosys, Business Object XI.

Confidential

Informatica Developer

Description:

  • A new tool and associated methods are required to allow for more efficient and rigorous maintenance of this information on the over 24,000 policies currently active at MVP. This will be implemented using new code and changes to existing software.
  • The current provision for single updating remains operational and there is additional of the Mass Update Tool MUT . The MUT addresses the following requirements of the business users:
  • In addition of current functionality, enable the ability to apply a change to an existing phrase to some, or all Benefits and related Benefit Packages using that phrase at same time.
  • Enable the ability to associate a phrase to some or all Benefit Packages, including within a particular Product type at the same time.

Responsibilities:

  • My role as Informatica Developer involved in analyzing source systems, existing manual processes, designing the processes for Extracting, Transforming, Loading of data and automating existing manual processes.
  • Worked closely with clients to understand new requirements and implemented efficiently.
  • Worked in Scrum Environment, ESP jobs, web services,B2B and SSIS.
  • Developed mappings to bring in data from various sources across multiple stages -Staging, ODS to Reporting.
  • Used various transformations such as Source Qualifier, Expression, Lookup, HTTP, Sequence Generator, aggregator, Update Strategy, and Joiner while migrating data from various heterogeneous sources like Oracle, SQL Server, and Flat files to Oracle.
  • Used parameterization for better use of mappings and sessions, data stage.
  • Implemented Slowly Changing Dimensions - I, II, III based on the requirements and crystal reports.
  • Involved in performance tuning by optimizing the sources, targets, mappings and sessions and eliminating bottlenecks.
  • Developed UNIX Shell scripts to automate repetitive database processes
  • Created and Monitored sessions and workflows for daily extract jobs using Informatica Power Center, Workflow Manager and Workflow Monitor.
  • Deployed objects across various environments from various developer folders in DEV/QA/PRD.
  • Worked closely with Tidal admin to schedule the jobs.
  • Documented the changes and development related to project.
  • Assisted in production Support.
  • Used Teradata as a source system

Environment: Informatica Power Center 9.1/9.x, Oracle 11g/10g, SQL Server 2008/2005, Flat files, PL/SQL, UNIX, Tidal, Teradata.

Confidential

Informatica Developer

Description:

Confidentialcommercial banking products, services, Insurance agency and wealth management services for personal investing and institutional or non-profit organizations. The project is about to loading of the data from the multiple applications from different banks into the bank's Retention Warehouse.

Responsibilities

  • Requirements Gathering and Business Analysis.
  • Design of Physical and Conceptual models for the database with ILM and B2B.
  • Analyzed the data models of legacy implementations, identifying the sources for various dimensions and facts for different data marts according to star schema design patterns.
  • Analyze and document the level of effort for all Stages of all ETL projects.
  • Studying the existing environment, validating the requirements and gathering source data by interacting with clients on various aspects.
  • Designed the Functional Requirements and Mapping Technical Specifications.
  • Worked with Customer Master Team to get the customer data.
  • Developed and documented Informatica Mappings/Transformations, and Informatica sessions per the business requirement.
  • Designed and encouraged use of mapplets in Informatica to promote reusability, eliminate coding redundancy and ease maintenance of version control.
  • Debugging and Troubleshooting Informatica Mappings.
  • Created sessions, workflows and worklets to run with the logic embedded in the mappings using Power center Design.
  • Involved in all phases of Software Development like Code Development, Unit Testing using debugger, Data analyst, Integration Testing, Code Promotion, and Preparation of Checklists at each phase.
  • Analyzed the issue if any, in flat file and Communicate with concerned person to resolve issue.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions.

Environment: Informatica 7.1.2/8.x Power Connect, Power Mart, Power Center, Designer, Workflow Manager, Administrator and Repository Manager Oracle 9i, PL/SQL, TOAD, Business Objects 6.5, UNIX.

Confidential

Informatica Developer

Description:

Confidential multiple OLTP Systems and Flat files. Extracted Transformed data in to data warehouse using Informatica Power centre and generated various reports on a daily, weekly monthly and yearly basis. These reports give details of the various products of all state Insurance products that are sold. The reports are used for identifying agents for various rewards and awards and performance, risk analysis reports for Business development Managers.

Responsibilities:

  • Worked closely with the business analyst and Data warehouse architect to understand the source data and come up with the design for the Data Warehouse and Data Mart.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Mapplet Designer and Transformation Developer.
  • Worked with various active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier Transformation, and Update Strategy Transformation.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache.
  • Used Update Strategy DD INSERT, DD UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
  • Developed Procedures, Functions and UNIX shell scripts also a PeopleSoft Tools.
  • Created debugging sessions before the session to validate the transformations and also used existing mappings in debug mode extensively for error identification by creating break points and monitoring the debug monitor.
  • Used Windows Scripting and Scheduled pmcmd to interact with Informatica Server from command mode.
  • Created reusable transformations and used in various mappings.
  • Developed Informatica parameter files to filter the daily source data
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
  • Extensively used Informatica Scheduler to Schedule Informatica Workflows.

Environment: Informatica Power Center 7.1, Unix Shell scripts, People Soft tools, Business Objects6, oracle 9i.

Confidential

Informatica Developer

Description: Confidential Warehouse project. Primarily using Informatica Power Center 7.1 to develop and enhance programs to migrate data sourced from third party customer complaint software to reporting data marts.

Responsibilities:

  • Used ETL to load data from Flat Files, XML, Oracle to oracle 8i
  • Involved in Designing of Data Modeling for the Data warehouse
  • Involved in Requirement Gathering and Business Analysis
  • Developed data Mappings between source systems and warehouse components using Mapping Designer
  • Worked on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML.
  • Setup folders, groups, users, and permissions and performed Repository administration using Repository Manager.
  • Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.
  • Created, launched scheduled sessions.
  • Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
  • Involved in creating Business Objects Universe and appropriate reports
  • Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.

Environment: Informatica 7.1.3, ORACLE 10g, Windows NT 4.0, PL/SQL, TOAD Quest Software, UNIX, UNIX Shell Programming.

Confidential

Informatica Developer

Description:

National Insurance data warehouse deals with different kinds of claims which are categorized as Facility, Professional, and FEP. Data is coming from various sources like Oracle, SQL Server, and Mainframe etc which will be loaded in to EDW based on different frequencies as per the requirement. The entire ETL process consists of source systems, staging area, ODS layer, Data warehouse and Data mart.

Responsibilities:

  • Used Informatica Power Center for ETL extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
  • Developed several reusable transformations and mapplets that were used in other mappings.
  • Prepared Technical Design documents and Test cases.
  • Involved in Unit Testing and Resolution of various Bottlenecks came across.
  • Implemented various Performance Tuning techniques.
  • Used Teradata as a source system

Environment: ORACLE 10g, PL/SQL, Informatica 7.1.3.

We'd love your feedback!