We provide IT Staff Augmentation Services!

Informatica Developer Resume

3.00/5 (Submit Your Rating)

Raritan, NJ

SUMMARY

  • 8+ years of hardcore IT experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading using Informatica Power center 9.1,8.6,8.1 and Microsoft technologies.
  • Tuning of SQL Queries, Procedures, Functions and Packages using EXPLAIN PLAN and TKPROF.
  • Extensive experience in SQL,PL/SQL,SQL*Plus and third party tools like TOAD and SQL Developer.
  • Proficient in Oracle Tools and Utilities such as SQL*Loader, Import/Export and SQL*Navigator.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • Good understanding of Ralph Kimball Dimensional Modelingusing Star schema methodology and Bill Inman Snow Flake schema methodology.
  • Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Metadata Manger).
  • Usage of ETL in Business intelligence, OLAP, OLTP, MOLAP, ROLAP, data marts, data mining etc. Knowledge of Metadata Manager for metadata Analysis and specialty in dealing with flat files, Oracle, XML, SQL Server.
  • Experience in complete life cycle (SDLC) of Design, Maintenance of Development, Implementation and Enterprise Data Warehouse (EDW).
  • Worked extensively with slowly changing dimensions. Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations
  • Responsible for design and implementation needed for loading and updating the warehouse.
  • Worked with Session logs and Workflow logs for Error handling and troubleshooting in Dev environment.
  • Efficiently worked on data modeling, data mappings and data integration.
  • Involved in Performance Tuning of existing mappings as well as new ones.
  • Good exposure in Informatica MDM where data Cleansing, De - duping and Address correction were performed.
  • In-depth Technical Knowledge on Microsoft Business Intelligence Tools like SQL Server 2005/08 Integration Services and SQL Server 2005/08 Reporting Services.
  • Worked on SSRS (SQL server Reporting Services) and SSIS (SQL Server Integration Services).
  • Proficiency at Data Transformations, Tasks, Containers, Sources and Destinations like Derived Column, Conditional Split, Union all, Merge, Lookup and Merge Join Transformations to load data into Data Warehouse.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).

TECHNICAL SKILLS

ETL Tools: Informatica Power center 9.1.0/8.6.x,Informatica MDM, Data Profiling, Data cleansing, SSIS with SSRS

BI Tools: Business Objects R2/R3

Scheduling Tool: Control-M, Cronacle and Autosys

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling

Database: Oracle 11g/10g/9i, Teradata v2r6, SQL Server 2008 R2

Programming Languages: Oracle SQL, PL/SQLUNIX Shell Scripting

Tools: /utilities: SQL*Plus, TOAD,PL/SQLDeveloper, SQL Navigator,SQL Loader

Operating Systems: UNIX, Windows NT

PROFESSIONAL EXPERIENCE

Confidential, Raritan, NJ

Informatica Developer

Responsibilities:

  • Developed PL/SQL Procedures, Functions and Packages and used SQL* loader to load data into the database.
  • Tuned SQL queries using explain plan generated in TOAD and SQL Navigator.
  • Wrote Unix Shell Scripts for automation process.
  • Involved in the QA, Prod Migration of Informatica Workflows.
  • Assessing the information needs of the users and developing functional specifications and test cases.
  • Involved in analyzing and development of the Data Warehouse.
  • Designed various mappings using different types of transformations like Source Qualifier, Expression, Filter, Aggregator, Rank, Update strategy, Lookups (Connected, Unconnected), Stored Procedure, Sequence Generator, Joiner, and Router.
  • Understanding the current process of sending the files to the state from the plan partners
  • CreatedPL/SQL Packages, Procedures, Functions and Triggers.
  • Worked on their flat file conversion from the plan partners to the state format
  • CreatedPL/SQLcode for the conversion of NCPDP 2.2/4.2 file to DHCS new format.
  • Performed tuning of ongoing monthly automated process usingExplain plan, Trace utility
  • Documented all the schema packages, tables and procedures
  • Loading of the data from the plan partners to state usingBULKCOLLECTIONS
  • Providing productions supportfor all the ongoing automated process which takes place weekly, monthly..

Environment: Oracle 10g/11g, SQL*Loader, PL/SQL,TOAD, Informatica Power center 9.6.1 & 9.1, BO XI R2/R3,Informatica MDM,Toad 10x, Cronacle, UNIX

Confidential, Jersey city, NJ

Informatica Developer

Responsibilities:

  • Development of complete Data warehouse including dimensional modeling (Star Schema and snowflake), FACT and Dimensions Tables from scratch.
  • Involved in requirement gathering meetings, design and development of Informatica mappings.
  • Extensively worked in the performance tuning of ETL mappings.
  • DevelopedPL/SQLpackages which includecursors and nested cursors in procedures to populate the tables.
  • Involved in PL/SQL code review and modification for the development of new requirements.
  • Code SQL Stored Procedures and integrated with application
  • Involve in Unit Testing for the application developed using test cases.
  • Involved in coding shell scripts using Korn Shell for executingPL/SQLscripts batch jobs
  • Develop new back end interfaces by using OraclePL/SQL(Procedures, Functions and packages) and performance tuning.
  • Implemented slowly changing dimensions Type 2 using date method.
  • Involved in creating Target databases for the Data marts using Power Center Designer.
  • Develop required documents for source to target mappings and technical documents.
  • Involved in requirement gathering meetings, design and development of Informatica mappings.
  • Involved in the QA, Prod Migration of Informatica Workflows.
  • Develop required documents for source to target mappings and technical documents.
  • Involved in creating Target databases for the Data marts using Power Center Designer.
  • Analyzed the source data coming from various data sources like Oracle and Flat files etc.
  • Extensively working with Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Cleansed the data using MDM technique.
  • Enabled incremental loading in fact table mappings and made required changes to the mappings to populate the production data.
  • Well exposed to unit and Integration testing.
  • Use of Autosys tool to schedule variousInformaticajobs and workflows.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files to staging database and from staging to the target Teradata Warehouse database using utilities like Fast Load.
  • Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.

Environment: Informatica Power center 9.1.0, UNIX shell scripting, Oracle 11g,Teradata v2r6, Oracle SQL,PL/SQL,TOAD,SQL*Loader,UNIX

Confidential, Columbus, OH

Informatica Developer

Responsibilities:

  • Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Used Informatica data services to profile and document the structure and quality of all data.
  • Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
  • Translated Business processes into Informatica mappings for building Data marts by using Informatica Designer, which populated the Data into the Target Star Schema on Oracle 9i Instance.
  • Followed the required client security policies and required approvals to move the code from one environment to other.
  • Developed ETL mappings, transformations using Informatica Power center 8.6
  • Deployed the Informatica code and worked on code merge between two difference development teams.
  • Extensively used PL/SQLprogramming in backend and front-end functions, procedures, packages to implement business rules.
  • Created Informatica complex mappings with PL/SQL procedures/functions to build business rules to load data.
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
  • Created automated scripts to perform data cleansing and data loading.
  • Performed complex defect fixes in various environments like UAT, SIT etc to ensure the proper delivery of the developed jobs into the production environment.
  • Attended daily status call with internal team and weekly calls with client and updated the status report.

Environment: Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor), SQL Server, Oracle 10g, Toad, UNIX Shell Scripting, Flat Files, SQL Developer, and Windows XP Professional,UNIX

Confidential, Phoenix,AZ

Jr Informatica Developer

Responsibilities:

  • Involved in creating Detail design documentation to describe program development, logic, coding, testing, changes and corrections.
  • Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.
  • Involved in requirement definition and analysis in support of Data Warehouse.
  • Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, sequence generator, etc.
  • Worked with XSD and XML files generation through ETL process.
  • Defined and worked with mapping parameters and variables.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Performed the performance evaluation of the ETL for full load cycle.
  • Checked Sessions and error logs to troubleshoot problems and also used debugger for complex.
  • Worked on Parameterize of all variables, connections at all levels in UNIX.
  • Created test cases for unit testing and functional testing.
  • Coordinated with testing team to make testing team understand Business and transformation rules being used throughout ETL process.

Confidential

MSBI Developer

Responsibilities:

  • Designing and customizing data models for Data warehouse supporting data from multiple sources on real time.
  • Created SSIS packages with important Lookups validating master data, prior to updating transaction tables and all invalid data will be routed to error file.
  • Deployed and monitored SSIS packages for Dynamic ETL flow to validate, extract, transform and load data into database.
  • Developed SSRS Parameterized, Drill-through, and Drill-down Reports for validation percentage of data quality reports from the sentinel feed.
  • Hands on experience in deploying, configuring, subscribing and managing reports using Report Manager and Report Builder.
  • Good DB Admin tasks viz., DB Backup, Shrinking Database (Transaction logs) and Updating Statistics after loading the daily huge load data.
  • Created database objects like table, views, Store Procedure, Triggers, Functions etc. using T-SQL to providing structure to store data and maintain database efficiently.
  • Utilized joins and sub queries to simplify complex queries involving multiple tables.
  • Reviewed data will be picked up by SSIS package overnight and validates the data, which is incomplete and sends information to corresponding distribution group.
  • Financial calculations such as payments, leave, bonus, and perks for working resources were calculated by weekend jobs.
  • Implemented exception handling in all packages to cover exceptional scenarios. Errors occurred on performing resource background check are captured and logged into flat files utilized for investigation
  • Provided documentation about database functional specification and technical design documents for reports and SSIS Packages.

Environment: Flat files, MS Excel Files, MS Access, SSIS and SSRS2008,Oracle 9i/10g, Erwin 7.3, MS SQL Server 2005/2000, PL/SQL, IBM DB2 8.0, Teradata, Mainframes, Toad, Perl, Unix scripting, Windows NT,Autosys, Microsoft Project Plan.

We'd love your feedback!