We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume Profile

3.00/5 (Submit Your Rating)

Professional Summary:

  • Over 8 years' Experience with Informatica 9.1x/8.6x/7.x/6.1 Source Analyzer, Mapping Designer, Mapplet Designer, Transformations Designer, Warehouse Designer, Repository Manager, and Workflow Manager/Server Manager .
  • Implementation of full lifecycle in Data warehouses and Business Data marts with Star Schemas, Snowflake Schemas, SCD Dimensional Modeling.
  • Extensively worked with different data sources non-relational databases such as Flat files, XML files, and other relational sources such as Oracle, Teradata, Netezza ,Greenplum,Sql Server, DB2, and SAP R/3.
  • Extensively worked with the files in HIPAA format and have good knowledge with serializers and parsers.
  • Well adept in planning, building, and managing successful large-scale Data Warehouse and decision support systems.
  • Comfortable in both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
  • Experience in data mart life cycle development, performed ETL procedure to load data from different sources into data marts and data warehouse using Informatica Power Center.
  • Extensively worked in Extraction, Transformation and Loading of data from multiple sources into Data Warehouse.
  • Expertise in creating mappings, mapplets and reusable Transformations using Informatica Designer.
  • Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
  • Good knowledge in tuning the performance of SQL queries and ETL process.
  • Worked on UNIX-shell Scripting for Pre-Session, Post-Sessions tasks and also automated the scripts using a scheduling tool.
  • Experienced in Repository Configuration/using Transformations, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow Manager to move data from multiple source systems into targets.
  • Experienced in Installation, Configuration, and Administration of Informatica Power Center 9.x/8.x.
  • Experienced in Performance tuning of Informatica sources, mappings, targets and sessions and tuning the SQL queries.
  • Hands on experience in creating Indexes and partitioning tables for performance.
  • Significant experience in Dimensional and Relational Data Modeling and Business Intelligence experience in Star Schema/Snowflake modeling, FACT Dimensions tables, Cubes, Dash Boards, Physical logical data modeling using ERWIN.
  • Significant Experience in PL/SQL, Procedures/Functions, Triggers and Packages.
  • Expertise in Teradata tools Utilities like Fast Load, Multi Load, Fast Export, TPump and Teradata Parallel Transporter TPT BTEQ.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Familiar with Data modeling and worked with data modeling tool Erwin.
  • Excellent Analytical, Communication skills, working in a team and ability to communicate effectively at all levels of the development process.
  • Expertise in generating reports for end client using various Query tools like Cognos, Business Objects and OBIEE.
  • Worked closely with Business Object's reporting team in order to meet users/business needs.
  • Expertise in scheduling Informatica jobs using Informatica, Windows scheduler and with Unix.
  • Expertise in creating Unix shell scripts.
  • Very Good Hands on knowledge in data management and implementation of Big Data/Hadoop applications using Informatica Big data Version.
  • Experienced in working for the post development cycle and applications in Production Support.

Work Experience

Sr. Informatica Developer

Confidential

Responsibilities:

  • Analyzed the Source and Target data and designed the Source to Target Data Map-ping document.
  • Developed complex mappings using Informatica Power Center Designer to transform and load the data from SQL Server 2012 to Target system Sales Force and vice versa.
  • Created mappings and loaded data from SQL Server 2012 to Target system Sales Force and vice versa using Informatica Cloud Services.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Java transformation.
  • Work with support team to define methods for and potentially implement solutions for performance measuring and monitoring of all data movement technologies.
  • Designed and developed Data Profiling in Informatica Designer to profile the Source data to meet business requirements using IDQ
  • Created Data Profiles in Profile Manager using Auto Profile and Custom Profile options to validated documented business rules.
  • Implemented performance tuning logic on Targets, sources, mappings and sessions to provide maximum efficiency and performance.
  • Created deployment groups in Informatica to handle code migrations from DEV, QA and Prod.
  • Developed Unit tests and separate User Acceptance test cases UAT to validate data against test scenarios.
  • Created Store Procedures in Netezza to load large data volumes as part of the production implementation.
  • Scheduled Informatica jobs using Autosys job scheduler.
  • Generating the SQL queries for Cognos reporting and validating the reports.
  • Involved in Performance Tuning of mappings in Informatica.
  • Good understanding of source to target data mapping and Business rules associated with the ETL processes.

Environment: Informatica Power Center 9.5, Informatica IDQ 9.5,Informatica Cloud Services, Oracle 11g, Netezza 7.0.4 ,SQL Server 2008, Autosys, Toad, SAP Business Objects, SalesForce.com

Sr. ETL Developer

Confidential

Responsibilities:

  • Involved closely with the team implementing logical and physical data modeling with STAR schema techniques using Erwin in Data warehouse as well as in Data Mart.
  • Involved in the Administration and maintenance of the Informatica Repository.
  • Extracted data from flat files, SQL Server 05, XML Files data using Informatica Power Connect and performed massive data cleansing applying all the business rules prior to loading them into Oracle staging tables.
  • Extensively used Informatica to load data from MS SQL Server, SAP- R/3 and DB2, into the target Oracle.
  • Implemented data load from different type of source like Flat File, Relational Database, Oracle Logs using Informatica Power Exchange.
  • Performed Data Profiling and created User Defined Functions UDF /process to handle null fields with default values for string, date and number.
  • Created packages using SSIS as ETL tools to transfer and integration of data from databases.
  • Used PMCMD, PMREP and UNIX shell scripts for workflow automation and repository administration.
  • Created Data Breakpoints Error Breakpoints for debugging the mappings using Debugger.
  • Implemented Slowly Change Dimension Type I II to capture history and created process flow chart for insert/update using effective end date while loading data into the data marts.
  • Developed shell scripts, PL/SQL scripts, T-SQL scripts, Stored Procedures for regular Maintenance and Production Support to load the warehouse in regular intervals and to perform Pre/Post Actions.
  • Extensively involved in designing the SSIS packages to export data of flat file source to SQL Server database.
  • Developed BTEQ scripts to extract, transform and load data into the Teradata database from Business Requirements Document.
  • Worked on Teradata Utilities like Fastload, Multiload and Fast Export.
  • Extensively worked with the Teradata SQL assistant/NEXUS for testing and analyzing the data.
  • Performed Bulk Load of large volume of data by creating pre- and post-load scripts to drop indexes and key constraints before session and rebuild those indexes and key constraints after the session completes.
  • Developed BI Publisher reports as per the User requirements.
  • Familiar with batch scheduling using Autosys.

Environment: Informatica Power Center 8.1/8.5/8.6, Informatica Power Exchange 8.1, Informatica Power Connect, Oracle 10g, Teradata, GreenPlum,DB2, SSIS, SQL Server 05, Autosys, Toad 9.0.1, Erwin 4.2, Unix, Siebel CRM, SQL Developer, SQL, T-SQL, PL/SQL.

Informatica consultant

Confidential

Description:

Confidential. is a leader in managing prescription drug benefit programs that are designed to drive down the cost of pharmacy health care for private and public employers, health plans, labor unions and government agencies of all sizes and for individuals served by the Medicare Part D Prescription Drug Program. Medco serves the needs of patients with complex conditions requiring sophisticated treatment through its specialty pharmacy operation, which became the nation's largest with the 2005 acquisition of Accredo Health, Incorporated.

Responsibilities:

  • Extensive use of B2B Data Transformation for handling vendor data, where it is in EDI, Unstructured data and Complex structured data XML Schemas .
  • Extensively worked with structured and unstructured data.
  • Worked with HIPAA 5010 for reduces risk and provides flexibility and complete bidirectional transaction crosswalk transactions.
  • Involved in analysis, requirement gathering, and documenting functional and technical specifications.
  • Involved in the preparation of the mapping document for 5010 by identifying the minor changes from 4010.
  • Involved in the installation of B2B plug-ins in the machine.
  • Designed the inbound mappings to load data from HIPAA 5010 834, 270, 271 and 271U files into Database to healthcare industry standards.
  • Created the outbound mappings to generate HIPAA 5010 834, 271 and 270 files from the data in the data bases.
  • Upgraded Informatica power center 8.1 to version 8.6 on servers.
  • Hands-On experience in developing Transformations like Joiner, Aggregate, Expression, Lookup connected and un connected , Filter, Union, Stored Procedures, Router, XML generator and parser, unstructured Data Transformation etc. using best practices.
  • Configure the session so that power center server sends the e-mail when the session fails.
  • Extensive use of flat files as sources and targets depending on the inbound and outbound processes.
  • Deal with data files with lot of data almost up to 6 million members in one file .
  • Involved in the performance tuning of the maps to reduce the runtime for the big files.
  • Extensively worked in the performance tuning of the programs, PL/SQL Procedures and processes.
  • Cleansing data using Trillium, RTRIM and LTRIM.
  • Implemented Slowly Changing Dimensions - Type I II in different mappings as per the requirements.
  • Run the mappings using Tidal 3rd party tool and implemented the concurrent running of the workflow for different files at the same time.
  • Fixed the minor issues in the parser and serializers built in java codes .
  • Build XML parser and serializer transformations with the xsd files.
  • Involved in the resolution of the issues in the built in java codes along with the informatica people through GoToMeeting.
  • Passed the parameters to work flow from the tool Tidal directly to run the map.

Environment: Informatica Power Center 9.1/8.6, Data Transformation Studio, Oracle 10g, SQL Server 2005, Tidal, SQL, T-SQL, PL/SQL.

ETL Developer

Confidential

Responsibilities:

  • Designed and Developed mappings needed for enhancement of project.
  • Worked with Informatica Power center 8.6 Mapping Designer, Workflow Manager, Workflow Monitor and Admin Console.
  • Exported and Imported mappings between different Informatica folders and also Repositories.
  • Responsible for Administration of Informatica Environment. Created users and groups, configured profiles and privileges.
  • Informatica Debugger is used to test the mappings and fixed bugs.
  • Provided Production Support for IDS DSS and created Unix Shell Scripts to delete data that is older than 3 years from fact tables.
  • Created Stored Procedures and SSIS DTS packages for the ETL process.
  • Developed UNIX shell scripts to get data from external systems flat files to EDW stage area, and to schedule various workflows to load the data into target system.
  • Familiar with Data Warehouse Architecture, design methodologies and Best practices.
  • Experience with server log files, editing UNIX scripts, FTP files and checking space on server.
  • Implemented performance tuning techniques on Targets, Sources, Mappings and Sessions.
  • Scheduled Informatica sessions and workflows using Informatica Scheduler.
  • Performance and fine tuning of Informatica mappings.
  • Loaded data into Datamarts on a daily basis.
  • Created new repositories and new folders within the repositories using Repository Manager.

Environment: Informatica Power Center 8.1, Oracle 9i Windows Server 2003, PL/SQL, SQL Plus, XML, PL/SQL Developer, SSIS, Teradata, DB2, Erwin 4.1, AppWorx 5.1.1, Windows NT/2000.

We'd love your feedback!