We provide IT Staff Augmentation Services!

Informatica Administrator Resume

5.00/5 (Submit Your Rating)

Rochester, NY

SUMMARY

  • 9+ Years of Experience in Designing, Development, Administration, Implementation of Data Warehouse & Data marts with Informatica Power Center 9.x/8.x/7.x/6.x/5.x as an ETL tool.
  • Expert in extracting and transforming data using (ETL) Informatica Power Center and Power Exchange.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.
  • Extensively worked on Informatica tools Admin console, Repository manager, Designer, Workflow manager, Workflow monitor.
  • Expertise in Data warehousing, Data migration and Production Support.
  • Expert in typical Admin jobs like Migrations, FTP requests, Performance tuning, Installation of patches, hotfixes and upgrading the tool.
  • Extensive experience in implementing SCD types using Informatica Power Center 9.x/8.x/7.x.
  • Expertise in Dimensional and Relational Physical & logical data modeling using Erwin and ER/Studio.
  • Expert Technical Designer in Microsoft Visio 2003.
  • Expert in the analysis, design, coding and testing of the data analysis and warehouse build phase with Xenomorph TimeScape Workbench 4.0.
  • Experience in using SQL Server Integration Services (SSIS), OBIEE, SQL Server Reporting Services (SSRS).
  • Good experience in performing and supporting migrations for Unit testing, System Integration testing, UAT and Production Support for issues raised by application users.
  • Strong in UNIX K - Shell scripting. Developed UNIX scripts using PMCMD utility.
  • Scheduled ETL load using utilities like AppWorx, CRON tab, Ctrl M, Autosys and ESP Schedular.

TECHNICAL SKILLS

Operating Systems: Windows 8/7/Vista/ 2008/2003/2000/ XP/NT,MS-Dos, HP-UX, Unix, IBM AIX 6.1/4.3/4.2 and Solaris

ETL Tools: Informatica 9.x/8.x/7.x/6.x/5.x (Power Mart/ Power Center) 6.x, Informatica Power Exchange 7.x/8.x, Informatica Power Connect, Data stage 7.5.x

Languages: SQL, PL/SQL, C, C++, C#.net.

Databases: Oracle 11g/10g/9i/8i/8.0/7.x, IBM DB2 UDB 8.0/7.0, Teradata V2R4 V2R5, MS SQL Server 2005/2000/7.0/6.0.

Database utilities: SQL *plus, Toad, Stored procedures, Functions, Exception handling.

Data Modeling tool/ Methodology: MSVisio, ERWIN 4.x/3.x, Ralph-Kimball Methodology, Bill-Inman Methodology, Star Schema, Snow Flake Schema, Extended Star Schema, Physical And Logical Modeling.

Reporting Tools/Others: Business Objects XI r3/r2/r 1/6.5/6.1/5.1 , Cognos, SSRS (SQL Reporting Services).

PROFESSIONAL EXPERIENCE

Confidential, Rochester, NY

Informatica Administrator

Responsibilities:

  • Administrating Informatica, by creating user logins with appropriate roles, granting and denying login privileges, monitor the user accounts, creation of groups, granting database and application roles to users and groups.
  • Daily migrations through various environments (EDW, Transformation, Legacy & OBIEE) with export/import Technique using Deployment Group, based on requests raised by users.
  • File transfers through FTP using various tools.
  • Install of patches, hotfixes and upgrading the tool up to date in regular basis.
  • Performing regular Administration activities from Managing system resources, users and database to the fine-tuning level and implemented the project successfully.
  • Worked on Database Maintenance, Backups and Recovery Plans.
  • Expertise in configuration, performance tuning, installation of Informatica, & integration of various data sources like Oracle, MS SQL Server, and XML, Flat files into the staging area and Design ETL processes that span multiple projects.
  • And also implementing performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Schedule and maintain routine Jobs, Tasks, and Alerts using Ctrl M.
  • Assisted with scripting and monitoring interfaces between UNIX scripts and Oracle.
  • Monitor Informatica session Logs/ Schedule tasks/data activity/ eliminate blocking and deadlocks/ user counts and connections/Intent locks.
  • Perform periodic checking of databases for maintaining data consistency and integrity and rebuilt indexes.
  • Identified, tested and resolved database performance issues (monitoring and tuning) to ensure database optimization.
  • TroubleshootingandPerformance Tuningfor database systems.
  • On Call Production Support for after-hour requests like daily migrations and other Misc. Requests in shifts.

Environment: Informatica Admin Console 8.6.1/9.0.1/9.1.0 with hot fix3, Informatica PowerExchange (CDC, Siebel CRM, SAP Tools), Oracle 10g, ODBC 5.2., AIX 6.1., Ws-FTP 12.3., Putty 0.58., WinSCP 4.3.7., QWS3270 Plus 4.0.5., Password Safe Application 3.14.0.2123. , SQL server 2005/2008.

Confidential, Portsmouth, NH

Informatica Architect

Responsibilities:

  • Analysed, Designed and Developed a POC and tested it to the core.
  • With the feedback, implemented to different Oracle Golden Gate environments.
  • Responsible for the design of mapping and delivering the Business data for the DT Studio and Data Quality(IDQ) MDM Hub from existing data sources.
  • Imported all source and target definitions for particular Oracle Golden Gate environment.
  • Took the parameters for Dataversion to filter the Admin Audit.
  • Compared the Admin Audit in two environments and took the difference.
  • Processed the Not Exists Flat File data for verification with Admin Audit.
  • Processed the Flat File data for verification with Admin Audit (INPUT TABLE).
  • Loaded from Source to Interim stage table (GOLD -> Staging Table).
  • And from staging table manipulated the data according to the requirement.
  • Loaded from Interim Table to Flat File.
  • Worked on the parameters for Import tool authentication.
  • Took the parameters for delete and create a .csv file for Webservices.
  • Webserives will update the destination environment with the changes captured in Gold.
  • Extensively worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Web Service, Update Strategy, Filter, Sequence Generator Transformations.
  • Developed reusable transformations to load data from flat files and other data sources to the Datawarehouse.
  • Parameterised the connections in the session, Workflow Integration with K-Shell Script.
  • Sent data from flat file's to Import tool and Webservices. Take the response and Update the Stagging Table whether it’s Processed, Pending or Failed.
  • Tested Case Preparation for any one particular scenario/used case and Error Handling technique.
  • Worked in Agile Methodology Environment with Daily Standup meetings for status Report.
  • Administrated by creating user Logins with appropriate roles, granting and denying login privileges, monitor the user accounts, creation of groups.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions and Workflows.
  • Schedule and maintain routine Jobs, Tasks, and Alerts using ESP Schedular.
  • Constructed views and store procedures by using PL/SQL Queries in TeraData to bring data into database based on the design.
  • Identified, tested and resolved Production issues (monitoring and tuning) to ensure optimization.
  • PerformedTroubleshootingandPerformance Tuningfor database systems.
  • Involved in the design, configuration and Production Support with offshore team for daily databasebackupsmonitored andlogged backup jobs, and data restoration when required.

Environment: Oracle 11g, Oracle Sql Developer, Toad, IDQ MDM HUB,DT STUDIO, SQL Server 2008/2010, Informatica Power Center(8.6.1, 9.1), InformaticaPowerPlug (Siebel CRM, SAP Tools) Microsoft Visio, Putty, WinSCP.

Confidential, Ashburn, VA

Sr. Informatica Admin & Developer

Responsibilities:

  • Performing data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis, Data Quality (IDQ) MDM Hub solutions.
  • Created Logical Data Models from the Source System study according to Business requirements on Erwin 4.1.
  • Performed regular Administration activities from Managing system resources, users and database to the fine-tuning level and implemented the project successfully.
  • Install and Configure SQL database server 2005 and 2008 for development and production environment.
  • Worked on Database Maintenance, Backups and Recovery Plans.
  • Expertise in configuration, performance tuning, installation of Informatica, & in integration of various data sources like Oracle, MS SQL Server, and XML, Flat files into the staging area and Design ETL processes that span multiple projects.
  • Administer Informatica, MS SQL server by creating user Logins with appropriate roles, granting and denying login privileges, monitor the user accounts, creation of groups, granting database and application roles to users and groups.
  • Perform Tuning long running queries and slow running servers through SQL Profiler and Execution plans.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Schedule and maintain routine Jobs, Tasks, and Alerts using AppWorx.
  • Assisted with scripting and monitoring interfaces between UNIX scripts, DB2 and Oracle.
  • Monitor SQL Error Logs/ Schedule tasks/data activity/ eliminate blocking and deadlocks/ user counts and connections/ locks.
  • Perform periodic checking of databases with OBIEE for maintaining data consistency and integrity and rebuilt indexes.
  • Constructed PL/SQL by using cursors, triggers and procedures to bring data into database based on the design.
  • Created, optimized, reviewed, and executed Teradata SQL queries and Store Procedures to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Identified, tested and resolved database performance issues (monitoring and tuning) to ensure database optimization.
  • PerformedTroubleshootingandPerformance Tuningfor database systems.
  • Done extra hours in On Call Production Support of daily migrations with offshore time in India.

Environment: Oracle 9i, PL/SQL, SQL server 2005/2008, Informatica 9.1 with hot fix1, Informatica PowerExchange (Siebel CRM, SAP Tools), IBM Datastage 7.5, DataQualityTeradata 12, Teradata Utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD).

Confidential, Oaks, PA

Database Lead Developer

Responsibilities:

  • Technical Lead in the analysis, design, coding and testing of the data analysis and warehouse build phase with TimeScape Workbench.
  • Played Major role in Installation, Administration and Configuration of TimeScape Workbench in Development Environment.
  • Responsible for the design of mapping and delivering the Business data (Bloomberg) for the data warehouse from existing data sources (Request Builder FTP Site).
  • Created Database Architecture using the TimeScape WorkBench 4.0 and Microsoft Excel 2007.
  • Separated the data flow into various categories according to the type of data.
  • Connected TimeScape to SQLServer 2008 Database for Administration and DataView.
  • Created Complex mappings with Task Definition File (TDF) using XTFileImporter (GUI) to import the Bloomberg files.
  • Involved in the Informatica server and client installation and set up the environment.
  • ETL Informatica was used to minimize the size of the Bloomberg Files using SEI Holding’s (Geneva and InvestOne) for filtration.
  • Involved in Performance Tuning of Mappings, Parameters, Templates and configured sessions.
  • Developed reusable TDF’s to load data from flat files and other data sources to the MDM Hub DataQuality (IDQ).
  • Scheduled using Batch files for the import process of daily .Dif’s in Development.
  • Imported .OUT files once in a month to make sure Database is in Sync.
  • Extensive documentation on the design, development, implementation, daily loads and process flow of the data.
  • Experienced as a SME (Subject Matter Expert) by presenting a common data model that unifies the behavior of each database into one consistent mode of operation.
  • Worked in Agile Methodology Environment with Daily Standup meetings for status Report.
  • Customized data by adding calculations, summaries and functions.
  • Trapping and resolving Data Errors in Production Support.
  • Assisted operation support team (CTS) for Bloomberg data loads in developing AppWorx Job schedules and Job failures.
  • Responsible for Error handling, bug fixing, Session monitoring, log analysis.
  • Helped E-delivery team in Production to create the TimeScape Environment Successfully.
  • Expert in understanding the Business requirements and extracting the input from the vendors.
  • Expert in maintaining positive relationships with the vendors and client’s in work environment.

Environment: Xenomorph TimeScape Workbench 4.0, Informatica Power Center (8.6.1, 9.1) MDM Hub, Abinitio, Microsoft Excel 2007/2010, Business Objects XIR 2/XIR3, Qlickview OBIEE, Bloomberg Data License Request Builder 5.0.19, XTFileImporter (GUI), SQLServer 2008/2010, MetaFrame Presentation Server Client(citrix), WindowsServer 2003, VMWare vSphere Client.

Confidential, Aloha, OR.

Interface Developer

Responsibilities:

  • Involved in the Informatica server installation and set up the environment.
  • Worked on developing UNIX scripts for data cleansing and data archiving.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Implemented slowly changing dimension (SCD) Type 1 and Type 2 forChange data captureusing Version control.
  • Involved in Creation of SQL, Packages, Functions, Procedures, Views, and Database Triggers.
  • Developed database monitoring and data validation reports in SQL Server Reporting Service (SSRS).
  • Migrated DTS objects to SQL Server Integrated Services (SSIS).
  • Designed and Developed ODS to Data Mart Mappings/Sessions/Workflows.
  • Created various Oracle database objects like Indexes, stored procedures, Materialized views, synonyms and functions for Data Import/Export.
  • Created reusable worklets and workflows.
  • Also involved in Client-Server applications using MVVM, MVC pattern, WCF, Developing in Asp.net and Silverlight 3.0.
  • Involved in Debugging and Troubleshooting Informatica mappings.
  • Populated error tables as part of the ETL process to capture the records that failed the migration.
  • Used CDC for moving data from Source to Target.
  • Designed ETL process using MDM Hub DataQuality (IDQ) to load from Sources to Targets through Data Transfer.
  • Involved more in Production Support Team from India
  • Developed test cases for Unit, Integration and system testing.
  • Experienced the Retail Industry with its manufacturing and sales of Confidential Chips.
  • Involved in Maintaining the Repository Manager for creating Repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environments.
  • Partitioned the Sessions for better performance.
  • Designed of ETL mappings for the CDC change data capture.
  • Wrote SQL Scripts and PL/SQL Scripts to extract data from Databases.
  • Expertise in UNIX Shell and PERL scripting using PMCMD utility.
  • Prepared Technical Design Documentation (TTD) on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 8.6.0/8.1.1 , Power Exchange, Power Plug, Erwin 4.0, UNIX Shell Scripting, Oracle 9i/10g/11g, PL/SQL, IIS 7.0,SQL Server 2005/2008, Teradata SQL,Korn Shell Scripting, MULTILOAD, DATAQUALITY(MDM HUB), TFS, Windows 2008, TOAD 9.7.2,Tibco.

Confidential, Chicago, IL.

ETL Administrator/Developer

Responsibilities:

  • Performed major role in understanding the business requirements, designing and loading the data into data warehouse using (ETL).
  • Used ETL (Informatica) to load data from source to ODS.
  • Participated in Performance Tuning of ETL maps at Mapping, Session, Source and Target level as well as writing Complex SQL Queries from ABSTRACT Data model.
  • Used techniques like source query tuning, single pass reading and caching lookups to achieve optimized performance in the existing sessions.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
  • Used Informatica’s Data Migration to decrease the risk and minimize the errors.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements.
  • Responsible for performance tuning for several ETL mappings, Mapplets, workflow, session executions.
  • Database Design (Relational and Dimensional models) using Erwin.
  • Strong in UNIX Shell and PERL scripting.
  • Used UNIX Shell scripting for automation of the process, invoking PL/SQL procedures, and Informatica sessions.
  • Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
  • Designed ETL process using Power Exchange Tool to load from sources of enterprise to SAP Tools through data Transformations.
  • Developed various Reports using SQL Server Reports and crystal reports.
  • Worked in Agile Methodology Environment with Daily Standup meetings for status Report.
  • Worked extensively on distribution, maintenance, and optimization of universes for Business Objects, Qlickview and Web Intelligence deployments.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Participated in the development of catalogs, multiple queries with complicated reports and query definition files using various join and filter conditions in Impromptu.

Environment: Informatica Power Center 8.x/7.x, DB2, Erwin 4.0, UNIX Shell Scripting, Oracle 9i/10g, PL/SQL, Microsoft BizTalk Server 2006, IIS 6.0, TFS, Windows 2003, SQL Server 2005/2000, Teradata SQL,Korn Shell Scripting, Siebel CRM, SAP Tools, Cognos Report Net MR1.1.

Confidential, Boston, MA

ETL Informatica Administrator/Developer

Responsibilities:

  • Involved in design, development and implementation of ETL process in power center.
  • Responsible for managing, scheduling and monitoring the workflow sessions.
  • Developed Transformation Logic to cleanse the source data of inconsistencies before loading the data into staging area which is the source for stage loading.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements.
  • Used Informatica Data Transfer and Data Exchange to integrate data safely with the internal systems.
  • Responsible for performance tuning for several ETL mappings, Mapplets, workflow session executions.
  • UsedData Analyzerto track and tune data flowperformance,enrich and speed integration.
  • Used Data Profiling to minimize the riskof data quality problems proliferating throughout the organization.
  • Extensively worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter, Sequence Generator, Union, XML, etc.
  • Involved in conceptual, logical and physical data modeling and used star schema in designing the data warehouse.
  • Responsible for Error handling, bug fixing, Session monitoring, log analysis.
  • Implemented various Data Transfer using Slowly Changing Dimensions.
  • Worked on Power Mart client tools viz. Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developers.
  • Wrote a number of stored procedures to validate the complex mappings and automated the validation process.
  • Involve in Designing of logical Data model using Erwin Tool.
  • Performed optimization of SQL queries in SQL Server.
  • Imported Data from MS-Access Database and XML to SQL Server Database using DTS.
  • Generate scripts for Data Migration and Data Validations as specified in the technical design.
  • Installation and Configuration of ETL tool Pentaho for creating Data ware housing.
  • Written SQL Scripts and PL/SQL Scripts to extract data from Databases.
  • Developed UNIX scripts and PERL scripting using PMCMD utility.
  • Migrated DTS Packages to SSIS packages.
  • Strong leader with experience in SSIS training, advising developers on SSIS tuning.
  • Involved in the optimization of SQL queries which resulted in substantial performance improvement for the conversion processes.

Environment: Informatica - Power Center 8.x, Oracle 9i/10g, MS SQL Server, SQL, PL/SQL, UNIX Shell Scripting, SSIS, Microsoft BizTalk Server 2006, IIS 7.0, TFS, Windows 2008, Teradata, Sybase, Cognsos8.1,Tibco.

Confidential, Kansas City, MO

Informatica Admin and Developer

Responsibilities:

  • Involved in the analysis, design, coding and testing of the data analysis and warehouse build phase (ETL).
  • Responsible for the design of mapping and delivering the Business data for the data warehouse from existing data sources.
  • Created Complex mappings with transformations involving expressions, lookup, rank, formalizer, update strategy and stored procedure.
  • Developed Mapplets, Mappings and configured sessions.
  • Developed reusable transformations to load data from flat files and other data sources to the Data warehouse.
  • Installation and Configuration of Designer, ODS, Reports, Portal and Broadcast Server.
  • Developed the system frontend using ASP.NET and HTML and backend using SQL server 2000 as the backend database.
  • Customized data by adding calculations, summaries and functions.
  • Trapping Data Errors.
  • Performance Tuning of Sessions and Mappings.
  • Assisted operation support team for transactional data loads in developing SQL Loader & UNIX scripts.
  • Created reports using Business Objects.

Environment: Oracle 9i, PL/SQL, SQL Plus, Informatica (Power Center 5.1), DB2, UNIX Scripting (Ksh), Business Objects 5.1, Teradata, SQL, Sybase, PowerBuilder, Teradata Utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD).

Confidential

Data Warehouse and Informatica Developer

Responsibilities:

  • Analyzed the systems, architecture, met with end users and business units in order to define the requirements.
  • Involved in the requirement definition and analysis in support of Data Warehousing efforts.
  • Developed ETL mappings, transformations using Informatica PowerCenter 6.x.
  • Extensively used Informatica tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer.
  • Implemented various Informatica transformations like Expression, filter, Aggregator, Router, Ranker, update strategy, joiner, Lookup, Sequence generator, stored procedure and Source Qualifier. To load the data from Staging area to DW.
  • Developed data Mappings between source systems and warehouse components using Mapping Designer.
  • Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data.
  • Responsible for creating DLL for report design in HTML format and calling them from ASP pages to provide more security for Crystal Reports.
  • Involved in the Performance Tuning.
  • Expertise in UNIX Shell and PERL scripting using PMCMD utility.
  • Created and ran sessions and workflows using Workflow Manager.
  • Monitored the sessions and workflows using Workflow Monitor.
  • Trouble shooting of connectivity problems. Looked up and read session, event and error logs for troubleshooting.
  • Helped in generating reports using Cognos Powerplay.

Environment: Informatica PowerCenter 6.x, Cognos 6.x, Oracle 9i, Erwin, UNIX Shell Scripts, TOAD, SQL * Loader, IIS, Windows NT, Tera data.

We'd love your feedback!