We provide IT Staff Augmentation Services!

Informatica Integration Developer Resume

5.00/5 (Submit Your Rating)

Houston, TX

SUMMARY

  • 13+ years of Professional experience as Software developer including over Nine Plus years of experience in designing and development of Data warehouse/ Data Migration/ Data Conversion projects as Informatica ETL consultant and client/server applications.
  • Carried out Full Data warehouse life cycle in the project, requirement analysis, Modeling and designing, database and end user application development.
  • 12+ years experienced Informatica ETL expert with experience in online analytics, system analysis, creating technical design documents, project plans, managing Data Warehouse projects.
  • Strong understanding of Conceptual Data Modeling (CDM), Logical Data Modeling (LDM), Physical Data Modeling (PDM).
  • Creating data models, building Informatica ETL processes, dimensional data modeling primarily star schema, performing data migration, defining user interfaces, executing large projects, assuring quality and deployment.
  • Developed staging areas to Extract, Transform and Load new data from the OLTP database to warehouse.
  • Strong in Dimensional Modeling, Star/Snow Flake Schema, Extraction, Transformation & Load and Aggregates.
  • Strong in converting business requirements to project documentation and technical specifications.
  • Extensive experience in using Business intelligence report tools like Business Objects, Cognos BI Suite for warehouses and data marts, OBIEE etc.
  • Sound knowledge of Data warehousing concepts and Informatica ETL tool.
  • Excellent communication/Presentation, communication, interpersonal presentation and analytical skills.

TECHNICAL SKILLS

Languages: C, Java (jdk1.2, Servlets, JSP), J2EE, SQL, PL/SQL, T - SQL, NZ-SQL.

Data warehousing Tools: Informatica Power Center 9.6.1/9.5.1/9.1.0/8.6.1/8.5.1/7.1, Informatica Power Exchange, Informatica Metadata Manager, Informatica Data Quality, Informatica Data Explorer, DataStage Server Edition, Talend Open Studio, Oracle Data Integrator, Talend data profiler, Business objects 5.1/6.5, Cognos BI suite 6.0/7.2.Reportnet1.1, QlickView (Netezza), OBIEE.

Data Modeling Tools: Erwin, Embarcadero.

RDBMS: Oracle Exadata, Oracle 11g/10g/9i/8i/8.0, DB2, Oracle EBS 11g, Siebel 7.8, Ms SQL Server 2012/2008 R2/2005, Sybase, Teradata (BTEQ, Fast load, multi load, TPump, SQL Assistant, Fast Export), Netezza TwinFin 3/6/skimmer.

Script Languages: Perl, Script, Java script, korn shell script, Application Server IBM WebSphere 5

Others: Citrix, Telnet, PLSQL developer, Toad 9.7.2/6.5, T-Pump, ISQL, Workbench Aginity, Management Studio, Visio pro, COBOL, IMS, VSAM

Operating System: Windows XP / Windows 2000, WinNT, UNIX (AIX, HP, SCO), Linux, Sun Solaris

PROFESSIONAL EXPERIENCE

Confidential, Houston, TX

Informatica Integration Developer

Responsibilities:

  • Worked on Requirement Gathering and Business Analysis
  • Analyzed the data models of legacy implementations, identifying the sources for various dimensions and facts for different data marts according to star schema design patterns.
  • Worked with multiple sources such as Relational Tables, Flat Files, Excel Sources, ERP (SAP-BW) sources for Extraction using Source Analyzer/PowerExchange
  • Extensively involved in the Analysis, Design and Modeling. Worked on Star Schema, Data Modeling, Data Elements.
  • Converted views, materialized views, stored procedures, functions from Oracle to Teradata.
  • Created Informatica mappings/sessions/workflows for data migration from Oracle to Teradata.
  • Created CDC (change data capture) sources in Power Exchange and imported that into Power Center
  • Designed the ETL process for extracting data from heterogeneous source systems, transform and load into Data Mart.
  • Worked on creating the Informatica jobs to load the data into the S3 buckets of AWS.
  • Used Informatica power connect for AWS Red Shift to load the data.
  • Validated the Red Shift target data by using Red Shift workbench aginity.
  • Optimized the API to get data and loaded into Red Shift.
  • Created Logical and Physical models for production using Erwin 3.5.
  • Worked with Informatica Power Connect to get data from PeopleSoft XLATTABLE table and modules like General Ledger, Accounts Payable, Accounts Receivable, and Asset Management.
  • Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
  • Performance tuned the workflows by identifying the bottlenecks in targets, sources, mappings, sessions and workflows and eliminated them.
  • Provided production support including error handling and validation of mappings, sessions and workflow.
  • Extensively used Debugger Process to test data and applying Break Points while Session is running.
  • Provided production support for Business Users and documented problems and solutions for running the workflow.
  • Developed UNIX scripts for scheduling the jobs.
  • Designed and developed Oracle PL/SQL Procedures.
  • Performance tuning of Oracle PL/SQL Scripts
  • Worked on SpotFire reports for Flare Forecasting and Emissions.

Environment: Informatica Power Center 10.1.0, Amazon Webservices (AWS) EC2 Instance, SoftNAS, AWS RedShift, SpotFire, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC), SQL Server 2014, Teradata 15.10.1.3, Teradata SQL Assistant, Teradata Studio, SQL Server Management studio, Sun Solaris, Windows XP, Autosys.

Confidential, Wayne, NJ

Informatica Architect/ Developer

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Worked on populating the SAP objects like Customer Master, Material Master, Vendor Master etc.
  • Created views in Oracle for SAP data pre-validations.
  • Involved in the development of Informatica mappings and tuned for better performance.
  • Created the load readiness reports (LRR) to load into SAP.
  • Created scorecards for redundant data and created validation rules using Informatica.
  • Did data profiling for the column/across table data validation.
  • Parsed the target data in IDQ using parser transformation.
  • Worked on Data integration from various sources like JDE-AUTO, JDE-IND, Marine, Vessels etc into SAP.
  • Maintained contact information in SFDC using Informatica CC 360.
  • Worked on creating the master beans for contact in SFDC using Informatica CC 360.
  • Created the SAP lookup tables for cross reference in Oracle.
  • Configured Power Connect for SAP and maintained sap.ini file entries.
  • Retrieved data from SAP ECC and installed/configured ABAP code.
  • Configured LDAP connector for Informatica in administration console.
  • Created the Sap ABAP data dictionaries and mapped underlying relational tables and views.
  • Created SAP ABAP data classes and indexes.
  • Worked on address validations in SFDC in consolidated tab of Informatica CC 360.
  • Converted/managed LEADS of SFDC using Informatica CC 360.
  • Worked with transparent and pooled tables of SAP ABAP.
  • Extensive experience on Designing, Managing and administrating MDM/DIW objects.
  • Involved in Data integration keeping all success factors into consideration.
  • Success factors of integration includes, historical data, data archiving strategy, cache sizes calculations on the server etc.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Retrieved AUTO data from DB2.
  • Worked extensively in PL/SQL to migrate the data from DB2 to Oracle database.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Production Support and issue resolutions.
  • Integrated data from DB2 to Oracle.
  • Validated the source queries in DB2.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.

Environment: Informatica Power Center 9.5.1 HF2, Informatica Data Quality 9.5.1, SAP ECC 6.0, Oracle 11 g, SQL Server 2014, DB2, PL/SQL, Linux, Putty, Winscp.

Confidential, Cary, NC

Informatica CDC Consultant

Responsibilities:

  • Evaluated RailInc’s Informatica Power Center and Power Exchange for Oracle CDC architecture.
  • Created the architecture, best practices and ETL standards documents.
  • Worked on creating the error and performance matrix reports from Informatica repository.
  • Generated report to check the data synchronization into multiple systems for resync process.
  • Created the report for the Informatica CDC workflow’s performance.
  • Validated the data from the source DB2 and loaded into Oracle for 650 tables of DB2.
  • Created base tables in DB2 for multidimensional clustering table.
  • Retrieved data from PeopleSoft HR and loaded into Oracle.
  • Worked on creating the dimensional model for Rail Road Network using Ralph Kimball methodology, identifying attributes of dimensions and facts for each subject area.
  • Used ERWin tool for dimensional modeling.
  • Updated PeopleSoft HR Employee, Contingent Worker, POI with/without jobs etc.
  • Retrieved data from PeopleSoft HR for hierarchy information for Pension payees, stock board members, stock non-HR members etc.
  • Created the unit testing, STM document templates for future development.
  • Created the road map for future developments, creation of versioned repositories, usage of shared folders, usage of parametrization etc.
  • Created the data quality road map for the source systems.
  • Created the report for the Informatica CDC workflow’s performance.
  • Worked on Creating Data quality customized plans for data harmonization, cleansing, profiling using Analyst tool etc.
  • Installed/configured address doctor for Informatica data quality.
  • Configured LDAP authentication for Informatica analyst logins.
  • Used numerous transformations of Informatica data quality like match/merge, case converter, standardizer, labeler etc.
  • Automated some of the IDQ plans by incorporating it into Power Center as mapplets.
  • Created scorecards for the source data for data redundancy, duplicity etc.
  • Created exception management process flow for bad records and duplicate records.

Environment: Informatica Power Center 9.6.1, Informatica Data Quality 9.5, Informatica Power Exchange for Oracle CDC, PeopleSoft HR, IBM DB2 10.5, Oracle 11g, SQL Developer, WinSCP, Putty, Linux, shell scripting.

Confidential, Dallas, TX

Informatica Architect/Admin/Developer

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Created the ETL performance expectations document based on the source data profile results.
  • Captured the data volumes, upserts/truncate and load strategies etc in Integration design document.
  • Incorporated the refresh strategy, maintaining the historical data, archiving strategies for the source flat file, Audit balance and Control (ABC) etc in Integration design document.
  • Created the technical architecture (Hardware and Software) that will support ETL.
  • Configured Informatica Power Center GRID on Linux platform.
  • Assigned master and worker nodes to GRID in Informatica platform.
  • Created the Informatica data quality plans, created rules, applied Rules to IDQ plans and incorporated the plans as mapplets in Informatica Power Center.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Created High Level and Low-Level design document and ETL standards document.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Installed and configured Informatica 9.5.1 HF3 on Red Hat platform.
  • Wrote shell script to take repository backup on a weekly basis and archiving the 30 day old files on Red Hat.
  • Created the Visio diagram
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
  • Worked extensively on Autosys using the CA workload center and JIL Checker.
  • Scheduled Informatica jobs using Autosys.
  • Created dependencies in Autosys, inserted/updated jobs Autosys on CA Workload Center.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2012.
  • Worked on Informatica BDE for retrieving data from Hadoop’s HDFS file system.
  • Solved T-SQL performance issues using Query Analyzer.
  • Optimized SQL queries, sub queries for SSRS reports.
  • Created the SSRS reports with multiple parameters.
  • Modified the data sets and data sources for SSRS reports.
  • Retrieved data from Oracle EBS and loaded into SQL Server data Warehouse.
  • Worked with the Oracle EBS tables like GL CODE COMBINATIONS, GL LEDGER, GL PERIODS, GL JE SOURCES TL, AP CHECKS ALL, AP INVOICE ALL, PO HEADERS ALL, PO LINES ALL, RA CUSTOMER TRX ALL, SO LINES INTERFACE ALL etc.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Performance tuned Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
  • Developed SSIS packages and migrated from Dev to Test and then to Production environment.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2012.
  • Created the SFDC, Flat File and Oracle connections for AWS Cloud services.
  • Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
  • Standardized the T-SQL stored procedures per the organizations standards.
  • Applied try/catch blocks to the T-SQL procedures.
  • Used merge statement in T-SQL for upserts into the target tables.
  • Made changes to SSRS financial reports with user’s input.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Involved heavily in creating customized Informatica data quality plans.
  • Worked with address and names data quality.
  • Used Proactive monitoring for daily/weekly Informatica jobs.
  • Customized the proactive monitoring dashboard with the Informatica repository tables like OPB SESS TASK LOG etc.
  • Resolved Skewness in Teradata
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Wrote BTEQ scripts of Teradata extensively.
  • Installed configured Amazon redshift cloud data integration application for faster data queries.
  • Created JDBC, ODBC connections in Amazon redshift from the connect client tab of the console.
  • Automated the administrative tasks of Amazon redshift like provision, monitoring etc.
  • Aware of the columnar storage, data compression, zone maps of Amazon redshift.
  • Extracted data from complex XML hierarchical schemas for transformation and load into Teradata and vice versa.
  • Resolve syntax differences from Teradata to Oracle and documented it.
  • Scheduled the workflows to pull data from the source databases at weekly intervals.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Created the FTP connection from Tidal to the source file server.
  • Retrieved data from XML, Excel, and CSV files.
  • Archived the source files with timestamp using Tidal Scheduler.
  • Performance tuning on sources, targets, mappings and database.
  • Worked with the other team such reporting to investigate and fix the data issues coming out of the warehouse environment.
  • Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes.
  • Worked with business to provide them daily production status report in the form of issues, their priority and business impact along with recommended short term and long term solution.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica BDE, Amazon web services (AWS) cloud, Amazon RedShift cloud data integrator 10, Business Objects, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC), SQL Server 2008 R2/2012, DB2 8.0/7.0, Team Foundation Server, SQL Server Management studio, Sun Solaris, Windows XP, Control M.

Confidential, Houston, TX

Informatica Architect/Developer

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End User Meetings
  • Responsible for Business Requirement Documents BRD's and converting Functional Requirements into Technical Specifications.
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Used Informatica MDM (Siperion) tool to manage Master data of EDW.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Pulled data from Epic Clarity, EDI and HIPAA compliance files (X12 files).
  • Loaded data to EDI system and clinical documentation.
  • Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
  • Did change data capture (CDC) by using the MD5 function of Informatica.
  • Created the RPD for OBIEE.
  • Created custom IDQ plans and incorporated it into Power Center as mapplet.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Extracted data from various heterogeneous sources likeOracle, SQL Server, andFlat Files and loaded intoDataMart’susingInformatica.
  • Resolved Skewness in Teradata.
  • Optimized OBIEE dashboard queries.
  • Used Informatica Data Quality (IDQ) to format data from sources and load it into target databases according to business requirements.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created the visio diagram to show the job dependencies for Control-M jobs.
  • Inserted/updated the Informatica jobs in Control-M by invoking the pmcmd utility of Informatica.
  • Wrote BTEQ scripts of Teradata extensively.
  • Created custom plans for product name discrepancy check using IDQand incorporated the plan as a Mapplet into Power Center.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Loaded data from flat files to Big Data (1010 data).
  • Did bulk loading of Teradata table using Tpump utility.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Extensively used various active and passive transformations likeFilter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation,Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Responsible for best practices like naming Conventions, Performance tuning, andError Handling
  • Responsible for maintaindata qualityanddata consistencybefore loading into ODS.
  • Responsible forPerformance Tuningat the Source level, Target level, Mapping Leveland Session Level
  • Created business objects universes.
  • Created denormalized BO reporting layer for BO reports.
  • Solid Expertise in using bothConnectedand Un connectedLookup transformations
  • Extensively worked with various lookup caches likeStatic Cache, Dynamic Cache, and Persistent Cache
  • Responsible for determining thebottlenecks and fixing the bottlenecks withperformance tuning.
  • Used Update Strategy DD INSERT, DD DELETE, DD UPDATE, AND DD REJECT to insert, delete, update and reject the items based on the requirement
  • Worked withSession Logs, andWorkflow Logsfor Error handling and Troubleshooting in all environment
  • Responsible for Unit Testing and Integration testing of mappings and workflows.

Environment: Informatica Power Center 9.5.1/9.1.0 HF1, Informatica MDM,SAP BW,Big Data 1010 data., Teradata 14, Oracle 10g, MS SQL Server 2012,SAP BW/IDoc, Control-M, TOAD, SQL, PL/SQL, BO XI, OBIEE, Windows XP, UNIX

Confidential, Houston, TX

Sr. Data Warehouse Consultant

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Upgraded Informatica from 9.5.1 to 9.6.1 on Linux servers for Dev/Test and Prod environments.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Migration of data from Oracle 11g to Oracle Exadata
  • Created Oracle Exadata database, users, base table and views using proper distribution key structure.
  • Worked on Data integration from various sources into ODS using ODI (oracle data Integrator)
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Calculated the KPI’s and worked with the end users for OBIEE report changes.
  • Created the RPD for OBIEE.
  • Developed mapping parameters and variables to support connection for the target database as Oracle Exadata and source database as Oracle OLTP database.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Worked with Crontab for job scheduling.
  • Production Support and issue resolutions.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.
  • Written Unix Shell Scripting for repository backups, job scheduling on Crontab etc.

Environment: Informatica Power Center 9.6.1/9.5.1 HF2, Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1, SAP R3, ODI (Oracle Data Integrator), SQL Server 2008 R2, Oracle 11 g, Oracle Exadata, OBIEE, PL/SQL, Linux, Putty, Winscp.

Confidential, Houston, TX

Sr. Informatica Architect/ T-SQL Developer

Architectural Responsibilities:

  • Created the architectural design of the Informatica /ETL and Data warehouse.
  • Applied patches to Informatica Servers.
  • Worked with SQL DBA’s on collation change on SQL Server 2012.
  • Applied Sales Force license to the domain for development and production environments.
  • Created Informatica best practices document, mapping/Unit testing document templates.
  • Communicated with the networking team on ETL server upgrades to space, memory and processors.
  • Maintained Informatica servers to make sure integration services, repository and servers are up and running and also coordinated with networking teams for server reboots etc.
  • Created ETL auditing reports for error handling/validations etc against Informatica ‘OPB’ repository tables.
  • Installed/Configured Informatica 9.1.0 HotFix1 on Development server.
  • Installed/Configured Informatica HotFix3 on development/production environment.
  • Applied EBF on Informatica server for SQL Server native client 11.0.
  • Production support for the daily/weekly and monthly loads.
  • Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
  • Configured the SFDC application connection to fulltest and production.
  • Configured JD Edwards Power Exchange for Informatica.
  • Standardized the parameter file location for each project in BWParam folder of Informatica.
  • Deployed Informatica workflows from Development
  • Deleted old repository backup files and purged deleted objects from repository in both development and production environments.
  • Supported the development team on ETL standards, naming conventions, best practices and folder structures.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Folder migration from Development to UAT to Production.
  • Created shared/regular folders for Projects and personal development.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite.
  • Designed and developed Reports for the user Interface, according to specifications given by the Track leader.
  • Worked on converting the older ETL jobs like cast Iron, CA ADT for getting the data into data warehouse.
  • Involved in performance tuning at source, target, mapping and session level.
  • Loaded Oracle tables from XML sources.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS
  • Introduced the concept of Data Dashboard to track the technical details sighting the continuous requirement changes and rework needed.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Retrieved data from SAP using Informatica Power Exchange.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Resolved Skewness in Teradata.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Used Informatica web services to create work requests/work Items for the end user.
  • Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Created Stored Procedures to validate and load the data from interface tables to the Oracle E-Business Suite internal tables.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Staged data in Oracle E-Business Suite stage tables using Power Center in Informatica.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1,Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1,Informatica Data Explorer 8.6, SQL Server 2008 R2/2012, Oracle 10g, Oracle 12.0.4/5 EBS, JD Edwards, SAP, Sales Force dot com (SFDC), Informatica/windows scheduler, Force.com, Teradata 13, OBIEE, PL/SQL, Windows Server 2008 R2 Standard.

Confidential

Sr. Informatica Administrator/Developer/Data Analyst

Responsibilities:

  • Installation & Configuring Informatica 8.6.1 on UNIX (SUNos 5.11), also configuring the High availability option for the same.
  • Configuring the domain with various application services like repository services & integration services.
  • Purge deleted objects from the repository in the DEV, and QA environment. The repository is approx. 850MB and growing. In order to manage the growth of the repository, the purging of deleted objects must be performed as part of the normal administrative maintenance (1 hour every other day).
  • Performed Informatica PowerCenter HotFix upgrade / TOMCAT upgrade to meet the Organization security standards.
  • Creating & maintaining user, groups for Development repository and Quality Check.
  • Maintaining the repository & purging the repository objects & Archiving Periodic repository backup.
  • Workflow errors that require Service Request to Informatica for problem resolution. This sometimes requires backing up and zipping the repository, and FTP uploading them to Informatica FTP site. Informatica will examine repositories for inconsistencies, and fix them. Downloading fixed repositories, and restoring them in the troubled environment.
  • Revised the standards for the ETL Design
  • Involved in Gathering and Analyze business requirements for the RS One conformed to the business rules
  • Design & Customizing the Business process using Informatica transformations including SQL Transformation, Source Qualifier, Lookup, Aggregator (Incremental update), Expression, Joiner, Filter, Router, Update Strategy Transformations.
  • Created Reusable Transformations Logic and Mapplets to use in Multiple Mappings and also worked with shared folders, shortcuts.
  • Developed Informatica mapping to handle dynamic partition i.e., to create/add & Archive partitions of an oracle data base tables.
  • Flexible Implementation of mapping, sessions and workflows using Parameter file/Global parameters and by implementing Informatica Best Practices
  • Worked with various relation and non relational sources like FF (Direct / Indirect), Relational Tables and ERP systems.
  • Created SSIS packages to get the data from AS 400 (.csv files) into SQL Server 2008.
  • Used the data conversion transformation in SSIS to get the correct datatypes into SQL Server database.
  • Loaded data into Interface tables of Oracle EBS.
  • Wrote extensive validation scripts before loading the data into Oracle EBS.
  • Configured SAP IDoc connector for Informatica.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Did bulk loading of Teradata table using TPump utility.
  • Retrieved data from SAP IDocs using Informatica connector.
  • Imported data from AS400 to be loaded into SQL Server 2008 into the dbo schema through import wizard and stored it as an SSIS package.
  • Created SSIS packages for importing the XRef tables for Purchase orders conversion.
  • Developed and configured various mappings and workflows for reading and writing the data to JMS (JAVA Message Service) Queues, using Application Source qualifier.
  • Significantly Involved in the analysis and Implementation of Performance Tuning techniques for SQL, Transformations, Mappings, Session
  • Developed & Tuned DML/DDL SQL’s to implement Data modeling changes, Also developed T-SQL and PLSQL procedure to handle the Informatica JOB metadata.
  • Providing the Functional and Technical specification for designing customized workflows & their automation.
  • Implementing MDM and Maintaining data Integrity by eliminating the redundancy.
  • Flexible Implementation of mapping, sessions and workflows using Parameter file/Global parameters and by implementing Informatica Best Practices
  • Change data capture by multiple level scheduling of session and workflow.
  • Developed various shell scripts using pmcmd command line program, to execute and maintain the workflow jobs.

Environment: Informatica PowerCenter 8.6.1/8.1.1, Erwin Data Modeler, Teradata, SQL Server 2008 R2, SSIS, SSRS, Oracle 11g/10g, Oracle 11.X (EBS), SQL Server Management studio, TOAD 9.6, SAP IDoc, OBIEE Reporting, Windows NT/2000, UNIX SUNos 5.11, HP Load Runner 9.50.

Confidential

Sr. Informatica Lead/Administrator

Responsibilities: -

  • Created Integration Requirements and Design Specification Document.
  • Provided architectural design for Informatica.
  • Defined and designed flow and standards for Informatica.
  • Presented all of the Informatica tools to the Client, its usage, advantages and disadvantages for them to make up their mind to proceed with specific tools of Informatica.
  • Extracted data from SalesForce legacy system, SalasVision, Charles River (Trading Platform).
  • Documented ETL requirements translating STM’s Business logic into ETL language.
  • Created Projects, jobs in Talend Open Studio.
  • Used Basic run, debug jobs and used metadata wizard etc in Talend Open Studio
  • Lead the offshore GDC team of ETL Developers providing them with in depth understanding of the Architecture, ETL system design and requirements.
  • Provided the real time solutions with Informatica mappings for the traders to instantaneously react to market opportunities.
  • Did extensive analysis for advanced trading analytics for drill down capability (by trader, portfolio etc).
  • Analyzed data from commodity exchanges (ICE and NYMEX) and pricing sources (LIM and Platts).
  • Created ETL mappings for identification of arbitrage opportunities, optimize a portfolio in real-time, simulate transactions and automatically execute trade strategies with live feeds.
  • Worked closely with data population developers, multiple business units and a data solutions engineer to identify key information that will enhance business decision-making.
  • Used Informatica data explorer tool (IDE) for data profiling.
  • Loaded the relational tables for trade decision support which is consumed by the dashboard for trade decisions.
  • Involved in designing the Data warehouse based on the requirement document using Informatica Power Center 8.6.1.
  • Created stored procedure to be called in Informatica for the nexval from Dual table of oracle.
  • Created reusable expression transformation for Meta columns of Standardization area.
  • Masked data and populated to the limited trust zone using Data masking transformation of Informatica.
  • Used SQL, Stored Procedure.
  • Used Exceed tool for scheduling the Autosys jobs.
  • Debugged and corrected mappings created by GDC.
  • Fixed numerous bugs with Testers inputs.
  • Created Visio documents for Autosys Production Schedule.
  • Created Production readiness document.
  • Created Autosys documents for one time, Daily loads of data.
  • Used Exceed to execute the Autosys Jobs.
  • Upgraded Informatica Power Center 8.1.1 SP4 to 8.6.1.
  • Configured Informatica Power Exchange add on for SAP (Power Connect)
  • Created Cognos Cubes and developed Cognos reports
  • Used Informatica Data Quality tool for Standardization by referring to the database dictionary tables and populating the flat file dictionaries.
  • Good knowledge of Oracle major upgrade from 10.2.0.4 to 11g
  • Worked with tools - Source Analyzer, Warehouse designer, Transformation and Mapping Designer, Transformations developer, Informatica Repository Manager and workflow Manager and Informatica workflow monitor.
  • Read CSV and Tab delimited file and worked with code page.
  • Created .CSV files from excel spreadsheets and loaded into the target Oracle Database.
  • Worked with memory cache for static and dynamic cache for better throughput of sessions containing Rank, Sorter, lookup, joiner, Aggregator transformations.
  • Wrote UNIX shell scripts extensively.
  • Mentor and tutor Informatica users on the Power Center product suite.
  • Created deployment groups for each iteration releases.
  • Created labels for deployment groups for migration.
  • Managed tools and services of Informatica.
  • Managed user and folder permissions for the developers.
  • Purged old repository objects weekly.
  • Created shell script for repository backup weekly.
  • Developed data Mappings between source systems to Landing and from Standardization to warehouse components using Mapping Designer.
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner.
  • Did performance tuning on the mappings developed by developers.
  • Wrote PL/SQL Packages, Stored procedures to implement business rules and validations in the actuarial system.
  • Looked up and read session, event and error logs for troubleshooting.
  • Created Informatica unit testing document.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Problem resolution involves resolving the issues raised by the client/actuarial users, validation of the data in the database or in the application functionality.
  • Worked closely with Cognos Developers for building cubes and upgrade from Cognos 8.3 to 8.4.1.
  • Optimizing/Tuning mappings/sessions, indexing and partitioning for better performance and efficiency.
  • Created the reusable transformations for better performance.
  • Design and implement data verification and testing methods for data warehouse.

Environment: Informatica Power center 8.5.1, Informatica Metadata Manager, Informatica Data Quality 8.6.2, Informatica Data Explorer, Oracle 11g/10g, SalesForce, SalesVision, Charles River, Windows 2003/2008, Sun Solaris, Red Hat Linux, SUSE Linux, Talend Open Studio 4.1, Talend Open Data Profiler, XML Sources, Cognos ePlanning 8.4.1, UNIX as Server and Citrix as Client, Exceed Autosys Tool and UNIX Shell Script.

Confidential

Sr. Informatica Developer/Administrator

Responsibilities: -

  • Studied the current OLTP system to understand the existing data structures.
  • Worked with server team regarding storage, memory, event log errors, performance issue, and coordinate outage for maintenance.
  • Used Deployment group migration.
  • Trouble shooting and resolving issues workflow objects.
  • Workflow errors that require service request to Informatica for problem resolution.
  • FTP uploading the Informatica repository to Informatica site and downloading the fixed repository and restoring in troubled environment.
  • BULK INSERT using Transact-SQL statement that implements a bulk data-loading process.
  • Participate in gathering of business requirements and carry out a suitable data Model for Data mart.
  • Involved in preparing technical design/specifications for data Extraction, Transformation and Loading.
  • Worked with XML sources and targets.
  • Lead the team of ETL developers
  • Provided technical feed to the team of developers
  • Worked extensively on Informatica Data Quality (IDQ) for the clean up of addresses in the Customers table.
  • Have very good understanding of Informatica 9 version including new features and data quality.
  • Worked with full transactional drill down capabilities using graphical view.
  • Used Normalizer transformation to normalize and de-normalize the data.
  • Worked with Mainframe/Cobol sources using Normalizer.
  • Involved in major up gradation of oracle from 9.2.0.7 to 10.2.0.3.
  • Involved in Oracle DBA including cold and hot back up.
  • Oracle SQL Tuning, trouble shoot, explain plan etc.
  • Developed various complex mappings with parameter files, Mapplets and Transformations for migration of data from various existing systems to the new system using Informatica Designer.
  • Involved in data migration from compass3 legacy phase to EDW.
  • Used various Transformations like Expression, Filter, Joiner and Lookups for better data massaging and to migrate clean and consistent data.
  • Provide technical/user documentation and training.
  • Developed Pl Sql Triggers.
  • Used Trillium data quality Informatica.
  • Used High Availability for Informatica and push down optimization.
  • Define, create and access data source layouts and target data warehouse and data mart schemas through Informatica Power Mart Designer.
  • Performed various Informatica Administrator tasks including up gradation of Informatica from version 8.5.1 to 8.6.1.
  • Created new folder and assigned user permissions to the members of the group.
  • Did Data migration, Data cleansing and Aggregation while transformation.
  • Used mapplets to implement the business logic to effect easy change management.
  • Conducted Unit and Systems Tests and proven proficiency in documenting the test case results.
  • Written shell scripts to automate the workflows and also to FTP the files.
  • Created, scheduled and monitored work flows and sessions using Informatica Power center Server.
  • Involved in Performance tuning of various queries and stored Procedures using Explain plan.
  • Performed various Informatica Administrative tasks & managed Informatica Repository.
  • Performed the data extraction, transformation with business rules, maintained repositories, created users, user groups and developed business reports.
  • Generated reports using Cognos Impromptu & Power Play.
  • Analyzed and explored the reports using slice and dice and drill down.
  • Created different joins according to requirements & database functions.
  • Created Catalog Prompts, Conditions and Calculations.
  • Created and enhanced existing Power Cube as per user requirements.
  • Prepared the test cases, UTP, review records
  • Supported process steps under development, test and production environment.

Environment: Informatica Power center 7.9/8.X, Informatica Metadata Manager, Windows 2000 Server, Oracle 8.1, MS SQL server 2005, T/SQL, Cognos BI suite6.0, UNIX, Perl script, Shell script.

Confidential

Sr. Informatica Administrator

Responsibilities:

  • Identify tables/Scripts required for full load, daily delta load, and bi-hourly load.
  • Lead the team of Datawarehouse developers and provided technical and logistics guidance.
  • Document ETL process for full, daily and bi-hourly loads.
  • Develop and document ETL validation process.
  • Create detailed design documents and Informatica objects.
  • Very good understanding of the Novell Network.
  • Extracted data from complex XML hierarchical schemas for transformation and load into teradata and vice versa.
  • Used XML source analyzer to extract data from XML files.
  • Used Informatica Data Quality (IDQ).
  • Performed Teradata SQL tuning for HP Neoview.
  • Hands on experience Using ETL platform SQL server integration services (SSIS) 2008 to build innovative ETL based application.
  • Used SQL server reporting services (SSRS) using multiple output formats.
  • Create Informatica extract scripts and execute extract- relational loader.
  • Used IBM Maximo Asset management for planned and unplanned activities.
  • Build Neoview mappings and load the GL data to HP neoview.
  • Data migration from Teradata to HP Neoview.
  • Performed Oracle DBA tasks and performance tuning.
  • Validate data loads per plan and also validate load times.
  • Used CRC incremental logic for complex fact tables.
  • Import data from the data source Peoplesoft HR.
  • Updating the Tivoli scheduler for the daily and bi-hourly loads for HP neoview.
  • Installation and upgradation of Informatica from 8.1 sp2 to 8.1 sp4 with ebf memory patch.
  • Upgradation from Informatica 8.1 sp4 to 8.5 on all the three servers (test, development and production).
  • Informatica fix/fast export.
  • Hands on experience on Data mirror constellar hub and Transformation server.
  • Resolve syntax differences from teradata to neoview and document it.
  • Run benchmarks and complete tuning.
  • Analyze the query performance and designed the loading process schedules.
  • Resolved migration issues across repositories and for individual objects using objimport and objexport tasks.
  • Worked with performance issues of Informatica mappings and tuned ETL mappings for better performance using various techniques and strategies.
  • Maintaining Versioning provided by Informatica tool.
  • Developed UNIX scripts for automation of Informatica jobs.
  • Developed UNIX and PL/SQL scripts for pre and post session processes, creation and dropping the indexes to automate the daily loads.
  • Identified and resolved numerous technical and operational problems in the datamart design and ETL implementation.

Environment: Informatica power center 7.1/8.1 sp2/sp4/8.5, Informatica Power Exchange, Teradata, HP Neoview, unix Shell scripting, queryman, oracle 10.2.0.3, Business objects 6.5, Tivoli scheduler, MS SQL Server.

Confidential

Sr. Informatica Developer

Responsibilities:

  • Assisting in Business Requirements gathering and Analysis.
  • Analyzing the source data coming from Oracle and working with business users and developers to develop the Data Model.
  • Identifying and tracking SCDs, heterogeneous sources and determining dimension hierarchies.
  • Developed a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.
  • Created Informatica mappings with parameters files and transformations (such as Source qualifier, Aggregators, Filters, Router, Joiners, Sequence, Lookup’s, Update Strategy),tasks( such as sessions, command, Event wait, Event raise, Decision) to build business rules
  • Worked on IBM Information server for a different project and have sound knowledge of Datastage.
  • Developed Ab Initio tasks and scripts for other project called purchasing datamart using Ab Initio version 1.15
  • Used Control-M tool for scheduling Jobs.
  • Used Siebel EIM to populate, update, delete data from Siebel database using EIM Tables.
  • Loaded Siebel base tables using Siebel EIM.
  • Extracted data from PeopleSoft HR.
  • Performed Oracle DBA tasks including major up gradation of Oracle from 9.2.0.7 to 10.2.0.3.
  • Done Oracle performance tuning.
  • Oracle hot and cold backups.
  • Hands on experience with cold fusion scripting for the web page connecting to Oracle.
  • Worked with IBM TDW.
  • Creating test cases for Informatica mappings and design documents for production support.
  • Designed the ETL processes using Informatica to load data from Oracle ERP, SQL Server, Flat Files, XML Files, Excel files and Legacy system extracts to target Oracle 10g Data Warehouse database.
  • Worked closely with Software Developers to isolate, track, and troubleshoot defects.
  • Identifying performance bottlenecks and fine-tuning ETL mappings and workflows for performace. Created scripts for automating dropping-recreation of warehouse table indexes for bulk loads.
  • Analyzing the source data and deciding on appropriate extraction, transformation and load strategy.
  • Creating batch scripts for post load rename of stand alone and legacy system extract files. Parameterizing batch scripts to make them generic across the board.
  • Working with business analysts for data testing and validation.
  • Changing of Universe identification as per the client requirements to facilitate easy identification for the end users.

Environment: Informatica PowerCenter 6.2/7.1, Informatica Power Exchange, Oracle ERP/10g, Ab Inito 1.15, Siebel 7.8, SQL Server 2003, T/SQL, Erwin 4.1, NT Batch Scripting, Shell scripting, Toad 6.5, and Windows NT 4.0, Hyperion Systems 9.

We'd love your feedback!