We provide IT Staff Augmentation Services!

Informatica Integration Developer Resume

Houston, TX

SUMMARY:

  • 13+ years of Professional experience as Software developer including over Nine Plus years of experience in designing and development of Data warehouse/ Data Migration/ Data Conversion projects as Informatica ETL consultant and client/server applications.
  • Carried out Full Data warehouse life cycle in the project, requirement analysis, Modeling and designing, database and end user application development.
  • 12+ years experienced Informatica ETL expert with experience in online analytics, system analysis, creating technical design documents, project plans, managing Data Warehouse projects.
  • Strong understanding of Conceptual Data Modeling (CDM), Logical Data Modeling (LDM), Physical Data Modeling (PDM).
  • Creating data models, building Informatica ETL processes, dimensional data modeling primarily star schema, performing data migration, defining user interfaces, executing large projects, assuring quality and deployment.
  • Developed staging areas to Extract, Transform and Load new data from the OLTP database to warehouse.
  • Strong in Dimensional Modeling, Star/Snow Flake Schema, Extraction, Transformation & Load and Aggregates.
  • Strong in converting business requirements to project documentation and technical specifications.
  • Extensive experience in using Business intelligence report tools like Business Objects, Cognos BI Suite for warehouses and data marts, OBIEE etc.
  • Sound knowledge of Data warehousing concepts and Informatica ETL tool.
  • Excellent communication/Presentation, communication, interpersonal presentation and analytical skills.

TECHNICAL SKILLS:

Languages: C, Java (jdk1.2, Servlets, JSP), J2EE, SQL, PL/SQL, T - SQL, NZ-SQL.

Data warehousing Tools: Informatica Power Center 9.6.1/9.5.1/9.1.0/8.6.1/8.5.1/7.1 , Informatica Power Exchange, Informatica Metadata Manager, Informatica Data Quality, Informatica Data Explorer, DataStage Server Edition, Talend Open Studio, Oracle Data Integrator, Talend data profiler, Business objects 5.1/6.5, Cognos BI suite 6.0/7.2.Reportnet1.1, QlickView (Netezza), OBIEE.

Data Modeling Tools: Erwin, Embarcadero.

RDBMS: Oracle Exadata, Oracle 11g/10g/9i/8i/8.0, DB2, Oracle EBS 11g, Siebel 7.8, Ms SQL Server 2012/2008 R2/2005, Sybase, Teradata (BTEQ, Fast load, multi load, TPump, SQL Assistant, Fast Export), Netezza TwinFin 3/6/skimmer.

Script Languages: Perl, Script, Java script, korn shell script, Application Server IBM WebSphere 5

Others: Citrix, Telnet, PLSQL developer, Toad 9.7.2/6.5, T-Pump, ISQL, Workbench Aginity, Management Studio, Visio pro, COBOL, IMS, VSAM

Operating System: Windows XP / Windows 2000, WinNT, UNIX (AIX, HP, SCO), Linux, Sun Solaris

PROFESSIONAL EXPERIENCE:

Confidential, Houston, TX

Informatica Integration Developer

Responsibilities:

  • Worked on Requirement Gathering and Business Analysis
  • Analyzed the data models of legacy implementations, identifying the sources for various dimensions and facts for different data marts according to star schema design patterns.
  • Worked with multiple sources such as Relational Tables, Flat Files, Excel Sources, ERP (SAP-BW) sources for Extraction using Source Analyzer/PowerExchange
  • Extensively involved in the Analysis, Design and Modeling. Worked on Star Schema, Data Modeling, Data Elements.
  • Converted views, materialized views, stored procedures, functions from Oracle to Teradata.
  • Created Informatica mappings/sessions/workflows for data migration from Oracle to Teradata.
  • Created CDC (change data capture) sources in Power Exchange and imported that into Power Center
  • Designed the ETL process for extracting data from heterogeneous source systems, transform and load into Data Mart.
  • Worked on creating the Informatica jobs to load the data into the S3 buckets of AWS.
  • Used Informatica power connect for AWS Red Shift to load the data.
  • Validated the Red Shift target data by using Red Shift workbench aginity.
  • Optimized the API to get data and loaded into Red Shift.
  • Created Logical and Physical models for production using Erwin 3.5.
  • Worked with Informatica Power Connect to get data from PeopleSoft XLATTABLE table and modules like General Ledger, Accounts Payable, Accounts Receivable, and Asset Management.
  • Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
  • Performance tuned the workflows by identifying the bottlenecks in targets, sources, mappings, sessions and workflows and eliminated them.
  • Provided production support including error handling and validation of mappings, sessions and workflow.
  • Extensively used Debugger Process to test data and applying Break Points while Session is running.
  • Provided production support for Business Users and documented problems and solutions for running the workflow.
  • Developed UNIX scripts for scheduling the jobs.
  • Designed and developed Oracle PL/SQL Procedures.
  • Performance tuning of Oracle PL/SQL Scripts
  • Worked on SpotFire reports for Flare Forecasting and Emissions.

Environment: Informatica Power Center 10.1.0, Amazon Webservices (AWS) EC2 Instance, SoftNAS, AWS RedShift, SpotFire, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC), SQL Server 2014, Teradata 15.10.1.3, Teradata SQL Assistant, Teradata Studio, SQL Server Management studio, Sun Solaris, Windows XP, Autosys.

Confidential, Wayne, NJ

Informatica Architect/ Developer

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Worked on populating the SAP objects like Customer Master, Material Master, Vendor Master etc.
  • Created views in Oracle for SAP data pre-validations.
  • Involved in the development of Informatica mappings and tuned for better performance.
  • Created the load readiness reports (LRR) to load into SAP.
  • Created scorecards for redundant data and created validation rules using Informatica.
  • Did data profiling for the column/across table data validation.
  • Parsed the target data in IDQ using parser transformation.
  • Worked on Data integration from various sources like JDE-AUTO, JDE-IND, Marine, Vessels etc into SAP.
  • Maintained contact information in SFDC using Informatica CC 360.
  • Worked on creating the master beans for contact in SFDC using Informatica CC 360.
  • Created the SAP lookup tables for cross reference in Oracle.
  • Configured Power Connect for SAP and maintained sap.ini file entries.
  • Retrieved data from SAP ECC and installed/configured ABAP code.
  • Configured LDAP connector for Informatica in administration console.
  • Created the Sap ABAP data dictionaries and mapped underlying relational tables and views.
  • Created SAP ABAP data classes and indexes.
  • Worked on address validations in SFDC in consolidated tab of Informatica CC 360.
  • Converted/managed LEADS of SFDC using Informatica CC 360.
  • Worked with transparent and pooled tables of SAP ABAP.
  • Extensive experience on Designing, Managing and administrating MDM/DIW objects.
  • Involved in Data integration keeping all success factors into consideration.
  • Success factors of integration includes, historical data, data archiving strategy, cache sizes calculations on the server etc.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Retrieved AUTO data from DB2.
  • Worked extensively in PL/SQL to migrate the data from DB2 to Oracle database.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Production Support and issue resolutions.
  • Integrated data from DB2 to Oracle.
  • Validated the source queries in DB2.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.

Environment: Informatica Power Center 9.5.1 HF2, Informatica Data Quality 9.5.1, SAP ECC 6.0, Oracle 11 g, SQL Server 2014, DB2, PL/SQL, Linux, Putty, Winscp.

Confidential, Cary, NC

Informatica CDC Consultant

Responsibilities:

  • Evaluated RailInc’s Informatica Power Center and Power Exchange for Oracle CDC architecture.
  • Created the architecture, best practices and ETL standards documents.
  • Worked on creating the error and performance matrix reports from Informatica repository.
  • Generated report to check the data synchronization into multiple systems for resync process.
  • Created the report for the Informatica CDC workflow’s performance.
  • Validated the data from the source DB2 and loaded into Oracle for 650 tables of DB2.
  • Created base tables in DB2 for multidimensional clustering table.
  • Retrieved data from PeopleSoft HR and loaded into Oracle.
  • Worked on creating the dimensional model for Confidential Road Network using Ralph Kimball methodology, identifying attributes of dimensions and facts for each subject area.
  • Used ERWin tool for dimensional modeling.
  • Updated PeopleSoft HR Employee, Contingent Worker, POI with/without jobs etc.
  • Retrieved data from PeopleSoft HR for hierarchy information for Pension payees, stock board members, stock non-HR members etc.
  • Created the unit testing, STM document templates for future development.
  • Created the road map for future developments, creation of versioned repositories, usage of shared folders, usage of parametrization etc.
  • Created the data quality road map for the source systems.
  • Created the report for the Informatica CDC workflow’s performance.
  • Worked on Creating Data quality customized plans for data harmonization, cleansing, profiling using Analyst tool etc.
  • Installed/configured address doctor for Informatica data quality.
  • Configured LDAP authentication for Informatica analyst logins.
  • Used numerous transformations of Informatica data quality like match/merge, case converter, standardizer, labeler etc.
  • Automated some of the IDQ plans by incorporating it into Power Center as mapplets.
  • Created scorecards for the source data for data redundancy, duplicity etc.
  • Created exception management process flow for bad records and duplicate records.

Environment: Informatica Power Center 9.6.1, Informatica Data Quality 9.5, Informatica Power Exchange for Oracle CDC, PeopleSoft HR, IBM DB2 10.5, Oracle 11g, SQL Developer, WinSCP, Putty, Linux, shell scripting.

Confidential, Dallas, TX

Informatica Architect/Admin/Developer

Responsibilities:
  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Created the ETL performance expectations document based on the source data profile results.
  • Captured the data volumes, upserts/truncate and load strategies etc in Integration design document.
  • Incorporated the refresh strategy, maintaining the historical data, archiving strategies for the source flat file, Audit balance and Control (ABC) etc in Integration design document.
  • Created the technical architecture (Hardware and Software) that will support ETL.
  • Configured Informatica Power Center GRID on Linux platform.
  • Assigned master and worker nodes to GRID in Informatica platform.
  • Created the Informatica data quality plans, created rules, applied Rules to IDQ plans and incorporated the plans as mapplets in Informatica Power Center.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Created High Level and Low-Level design document and ETL standards document.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Installed and configured Informatica 9.5.1 HF3 on Red Hat platform.
  • Wrote shell script to take repository backup on a weekly basis and archiving the 30 day old files on Red Hat.
  • Created the Visio diagram
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
  • Worked extensively on Autosys using the CA workload center and JIL Checker.
  • Scheduled Informatica jobs using Autosys.
  • Created dependencies in Autosys, inserted/updated jobs Autosys on CA Workload Center.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2012.
  • Worked on Informatica BDE for retrieving data from Hadoop’s HDFS file system.
  • Solved T-SQL performance issues using Query Analyzer.
  • Optimized SQL queries, sub queries for SSRS reports.
  • Created the SSRS reports with multiple parameters.
  • Modified the data sets and data sources for SSRS reports.
  • Retrieved data from Oracle EBS and loaded into SQL Server data Warehouse.
  • Worked with the Oracle EBS tables like GL CODE COMBINATIONS, GL LEDGER, GL PERIODS, GL JE SOURCES TL, AP CHECKS ALL, AP INVOICE ALL, PO HEADERS ALL, PO LINES ALL, RA CUSTOMER TRX ALL, SO LINES INTERFACE ALL etc.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Performance tuned Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
  • Developed SSIS packages and migrated from Dev to Test and then to Production environment.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2012.
  • Created the SFDC, Flat File and Oracle connections for AWS Cloud services.
  • Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
  • Standardized the T-SQL stored procedures per the organizations standards.
  • Applied try/catch blocks to the T-SQL procedures.
  • Used merge statement in T-SQL for upserts into the target tables.
  • Made changes to SSRS financial reports with user’s input.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Involved heavily in creating customized Informatica data quality plans.
  • Worked with address and names data quality.
  • Used Proactive monitoring for daily/weekly Informatica jobs.
  • Customized the proactive monitoring dashboard with the Informatica repository tables like OPB SESS TASK LOG etc.
  • Resolved Skewness in Teradata
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Wrote BTEQ scripts of Teradata extensively.
  • Installed configured Amazon redshift cloud data integration application for faster data queries.
  • Created JDBC, ODBC connections in Amazon redshift from the connect client tab of the console.
  • Automated the administrative tasks of Amazon redshift like provision, monitoring etc.
  • Aware of the columnar storage, data compression, zone maps of Amazon redshift.
  • Extracted data from complex XML hierarchical schemas for transformation and load into Teradata and vice versa.
  • Resolve syntax differences from Teradata to Oracle and documented it.
  • Scheduled the workflows to pull data from the source databases at weekly intervals.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Created the FTP connection from Tidal to the source file server.
  • Retrieved data from XML, Excel, and CSV files.
  • Archived the source files with timestamp using Tidal Scheduler.
  • Performance tuning on sources, targets, mappings and database.
  • Worked with the other team such reporting to investigate and fix the data issues coming out of the warehouse environment.
  • Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes.
  • Worked with business to provide them daily production status report in the form of issues, their priority and business impact along with recommended short term and long term solution.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica BDE, Amazon web services (AWS) cloud, Amazon RedShift cloud data integrator 10, Business Objects, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC), SQL Server 2008 R2/2012, DB2 8.0/7.0, Team Foundation Server, SQL Server Management studio, Sun Solaris, Windows XP, Control M.

Confidential, Houston, TX

Informatica Architect/Developer

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End User Meetings
  • Responsible for Business Requirement Documents BRD's and converting Functional Requirements into Technical Specifications.
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Used Informatica MDM (Siperion) tool to manage Master data of EDW.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Pulled data from Epic Clarity, EDI and HIPAA compliance files (X12 files).
  • Loaded data to EDI system and clinical documentation.
  • Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
  • Did change data capture (CDC) by using the MD5 function of Informatica.
  • Created the RPD for OBIEE.
  • Created custom IDQ plans and incorporated it into Power Center as mapplet.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Extracted data from various heterogeneous sources likeOracle, SQL Server, andFlat Files and loaded intoDataMart’susingInformatica.
  • Resolved Skewness in Teradata.
  • Optimized OBIEE dashboard queries.
  • Used Informatica Data Quality (IDQ) to format data from sources and load it into target databases according to business requirements.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created the visio diagram to show the job dependencies for Control-M jobs.
  • Inserted/updated the Informatica jobs in Control-M by invoking the pmcmd utility of Informatica.
  • Wrote BTEQ scripts of Teradata extensively.
  • Created custom plans for product name discrepancy check using IDQand incorporated the plan as a Mapplet into Power Center.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Loaded data from flat files to Big Data (1010 data).
  • Did bulk loading of Teradata table using Tpump utility.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Extensively used various active and passive transformations likeFilter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation,Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Responsible for best practices like naming Conventions, Performance tuning, andError Handling
  • Responsible for maintaindata qualityanddata consistencybefore loading into ODS.
  • Responsible forPerformance Tuningat the Source level, Target level, Mapping Leveland Session Level
  • Created business objects universes.
  • Created denormalized BO reporting layer for BO reports.
  • Solid Expertise in using bothConnectedand Un connectedLookup transformations
  • Extensively worked with various lookup caches likeStatic Cache, Dynamic Cache, and Persistent Cache
  • Responsible for determining thebottlenecks and fixing the bottlenecks withperformance tuning.
  • Used Update Strategy DD INSERT, DD DELETE, DD UPDATE, AND DD REJECT to insert, delete, update and reject the items based on the requirement
  • Worked withSession Logs, andWorkflow Logsfor Error handling and Troubleshooting in all environment
  • Responsible for Unit Testing and Integration testing of mappings and workflows.

Environment: Informatica Power Center 9.5.1/9.1.0 HF1, Informatica MDM,SAP BW,Big Data 1010 data., Teradata 14, Oracle 10g, MS SQL Server 2012,SAP BW/IDoc, Control-M, TOAD, SQL, PL/SQL, BO XI, OBIEE, Windows XP, UNIX

Confidential, Houston, TX

Sr. Data Warehouse Consultant

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Upgraded Informatica from 9.5.1 to 9.6.1 on Linux servers for Dev/Test and Prod environments.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Migration of data from Oracle 11g to Oracle Exadata
  • Created Oracle Exadata database, users, base table and views using proper distribution key structure.
  • Worked on Data integration from various sources into ODS using ODI (oracle data Integrator)
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Calculated the KPI’s and worked with the end users for OBIEE report changes.
  • Created the RPD for OBIEE.
  • Developed mapping parameters and variables to support connection for the target database as Oracle Exadata and source database as Oracle OLTP database.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Worked with Crontab for job scheduling.
  • Production Support and issue resolutions.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.
  • Written Unix Shell Scripting for repository backups, job scheduling on Crontab etc.

Environment: Informatica Power Center 9.6.1/9.5.1 HF2, Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1, SAP R3, ODI (Oracle Data Integrator), SQL Server 2008 R2, Oracle 11 g, Oracle Exadata, OBIEE, PL/SQL, Linux, Putty, Winscp.

Confidential, Houston, TX

Sr. Informatica Architect/ T-SQL Developer

Responsibilities:

  • Created the architectural design of the Informatica /ETL and Data warehouse.
  • Applied patches to Informatica Servers.
  • Worked with SQL DBA’s on collation change on SQL Server 2012.
  • Applied Sales Force license to the domain for development and production environments.
  • Created Informatica best practices document, mapping/Unit testing document templates.
  • Communicated with the networking team on ETL server upgrades to space, memory and processors.
  • Maintained Informatica servers to make sure integration services, repository and servers are up and running and also coordinated with networking teams for server reboots etc.
  • Created ETL auditing reports for error handling/validations etc against Informatica ‘OPB’ repository tables.:
  • Installed/Configured Informatica 9.1.0 HotFix1 on Development server.
  • Installed/Configured Informatica HotFix3 on development/production environment.
  • Applied EBF on Informatica server for SQL Server native client 11.0.
  • Production support for the daily/weekly and monthly loads.
  • Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
  • Configured the SFDC application connection to fulltest and production.
  • Configured JD Edwards Power Exchange for Informatica.
  • Standardized the parameter file location for each project in BWParam folder of Informatica.
  • Deployed Informatica workflows from Development
  • Deleted old repository backup files and purged deleted objects from repository in both development and production environments.
  • Supported the development team on ETL standards, naming conventions, best practices and folder structures.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Folder migration from Development to UAT to Production.
  • Created shared/regular folders for Projects and personal development.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite.
  • Designed and developed Reports for the user Interface, according to specifications given by the Track leader.
  • Worked on converting the older ETL jobs like cast Iron, CA ADT for getting the data into data warehouse.
  • Involved in performance tuning at source, target, mapping and session level.
  • Loaded Oracle tables from XML sources.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS
  • Introduced the concept of Data Dashboard to track the technical details sighting the continuous requirement changes and rework needed.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Retrieved data from SAP using Informatica Power Exchange.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Resolved Skewness in Teradata.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Used Informatica web services to create work requests/work Items for the end user.
  • Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Created Stored Procedures to validate and load the data from interface tables to the Oracle E-Business Suite internal tables.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Staged data in Oracle E-Business Suite stage tables using Power Center in Informatica.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1,Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1,Informatica Data Explorer 8.6, SQL Server 2008 R2/2012, Oracle 10g, Oracle 12.0.4/5 EBS, JD Edwards, SAP, Sales Force dot com (SFDC), Informatica/windows scheduler, Force.com, Teradata 13, OBIEE, PL/SQL, Windows Server 2008 R2 Standard.

Confidential, Houston, TX

ETL Architect/ Informatica Lead

Responsibilities:

  • Extensively worked with the solution architect to implement the project from scratch.
  • Upgraded Informatica Power center from 8.6.1 to 9.1.
  • Converted all the Confidential specific utilities like File Manipulator, DB Wrapper, Programme Wrapper etc into Informatica jobs.
  • Purged the deleted objects from the repository in the DEV and QA environments.
  • Managed the growth of repository as part of normal administration maintenance.
  • Created the shell script to old repository backup files.
  • Worked with versioned objects.
  • Installed and configured Informatica Power Exchange for CDCand Informatica Data Quality (IDQ).
  • Extensively handled Informatica server related issues with the development teams.
  • Created various issue tickets with Informatica professional services.
  • Created SR’s with Informatica PS various workflow errors which require SR to be opened.
  • Worked with the network supportteams both on Windows and AIX servers for memory, storage, event log errors, performance issues and coordinated outage for maintenance.
  • Worked closely with the Cognos reports development team in configuring the cubes.
  • Created Cognos cubes and denormalized the data for faster access.
  • Customized the Cognos reports.
  • Provided upgrade and configuration support as part of administration.
  • Applied HotFix 5 on Informatica 8.6.1.
  • Created labels for each project for code deployments.
  • Created deployment groups for code migration from DEV to QA and also QA to Prod.
  • Created program wrapper/DB wrapper/File Manipulator jobs.
  • Attended POC of IDR.
  • Configured Informatica data replication (IDR) fast clone.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created custom plans for product name discrepancy check using IDQand incorporated the plan as a mapplet into Power Center.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Used Workflow Manager for creating and maintaining the sessions and also to monitor, edit, schedule, copy, aborts and deletes of the session.
  • Extensively worked in performance tuning of the programs, ETL Mappings and processes.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.

Environment: Informatica PowerCenter 9.1, SQL Server 2008 R2, Oracle 11g, Sybase, SQL server Management Studio, MS Visual Studio 2010, SQL Developer for Oracle, Program Wrapper, File Manipulator, Cognos EBusiness Suite, Cognos Cubes, Teradata, Designer, SQL Server Migration Assistant (SSMA), CMS, ER-Studio, AIX, UNIX Shell Scripting, SQL, PLSQL, T-SQL, XML, Xqueries on SQL Server Database, Jira Bug tracker.

Confidential, AUSTIN, TX

Data Architect/DW Developer

Responsibilities:

  • Responsibilities include Production Implementation, Scheduling, Data Loading, Monitoring, Troubleshooting and Support for Global Operations Reporting using Informatica and Business Objects.
  • Extensively worked with the solution architect to implement the project from scratch.
  • Training team members to run and monitor workflows in Informatica.
  • Lead the team of change Management.
  • Created extensive T-SQL procedures.
  • Created SSIS packages to get the data from the flat files and load it into Oracle 11g.
  • Created the deployment document to be followed for migration of code from one environment to another.
  • Created the metadata layer for OBIEE.
  • Created batch scripts to rename/copy and move the processed files.
  • Designed conceptual, logical and physical data model using Erwin data modeling tool.
  • Administrative role and monitoring and support for several projects in Staging and Production environments for Informatica and OBIEE.
  • Exceptional background in analysis, design, development, and implementation and testing of data warehouse development and software application
  • Created ETL Mappings to ensure conformity, compliance with standards and lack of redundancy.
  • Exceptional background in analysis, design, development, and implementation and testing of data warehouse development and software applications.
  • Designed and developed Informatica mappings, to load data into target tables.
  • Worked with static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created Mapplets to be re-used in the Informatica mappings.
  • Developed ETL Mappings to ensure conformity, compliance with standards and lack of redundancy.
  • Designed and developed Informatica mappings for data loads and data cleansing. Extensively worked on Informatica Designer, Workflow Manager.
  • Extensively used most of the transformations of Informatica including lookups, Stored Procedures, Update Strategy.
  • Created SSIS packages to automate data from Flat files into Oracle database.
  • Created SSIS package to get the dynamic source file name using ForEachLoop Container.
  • Used the Lookup, Merge, Data conversion, sort etc Data flow transformations in SSIS.
  • Used Workflow Manager for creating and maintaining the sessions and also to monitor, edit, schedule, copy, aborts and deletes of the session.
  • Extensively worked in performance tuning of the programs, ETL Mappings and processes.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.
  • Made performance improvements to the database by building Partitioned tables, Index Organized Tables and Bitmap Indexes.
  • Developed OBIEE RPD and DAC from end user’s input.
  • Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules and security.
  • Designed and Developed Audit process to maintain the data integrity and data quality. Source to target audits were built to make sure accurate data is loaded to the warehouse and Internal Audits for checking the integrity of the data within the data warehouse.
  • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.
  • Strong knowledge in Oracle Business Intelligence Enterprise Edition.

Environment: Erwin 4.2, Informatica Power Center 9.1/8.6.1, Informatica Data Replication 9.1.1 (IDR), Oracle 10g/11g, Teradata 13.1, SQL Developer, Crontab scheduler, SQL Server 2008 R2, SQL Server Management Studio, SQL Server Visual Studio 2007, PL/SQL, SQL Assistant, Windows XP, Sun Solaris 5.1.0, UNIX scripting, OBIEE, IBM Vendavo Profit Analyzer.

Confidential, HOUSTON, TX

Informatica/OBIEE lead

Responsibilities:

  • Extensively involved in gathering business requirements and translating them into technical requirements
  • Responsible for loading the data into base model repository and Dimensional model.
  • Used Informatica’s Data Transformation (B2B) tool to retrieve unstructured (xml) data.
  • Responsible in strictly maintaining naming standards and warehouse standards for future development.
  • Extensively wrote xqueries to get data from XML stored in SQL Server.
  • Actively worked on Informatica Administration Console.
  • Coordination and Leading the ETL development team and assisting the OBIEE team on RPD and Dashboard customization.
  • Worked closely with the OBIEE team in building the RPD and dashboard.
  • Customized the OBIEE dashboard.
  • Configured SFDC license in administration console.
  • Wrote BTEQ scripts of Teradata extensively.
  • Retrieved data from JD Edwards and loaded into Oracle.
  • Created customized OBIEE model in the RPD to retrieve the RPD data into dashboard.
  • Configured replication jobs for initialsync for replication jobs using DR console.
  • Created Informatica IDR jobs using Data replication console.
  • Scheduled the replication job using replication console.
  • Extracted data from various sources such as Flat Files, Oracle using Informatica Power Center
  • Designed and Developed Stored Procedures/Views in Oracle.
  • Created SSIS packages using BIDS.
  • Imported sources and targets for the SSIS packages into BIDS.
  • Achieved performance improvement by tuning SQL queries, xqueries, extraction procedures between Oracle and Power Center.
  • Offered 24/7 production support for the application on a time to time basis.
  • Developed complex Power Center mappings and IDQ plans by using different transformations and components respectively to meet the data integration and data quality requirements for various clients.
  • Used Informatica xml SQ transformation to read data from an xml (XSD) source and write data to a xml target.
  • Created and monitored workflows/sessions using Informatica Workflow Manager/Workflow Monitor to load data into target Oracle database
  • Performed Unit testing, Integration Testing, and User Acceptance testing to pro-actively identify data discrepancies and inaccuracies
  • Involved in performance tuning at source, target, mapping and session level
  • Delivered and sign-off of deliverables pertaining to the Transactional Data warehouse.
  • Migrated the Informatica Code using the deployment groups
  • Prepared design documents, ETL Specifications and migration documents
  • Introduced the concept of OBIEE Dashboard to track the technical details sighting the continuous requirement changes and rework needed
  • Maintained daily Tech Tracker for the updates from the team regarding their objects, issues and progress
  • Involved in Informatica PowerCenter 8.1.1 SP4 Repository upgrade to Informatica Power center 8.6.1
  • Involved in providing Informatica Technical support to the team members, as well as, the business

Environment: Informatica Power center 9.1/8.6.1, Informatica Data Transformation (B2B), Informatica Data Quality 8.6.2, Informatica IDR, Oracle 11g/10g, SQL Server 2008, SSIS, Windows 2003/2008, Sun Solaris, Red Hat Linux, OBIEE, Micro Strategy, UNIX shell scripting.

Confidential, HOUSTON, TX

BI Lead / Netezza Implementation Consultant

Responsibilities:

  • Provided architectural design of Netezza to the client.
  • Provided Netezza framework design to the client (pros and cons).
  • Extensively assisted the Framework implementation team with time-to-time client requirements and WM specific framework design.
  • Interacted with the business users, assisted the ELT developers with code development.
  • Assisted the Informatica lead in understanding the framework requirement.
  • Worked closely with data modelers in requirement gatherings.
  • Assisted the ELT developers in understanding the mapping documents.
  • Created nzsql procedures to be kicked off within Framework.
  • Extensively created shell scripts for the configuration of meta, ddl, data, xfr files in the Unix directories.
  • Mentor client on different BI tools.
  • Evaluated Team member’s performances.
  • Created stored procedures (NZ-SQL).
  • Scheduled the ELT load in Autosys.
  • Debugged and corrected the xfr files developed by other ELT developers.
  • Fixed numerous bugs with load issues.
  • Optimized the NZ-SQL queries.
  • Converted Oracle ddl’s to Netezza ddl’s.
  • Created the format of the unit test documents per Netezza Framework.
  • Assisted ELT developers in creating the unit test documents.
  • Managed user and folder permissions for the developers.
  • Purged old repository objects weekly.
  • Created shell script for repository backup weekly.
  • Developed data Mappings between source systems to Landing and from Standardization to warehouse components using Mapping Designer.
  • Did performance tuning on the ELT code developed by ELT developers.
  • Debugged the framework error logs.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Problem resolution involves resolving the issues raised by the client/actuarial users, validation of the data in the database or in the application functionality.
  • Worked closely with QlickView Developers.

Environment: Informatica Power center 8.6.1, Trillium Data Quality, Netezza TwinFin 6 (Production), Netezza TwinFin 3 and Netezza Skimmer (Non-production), QlickView Reporting tool from Netezza, Oracle 11g/10g, Windows 2003/2008, Sun Solaris, Red Hat Linux, SUSE Linux, Micro Strategy, Crystal Reports, UNIX as Server Autosys and UNIX Shell Script.

Hire Now