We provide IT Staff Augmentation Services!

Sr. Informatica Administrator/developer/ Data Analyst Resume

5.00/5 (Submit Your Rating)

Woodlands, TX

SUMMARY

  • Around 10+ years of IT experience with expertise in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) involving RDBMS like Oracle, MS SQL server on Windows and UNIX platforms.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Data modeling tool ERWIN and Dimensional modeling techniques (Kimball and Inmon), Star and Snowflake schema addressing Slowly Changing Dimensions (SCDs).
  • Experience in developing and maintaining logical and physical Data models for Data Warehouse applications using tools like Erwin and ER/Studio.
  • Extensively used Informatica Client Tools - Designer, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Expertise in implementing complex Business rules by creating robust mappings, Mapplets, shortcuts,
  • Extensive knowledge on Master Data Management (MDM) concepts.
  • Extensive experience on Designing, Managing and administrating MDM/DIW objects using Kalido DIW/MDM8.5/9 tool.
  • Worked on designing catalogs, categories, sub-category and user roles using Kalido MDM 9.
  • Populated the MDM tables using Stating table feed or File feed.
  • Developed the windows batch script to load the master data for the Kalido MDM tables.
  • Shared the knowledge with Business people on how to work on MDM interface, key-in the master data, authorize and publish the data.
  • Worked on providing the data security for the MDM categories. Only the assigned role people can enter the data in Kalido MDM interface and to authorize or approve the master data.
  • Reusable transformations using transformations like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
  • Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like Oracle, Flat files, XML files and MS SQL Server into Oracle, XML and SQL server targets.
  • Extensive experience in implementing CDC using Informatica Power Exchange 8.x/7.x.
  • Used IBM information analyzer extensively for complete source system profiling and analysis, to validate data values, validate column/table relationships and drill down to exception rows for further analysis to ensure data projects contain trusted information and lower the risk of propagating bad data.
  • Worked on Quality Stage jobs using Match, Standardize and Survivor stages. Used Quality Stage plug-in within Data Stage jobs.
  • Created and verified Quality Stage jobs for Match and un-duplication of data.
  • Extensively worked with Oracle PL/SQL Stored Procedures, Functions and Triggers and involved in Query Optimization.
  • Experience in Data loads using the Toad for Oracle, SQL loader Imports & UTL files based on file formats.
  • Worked on configuring the Hyperion log files like application log, Essbase log, configtool log, eas install.log, ess server instal.log etc.
  • Created the Sales Force connections in Informatica Power Center.
  • Converted the SSIS packages into Informatica mappings.
  • Extensively worked on Repository Manager, Server Manager, Workflow Manager, and Workflow Monitor.
  • Familiar with Business Objects 5.x.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.
  • Used Pentaho Data Integration Designer to create ETL transformations
  • Developed several reports that exploited the use of Sub reports, report grouping, Pentaho Sub Confidential.
  • Created dashboards in Pentaho using Pentaho Dashboard Designer.
  • Designed and developed a series of complex Business Intelligence solutions using Pentaho Report Designer
  • Designed and developed mappings with optimal performance using Aggregator, Joiner, B2B Transformation, Sequence Generator, Un-Cached & Cached Lookup, Connected & Unconnected Source Target pre and post load Stored Procedure Transformations, Update Strategy, Union etc
  • Designed, developed, implemented and maintained Informatica Power Center and IDQ 8.6.1 application for matching and merging process.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 8.6.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 8.6.1 for data quality measurement.
  • Developed mappings and workflows which supported the mappings in PowerCenter, IDQ and MDM
  • Exported the IDQ Mappings and Mapplets to power center and automated the scheduling process.
  • Knowledge of data profiling using Informatica Data Quality(IDQ
  • Experienced with DVO, Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
  • Designed, tested, and deployed plans using IDQ 8.5
  • Production support and fix the problem by understanding its priority.
  • Experienced in Software Development Life Cycle (SDLC), which includes Requirements collection, Design, Development, Testing, Deployment, Post Deployment Support
  • Experienced in Data Quality Management, Master Data Management and Metadata Management
  • Experienced in software development using Waterfall and Agile software development methodologies
  • Experienced in Data Warehouse Application Maintenance, which includes operational excellence, change management, application performance management, data privacy management
  • Experienced in data warehousing applications development for Airlines and Manufacturing industries
  • Acquired Working knowledge of Repository (RPD) and Datawarehouse Application Console (DAC).

TECHNICAL SKILLS

Databases: Oracle 10g/11i, SQL Server 2012/2008 R2/2008/2005, DB2, Teradata, SQL Assistant, Teradata 14/13, My SQL 5.0/4.1, MS-Access. Editors (SQL Navigator, Toad)

ETL Tools: Informatica Powercenter 9.x, 8.x, 7.x, Informatica 9, Kalido DIW/MDM version 9

Data Modeling Tools: ERWIN, ER/Studio

Programming Skills: C++, Shell Scripting, PL/SQL, PERL, FORTRAN, HTML, JAVA Script, J2EE, CSS

Methodologies: Data Modeling - Logical, Physical Dimensional Modeling - Star / Snowflake

Reporting Tools: MS SQL Server Reporting services 2008 R2, Developer 2000(Forms 5.0, Reports 6i), Crystal Reports 10, OBIEE, Business Objects, and Cognos.

Operating Systems: UNIX (Sun-Solaris, HP/UX), Windows 95/98/00/NT/XP

PROFESSIONAL EXPERIENCE

Confidential, Woodlands, TX

Sr. Informatica Administrator/Developer/ Data Analyst

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Upgraded Informatica from 9.5.1 to 9.6.1 on Linux servers for Dev/Test and Prod environments.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Migration of data from Oracle 11g to Oracle Exadata
  • Created Oracle Exadata database, users, base table and views using proper distribution key structure.
  • Worked on Data integration from various sources into ODS using ODI (oracle data Integrator)
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Calculated the KPI’s and worked with the end users for OBIEE report changes.
  • Created the RPD for OBIEE.
  • Developed mapping parameters and variables to support connection for the target database as Oracle Exadata and source database as Oracle OLTP database.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Did bulk loading of Teradata table using TPump utility.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Worked with Crontab for job scheduling.
  • Production Support and issue resolutions.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.
  • Written Unix Shell Scripting for repository backups, job scheduling on Crontab etc.

Environment: Informatica Power Center 9.6.1/9.5.1 HF2, Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1, Teradata, SAP R3, ODI (Oracle Data Integrator), SQL Server 2008 R2, Oracle 11 g, Oracle Exadata, OBIEE, PL/SQL, Linux, Putty, Winscp.

Confidential, Woonsocket, RI

Sr. Informatica Developer/ Data Analyst

Responsibilities:

  • Worked with business analysts for requirement gathering (BRD), business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
  • Developed mappings for fact and dimension tables using the various transformations to extract data from different source databases Oracle, SQL Server and XML Files.
  • Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Created Workflows using various tasks like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment and worked on scheduling of the workflows.
  • Used mapping parameters and variables.
  • Prepared mapping specification document, which gives the data flow and transformation logic for populating each column in the data warehouse table.
  • Used debugger to analyze the data flow between source and target to fix the data issues.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Involved in DBMS developments include building data migration scripts using Oracle SQL*LOADER and UTL packages.
  • Implemented audit and reconcile process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
  • Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.
  • Created the release requests for QA Builds to include all the release requirements and involved in the implementation of QA, UAT and Production releases.
  • Developed a UNIX Shell scripts which will send the reports to client over the network by using file transfer protocol (FTP & SFTP) and generating the log file, which will keep the history for the FTP reports.
  • Provided data loading, monitoring, System support and worked on data issues raised by end user during its production support phase.

Environment: Informatica Powercenter 9.6.1/9.5.1 , Oracle 10g, SQL Server, Teradata, DB2 Loader, Flat Files, UNIX Shell Scripting, TOAD, Approx. 6.0

Confidential, San Ramon, CA

Sr. Informatica/ Oracle Developer

Responsibility:

  • Worked with business analysts to identify appropriate sources for data warehouse and prepared the Business Release Documents, documented business rules, functional and technical designs, test cases, and user guides.
  • Actively involved in the Design and development of the STAR schema data model.
  • Implemented slowly changing and rapidly changing dimension methodologies; created aggregate fact tables for the creation of ad-hoc reports.
  • Extensively worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter, Sequence Generator, etc.
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.
  • Retrieved data from unstructured sources like xml using B2B data transformation.
  • Created mapping using B2B DT Visual studio with script pane and target schema for xml and excel source data.
  • Created metadata layer (reporting) for OBIEE and OBIA.
  • Modifies OBIA dashboard for HR, INV, MFG, financial, Spend Analysis and supply chain.
  • Modified OBIA dashboard for metrics and standard reports.
  • Scheduled the jobs using Data Warehouse Administration Console(DAC).
  • Configured Informatica and OBIEE workflows in DAC.
  • Extracted data from SalesForce legacy system, SalasVision, Charles River (Trading Platform).
  • Integrated the data from Oracle to Sales Force (SFDC) using Informatica cloud.
  • Mentored Informatica developers on project for development, implementation, performance tuning of mappings and code reviews.
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart.
  • Developed Informatica mappings/mapplets, sessions, Workflows for data loads and automated data loads using UNIX shell scripts.
  • Used variouslookup caches like Static, Dynamic, Persistent and Non-Persistent in Lookup transformation.
  • Involved in debugging mappings, recovering sessions and developing error-handling methods.
  • Successfully migrated objects to the production environment while providing both technical and functional support.
  • Installed/configured Informatica’s ILM product suite for Data masking (Data Privacy), file archive load and data discovery and data visualization for data archive.
  • Created projects in ILM for data masking with different parameters like commit interval, encryption key and degree of parallelism.
  • Experience inBig Datawith deep understanding of the Hadoop Distributed File System Eco System (MapReduce,Pig,Hive,Sqoop,HBase Cloudera Manager) ETL and RDBMS
  • Optimized data transformation processes in the Hadoop and Big Data.
  • Loaded data from flat files to Big Data (1010 data).
  • Retrieved data from Hadoop Distributed File System and loaded into Oracle.
  • Working knowledge of Hadoop File system
  • Used Power exchange CDC (change data capture) for pulling from oracle source.
  • Developed E-MAIL tasks to send mails to production support and operations.
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Resolved Skewness in Teradata.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Wrote BTEQ scripts of Teradata extensively.
  • Worked on different types of prompts like value, text, date, time etc in Cognos reporting.
  • Drill through the parent-child hierarchy based on ID in Cognos reporting.
  • Worked with framework manager, Cognos connection, query reporting, report studio.
  • Worked on standalone and embedded reports of Cognos.
  • Designed and developed UNIX Scripts to automate the tasks.
  • Resolved memory related issues like DTM buffer size, cache size to optimize session runs.
  • Worked on SQL Server Integration Services (SSIS) to integrate and analyze data from multiple homogeneous and heterogeneous information sources (CSV, Excel, Oracle db).
  • Worked with Hyperion Essbase, Hyperion Planning and HAL.
  • Experienced in Developing Hyperion Essbase Cubes, Hyperion Planning Cubes Load Rules from text files and Oracle server and also develop calc script.
  • Extracted and Transformed data into Staging Servers named as PStaging Server for historical data and TStaging Server for load increment.
  • Performed Loading operation of historical data using full load and incremental load into Enterprise Data Warehouse.
  • Created SSIS packages to Extract, Transform and load data using different transformations such as Lookup, Derived Columns, Condition Split, Aggregate, Pivot Transformation, and Slowly Changing Dimension, Merge Join and Union all.
  • Extensively worked on Power Center 9.5.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite (EBS).
  • Did data validations in Informatica mappings and loaded the target validation tables.
  • Developed Custom Logging so user can know when a row is inserted in custom logging table by every SSIS package that executes.
  • Created data objects in Informatica Data Quality
  • Previewed the data objects in csv files and ran the validation scripts in Excel.
  • Created expression rules and ran profiles on the source data.
  • Created scorecards for redundant data and created validation rules using word wrapper in IDQ.
  • Created reference tables in Oracle 11g and used it as database dictionaries in IDQ.
  • Did data profiling for the column/across table data validation.
  • Parsed the target data in IDQ using parser transformation.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1, Informatica IDQ 9.6.X, Informatica Cloud, Oracle 11g, SQL Server 2012/2008 R2, Erwin - 4.0, TOAD 9.x, SalesForce dot com(SFDC),Big Data Edition, Hadoop, Oracle EBS, Shell Scripting, Teradata 14, SQL Assistant, B2B DT, Oracle SQL *Loader, Cognos e-business suite, PL/SQL, SSIS and Sun Solaris UNIX, Windows-XP.

Confidential, Houston, TX

Senior Informatica Developer

Responsibilities:

  • Worked with business analysts for requirement gathering (BRD), business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
  • Generated Hyperion reports against ESSBASE (PA Files).
  • Resolved conflicts to Essbase database conflict from the Hyperion by removing the conflicts from Hyperion administrative services and also from excel spreadsheet.
  • Developed mappings for fact and dimension tables using the various transformations to extract data from different source databases Oracle, SQL Server and XML Files.
  • Integrated data from external sources like swift, xml using B2B Data exchange tool of Informatica.
  • Generated events, errors, transactions, values etc on the B2B dashboard for business decisions.
  • Worked with data integration hub of Informatica B2B.
  • Did the managed file transfer (MFT) from the legacy server to Informatica server.
  • Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Worked on Migration of mappings from Data Stage to Informatica.
  • Created Workflows using various tasks like sessions, events raise, event wait, decision, e-mail, command, work lets, Assignment and worked on scheduling of the workflows.
  • Used mapping parameters and variables.
  • Prepared mapping specification document, which gives the data flow and transformation logic for populating each column in the data warehouse table.
  • Used debugger to analyze the data flow between source and target to fix the data issues.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Created the expression, encryption, SSN replacement data masking techniques in data masking transformation of Informatica Power Center.
  • Did data masking for the limited trust zone using data masking transformation of Informatica power center.
  • Created and reviewed the rule sets of data masking.
  • Worked extensively on Cognos cube development for the financial reports.
  • Changed Cognos financial reports based on end user requirement by applying appropriate filters.
  • Created the Cognos connections for web portal and content store.
  • Developed a query subject in framework manager in Cognos.
  • Integrated the sales force data into target Oracle using Informatica cloud.
  • Validated the sales force target data in force.com application.
  • Created Invoices, Cash Receipts, RMA, RMA Start into Sales Force from Oracle EBS.
  • Extensive knowledge on Master Data Management (MDM) concepts.
  • Extensive experience on Designing, Managing and administrating MDM/DIW objects using Kalido DIW/MDM8.5/9 tool.
  • Configured business objects Universe.
  • Made changes to BO financial reports based on user input.
  • Calculated the KPI’s and worked with the end users for OBIEE report changes.
  • Created the RPD for OBIEE.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Administrative role and monitoring and support for several projects in Staging and Production environments for Informatica and OBIEE.
  • Created the metadata layer for OBIEE.
  • Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS
  • Retrieved data from SAP using Informatica Power Exchange.
  • Configured SAP IDoc connector for Informatica.
  • Retrieved data from SAP IDocs using Informatica connector.
  • Configured Informatica Power Exchange add on for SAP Hana (Power Connect)
  • Worked on designing catalogs, categories, sub-category and user roles using Kalido MDM 9.
  • Used SQL Loader to load huge flat files into Oracle database.
  • Implemented audit and reconcile process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
  • Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.
  • Created the release requests for QA Builds to include all the release requirements and involved in the implementation of QA, UAT and Production releases.
  • Developed a UNIX Shell scripts which will send the reports to client over the network by using file transfer protocol (FTP & SFTP) and generating the log file, which will keep the history for the FTP reports.
  • Provided data loading, monitoring, System support and worked on data issues raised by end user during its production support phase.

Environment: Informatica Powercenter 9.1.0 HF1, Oracle 11g, SQL Server 2012/2008 R2, SSIS, Netezza 7.2.0, Oracle SQL* LOADER, Flat Files, UNIX Shell Scripting, TOAD Informatica 9, SalesForce dot com(SFDC), OBIEE, Business Objects.

Confidential, Norfolk, VA

ETL/Informatica Developer

Responsibilities:

  • Perform analysis and gathering requirement by coordinating with Business Users.
  • Prepare High Level and Low Level Design Specifications.
  • Prepare Source to Target mapping documents for ease of ETL implementation.
  • Prepare specifications to create new Data model by coordinating with Data Architect and DBA.
  • Created and reviewed Informatica mappings, sessions and workflows.
  • Created test data, test scripts, perform testing and review test results.
  • Extensively worked on created and reviewing SQL scripts and UNIX shell Scripts.
  • Tuned embedded SQL in Informatica mappings.
  • Extensively tuned Informatica mappings, sessions by following best practices to avoid unnecessary caching of records by the lookups.
  • Created and used native SQL for performing updates where Informatica is taking more time on huge dataset tables.
  • Modify and maintain the Oracle Metadata tables to support the Essbase cubes.
  • Created SSIS packages using BIDS.
  • Imported data from AS400 to be loaded into SQL Server 2008 into the dbo schema through import wizard and stored it as an SSIS package.
  • Worked with the business objects developers in building the universes.
  • Made changes to BO reports from the user’s input.
  • Configured the fast export utility of Teradata.
  • Loaded data using the Teradata utilities like FLoad, MLoad.
  • Retrieved data from Teradata using the fast export utility.
  • Purging old data using Post-SQL in Informatica mappings.
  • Created Crystal reports connections to Oracle and SQL Server using deisgner.
  • Optimized the crystal reports queries for faster reports retrieval.
  • Involved in development of Informatica interfaces.
  • Involved in periodic team Meetings to get regular updates from the team.
  • Involved in pulling data from XML files, flat Files, SQL Server into Data warehouse and then Data Mart.
  • Participated in logical & physical and STAR Schema for designing the Data Marts.
  • Wrote complex SQL scripts to avoid Informatica Joiners, Unions and Look-ups to improve the performance as the volume of the data was heavy.
  • Provided requirements to the Business Objects development team to create reports.
  • Involved in working with the Business Objects team to create required Universe measures to auto populate values at the report level.

Environment: Informatica Powercenter 8.6, Windows XP/2003, UNIX, XML, Oracle 10g, Teradata 14, IBM Mainframes, SQL Server, EJB, JSP, JDBC,VISIO, Cognos, SQL and PL/SQL, Power Exchange. Business Objects XI edition.

Confidential, Bloomington, IL

ETL/Informatica Developer

Responsibilities:

  • Coordinated the meetings with Architects and Business analysts for requirement gathering, business analysis to understand the business requirement and to prepare Technical Specification documents (TSD) to code ETL Mappings for new requirement changes.
  • Involved with source systems owners, day-to-day ETL progress monitoring, Data warehouse target schema design (star schema) and maintenance.
  • Worked with Team Lead and Data Modeler to design data model using Erwin 7.1
  • Extensively worked on multiple sources like Oracle 10g, DB2, MS SQL server, XML and delimited flat files to staging database and from staging to the target XML and Oracle Data Warehouse database.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter 8.6.1/7.1.4.
  • Worked closely with the Cognos reports development team in configuring the cubes.
  • Created Cognos cubes and denormalized the data for faster access.
  • Customized the Cognos reports.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data. Most of the transformations were used like the Source qualifier, Router, Aggregators, Connected & Unconnected Lookups, Unions, and Filters & Sequence.
  • Worked on requirement gathering for B2B XML’s feed (Customer and Product).
  • Developed XML’s using XML parser Transformation for large scale data transmission.
  • Developed functional and Technical design documents for implementing Informatica web services.
  • Debugging invalid mappings using break points, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data.
  • Used SQL tools like TOAD 9.6.1 to run SQL queries and validate the data in warehouse.
  • Used Unix Shell Scripts for automation of ETL Batch Jobs.
  • Involved Bulk data loading used J2EE Web Service to FTP XML’s Feed (Customer & Product) to B2B.
  • Participated in defining and implementing a data masking environment in the Onsite / Offshore model.
  • Performed various operations like scheduling, publishing and retrieving reports from corporate documents using the business objects reporting.
  • Conducted interviews and organized sessions with Business and SMEs located at various theaters across the globe to obtain domain level information and conduct analysis.
  • Involved in preparing Test script documentation for Load Testing and involved in ping tests conducted on servers from different theaters across the globe
  • Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements.

Environment: Informatica Powercenter 8.1, Oracle 10g/9i, Linux, XML, SQL SERVER, PL/SQL, TOAD 9.6.1, Unix, Business Objects XI, Cognos E-Business Suite, Cognos BI Suite 6.0/7.2, Shell Scripts, SQL Navigator.

Confidential, Pittsburg, PA

Informatica Developer/ Data Analyst

Responsibilities:

  • Worked with business analysts for requirement gathering (BRD), business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
  • Developed mappings for fact and dimension tables using the various transformations to extract data from different source databases Oracle, SQL Server and XML Files.
  • Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Created Workflows using various tasks like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment and worked on scheduling of the workflows.
  • Used mapping parameters and variables.
  • Prepared mapping specification document, which gives the data flow and transformation logic for populating each column in the data warehouse table.
  • Used debugger to analyze the data flow between source and target to fix the data issues.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Worked closely with crystal reports developers in populating the reporting layer denormalized data.
  • Involved in DBMS developments include building data migration scripts using Oracle SQL*LOADER and UTL packages.
  • Implemented audit and reconcile process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
  • Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.
  • Created the release requests for QA Builds to include all the release requirements and involved in the implementation of QA, UAT and Production releases.
  • Developed a UNIX Shell scripts which will send the reports to client over the network by using file transfer protocol (FTP & SFTP) and generating the log file, which will keep the history for the FTP reports.
  • Provided data loading, monitoring, System support and worked on data issues raised by end user during its production support phase.

Environment: Informatica Powercenter 8.1.1 SP4, Oracle 10g, Oracle SQL * Loader, Flat Files, UNIX Shell Scripting, TOAD, Approrx 6.0, Crystal reports.

Confidential

Sr. Informatica Developer

Responsibilities:

  • Moved data from legacy systems to Oracle Server 7.3 using SQL*LOADER and PL/SQL procedures
  • Involved in data modeling.
  • Creating physical data model.
  • Designed the application security.
  • Created all Oracle schema objects.
  • Wrote triggers and stored procedures.
  • Analyzed various Schemas for Implementation and Modeled the Data Warehousing Data marts using Star Schema.
  • Created mappings using the Transformations such as the Source qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence Generator, Update Strategy etc.
  • Extracted data form flat files and oracle database, applied business logic to load them in the central oracle database.
  • Developed complex mappings and mapplets in Informatica to load the data using different transformations.
  • Created and Monitored Sessions and Batches using Server Manager.
  • Extensively used various Performance tuning Techniques to improve the session performance (Partitioning etc).
  • Successfully moved the Sessions and Batches from the development to production environment.
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Server Manager.
  • Generated completion messages and status reports using Workflow manager.
  • Created workflows and worklets for designed mappings.
  • Developed mappings and loaded data on to the relational database. Worked on Dimension as well as Fact tables.
  • Extensively used PL/SQL Procedures/Functions to build business rules.
  • Created User conditions and Filters to improve the report generation.

Environment: Informatica PowerCenter 8.1, Oracle8i, Flat files, SQL, PL/SQL, UNIX, Developer2000, SQL*Loader, PL/SQL procedures, Windows 2000.

We'd love your feedback!