We provide IT Staff Augmentation Services!

Lead Etl/dac Architect Resume

3.00/5 (Submit Your Rating)

San Clemente, Ca

SUMMARY:

  • Over 11 years of experience in analysis, design, development and implementation of Client/Server and Data warehouse applications using Oracle, MS SQL, DB2, NETEZZA server on Windows and UNIX platforms.
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehouse/Decision Support Systems using ETL tools Informatica PowerCenter 9.5/9.1/8.6.1/8.1/7.1/6.2/6.1, OLTP, OLAP.
  • Data Modeling experience using Star Schema/Snowflake modeling, OLAP/ROLAP tools, Fact and Dimensions tables, Physical and logical data modeling, and Oracle Designer.
  • Experience in documenting Design specs, Unit test plan and deployment plan.
  • Experience in Installation, Configuration, and Administration of Informatica Power Center 9.x/8.x/7.x/6.x and Power Mart 5.x/6.x Client, Server.
  • Experience in Repository Configuration, creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Extensively worked on Data Warehouse Administrative Console (DAC).
  • Experience in data integration of various data sources from Databases like MS Access, Oracle, SQL Server and formats like flat - files, CSV files, COBOL and XML files.
  • Experience in Oracle as Pl/SQL Developer.
  • Extensively worked on both Windows (Batch and Power shell) and Unix Scripting.
  • Extensively worked on Data Services proactively and iteratively collaborate in the profiling and cleansing of data across the heterogeneous sources.
  • Experience in creation and customization of ETL extracts and load process using Informatica.
  • Experience in Performance tuning of Informatica (sources, mappings, targets and sessions) and tuning the SQL queries.
  • Worked on data migration of Informatica Mappings, Sessions, and Workflows to Data Integrator.
  • Experience in design and implementation using ETL tools like Informatica (Power Center) Designer, Repository Manager and Workflow / Server Manager.
  • Extensively worked on Installation and configuration of Financials, Supply Chain Analytics, Procurement Analytics and Spend Analytics through Informatica.
  • Experience in Data masking using SQL scripts
  • Through knowledge in using Teradata
  • Extensively worked on Teradata utilities - SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man to design and develop dataflow paths for loading transforming and maintaining data warehouse.
  • Experience in Teradata parallel support and Unix Shell scripting.
  • Strong experience in creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.
  • Experience in using Data Warehouse Administration Console (DAC) to schedule the Informatica jobs and metadata management .
  • Expertise in design, development, and implementation of Informatica applications including Business Analysis, Dimensional Modelling, Repository Management (RPD), Extraction Transformation and Load (ETL).
  • Proficient within the Physical Layer, Business Model and Mapping Layer and Presentation Layer using OBIEE Administration tool.
  • Data Modeling experience using Star Schema/Snowflake modeling, OLAP/ROLAP tools, Fact and Dimensions tables, Physical and logical data modeling, and Oracle Designer.
  • Experience in documenting Design specs, Unit test plan and deployment plan.
  • Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectations.
  • Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work independently and use sound judgment.
  • Developed excellent professional skills by working independently and also as a team member to analyze the Functional/ Business requirements and to prepare test plans, test scripts.
  • Strong Conceptual, Business, Analytical skills & highly motivated team player and ability to provide work estimates and meet deadlines
  • Excellent written and verbal communication skills.

TECHNICAL SKILLS:

ETL Tool: INFORMATICA 9.5/9.1/8.6.1/8.1//7.1/6.2/6.1

Data Modeling: Toad, Star-Schema, Snowflake-Schema, Erwin.

DBMS: Oracle 8i/9i/10g/11g, MS Access,SQL Server 2003/2005, Teradata 13.1, Netezza, IBM DB2.

Languages & Scripting: SQL, PL/SQL, Java, XML, HTML, Perl, Batch

Scripting, Power Shell, Korn Scripting, C, C++, JCL.:

Reporting Tools: OBIEE 10g/11g, Business Objects 3.1/4.1, Cognos

Schedulers: DAC 10g/11g, Control M, Autosys and ROBOT.

Operating Systems: Windows 95/98/00/NT/XP/Vista, Sun Solaris, UNIX.

PROFESSIONAL EXPERIENCE:

Confidential, San Clemente, CA

Lead ETL/DAC Architect

Responsibilities:

  • Verify out-of-box OBIA implementation and validate standard metrics.
  • Perform detailed fit-gap analysis between requirements and out-of-box functionality.
  • Design and build extended OBIA data model to cover additional required functionality.
  • Design and build extended OBIA ETL mappings to cover additional required functionality.
  • Design and build three OBIEE reports - P&L Statement, Balance Sheet, and Trial Balance.
  • Participate in Integration and user acceptance testing.

Environment: Informatica Power Center 9.5, Oracle 11g, OBIEE 11g, Oracle EBS Apps 12R, DAC (Datawarehouse Administrative Console), Toad, Sql Developer, Flat Files, UNIX.

Confidential, Madison, WI

Lead ETL Architect

Responsibilities:

  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development by applying Ralph Kimball methodology of dimensional modeling and using Erwin.
  • Imported various Application Sources, created Targets and Transformations using Informatica Power Center Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer).
  • Actively involved in gathering requirements from the end users.
  • Used ETL (Informatica) to load data from sources like Oracle database and Flat Files.
  • Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables..
  • Worked with various transformations to solve the Slowly Changing Dimensional Problems using Informatica Power Center
  • Developed and scheduled Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.
  • Did performance tuning at source, transformation, target, and workflow levels.
  • Used PMCMD, and UNIX shell scripts for workflow automation.
  • Involved in creating and management of global and local repositories and assigning permissions using
  • Repository Manager. Also migrated repositories between development, testing and production systems.
  • Extensively used Stored Procedures, Functions and PL/ SQL Programming.
  • Extensively used Normalizer transformation while loading the data from the COBOL copybooks to the staging area.
  • Integrated data from Finance and other enterprise systems and transformed data using Informatica (ETL).
  • Involved in installation and configuration of Informatica Power Center and DVO (Data Validation Option).
  • Proficiency in identifying the issues/defects and resolving them with best practices.
  • Responsible for configuring the Source and lookup files.
  • Administration of all components that make up the Data Warehouse and Informatica applications and related tools including software installation, patching, configuration, monitoring, and tuning.
  • Extensively supported in attending all support calls like Service calls and production support calls of the database.
  • Develop and implement Data Warehouse, data mart techniques for target structures such as Star Schemas, Snowflake Schemas, and highly normalized data models.
  • Identified the major issues in Data warehouse and delivered the things on time.
  • Diagnose and resolve Data Warehouse access and performance issues.
  • Extensively worked on DVO to generate the Jasper reports for all Tested results.
  • Developed data Mappings between source systems and warehouse components.
  • Created and defined DDL’s for the tables at staging area.
  • Responsible for creating shared and reusable objects in Informatica shared folder and update the objects with the new requirements and changes.
  • Analyzed systems, met with end users and business units, in order to define and meet the requirements.
  • Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs.
  • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
  • Migration between Development, Test and Production Repositories.
  • Supported the Quality Assurance team in testing and validating the Informatica workflows.
  • Did unit testing and development testing at ETL level in my mappings.
  • Installed and configured Informatica Tools on client machine on Windows NT environment.
  • Used Informatica Power Analyzer and Informatica Analytics for developing Dashboards, Alerts, Reports,
  • Indicators and Adhoc Query Reporting.
  • Training the users how to access and run the jobs Using Informatica workflow Manager

Environment: Oracle 11g, Business Objects 3.1/4.1, Informatica Power Center 9.5/9.1 (Repository Manager, Designer, Workflow Manager and Workflow Monitor), DVO 9.5, ANSI SQL, PLSQL, SQL*PLUS, SQL*Loader, TOAD, SQL Developer, Unix(shell scripting), Control M

Confidential, North Wales, PA

Business Intelligence Lead

Responsibilities:

  • Studied the existing environment and accumulating the requirements by querying the Clients on Project Forecasting.
  • Data modeling and design the data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Prepared user requirement documentation for mapping and additional functionality. Extensively used ETL to load data using PowerCenter/PowerConnect from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle.
  • Worked on OBIEE 11g to build the RPD/Dashboards/Reports.
  • Created PL/SQL packages, Stored Procedures and Triggers for data transformation on the data warehouse.
  • Design and Development of data validation, load processes, test cases, error control routines, audit and log controls using PL/SQL, SQL.
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases. Assisted developers with Teradata load utilities and SQL.
  • Query tuning for long-running scripts, and coded in Teradata tools and utilities
  • Converted batch jobs with BULKLOAD utility to TPUMP utility
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Implemented slowly changing dimensions methodology to keep track of historical data.
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
  • Wrote appropriate code in the conversions as per the Business Logic using BTeq scripts.
  • Expertise in performance tuning the user queries. Execution of frequently used SQL operations and improve the performance.
  • Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts
  • Extensively used SQL, PL/SQL and Teradata in creation of Triggers, Functions, Indexes, Views, Cursors and Stored Procedures.
  • Used Update strategy and Target load plans to load data into Type-2 /Type1 Dimensions.
  • Created and used reusable Mapplets and transformations using Informatica Power Center.
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Design and Development of ETL routines, using Informatica Power Center Within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Stored procedures, functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done.
  • Developed Joiner transformation for extracting data from multiple sources.
  • Preparation of technical specification for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards.
  • Design and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Documented the complete Mappings.

Environment: Informatica Power Center 8.6.1, Teradata 13.1, Oracle 11g, OBIEE 11g, DB2, SQL Server 2005, Toad, Sql Developer, Flat Files, UNIX, Autosys.

Confidential, Englewood, CO

Lead ETL/DAC Architect

Responsibilities:

  • Integrated data from Finance and other enterprise systems and transformed data using Informatica (ETL).
  • Involved in installation and configuration of Informatica and DAC.
  • Proficiency in identifying the issues/defects and resolving them with best practices.
  • Analyzed OBIEE Analytics Metadata repository (RPD) that consists of Physical Layer, Business Mapping and Model Layer, and Presentation Layer.
  • Responsible for configuring the Source and lookup files.
  • Administration of all components that make up the Data Warehouse and Informatica applications and related tools including software installation, patching, configuration, monitoring, and tuning.
  • Extensively supported in attending all support calls like Service calls and production support calls of the database.
  • Develop and implement Data Warehouse, data mart techniques for target structures such as Star Schemas, Snowflake Schemas, and highly normalized data models.
  • Identified the major issues in Data warehouse and delivered the things on time.
  • Monitored scheduled jobs in DAC scheduler on different project plans across the warehouse.
  • Diagnose and resolve Data Warehouse access and performance issues.
  • Customizing the existing informatica repository and adding new mappings and workflows based on the requirements.
  • Extensively worked as a Techno-Functional consultant in providing CRM solution with excellent business domain experience in Siebel Telecom, Finance and Siebel Call Center, Sales and Marketing Applications.
  • Analysis, Design and Development in SQL, Informatica and OBIEE 11g/10g.
  • Developed the repository, merges the repository, performance tuning and enforcing repository best practices.
  • Troubleshooting and debugging of defects and production issues in the areas of SQL, Informatica and OBIEE 11g/10g.
  • Works in conjunction with System Administrators for migration, deployment and performance tuning of repositories.
  • Designed and Developed Custom Mappings and Workflows in Informatica to bring the data from other data source systems and Integrated using Universal Adapters in DAC.
  • Implement incremental logic for Stage load mappings and Insert/update logic for all Fact mappings.
  • Monitored Incremental and Full load of Data through Data Warehouse Administration Console (DAC) and Informatica Workflow Monitor.
  • Imported and Exported Execution Plan in DAC from one Instance to other Instance.

Environment: Oracle 11g, DAC 11g, OBIEE 11g, EBS11.6.10, Siebel Analytics/CRM/OBIEE/BA, RPD, Informatica Power Center 9.1 (Repository Manager, Designer, Workflow Manager and Workflow Monitor), ANSI SQL, PLSQL, SQL*PLUS, SQL*Loader, TOAD, Unix(shell scripting).

Confidential, Chicago, IL

Lead ETL Analyst

Responsibilities:

  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development by applying Ralph Kimball methodology of dimensional modeling and using Erwin.
  • Imported various Application Sources, created Targets and Transformations using Informatica Power Center Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer).
  • Actively involved in gathering requirements from the end users.
  • Used ETL (Informatica) to load data from sources like Oracle database and Flat Files.
  • Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables..
  • Worked with various transformations to solve the Slowly Changing Dimensional Problems using Informatica Power Center
  • Developed and scheduled Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.
  • Did performance tuning at source, transformation, target, and workflow levels.
  • Used PMCMD, and UNIX shell scripts for workflow automation.
  • Involved in creating and management of global and local repositories and assigning permissions using
  • Repository Manager. Also migrated repositories between development, testing and production systems.
  • Worked with Source Analyzer, Warehouse Designer, Transformation designer, Mapping designer and Workflow Manager.
  • Extensively used Stored Procedures, Functions and PL/ SQL Programming.
  • Extensively used Normalizer transformation while loading the data from the COBOL copybooks to the staging area.
  • Used transformations such as the source qualifier, normalizer, aggregators, connected & unconnected lookups, filters, sorter, stored procedures, router & sequence generator.
  • Checked and tuned the performance of application.
  • Developed data Mappings between source systems and warehouse components.
  • Created and defined DDL’s for the tables at staging area.
  • Responsible for creating shared and reusable objects in Informatica shared folder and update the objects with the new requirements and changes.
  • Analyzed systems, met with end users and business units, in order to define and meet the requirements.
  • Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs.
  • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
  • Migration between Development, Test and Production Repositories.
  • Supported the Quality Assurance team in testing and validating the Informatica workflows.
  • Did unit testing and development testing at ETL level in my mappings.
  • Installed and configured Informatica Tools on client machine on Windows NT environment.
  • Used Informatica Power Analyzer and Informatica Analytics for developing Dashboards, Alerts, Reports,
  • Indicators and Adhoc Query Reporting.
  • Training the users how to access and run the jobs Using Informatica workflow Manager

Environment: : Informatica Power Center 9.5/9.1, Oracle 11g, Toad, Erwin, SQL, PL/SQL, SQL plus, MS SQL Server 2008, UNIX.

Confidential, Addison, IL

Lead ETL/DAC Architect

Responsibilities:

  • Implemented Financials, Supply Chain, Order Management and Procurement.
  • Integrated data from Finance and other enterprise systems and transformed data using Informatica (ETL).
  • Involved in installation and configuration of Informatica and DAC.
  • Analyzed OBIEE Analytics Metadata repository (RPD) that consists of Physical Layer, Business Mapping and Model Layer, and Presentation Layer.
  • Responsible for configuring the Source and lookup files.
  • Develop the overall data warehouse architecture and physical implementation standards.
  • Act as evangelist for BI benefits across the organization; promote BI usage to relevant departments.
  • Administration of all components that make up the Data Warehouse and Informatica applications and related tools including software installation, patching, configuration, monitoring, and tuning.
  • Design, create and tune physical database objects (tables, views, indexes) to support logical and dimensional models. Maintain the referential integrity of the database.
  • Develop and implement Data Warehouse, data mart techniques for target structures such as Star Schemas, Snowflake Schemas, and highly normalized data models.
  • Diagnose and resolve Data Warehouse access and performance issues.
  • Customizing the existing informatica repository and adding new mappings and workflows based on the requirements.
  • Analyzes and develops complex Workflows and SmartScripts . Performs sophisticated systems analysis in the enterprise environment.
  • Extensively worked as a Techno-Functional consultant in providing CRM solution with excellent business domain experience in Siebel Telecom, Finance and Siebel Call Center, Sales and Marketing Applications.
  • Analysis, Design and Development in SQL, Informatica and OBIEE 11g/10g .
  • Developed the repository, merges the repository, performance tuning and enforcing repository best practices.
  • Troubleshooting and debugging of defects and production issues in the areas of SQL, Informatica and OBIEE 11g/10g .
  • Works in conjunction with System Administrators for migration, deployment and performance tuning of repositories.
  • Extensively used ETL processes to load data from flat files into the target database by applying business logic on transformation mapping for inserting and updating records when loaded.
  • Experience in Architecting large BI implementations over Enterprise Data Warehouse.
  • Experience with Analysis, design, development and implementation of projects for corporate Data Warehouse.
  • Extensive BI and Data warehousing experience using Siebel Analytics 7.x/OBIEE and Informatica 9.x/8.x/7.x/6.x.
  • Designed and Developed Custom Mappings, Workflows and Registered in DAC.
  • Experience in Preparing Design Specification documents based on functional requirements and also involved in the preparation of Technical Design Documents.
  • Worked on extracting data from Oracle EBS, OLTP flat files and integration and loading data to Data Warehouse (OLAP) system.
  • Designed and Developed Custom Mappings and Workflows in Informatica to bring the data from other data source systems and Integrated using Universal Adapters in DAC.
  • Implement incremental logic for Stage load mappings and Insert/update logic for all Fact mappings.
  • Created Time Series measures Year ago, Month Ago, Year-to-date measures using AGO and To Date functions.
  • Modified Dynamic Session Variables used for Data Visibility, Configured Security at the repository level, Catalog level & the Application level.
  • Monitored Incremental and Full load of Data through Data Warehouse Administration Console (DAC) and Informatica Workflow Monitor.
  • Developed Informatica Mappings using corresponding Source, Targets and Transformations like Source Qualifier, Sequence Generator, Filter, Router, Joiner, Lookup, Expression, Update Strategy, and Aggregator.
  • Defined the Primary, Foreign keys and created Simple and Complex joins between various Dimension and Fact Tables.
  • Imported and Exported Execution Plan in DAC from one Instance to other Instance.
  • Built the Physical Layer /Business Model and Mapping Layer/ Presentation Layer of a Repository by using Star and snowflake Schemas.
  • Involved in Performance Tuning of reports by identifying the indexes required on the backend tables and also from the data available in the Cache.
  • Designed and Developed Sessions and Workflows.
  • Exclusively Involved in Unit Testing, Peer Testing & Regression Testing.
  • Adding new Tables,Tasks and rebuilding the subject areas in DAC based on the changes done to the informatica repository.
  • Created functions, procedures and triggers in PL/SQL.
  • Created mappings using PL/SQL packages to implement business rules while loading data.
  • Written test scripts to compare OLTP vs OLAP for ETL testing.

Environment: Oracle Applications R12, General Ledger, Account Payables, Accounts Receivable, Inventory, Order Management, Oracle Purchasing, OWB, Oracle 11g/10g/9i (9.2.0), OBAW, (OBIEE 10.1.3.4.1 /11 g, EBS11.5.10, Siebel Analytics/CRM/OBIEE/BA, RPD, Apex 4.0, ETL, Informatica Power Center 8.x (Repository Manager, Designer, Workflow Manager and Workflow Monitor), DAC 10.1.3.4.1, XML/BI Publisher 5.6.3, Discoverer 11g, MSCA, ANSI SQL, PLSQL, SQL*PLUS, SQL*Loader, TOAD, Unix(shell scripting).

Confidential, Marlborough, MA

Lead ETL / Data Analyst

Responsibilities:

  • Studied the existing environment and accumulating the requirements by querying the Clients on various aspects.
  • Identification of various Data Sources and Development Environment.
  • Data modeling and design of for data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Prepared user requirement documentation for mapping and additional functionality. Extensively used ETL to load data using PowerCenter/PowerConnect from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle.
  • Created PL/SQL packages, Stored Procedures and Triggers for data transformation on the data warehouse.
  • Design and Development of data validation, load processes, test cases, error control routines, audit and log controls using PL/SQL, SQL.
  • Used Update strategy and Target load plans to load data into Type-2 /Type1 Dimensions.
  • Created and used reusable Mapplets and transformations using Informatica Power Center.
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Design and Development of ETL routines, using Informatica Power Center Within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Stored procedures, functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done.
  • Developed Joiner transformation for extracting data from multiple sources.
  • Preparation of technical specification for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards.
  • Design and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Documented the complete Mappings.

Environment: Informatica Power Center 8.6.1, Oracle 11g, DB2, SQL Server 2005, Toad, Sql Developer, Cognos, OBIEE, Flat Files, UNIX, Autosys.

Confidential, Clearwater, FL

Lead ETL/ DAC Analyst

Responsibilities:

  • Implemented HR, Financials, Order Management and Procurement.
  • Integrated data from Finance and other enterprise systems and transformed data using Informatica (ETL).
  • Involved in installation and configuration of Informatica and DAC.
  • Analyzed OBIEE Analytics Metadata repository (RPD) that consists of Physical Layer, Business Mapping and Model Layer, and Presentation Layer.
  • Responsible for configuring the Source and lookup files.
  • Develop the overall data warehouse architecture and physical implementation standards.
  • Act as evangelist for BI benefits across the organization; promote BI usage to relevant departments.
  • Administration of all components that make up the Data Warehouse and Informatica applications and related tools including software installation, patching, configuration, monitoring, and tuning.
  • Design, create and tune physical database objects (tables, views, indexes) to support logical and dimensional models. Maintain the referential integrity of the database.
  • Develop and implement Data Warehouse, data mart techniques for target structures such as Star Schemas, Snowflake Schemas, and highly normalized data models.
  • Diagnose and resolve Data Warehouse access and performance issues.
  • Customizing the existing informatica repository and adding new mappings and workflows based on the requirements.
  • Worked on extracting data from Oracle EBS, OLTP flat files and integration and loading data to Data Warehouse (OLAP) system.
  • Designed and Developed Custom Mappings and Workflows in Informatica to bring the data from other data source systems and Integrated using Universal Adapters in DAC.
  • Implement incremental logic for Stage load mappings and Insert/update logic for all Fact mappings.
  • Created Time Series measures Year ago, Month Ago, Year-to-date measures using AGO and To Date functions.
  • Modified Dynamic Session Variables used for Data Visibility, Configured Security at the repository level, Catalog level & the Application level.
  • Designed the Data model and Load strategy to get data from different systems and use it for the Online Registration database.
  • Prepared the Technical Specs, Reverse Engineered Logical designs using Erwin, Flow diagrams and Timeline sheets to facilitate development of Informatica Flows.
  • Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.
  • Designed and Developed Mappings, sessions and workflows in Informatica.
  • Extracted, transformed data from various sources such as Flat files, Oracle 11g and transferred data to the target data warehouse Oracle 11g and Flat files.
  • Customized Project, HR, Procurement and Finance inbuilt mappings.
  • Designed and Deployed UNIX Shell Scripts.
  • Designed and developed Informatica Mapping for data load and data cleansing
  • Responsible for Pre and Post migration planning for optimizing Data load performance, capacity planning and user support.
  • Performed migration of mappings and workflows from Development to Test and to Production Servers.
  • Partitioned sources and used persistent cache for Lookup’s to improve session performance
  • Worked closely with QA team during the testing phase and fixed bugs that were reported.
  • Worked with Data / Data Warehouse Architect on logical and physical model designs.
  • Performed impact analysis for systems and database modifications.
  • Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.
  • Worked closely with QA team during the testing phase and fixed bugs that were reported.
  • Performed impact analysis for systems and database modifications.
  • Extracted data from Sales department to flat files and load the data to the target database.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Performed data validation, reconciliation and error handling in the load process.
  • Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting.

Environment: Informatica Power Center 9.1/8.6.1, Oracle 11g/10g, OBIEE, SQL Server, XML, Flat Files, Toad, Sql Developer, DAC (Data warehousing Administrative Console), Shell Scripting.

Confidential, Denver

Sr.ETL Developer

Responsibilities:

  • Extracted data as flat file from GSI and DB2 database, applied business logic to load them in the Standardization area, Central Repository and all the way to data marts and flat files.
  • Interview the Business users Data architects in gathering the requirements and finding the data needs.
  • Participate in the discussions with the business solutions team in creating and implementing the plans for the designs like Flow Chart diagrams, Conceptual and Logical Diagrams, and defining the terms on the needs of the project.
  • Helps in getting the potential data sources.
  • Create the data results through techniques and tools such as basic SQL queries, data mining, and multidimensional analysis.
  • Installed and configured Informatica Power Center and Server 8.6.
  • Designed the mappings and workflows and implemented the logic using the transformations.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Maintained Sequential load through ROBOT job scheduler.
  • Tuning the SQL queries as required. Created the PL/SQL Stored procedures, Indexes, Views.
  • Performed unit testing and documented the results.
  • Worked closely with QA team during the testing phase and fixed bugs that were reported
  • Designed the Build Docs of the entire release/ project for the production support.

Environment: Informatica 8.6.1 DB2 4 (iSeries Navigator, TOAD for DB2), Oracle 11g, AS400, Cognos, UNIX, MS ACCESS, MS EXCEL 2007, Robot.

Confidential, Denver

Sr.ETL Developer

Responsibilities:

  • Extracted data as flat file from Sales Connect, SFDC and oracle database, applied business logic to load them in the Standardization area, Central Repository and all the way to data marts.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 8.6.1.
  • Used Java Transformations as both active and passive for calling java web services and generating the output data for updating the backend database.
  • Knowledge in upgrading from Informatica version 7.1.1 to 8.6.1.
  • Created reusable transformations and mapplets and used them in mappings.
  • Used Informatica Power Center 8.6.1 for extraction, loading and transformation (ETL) of data in the data warehouse.
  • Implemented Informatica recommendations, methodologies and best practices.
  • Implemented populate slowly changing dimension to maintain current information and history information in dimension tables.
  • Involved in real time data feed in Morningstar for foreign exchange.
  • Involved in creation of Folders, Users, Deployment Group using Repository Manager.
  • Worked on different data sources such as Oracle, SQL Server, Flat files etc.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Built Autosys boxes which schedules the Informatica jobs at very frequent.
  • Created E-mail notifications tasks using post-session scripts.
  • Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
  • Wrote SQL, PL/SQL, stored procedures & triggers for implementing business rules and transformations.
  • Created procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.
  • Maintained Full and Incremental loads through Autosys on different environments like Development, Stage, Production and Test environment.
  • Created deployment groups, migrated the code into different environments.
  • Written ETL Specs and documentation to describe program development, logic, coding, testing, changes and corrections.
  • Involved in managing the Unix servers and defining the file systems/directory structures on the Unix box for various parameters and deciding the disk space and memory requirements.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Documented technical design documents and error logics.

Environment: Informatica 8.6.1 Oracle 11g (TOAD and SQL Developer), Teradata, Cognos & Tableau, UNIX, MS ACCESS, MS EXCEL 2007, Autosys.

Confidential, New Jersey

Sr.ETL Developer

Responsibilities:

  • Interacting with business representatives for requirement analysis and to define business and functional specifications.
  • Designing and building data marts for Disability, Life, Accident, and underwriting divisions.
  • Loading Data to the Interface tables from multiple data sources such as SQL server, Text files and Excel Spreadsheets using SQL Loader, Informatica and ODBC connections.
  • Creating necessary repositories to handle the metadata in the ETL process.
  • Designing the target warehouse using Star Schema.
  • Even involved in some tasks like Development and setting up the new environments and Generating Microsoft sqlserver Reports(SSIS).
  • Designing and introducing new FACT or Dimension Tables to the existing Model and deciding the granularity of Fact Tables.
  • Installed and configured Informatica Power Center and Server 8.6.
  • Designing different regional data marts and loaded subset of warehouse data.
  • Writing Unix Shell Scripts to load data from different sources.
  • Implementing Aggregate, Filter, Join, and Expression, Lookup, Sequence generator and Update Strategy transformations.
  • Implementing Variables and Parameters in Transformations to calculate billing data in billing Domain.
  • Used Java Transformations to invoke web services and Java Servlets.
  • Tuning performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Simplified the data flow by using a Router transformation to check multiple conditions at the same time.
  • Creating sessions, sequential and concurrent batches for proper execution of mappings using workflow manager.
  • Optimizing Query Performance, Session Performance and Reliability.
  • Designing Database schemas using Erwin Design Tool.
  • Using Informatica Designer designed and developed Source Entities and Target warehouse Entity for Oracle.
  • Involving in versioning the whole process and retiring the old records using the built-in’s DD UPDATE, DD DELETE, and DD INSERT.
  • Developing Mapplets using corresponding Source, Targets and Transformations.
  • Testing all the applications and transport the data to target Warehouse Oracle tables on the server, Schedule and Run Extraction and Load process and monitor sessions by using Informatica Workflow Manager.

Environment: Informatica 8.6/8.1, Oracle 9i (TOAD), MS SQL SERVER 2005, MS SQL Reporting Services (SSIS), MS ACCESS, MS EXCEL 2007(PIVOT tables), Autosys.

Confidential, NJ

ETL Developer

Responsibilities:

  • Analyzed various Schemas for Implementation and Modeled the Data Warehousing Data marts using Star Schema.
  • Created mappings using the Transformations such as the Source qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence Generator, Update Strategy etc.
  • Extracted data form flat files and oracle database, applied business logic to load them in the central oracle database.
  • Developed complex mappings and mapplets in Informatica to load the data using different transformations.
  • Created and Monitored Sessions and Batches using Server Manager.
  • Extensively used various Performance tuning Techniques to improve the session performance (Partitioning etc).
  • Successfully moved the Sessions and Batches from the development to production environment.
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Server Manager.
  • Generated completion messages and status reports using Workflow manager.
  • Created workflows and worklets for designed mappings.
  • Developed mappings and loaded data on to the relational database. Worked on Dimension as well as Fact tables.
  • Extensively used PL/SQL Procedures/Functions to build business rules.
  • Created User conditions and Filters to improve the report generation.

Environment: ETL-Informatica 6.2-Power Center, Oracle8i, Flat files, SQL, PL/SQL, UNIX.

Confidential

ETL Developer

Responsibilities:

  • Designed and created a multi-dimensional schema.
  • Installed and Configured Informatica Server and Power Center, Repository and Server Manager.
  • Worked on Informatica tool - Source Analyzer, Data warehousing designer, Mapping Designer and Mapplet, and Transformations for ETL Processes to load operational data into multi-dimensional database.
  • Worked closely with various DW Analyst/Developers working on specific data marts, multiple business units to identify key information that will enhance business decision-making.
  • Responsible for tuning ETL procedures and STAR Schema and SNOWFLAKE Schema to optimize load and query Performance.
  • Created and Monitored Batches and Sessions using Server Manager.
  • Wrote UNIX scripts and PL/SQL scripts for implementing business rules.
  • Informatica Server Manager was used to do the loading of the transformed data on to the multi-dimensional schema.
  • Generated Unix Scripts for Data warehouse applications and maintaining batch processing.
  • Developed Database Triggers and Stored Procedures in Oracle.
  • Involved in creating STAR Schema for OLAP cubes.
  • Responsible for training end users in installing and Configuring Informatica, creating sessions, batches and browses through repository.
  • Involved in organizing production and documentation of complete project

Environment: Informatica PowerCenter (Repository Manager, Designer, Server Manager), Business Objects 5.1.1, Oracle 8i, SQL Server, Shell Scripting, UNIX.

We'd love your feedback!