We provide IT Staff Augmentation Services!

Etl Conversion Track Resume

5.00/5 (Submit Your Rating)

Richardson, TX

SUMMARY

  • 8 years of extensive experience in Information Technology with special emphasis on Design and Development of Data Warehousing using Informatica Power Center 9.1/9.0.1/8.6 / 8.5 / 8.1.1 / 7.1.1 / 7.1.2 / 7.0 / 6.2
  • Over 5 years of working experience in all teh phases of teh Data warehouse life cycle involving design,
  • Development, and analysis and testing in teh various areas of Data warehousing.
  • Extensively used Informatica Client tools such as Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformations, Workflow Manager/Monitor, Informatica Repository Manager and Informatica Server Manager.
  • Proficient in Oracle 10g/9i, SQL, PL/SQL, SQL*Plus, SQL developer and Toad.
  • Created ETL mappings using Informatica PowerCenter to move Data from multiple sources like XML, SQL server, Flat files, Oracle into a common target area such as Staging, Data Warehouse and Data Marts.
  • Implemented Slowly Changing dimension methodology for accessing teh full history of accounts and transaction information.
  • Extensive experience in developing complex mappings from varied transformations like Source Qualifier, Connected and Unconnected lookups, Router, Filter, Sorter, Expression, Aggregator, Joiner, Union, Update Strategy, Sequence Generator etc.
  • Worked with various SQL Editors such as TOAD.
  • Good understanding and exposure to Data Warehousing techniques such as Star - Schema Modeling, Snowflake Modeling, OLAP, Fact tables, Dimension tables.
  • Hands on experience in Tuning Mappings, Identifying and resolving Performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Experience in installing and configuring Oracle BI Applications (Informatica, DAC, OBIEE)
  • Experienced with coordinating cross-functional teams, project management and presenting technical ideas to diverse groups.
  • Ability to work independently as well as in a team, fun-filled and challenging environment.

TECHNICAL SKILLS

Databases: Teradata, Oracle 11g/10g/9i/8i, MS Access 2000, MS SQL Server7.0, DB2

ETL Tools: Informatica B2B Data Exchange, B2B Data Transformation,PowerCenter 8.x.x/7.1.3/6.2

Data Modeling: Star-Schema Modeling, Snowflakes Modeling

Languages: PL/SQL, HTML, DHTML, XML, Java

Tools and Servers: Toad 9.6.1.1, Data Transformation Studio 4.0, MS Access Reports

Microsoft Visual Inter: Dev 6.0/1.0, MS FrontPage 98, SQL*PlusSQL*Loader, FTP

Operating Systems: Windows 95/ 98/NT/2000, XP, Linux, UNIX

PROFESSIONAL EXPERIENCE

Confidential, Richardson, TX

ETL Conversion Track

Responsibilities:

  • Analysis and identifying teh list of all teh tables across teh EDW based on teh complexity and data volume to shortlist teh eligible ETLs (using teh complexity Calculator document) for conversion from ELT to ETL logic.
  • Current code analysis where teh ELT logic (Teradata queries) and pushdown optimization is used in teh informatica mappings and workflows.
  • Created teh design document, including teh high level design flow diagrams.
  • Implemented teh new ETL logic for converting teh existing ELT logic; made use of teh dynamic, persistent properties of lookup transformation wherever applicable for building teh informatica caches.
  • Implemented teh new feature of Native MERGE INTO in Informatica 9.5.1 for teh SCD type-1 logic for teh better performance.
  • Parameterized all teh session level Relational, Loader connections and teh filenames, filepaths. Defined teh entries for these in teh job control table, these are used by teh script that invokes infa WF.
  • Enabled teh concurrent execution (only with teh unique instance name property at WF level) for teh reporting WFs to load teh reporting tables concurrently into 2 different databases (PROD1 and PROD2).
  • Teh script to invoke teh infa WFs are defined in teh $U (scheduling tool) jobs.
  • Extend teh support to teh QA testing, creating teh deployment documents and ensuring teh successful code deployment in teh Prod environment.
  • Extending teh QA and PROD support for teh daily cycles to fix issues (if any).
  • Identifying any bugs in teh existing logic while doing teh analysis, and coordinating with teh SME & Operation teams, and accordingly raise teh CRs in teh defect tracking tool.
  • Tracking all teh issues encountered in teh HP quality center, for each phase/release, and identifying teh lessons learnt to improve teh quality of code in teh next phases/releases.

Environment: Informatica Power Center v9.5.1, Oracle 11 g, Teradata SQL Assistant 14.0.1, SQL, PL-SQL, Microsoft Visio, Dollar U, Toad, Unix, Perl, HP Quality center, PVCS, Kintana deployment tool.

Confidential, Pleasanton, CA

Claims Data Warehouse

Responsibilities:

  • Requirements analysis and converting teh specifications to teh Technical design documentation.
  • Created teh Data flow diagrams for teh high level design; Source data analysis and identifying teh lineage of teh data elements (as a part of teh detailed design) to load teh staging and reporting tables.
  • Worked on teh Change Requests for teh scheduled release time frames.
  • Involve in teh design decision reviews for teh implementation of teh ETL process for teh CA region from different version of teh Xcelys (Oracle db) and GA Tapestry: Clarity (Teradata db) source systems
  • Implement Regionalization logic in teh same ETL code base though teh business rules are different for ROC (Regions other than CA) versus CA regions. Created and used Persistent lookup cache on a table (XL CDW RGN IDENTIFIER) that requires to be looked up across various sessions to load all teh staging tables to determine teh REGION CODE (NC/SC) based on teh SOURCE SYS CODE (CA / ROC).
  • Implemented teh conditional logic at teh workflow level, based on teh Business rules, to look up teh Persistent cache for processing teh multiple cycles per day on incremental fashion, and to process teh next day cycle by dropping and recreating teh PC. (Teh table is truncated before staring teh next day’s 1st cycle).
  • Responsible for handling all teh financial data extracts (General Ledger, Accounts Payable, and so on) from source to stage and stage to reporting tables.
  • Implementation changes to fix teh workflows per teh CRs, Eg: (a) to convert from regionalization logic to teh shared logic (Eg: certain finance VENDOR tables) (b) To convert from Type1 to Type2; and tan historical fix to teh existing data either using SQL scripts or one-time mappings (ETL).
  • Eg: While teh source system has teh Vendor data for CO and HI regions, teh CDW has only teh CO region:
  • Identified and fixed teh lookup transformations in certain mappings for only teh incremental dataset, to achieve a better performance, which reduced teh WF run time.
  • Upgrade and maintain teh consistency of teh CDW table structures and teh ETL code for every Version & Release upgrade to teh tables on teh Xcelys Source system side, and released to Prod on planned dates.
  • Create Informatica deployment groups, Service Requests (Remedy tool) to migrate teh code to QA/ UAT/ PROD environments.
  • Creating teh parameter file based on teh configurations (Regional Subject Area, Line Nbr, Param Line) set in teh ETL SESSION PARAMETER table and defining all parameters at teh workflow level.
  • Parameterize teh mappings across teh regions for loading teh stage, dim, fact tables.
  • Enabled concurrent execution of teh workflows to allow teh parallel execution of teh same set of sessions pointing to different parameter files to run cycles for each of teh regions.
  • Worked with teh reusable transformations, mapplets, mappings, sessions, worklets, and workflows.
  • Creating teh Unit test case document with teh test results and screenshots.
  • Managing teh cycle loads in teh QA, UAT environments on teh daily basis to ensure teh data reconciles for teh counts and amounts: across teh source to staging and staging to reporting levels for all teh claims and finance, shared (across all regions) tables.
  • Worked on fixing teh emergency RFCs by identifying teh issues and testing: before promoting to Prod.
  • Worked with HP Quality center tool to identify teh assigned defects, and update teh progress.

Environment: Informatica Power Center v8.6.1, Oracle 11 g, Teradata SQL Assistant 14.0.1, SQL, PL-SQL, Microsoft Visio, Tivoli, SQL Developer, Remedy tool (for creating SRs).

Confidential, San Ramon, CA

Recovery Management system

Responsibilities:

  • Derived teh Technical specifications document, based on teh Functional specifications.
  • Worked with B2B Data Transformation to split teh cobol source files(Account Master, Account History, Account Transactions, Customer Master, Score Master), and deployed teh projects to teh ServiceDB in teh server.
  • Used CM console command to split teh original file into multiple files, using teh DT service name.
  • Worked with Informatica Power Center to load teh data into various stages till it gets loaded into teh core db.
  • Loaded teh data into delimited flat files for providing teh TSYS data to teh Financing team
  • Written teh Views in Teradata database to handle teh Change Data Capture mechanism
  • Used Pushdown Optimization technique in Informatica that enables faster processing of teh huge data loads
  • Used MLOAD utility for teh loader connections to load teh cobol source data into teh Teradata database.
  • Used Indirect file loading method to process teh data catch ups for teh Recovery Management System
  • Used parameter files widely at every layer of teh data loads (SRC àB2B à STG à WRK à PRE-CORE à CORE), and avoided any hardcoding in teh power center workflows.
  • Unit Testing in DEV, Prepare teh migration checklist documentation, QA/UAT Support, Production support.

Environment: B2B Data Transformation Studio 8.6.2, Informatica Power Center v8.6.1, Teradata SQL Assistant 12,Windows 2000, ANSI SQL, UNIX Scripting, PVCS

Confidential, Palo Alto, CA

Everest BI

Responsibilities:

  • Involved in teh design and development of teh Keystone Phase III to extract teh data from SFDC source system into teh EDW database and tan to teh Sales BI and teh Marketing data marts.
  • Developing teh mapping documents and teh ETL logic for loading into teh global data warehouse
  • Worked with Salesforce UI, Apex explorer and Apex data loader.
  • Creating Informatica mappings to populate teh data into dimensions and facts using various transformations.
  • Handled teh historical data loads into teh Type2 dimensions facts, with teh Type1 dimensions and facts as teh sources.
  • Handled teh logical deletes in teh Type1 and teh Type2 dimension and fact tables.
  • Implemented teh re-startablility in teh Type 2 ETL logic by including teh PROCESS FLAG attribute in teh type 1 Dim/FACT table that gets updated to Y whenever teh record gets processed; and by choosing teh Source-based commit at teh session level. This attribute is reset to null in teh Type 1 mapping - whenever a source row passes through teh UPDATE flow.
  • Avoid hard coding at teh session/workflow level, parameterized and defined teh variables at teh database level in a table for teh minimal/no changes while promoting teh code across different environments like QA/UAT/Production. Tan with INFA mapping these values are used to generate teh parameter file.
  • Worked on teh condensation of teh historical data in teh Siebel system (that is not handled by teh Conversion process into teh salesforce.com system) - identifying teh proposed set of teh critical columns of Siebel system, and tan migrated to teh EDW database directly.
  • Documented Informatica ETL design & Mappings and Prepared teh Unit Test cases documents.
  • Created and monitored workflow tasks and sessions using Informatica power center.
  • Used teh workflow tasks such as sessions, email, command, decision, event wait.
  • Identified teh tracks where teh performance was poor and worked to tune such mappings and workflows.
  • Used Informatica scheduler, to set teh dependencies between teh jobs.
  • Involved in creating Unix Shell Scripts for tasks like ftp, moving and archiving files.
  • Created Unix Shell scripts to call Informatica workflows using teh PMCMD command.
  • Maintained an exclusive folder called Workflow Schedule in teh Informatica repository for scheduling teh dependencies of teh workflows, and invoked each of teh workflows from teh command tasks.

Environment: Informatica Power Center v8.6.1,Salesforce.com UI, Apex Explorer, Apex Data loader, Oracle 10g, Erwin, PVCS, Windows 2000, Oracle PL/SQL, SQL Server 2000, UNIX Scripting

Confidential, San Jose, CA

Deep Application Vulnerability

Responsibilities:

  • Involved in teh Anaysis, Design of teh MRS system which includes teh various metrics like SHIPMENTS(SHP), INSTALLBASE (IB), ONTIME DELIVERY (OTI), PROBLEM REPORTS (NPR)
  • Involved in Design and Development for teh Project CUSTOMER in teh metrics: OTI
  • Involved in teh Production support work and responsible for running teh monthly loads for all teh MRS and Customer metrics (Shipments, Install base, Returns, On-time delivery, Number of problem reports, Software problem reports, Software fix quality, Cumulative software fix, Fixed response time, Outages) and all these metrics for teh customers to generate monthly, quarterly, release specific and release-specific quarterly metrics.
  • Implemented Slowly Changing Dimension methodology for accessing teh full history
  • Performance tuning for high volume data load mappings using SQL overrides in Lookups and Source filter usage in Source Qualifiers as well.
  • Involved in teh Requirement gathering and verifying thoroughly for any miscalculations and loop holes from teh business functionality point of view and also recommending teh client for teh better approach.
  • Extensively worked on $U scheduling tool to set up teh dependency of teh Informatica workflows and teh Unix Shell Scripts which are used to send teh e-mails after processing teh ETLs.
  • Involved in teh Unit Testing, supported teh QA and UAT Testing
  • Responsible for deploying teh code to teh Production, and tan supported teh post production issues.
  • Worked on designing teh standard Error handling in teh projects for TEMPeffectively reporting teh IT or Biz errors on teh abnormalities, respectively wit auto generated e-mails.
  • Worked extensively on different types of transformations like Lookup (connected and unconnected), Router, Expression, Filter, Aggregator, Update Strategy, Source Qualifiers, and Joiner.
  • Extensively worked on worklet, mapplets, tasks such as link, email, command; Pre SQL and Post SQL statements at mapping level, pre session and post session commands at workflow level, while developing teh mappings and teh workflows.
  • Implemented teh HTML tags in teh Unix Shell scripting while sending a mail from teh script that is in turn invoked from teh Dollar-U’s tailer task.
  • Developed teh technical documents for Solution design, Code migration, Naming conventions and teh Mapping documents from source to target, for teh various subject areas that I worked on.

Environment: Informatica Power Center v8.1.1, Oracle 9i, System Architect, PVCS, Dollar-U, Windows 2000, Oracle PL/SQL, SQL Loader, UNIX Scripting

We'd love your feedback!