We provide IT Staff Augmentation Services!

Sr.etl/informatica Lead Resume

3.00/5 (Submit Your Rating)

San Francisco, CA

SUMMARY

  • Over all of 8 years IT experience in Data ware housing development.
  • Involved in all the phases of the Data warehouse life cycle involving design, development, and analysis &testing using ETL, Data Modeling, Online Analytical Processing.
  • In - depth knowledge on Data warehousing ETL using Informatica Power Center 8.6/8.1/7.1, (Informatica Designer, Repository Administration Console, Repository Manager, Workflow Manager, Workflow Monitor) under diverse databases like Oracle, SQL Server, DB2
  • Hands-on experience in Transformations (Source Qualifier, Aggregator, Filter, Router, Update Strategy, Rank, Joiner, Expression, Sequence Generator, Normalizer, Store Procedure, Lookup, Sorter, and Union)
  • Extracted data from various sources such as Relational Sources (Oracle, SQL Server, DB2, and MS Access) and File formats (flat files, .csv)
  • Strong knowledge in Informatica Workflow Manager to create and schedule workflows and worklets.
  • Implemented Performance Tuning at Source, Confidential, Mapping, Session, and System levels
  • Strong experience in coding using SQL, PL/SQL, Procedures/Functions, Triggers, Packages.
  • Expertise in loading data into Oracle using SQL-Loader.
  • Experienced in the use of agile approaches, including Extreme Programming, Test-Driven Development, Scrum, eXtreme Programming (XP), Lean Analysis, mixed methods, RAD and RUP.
  • Knowledge on Saleforce.com and Cloud computing
  • Involved in Information installation on Unix and Windows Operating Systems
  • Highly skilled in various Operating Systems such as UNIX (Sun Solaris, HP-UX), Linux (Redhat 7.1/2,1 AS), AIX, Windows XP/NT/00/98 and MS-DOS.
  • Experience in writing Unix Shell Scripts
  • Strong knowledge in entire Software Development Life Cycle (SDLC) such as waterfall and Agile

TECHNICAL SKILLS

ETL Tools: Informatica 7.1/8.1.1/8.6/9.0.1 , IDQ 8.6.1,SQL Loader

Databases: Oracle 10g/9i/8.x, SQL Server 2005/2008,Teradata, DB2, MS Access 2000/97

Data Modeler: ERWIN 4.0/3.x, Oracle Designer

Languages: C, C++, Java, Shell Scripting, SQL, PL/SQL, XML, HTML,SOQL

Cloud Computing: Salesforce, Bigmachines

Operating Systems: Sun Solaris 9/8/2.7, HP UNIX, AIX, Red Hat Linux 2.1 AS/7.1, Windows XP/NT/00/98

Scheduling Tools: Tidal, Autosys, Crontab and Control-M

PROFESSIONAL EXPERIENCE

Confidential, San Francisco, CA

Sr.ETL/Informatica Lead

Responsibilities:

  • Involved in gathering requirements from the user, and understanding the client’s requirements for CPQ (Configure Price Quote) system automation integrating with Big machines.
  • Involved in Designing the ETL process in loading data to big machines and writing the technical design document.
  • Extensively worked on designing the data model required for staging and loading the data to Big machines.
  • Extensive Worked on developing the ETL process using Informatica. Developing mapping to extract data from PIM and transform and preparing the load files to Big machines
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager. usedAutosysandTidalfor scheduling theUNIXshell scripts and Informatica workflows.
  • Creating ETL mappings using Informatica Power Center to move data from multiple sources such as XML, DB2, Teradata, MS SQL Server, flat files, and Oracle into a common Confidential area such as data marts and data warehouse.
  • Estimates and planning of development work using Agile Software Development.
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Extensively worked with flat files in preparing the load files to big machines.
  • Worked on deployment process in generating XML files
  • Extensive worked on shell scripts in editing the load files, FTP to Big machines and archiving the load files to a location.
  • Automatize the loading process to big machines using shell scripts in Informatica Sessions
  • Created, launched & scheduled Workflows/sessions.
  • Extensive Worked on Sales force in migrating the data from Siebel to sales force.
  • Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.
  • Developed Data Design Document and Mapping Document to present to client for the Informatica process for the Salesforce objects.
  • Created and edited custom objects and custom fields in Salesforce and checked the field level Securities.
  • Uploaded data from operational source system (Oracle 8i) to Teradata.
  • Extracted the data from various data source systems into the Landing Zone area by creating the Informatica mappings using the Teradata fast Loader Connections.
  • Enabled Bulk data load for loading into the Sales force and checked the jobs in sales force Administration setup.
  • Worked on utilities like FLOAD, MLOAD, FEXP of Teradata and created batch jobs using BTEQ.
  • Good knowledge on SOQL to validate data in salesforce.
  • Imported Metadata from Teradata tables.
  • Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.
  • Involved in the Account, Contact, Training and Demo data migration from Siebel to Salesforce.
  • Got good knowledge on Customer Relationship management and Quote approval process in SFDC.
  • Used Informatica repository manager to backup and migrate metadata in development, Test and production systems.
  • Automation of job processing, establish automatic email notification to the concerned person

Environment: Informatica Power Center 9.0.1, IDQ 8.6.1,Oracle10g, Big machines, Tidal 5.3.1, Teradata 13.1, Sales Force, SOQL, UNIX (Solaris), Windows NT/2000, TOAD, Force Developer

Confidential, Charlotte, NC

Sr.ETL/Informatica Lead

Responsibilities:

  • Coordinated with Business Usersto understand business needs and developed the technical design document covering all the business requirements
  • Involved in the requirement definition and analysis in support of Data Warehouse and analysis of the existing system.
  • Good understanding on data modeling techniques as Star schema and snow flake Schema.
  • Developed ETL mappings, transformations using Informatica Power Center 8.6
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Designed and developed transformation rules (business rules) to generate consolidated data using Informatica ETL (Power center) tool.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • TidalScheduler was implemented for scheduling of Informatica workflows.
  • Developed data Mappings between source systems and warehouse components using Mapping Designer.
  • Worked extensively on different types of transformations like source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, Union, Stored Procedure and sequence generator.
  • Created jobs for automation of the Informatica Workflows and for DOS copy and moving of files usingTidalScheduler.
  • Wrote PL/SQL stored procedures in order to implement Complex logics.
  • Created, launched & scheduled Workflows/sessions.
  • Involved in the Performance Tuning of Mappings and Sessions Scheduled the batches using UNIX.
  • Developed UNIX shell scripts to run the workflows from the backend using PMCMD prompt.
  • Used Informatica repository manager to backup and migrate metadata in development, Test and production systems.
  • Automation of job processing, establish automatic email notification to the concerned persons.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Supported for monthly and weekly loads into DW using Informatica.
  • Good knowledge on Informatica architecture.

Environment: Informatica Power Center 8.6.1, Oracle10g, PL/SQL, MS SQL Server 2008, SQL * Loader, MS Access, UNIX (Solaris), Windows NT/2000, TOAD, Tidal 5.3.1, Erwin, Business Objects

Confidential, Minneapolis, MN

ETL/Informatica Developer

Responsibilities:

  • Involved in various stages of Data warehousing designing mainly on dimensional modeling, physical design, performance tuning, aggregation, indexing and partitioning.
  • Designed, developed, implemented and maintainedInformaticaPowerCenter and IDQ 8.6.1 application for matching and merging process.
  • Understanding the Business rules completely and implemented data transformation methodologies.
  • Involved in designing the ETL specification documents (Mapping documents).
  • Designed mappings with SCD type2 to keep track of historical data.
  • Performed ETL process on data stored in OLTP DB, Flat files to DB2 for ODS and EDW using power center 8.6
  • Identifying bottlenecks at mapping level and session level to improve the load performance.
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Involved in tuning the mappings and SQL Query to perform maximum efficiency.
  • Atomized Workless and Session schedule using Unix Shell scripts.
  • Wrote pre-session shell scripts to check session model (enable/disable) before running/scheduling batches.
  • Scheduling the session task’s comprising of different mappings for data conversion and extraction in order to load data into Confidential database.
  • Migrated mappings, sessions, and workflows from Development to testing and then to Production environments.
  • Utilized ofInformaticaIDQ 8.6.1to complete initialdataprofiling and matching/removing duplicate data.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and Confidential based commit interval.
  • Monitor; troubleshoot batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the Confidential database.

Environment: Informatica Power Center 8.6, Oracle10g, IDQ 8.6.1,PL/SQL, MS SQL Server 2008, SQL * Loader, MS Access, UNIX (Solaris), Windows NT/2000, TOAD, OBIEE, Jira

Confidential, Mayfield, OH

ETL/Informatica Developer

Responsibilities:

  • Coordinated with Business Usersto understand business needs and developed the technical design document covering all the business requirements.
  • Developed various Mappings & Mapplets to load data using different transformations.
  • Used Informatica as an ETL tool to extract data from sources systems and aggregate the data and load into the database.
  • Implemented complex logic using PL-SQL code.
  • Developed reusable Mapplets and Transformations using Informatica designer.
  • Designed and developed mappings using Source Qualifier, Expression, Lookup, Router,
  • Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.
  • Involved in developing SQL and PL/SQL codes through various procedures, functions, Triggers, Cursors and packages to implement the business logics of database in Oracle.
  • Experience in migrating data using PL-SQL procedures.
  • Good hands on experience with advanced concepts like hints, partition exchange concepts and tuning queries.
  • Developed UNIX shell scripts to run the workflows from the backend using PMCMD prompt.
  • Expertise in using SQL-Loader for loading external data into Oracle Database.
  • Used PVCS as version control tool.
  • Co-ordinated with Informatica admin group for deployment of code from Dev to Test to Prod.
  • Co-ordinated with Business analyst to understand the requirements and for clarifications on requirements
  • Good experience working on huge databases with Terabytes of data
  • Supported for monthly and weekly loads into DW using Informatica.
  • Good knowledge on Informatica architecture.

Environment: Informatica Power Center 8.6, Oracle10g, PL/SQL, MS SQL Server 2000, SQL * Loader, MS Access, UNIX (Solaris), Windows NT/2000, TOAD, Erwin, Business Objects

Confidential, Scotts Valley, CA

Informatica Developer

Responsibilities:

  • Using Informatica Designer designed and developed Source Entities for Oracle, SQL Server.
  • Involved in Business/Data analyses and gathering the requirements from different user levels of the organization.
  • Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited), Relational Tables (SQL Server, and Oracle), MS Access.
  • Field level validations like Data Cleansing and Data Scrubbing were applied on various Data Sources.
  • Good knowledge in shell scripting and using various Unix commands like grep, awk and sed.
  • Expertise in using SQL-Loader for loading external data into Oracle Database.
  • Extensively used InformaticaPowerCenter designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, and Mapping Designer.
  • Created Mappings using Mapping Designer to transform data extracted from various sources using Transformations like Aggregator, Expression, Stored Procedure, Filter, Joiner, Lookup, Sequence Generator, Source Qualifier, and Update Strategy transformations.
  • Worked in creating mapplet for most re-usable logic for error handling.
  • Implemented Variables and Parameters in the mapping and parameterized the mappings.
  • Used Debugger to validate mappings and also to obtain troubleshooting information about data by inserting Breakpoints.
  • Used Informatica Workflow Manager to test the application and transport the data from Source tables to Confidential tables and also to Schedule, Run Extraction, Load Process, and Monitored Sessions in Workflow Monitor.
  • Extensively used Workflow Manager for maintaining Sessions by performing tasks such as monitoring, editing, scheduling, copying, aborting, and deleting.
  • Used PMCMD to run workflows at the CMD Prompt created Cron jobs to automate scheduling of sessions and wrote UNIX Shell scriptsto create pre and post session commands.
  • Wrote SQL Queries, PL/SQL Procedures, Functions, and Triggers for implementing business logic and for validating the data loaded into the Confidential tables using query tool TOAD.

Environment: Informatica Power Center 8.1.1, Oracle10g, SQL Loader, PL/SQL, MS SQL Server 2000, MS Access, UNIX Shell Scripts, UNIX (Solaris), Windows NT/2000, TOAD 7.0, Erwin, Cognos Series 7,DB2

Confidential, MA

ETL Developer

Responsibilities:

  • Involved in Business/Data analyses and gathering the requirements from different user levels of the organization.
  • Designed Slowly Changing Dimension Strategy for the Data Warehouse.
  • Using Informatica Designer designed and developed Source Entities for Oracle, SQL Server and Confidential warehouse Entity for Oracle.
  • Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited), Relational Tables (SQL Server, and Oracle), MS Access.
  • Performed various validations like Data Cleansing and Data Scrubbing on various Data Sources.
  • Expertise in writing SQL and PL-SQL queries for validating various business logics.
  • Implemented the Recovery option while loading data into Confidential database.
  • Expertise in doing weekly, monthly or daily aggregations depending on the client requirement.
  • Expertise in using Stored procedure transformation in the logics to reduce traffic in the mappings.
  • Worked with business users to understand the requirements and in UAT.
  • Worked in offshore and onsite model for distribution of work among the team.
  • Extensively used InformaticaPowerCenter designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, and Mapping Designer.
  • Created Mappings using Mapping Designer to transform data extracted from various sources using Transformations like Aggregator, Expression, Stored Procedure, External Procedure, Filter, Joiner, Lookup, Sequence Generator, Source Qualifier, and Update Strategy transformations.
  • Created Type II Slowing Changing Dimensions using Lookup Transformations and Update Strategy Transformations.
  • Used Debugger to validate mappings and also to obtain troubleshooting information about data by inserting Breakpoints.
  • Used Informatica Workflow Manager to test the application and transport the data from Source tables to Warehouse tables and also to Schedule, Run Extraction, Load Process, and Monitored Sessions in Workflow Monitor.

Environment: Informatica Power Center 8.1, Oracle9i, SQL Loader, PL/SQL, SQL Trace, MS SQL Server 2000, MS Access, UNIX Shell Scripts, UNIX (SOLARIS 5.8), Windows NT/2000, TOAD 7.0), Cognos

Confidential, Columbus, OH

Informatica Developer

Responsibilities:

  • The data migration included identifying various databases where the information/data lay scattered, understanding the complex business rules that need to be implemented and planning the data transformation methodology.
  • Worked on complete SDLC from Extraction, Transformation and Loading of data using Informatica.
  • Involved in the analysis, design and development of all the interface’s using Informatica Powermart tools in interface team and interfaced with all the other tracks for business related issues.
  • Performed regression testing of Informatica mappings & workflows after an upgrade from Power Center v7.1 to v8.1.
  • Created complicated formulae and running totals for business measure calculations.
  • Responsible for documenting the data extraction process into a reporting database.
  • Provide support for reporting and back up support for data and developing ETL transformation using Informatica power mart.
  • Involved in developing SQL and PL/SQL codes through various procedures, functions, and packages to implement the business logics of database in Oracle.
  • Extensive experience in implementation of Data Cleanup procedures, transformations, Stored Procedures and execution of test plans for loading the data successfully into the targets
  • Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Sequence generator.
  • Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.
  • Defined Confidential Load Order Plan for loading data correctly into different Confidential Tables.

Environment: Informatica 7.1.3, DB2, Sybase, XML, Flat files, Oracle 9i/8i, Unix, PL/SQL, Windows NT, Brio 6.0

We'd love your feedback!