We provide IT Staff Augmentation Services!

Senior Etl Informatica Developer Resume

4.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • Over 8 + years of Solid understanding and expertise in data warehouse Development, Testing, Deployment, Maintenance and Production Support of Data Integration using Informatica PowerCenter 10.1, IDQ and DWBI ETL.
  • Expertise in Extraction, Transformation and Loading (ETL) process, Dimensional Data Modeling experience using Data modeling, Star schema/ Snowflake modeling, Fact and Dimensions tables, dimensional, multidimensional modeling and De - normalization techniques. Thorough understanding of Kimball and Inmon methodologies
  • Extensively used Informatica PowerCenter (10.1/9.x/8.x) for ETL (Extraction, Transformation and Loading) of data from multiple source database systems to Data Warehouses in UNIX/Windows environment.
  • Solid experience in Developing/Optimizing/Tuning mappings using Informatica.
  • Extensively worked with Informatica, IDQ and MDM ETL Tool.
  • Worked on various types of transformations like Lookup, Update Strategy, Stored Procedure, Joiner, Filter, Aggregator, Rank, Router, Normalizer, Sorter, External Procedure, Sequence Generator, and Source qualifier etc.
  • Expertise using Debugger, Mapping wizards, Workflow Manager and Workflow Monitor.
  • Sound knowledge in creating Tasks, Sessions, Worklets and Workflows in Worklet/workflow designer.
  • Created Reusable Transformation, Mapplets, Sessions and Work lets and made use of the shared folder concept using shortcuts wherever possible to avoid redundancy.
  • Expertise in design and implementation of Slowly Changing Dimensions (SCD) type1, type2.
  • Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in creating and scheduling workflows and expertise in Automation of ETL processes with scheduling tools such as Autosys and Control-M.
  • Extensive experience using tools and technologies such as Oracle 11g/10g/9i/8i, SQL Server, DB2 UDB, Sybase, TERADATA, MS Access, Flat files, SQL Developer, SQL Navigator , SQL Loader, PL/SQL, Sun Solaris, Erwin, Toad, Stored procedures, triggers.
  • Experienced in the development, modification and automation of the ETL processes using UNIX shell scripting.
  • Strong data modeling experience using Star/Snowflake schema, Re-engineering, Dimensional Data modeling, Fact & Dimension tables, Physical & logical data modeling .
  • Experience in writing complex sub queries, PL/SQL programs (functions, procedures, packages), stored procedures, and shell scripting to run pre-session and post session commands.
  • Worked with heterogeneous data sources like Oracle, SQL Server 2008/2005, flat files, XML files, DB2 UDB, Main Frames and COBOL files.
  • Perform error handling as part of production support on Informatica as well as Unix.
  • Experience in Installation, Configuration, and Administration of Informatica PowerCenter 10.1/9.x/8.x, power exchange8.1 and Power Mart 5.x/6.x Client, Server.
  • Extensive experience in writing SQL scripts to validate the database systems and backend database testing.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter (10.1/9.x/8.x), Informatica Data Quality (IDQ), Master Data Management (MDM)

Data Modeling Tools: Erwin Data Modeler, TOAD Data Modeler, Microsoft Visio

Data Modeling: Dimensional Data Modeling, using Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

RDBMS: Oracle 11i/10g/9i/8i, TERADATA, SQL Server, DB2

Operating System: UNIX, Windows

Scheduling Tools: Control - M, Autosys, BMC

OLAP Tools: Cognos 8.x/7.0, Micro Strategy, OBIEE

PROFESSIONAL EXPERIENCE

Confidential, Minneapolis, MN

Senior ETL Informatica Developer

Responsibilities:

  • Understanding the business rules and sourcing the data from multiple source systems.
  • Created Filewatcher jobs to setup the dependency between Cloud and PowerCenter jobs.
  • Created and developed mappings to load the data from staging tables to EDW DataMart tables based on Source to Staging mapping design document.
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of TERADATA.
  • Worked on TERADATA SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, FastLoad or FastExport. Created numerous Volatile, Global, Set, MultiSet tables.Created batch jobs for Fast Export.
  • Created shell scripts for Fast export and Fast load.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ .
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Wrote numerous BTEQ scripts to run complex queries on the TERADATA database. Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Moved data from TERADATA to oracle using SQL loader.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
  • Implemented Slowly Changing Dimensions Type1, Type 2.
  • Developed mappings between Legacy applications to Facets claim system applications.
  • Extracted, Transformed and Loaded data into Oracle database using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).
  • Hands On experience creating, converting oracle scripts (SQL, PL/SQL) to TERADATA scripts.
  • Configure rules for PowerCenter operations team, no file monitoring, process not started, reject records and long running jobs.
  • Worked with Informatica support on issues with Proactive Monitoring interface.
  • Optimizing and doing performance tuning of Metadata resources to achieve higher response times.
  • Created Views in Oracle and Sql Server to analyze the Data Assurance.
  • Participated in Code reviews with SME’s.
  • Designed, Developed, Deployed and implemented ETL mappings using Informatica.
  • Prepared technical design/specifications for data Extraction, Transformation and Loading.
  • Analyzed the business systems, gathered requirements from the users and documented business needs for decision support data.
  • Developed Informatica mappings to extract data, staged in Oracle & populated Warehouse
  • Supported very complex mainframe and informatica applications effectively.
  • Wrote Shell Scripts for event automation and scheduling.
  • Created UNIX scripts to automate the activities like start, stop, and abort the Informatica workflows by using PMCMD command in it.
  • Migrated workflows, mappings, & repository objects from development to QA to production
  • Used various Informatica Error handling techniques to debug failed session.
  • Provided production support including error handling and validation of mappings.

Environment: Informatica PowerCenter 10.1, Data Quality (IDQ), Oracle 11g, TERADATA, BTEQ, FAST EXPORT, FLOAD, Metadata, SQL Server 20012,PL/SQL, SQL* Loader, XML, Toad, Unix, Win NT

Confidential, San Jose, CA

Senior Informatica Analyst

Responsibilities:

  • Performed installation, configuration, applying hot fixes, patches and version upgrades of Informatica products and provided on-call support for the ETL applications.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Responsible for all aspects of data warehouse operations using Informatica 9.5 / 9.1, Microsoft SQL Server, Oracle and DB2 environments.
  • Worked with the Business analysts and the DBA for requirements gathering, business analysis, testing, and metrics and project coordination.
  • Data Profiling, performing analysis on high volume data to design and build TERADATA and Tableau reports on product, sales, and operational performance of business for users.
  • Developed an ETL Informatica mapping in order to load data into staging area. Extracted from flat files and databases and loaded into Oracle 10g target database.
  • Contributed to Data Migration and the Implementation of Roadmap defined by the business case.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Proficient in importing/exporting large amounts of data from files to TERADATA and vice versa.
  • Developed the DW ETL scripts using BTEQ, Stored Procedures, Macros in TERADATA
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of TERADATA.
  • Created numerous scripts with TERADATA utilities BTEQ, MLOAD and FLOAD
  • Troubleshooting the errors in ILM jobs with Informatica support.
  • Enforced coding standards, best practices and formalized the code reviews.
  • Developed complex Informatica mappings, mapplets, transformations and work flows.
  • Worked extensively on the performance tuning of ETL workflows and SQL scripts.
  • Provided on-call support to resolve production issues in time and meet the SLA.
  • Developed and re-engineered database objects like Tables, Views, Stored Procedures, Triggers, Functions, Indexes and Constraints in T-SQL and PL/SQL.
  • Involved in migration project to migrate data from data warehouses on Oracle/DB2 to TERADATA.
  • Wrote, tested and implemented TERADATA Fast load, Multi load and BTEQ scripts, DML and DDL.
  • Worked on Star Schemas and slowly changing dimensions.
  • Used TERADATA utilities fastload, multiload, tpump to load data.
  • Wrote BTEQ scripts to transform data.
  • Installed and administered knowledge modules for BMC Patrol monitoring of Oracle databases and UNIX servers.
  • Designed data warehouse and data marts using Relational and Dimensional data modeling.
  • Created technical specifications using Informatica Analyst tool and excel spread sheets.
  • Supported Business Objects reports with Informatica Data Services and PowerCenter.
  • Provided technical guidance to TWAI application developers and project managers.
  • Experience in managing and protecting personally identifiable information (PII) data.

Environment: Informatica PowerCenter 9.6, Data Quality (IDQ), TERADATA, Metadata, Oracle 10g/9i, DB2, PL/SQL, Toad, Flat files, BMC, XML, SQL Server 2008, Unix shell scripting, Power exchange, Control M.

Confidential, Columbus, OH

Senior ETL Developer

Responsibilities:

  • Analyzed the functional specifications provided by the data architect and created Technical System Design Documents and Source to Target mapping documents.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Performed Source System Data Profiling using Informatica Data Explorer (IDE).
  • Involved in designing Staging and Data mart environments and built DDL scripts to reverse engineer the logical/physical data model using Erwin.
  • Extracted data from SAP using Power Exchange and loaded data into SAP systems.
  • Translated the business processes/SAS code into Informatica mappings for building the data mart.
  • Used Informatica PowerCenter to load data from different sources like flat files and Oracle, TERADATA into the Oracle Data Warehouse.
  • Implemented pushdown, pipeline partition, persistence cache for better performance.
  • Developed reusable transformations and Mapplets to use in multiple mappings.
  • Implementing Slowly Changing Dimensions (SCD) methodology to keep track of historical data.
  • Assisted the QC team in carrying out its QC process of testing the ETL components.
  • Created pre-session and post-session shell scripts and email notifications.
  • Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
  • Created mappings using Data Services to load data into SAP HANA.
  • Developed reports in Cognos Reportnet.
  • Contributed to the design and development of Cognos framework model.
  • Involved in data quality checks by interacting with the business analysts.
  • Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained documentation of ETL processes to support knowledge transfer to other team members.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Involved as a part of Production support.
  • Informatica Power Exchange for Mainframe was used to read/write VSAM files from/to the Mainframe.
  • Basic Informatica administration such as creating folders, users, privileges, server setting optimization, and deployment groups, etc.
  • Designed and developed Mappings for loading MDM HUB.
  • Designed Audit table for ETL and developed Error Handling Processes for Bureau Submission.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Used Informatica IDQ to do data profiling of the source and check for the accuracy of data using dashboard.
  • Managed Change Control Implementation and coordinating daily, monthly releases and reruns.
  • Responsible for Code Migration, Code Review, Test Plans, Test Scenarios, Test Cases as part of Unit/Integrations testing, UAT testing.
  • Used TERADATA Utilities such as Mload, Fload and Tpump.
  • Created BTEQ scripts.
  • Used UNIX scripts for automating processes.

Environment: Informatica PowerCenter 9.1.1,Informatica Developer Client, IDQ, MDM, Power Exchange, SAP HANA, Oracle 11g, PL/SQL, Toad, Cognos, SQL SERVER 2005/2008, XML, UNIX, Windows XP, TERADATA.

Confidential

Informatica Developer

Responsibilities:

  • Extensively used ETL to load data from different source systems like Flat files into the Staging table and load the data into the target database.
  • Worked on Informatica Power Centre tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Developed Informatica mappings, re-usable transformations, re-usable mappings and mapplets for data load to data warehouse and database.
  • Multi source Extraction Transformation and Loading into the target structures using Informatica.
  • Worked closely with multiple business units and a data solutions engineer to identify key information that will enhance business decision-making.
  • Performance tuned mapping, sessions and database.
  • Wrote stored procedures to accomplish tasks.
  • Created sessions to move the data at specific intervals and on demand.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
  • Debugged the mapping for failed session to fix data and code issues.
  • As a part of daily loading into the warehouse, used UNIX shell scripting to triggers the workflows in a particular order.
  • Involved in coding UNIX shell scripting to execute pre/post session commands.
  • Involve in enhancements and maintenance activities of the data warehouse including performance tuning, rewriting of stored procedures for code enhancements.
  • Analyzed and Created Facts and Dimension Tables.
  • Used different Data Warehouse techniques like Star-Schema, Snow-Flake Schema.
  • Developing multiple dimensions (drill down hierarchies), logical objects in business model & mapping layer and creating presentation catalogs in presentation layer.
  • Informatica Data Quality was used for cleansing and matching the customer data. Real-time address cleansing was achieved using Informatica Power Connect for Web Services.

Environment: Informatica PowerCenter 9.1/8.6, PL/SQL, Oracle 9i, Toad, Erwin, Unix, SQL Server 2005,Autosys, Windows Server 2003, Visio 2003.

Confidential

ETL Informatica Developer

Responsibilities:

  • This application is mainly used for managing and utilizing the resources effectively and project allocations in the sector. It provides interfaces for Resource Management that offers cross-project hierarchical pool of people to assign to projects by skill sets for a specific month, Program Management that streamline program initiation and identifying of sub-projects and Skill capacity.
  • Worked on ETL optimization, troubleshooting and analyzing Informatica mappings.
  • Source systems data from the distributed environment was extracted, transformed and loaded into the Data warehouse using Informatica.
  • Development, Testing, Implementation & Training of users.
  • Performed requirements gathering and Gap analysis for the ETL load.
  • Created mappings for Historical and Incremental Loads.
  • Used Version Control to check in and checkout versions of objects.
  • Designed the Source Definition, Target Definition and Transformation for building mappings using the designer tools.
  • Involved in the design, development and implementation of mappings using Informatica PowerCenter designer and creating Design Documents.
  • Worked on different Transformations like Source Qualifier, Joiner, Router, Aggregator, Lookup, Expression and Update Strategy to load data into target tables.

Environment: Informatica PowerCenter 8.1, Oracle, MS PowerPoint, MS Access, Microsoft Excel, Ewrin Tool.

We'd love your feedback!