We provide IT Staff Augmentation Services!

Etl Developer (informatica) Resume

OklahomA

PROFESSIONAL SUMMARY:

  • Over 6 years of experience in IT industry with a strong background in Development, Analysis, Testing, Data Warehouse and Business Intelligence tools.
  • 5+ years of experience on Informatica PowerCenter 9.6/8.6/7.1(Repository Manager, Designer, Workflow Manager, and Workflow Monitor).
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1
  • Experience in using PL/SQL as ETL, performance tuning and building BI Architecture.
  • Strong experience working with XML, COBOL, Flat Files along with loading and retrieving data from different source systems.
  • Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension tables.
  • Experience in creating various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Sequence Generator, Normalizer, Sorter, Router, XML, Stored Procedure in Informatica PowerCenter Designer.
  • Hands on experience on Informatica Mapping performance tuning, identifying and removing performance bottlenecks.
  • Strong Knowledge of Hadoop Ecosystem (HDFS, HBase, MapReduce, Hive, Pig, NoSQL etc.) for responsibnilities - - Worked in development of Big Data POC projects using Hadoop, HDFS, Map Reduce, Hive.
  • Developed Big Data workflows using custom Pig, Hive, Swoop.
  • Worked closely with client on planning & brainstorming to migrate the current RDBMS to Hadoop
  • Worked on building ETL data flows that works natively on HADOOP and developed multiple Map Reduce jobs in Java for data cleaning and preprocessing
  • Worked on building ETL data flows that works natively on HADOOP
  • Proficient in using SQL Server Domain to build Data Integration and Workflow Solutions, Extract, Transform and Load (ETL) solutions for Data warehousing applications.
  • Worked in a Development profile, creating IDQ rules/mapplets, IDQ mappings using transformations like standardizer, match, parser, labeler and SQL, profiles and scorecards in IDQ 9.6.1 developer and Informatica Analyst. shorten
  • Having real time experience on Data Profiling, Address Validation and developing mappings using different transformations in Informatica Developer (IDQ)
  • Strong experience on writing Unix Shell Scripting.
  • Experience of Informatica Power Exchange for loading/retrieving data from mainframe systems.
  • Experience of PL/SQL packages, shell scripts to execute the packages and PL/SQL blocks to debug the package in production environment and migrated bulk data.
  • Worked on Power Exchange for change data capture (CDC).
  • Experience of development and implementation capacity of project from the Initial Life Phase until deployment and production.
  • Extensive experience with the Trizetto FACETS 4.71 and FACETS 5.01 Data models and in loading data in Trizetto Keyword XPF files.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.x/8.x/7.x, IDQ 9.6, Power Exchange 9.x/8.x

Databases: Oracle 9i/10g/11g, SQL Server 2000/2005/2008, Sybase, IBM DB2, Teradata

Client Side Skills: SQL, T-SQL, PL/SQL, UNIX Shell Scripting, HTML, XML, CSS, C, C++

Tools: Toad, Erwin, Autosys, Iseries Navigator, OBIEE

DB Skills: Stored Procedures, Triggers, Views, Cursors, Functions and Packages

OS: UNIX, Windows 98/95/2000/2007/ XP/ Linux

Reporting Tools: OBIEE, Tableau, BI Apps,Cognos 8.x/7.x

PROFESSIONAL EXPERIENCE:

Confidential, Oklahoma

ETL Developer (Informatica)

Responsibilities:

  • Documenting ETL process, interact with report developer and business personal.
  • Working in an Agile environment, as a core team member, coordinated with other team members, participated in Scrum Meetings, Backlog Refinement, Iteration Planning, Demo, Retrospective, and developed production ready code in Iterations/Sprints.
  • Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of the Fact and Dimension tables and Source to Target Mapping.
  • Worked with Informatica PowerCenter, Informatica Data validation Object, Informatica Data Quality, SAP HANA Studio, Business object Web Intelligence, Microstrategy.
  • Hands on experience with SQL queries, created Stored Procedures, packages, Triggers, Views and Materialized Views using toad, SQL developer and PL/SQL developer tools.
  • Worked as a data analyst and analyzed the data coming as Flat files, XML files from different source provider clients.
  • Implemented Slowly Changing Dimensions Type-1, Type-2 and Type-3 approach for loading the target tables from various sources.
  • Designed and developed robust end-to-end ETL process involving transformation like SQL, Static and Dynamic Lookup, Update Strategy, Router, Aggregator, Union, Sequence generator, Filter, Expression, Stored Procedure etc. for the efficient extraction, transformation and loading of the data to the Data Mart (Data Warehouse) implementing the complex logics for computing the facts.
  • Implemented complex business rules by creating re-usable Transformations, Mappings, Mapplets, Sessions, Tasks, Workflows, Parameter Variables and Workflow Variables. Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait etc.
  • Load data in SAP HANA by using Informatica 9.6.1.
  • Designed and developed complex mappings using Informatica power center and Informatica developer (IDQ).
  • Created profiles and complete initial data profiling and adhoc profiling using Informatica Data Quality (IDQ).
  • Created Scorecards in IDQ Developer and Analyst to identify the trend of the quality of the data.
  • Extensively worked on data profiling and data quality rules development.
  • Created mappings in Informatica Developer (IDQ) using Parser, Standardizer and Labeler, Exception, Merge Transformations.
  • Validated mac addresses and phone number using Infomatica Data Quality.
  • Experienced in Management of Bad and Duplicate Records using Informatica Power Center and Informatica data Quality.
  • Migrate both oracle and informatica new and modified object in QA environment using UNIX box and SVN Revision Number.
  • Developed and design work flows for Staging areas.
  • Developed work flow which will extract of data from various sources like flat files, Oracle and SQL Server.
  • Involved in Data Feed (Drop flat files in particular directory through work flow).
  • Edit different types of Json, xml, and bean files.
  • Performed Unit testing for mappings to ensure successful execution of the data loading processes & fix the errors to meet the requirement. Debugged mappings using break points, testing the stored procedure, functions, sessions, and batches and checking the target data.
  • Extensively worked in various level of performance tuning in Informatica from mapping level to session and parallel processing.
  • Worked closely with client on planning & brainstorming to migrate the current RDBMS to Hadoop
  • Worked on building ETL data flows that works natively on HADOOP and developed multiple Map Reduce jobs in Java for data cleaning and preprocessing
  • Worked on building ETL data flows that works natively on HADOOP
  • Involved in QA testing of Business Object report.
  • Created and worked on the unit test cases using Toad and Test link.
  • Prepared release Instruction to build Dev, QA and Prod environments and handed them over to Integration and UNIX teams.
  • Worked on production support for Data Back Fill.
  • Environment: Informatica Power Center 9.6.1, Informatica Data Quality 9.6.1/9.1, Oracle 11g/12c, PL/SQL, SQL, Business Object 4.1, Microstrategy 10, SAP HANA, Tidal.

Confidential, Plantation, FL

ETL Developer (Informatica)

Responsibilities:

  • Working in an Agile environment, as a core team member, coordinated with other team members, participated in Scrum Meetings, Backlog Refinement, Iteration Planning, Demo, Retrospective, and developed production ready code in Iterations/Sprints.
  • Extensively used Version One to create user stories, acceptance criteria and tasks and estimated the time to complete the story.
  • Worked with Business Users and Business Analyst for requirement gathering and business analysis.
  • Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of the Fact and Dimension tables and Source to Target Mapping.
  • Designed and developed robust end-to-end ETL process involving transformation like SQL, Static and Dynamic Lookup, Update Strategy, Router, Aggregator, Union, Sequence generator, Filter, Expression, Stored Procedure etc. for the efficient extraction, transformation and loading of the data to the Data Mart (Data Warehouse) implementing the complex logics for computing the facts.
  • Used IDQ transformation like labels, standardizing, proofing, parser, address doctor, Match, Exception transformations for standardizing, profiling and scoring the data.
  • Used IDQ analyst tool to correct the bad table records, reprocessed the records in Analyst.
  • Worked on Power Exchange for change data capture (CDC).
  • Used Power Exchange to source copybook definition and then to row test the data from data files, VSAM files.
  • Created custom reports using various tools including Crystal Reports.
  • Implemented Slowly Changing Dimensions Type-1, Type-2 and Type-3 approach for loading the target tables.
  • Implemented complex business rules by creating re-usable Transformations, Mappings, Mapplets, Sessions, Tasks, Workflows, Parameter Variables and Workflow Variables. Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait etc.
  • Experience in data profiling & data quality rules development using Informatica Developer and in making scorecards using Informatica Analyst.
  • Worked extensively on creation of variables and parameters files for the mapping and session to pass the values, source, target information and to migrate easily in different environment and database.
  • Created labels to migrate the code between different environments.
  • Prepared release notes to build Dev, SIT, UAT, Perf and Prod environments and handed them over to Integration and UNIX teams.
  • Performed Unit testing for mappings to ensure successful execution of the data loading processes & fix the errors to meet the requirement.Debugged mappings using break points, testing the stored procedure, functions, sessions, and batches and checking the target data.
  • Extensively worked in various level of performance tuning in Informatica from mapping level to session and parallel processing.
  • Enhanced the code as per requirement changes and resolved the Production issues.
  • Used Unix Command and Unix Shell Scripting to interact with the server and to move flat files and to load the files in the server. Used pmcmd commands in pre and post session to populate the parameter files and also to execute the tasks.
  • Worked extensively on shell scripting to prepare wrappers and JIL files to schedule the jobs using Autosys.
  • Participated in SIT, UAT and PERF testing and provided the support.

Environment: Informatica PowerCenter 9.6, IDQ 9.6, Informatica Analyst Tool, Embarcadero, Power Connect, Oracale Data Integrator, Flat file, TOAD, UNIX scripting, Windows XP Professional, Hyperion, Microsoft DTS pakages

Confidential, Grand Rapids

ETL Developer (Informatica)

Responsibilities:

  • Installed and configured PowerCenter 9.6.1 on windows platform.
  • Creation and maintenance of Informatica users and privileges.
  • Created Groups, roles, privileges and assigned them to each user group.
  • Ensure accuracy and integrity of archived data through analysis, coding and validation.
  • Created clear documentation of the process to be followed during the retirement of legacy applications.
  • Involved in the requirement definition and analysis in support of Data Warehouse efforts.
  • Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Developed data Mappings between source systems and warehouse components using Mapping Designer.
  • Worked in extensively testing and performance tuning of the mapping and logic.
  • Worked on Performance tuning of Oracle queries using SQL trace, SQL plan, various indexes and join types.
  • Worked on Power Exchange for change data capture (CDC).
  • Develop test, and implement program logic.
  • Created, launched & scheduled Workflows/sessions. Involved in the Performance Tuning of Mappings and Sessions.
  • Documented to describe program development, logic, coding, testing, changes and corrections.
  • Provided support in administration of the Informatica repository Manager and assisted in the design and architecture of multiple repositories to support multiple application development and QA environments.
  • Provided Production support.

Environment: Informatica PowerCenter 9.6, IDQ 9.6,Informatica Analyst Tool, Power Connect, Oracle Data Integrator, Flat file, TOAD, UNIX scripting, Windows XP Professional, Hyperion, Microsoft DTS pakages.

Confidential, Richmond, Virginia

ETL Developer (Informatica)

Responsiblities:

  • Worked with architect, created logical data object (LDO) and custom data object (CDO).
  • Ensure accuracy and integrity of archived data through analysis, coding and validation.
  • Created clear documentation of the process to be followed during the retirement of legacy applications and archiving of Health System data.
  • Involved in the requirement definition and analysis in support of Data Warehouse efforts.
  • Describes the Capabilities of DTS and summarizes the business problem it addresses.
  • Used IDQ transformation like labels, standardizing, proofing, parser, address doctor, Match, Exception transformations for standardizing, profiling and scoring the data.
  • Used IDQ analyst tool to correct the bad table records, reprocessed the records in Analyst.
  • Developed ETL mappings, transformations using Informatica Power Developer.
  • Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Imported the IDQ mapping into Power center and then added all the business rule.
  • Used all Transformations such as Expressions, Filters, Joiners, aggregators, Lookups, Update strategy, Sequence Generator, Normalizer and Router load consistent data into Database.
  • Developed data Mappings between source systems and warehouse components using Mapping Designer.
  • Describes specifics data conversion and data transformation issues that can arise when using DTS
  • Experienced in data profiling & data quality rules development using Informatica Data Quality tools
  • Worked in extensively testing and performance tuning of the mapping and logic.
  • Worked on Performance tuning of Oracle queries using SQL trace, SQL plan, various indexes and join types.
  • Worked on Power Exchange for change data capture (CDC).
  • Used Power Exchange to source copybook definition and then to row test the data from data files, VSAM files.
  • Develop test, and implement program logic.
  • Utilize ETL tools such as Informatica product suit.
  • Worked on building UNIX Shell Scripts to check and update the Teradata/Oracle tables and appending multiple CSV’S to a single CSV and used them in Data stage jobs.
  • Created custom reports using various tools including Crystal Reports.
  • Created, launched & scheduled Workflows/sessions. Involved in the Performance Tuning of Mappings and Sessions.
  • Documented to describe program development, logic, coding, testing, changes and corrections.
  • Implemented extreme programming by using fact paced agile methodology, involving in task completion, user stories, and iterations.
  • Provided support in administration of the Informatica repository Manager and assisted in the design and architecture of multiple repositories to support multiple application development and QA environments.
  • Provided Production support.

Environment: Informatica PowerCenter 9.6, IDQ9.6, Informatica Analyst Tool, Power Connect, Teradata, Oracale Data Integrator, Flat file, TOAD, UNIX scripting, Windows XP Professional, Hyperion, Microsoft DTS pakages.

Confidential, Dallas, Texas

ETL Developer (Informatica)

Responsiblities:

  • Extensively interacted with the user and involved in determining the data needed to address business users analytical requirements and design data mart to support these analysis.
  • Involved in handling and selecting heterogeneous data source like Oracle, DB2, SQL Server and Flat files.
  • Worked with Unified Modelling Design concept to design platform independent modelling.
  • Worked extensively in translation of business requirement into data warehouse design and developed ETL logic based on the requirement using Informatica PowerCenter 8.1
  • Contributed in logical and physical design of dimensional modelling and created 3 rd Normal Snow flake Schema.
  • Extensively worked with all the client components of Informatica like Repository Manager, Designer, Workflow Manager, Work flow Monitor.
  • Developed complex mappings to populate and incrementally load the source data to the staging area using Joiner, Sorter, Connected Lookup, Router, Filter, Update Strategy, Expression using Informatica Designer taking into consideration of maximize performance.
  • Developed ETL design using various transformations like Source Qualifier, Aggregator, Sorter, Joiner, Lookup, Stored Procedure, Router, Filter, Transaction Control, Sequence Generator, Expression, JAVA and XML as per necessity for source to target data mappings and to load the target table.
  • Worked extensively with variables and parameters files in the mapping and in the session to pass the value and to control the environment and source, target information.
  • Used IDQ transformation like labels, standardizing, proofing, parser, address doctor, Match, Exception transformations for standardizing, profiling and scoring the data.
  • Contributed in performance tuning of the existing project from source to target level and debugged invalid mappings using breakpoints, testing the stored procedure, functions, sessions, and batches and checking the target data.
  • Used shell script to handle flat files in the source level.
  • Worked with stored procedure and packages in PL/SQL and Unix Shell Scripting for automated execution of the job in the production environment.
  • Wrote complex queries in SQL and PL/SQL.
  • Used pmcmd commands to execute the tasks, populate parameter files, and Unix Shell Scripts for the automation of the design.
  • Involved in various level of performance tuning in Informatica from mapping level to session and parallel processing.
  • Involved in designing the complete Decision Support System using Micro Strategy by creating different types of reports for trend analysis using filter, conditions and calculations.

Environment: Informatica PowerCenter 9.6, IDQ 9.1, IBM Web Sphere Information Server (Data Stage Designer), Power Connect, SQL Server Integration Services (SSIS)Teradata, Oracle 11g Flat file, TOAD, UNIX scripting, Autosys, Windows XP Professional, Hyperion.

Confidential, Lynchburg, VA

ETL Developer(Informatica)

Responsibilities:

  • Interacted with business users and team leads for better understanding of business specifications and requirements, identifying data sources and Development strategies.
  • Analyzed business requirements, technical specifications, source repositories and physical data models for ETL mappings.
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiner, update strategy and stored procedure transformations.
  • Coordinated and developed documentation related to ETL design and development.
  • Used update strategy transformation to effectively migrate data from source to target.
  • Used Informatica Data Quality as a tool for data quality measurement.
  • Used IDQ transformation like labels, standardizing, proofing, parser, address doctor, Match, Exception transformations for standardizing, profiling and scoring the data.
  • Worked on Power Exchange for change data capture (CDC).
  • Created stored procedures and invoked from Informatica.
  • Used Informatica debugger to debug the mappings.
  • Performed partitions on sources to improve the performance.
  • Optimized the mappings and implemented the complex business rules by creating reusable transformations and mapplet.
  • Used Informatica workflow manager for create, ran batches and sessions and scheduled them to run at specified time.
  • Involved in Performance Tuning of mappings, SQL overrides and stored procedures.
  • Improved the performance of ETL process by indexing and caching.
  • Created workflows, tasks, database connections, FTP connections using workflow manager.
  • Created UNIX shell scripting for automation of ETL process.
  • Analyzed Session log files to resolve errors in mapping and managed session configuration.

Environment: Informatica PowerCenter 8.6, IDQ 9.1, Oracle 10g, SQL Server 2005, Oracle Designer, Toad, PL/SQL, Linux, Erwin and Windows 2000 / XP.

Confidential

SQL Server Database

Responsibility:

  • Involved in gathering and analysis of user requirements.
  • Created functional and technical specifications for projects.
  • Extracting data from different source like SQL Server, Flat Files, Taradata, and Composite View and transformed, Loaded onto required database through SSIS.
  • Involved in performance tuning and monitoring of T-SQL blocks.
  • Created and managing Event Handlers, Package Configurations, Logging, System and User defined Variables for SSIS Packages.
  • Used DDL and DML for writing triggers, stored procedures, and data manipulation.
  • Responsible and for logical and physical design of database.
  • Used Crystal Reports to design and develop reports.
  • Created database maintenance plans for the performance of SQL Server including database integrity checks, update database statistics, re-indexing and data backups.
  • Migrated bulk data using BCP and DTS from flat files.
  • Created documentation as required.
  • Involved in Performance tuning and testing on stored procedures, indexes, and triggers.
  • Performed daily backup of entire database and recovery.
  • Created functional and technical specifications for projects.

Environment: MS SQL Server 2005, SQL Server Integration Services (SSIS) Windows NT/XP, Crystal Reports, Query Analyzer, Enterprise Manager.

Hire Now