We provide IT Staff Augmentation Services!

Etl Informatica Developer Resume

2.00/5 (Submit Your Rating)

Woonsocket, RI

SUMMARY:

  • Nearly 8+ years of experience in Information technology with well - built background in development of various projects using Data Warehousing, ETL (Extract Transform Load) using Informatica Power center and Teradata.
  • Experience with Informatica tools - Mapping designer, Mapplet Designer, Transformations Developer, Informatica Repository Manager, Workflow Manager, and Workflow Monitor.
  • Experience in creating transformations and mappings using Informatica Designer.
  • Strong experience in developing Sessions, Workflows, Worklets using the Workflow Manager Tool (Task developer, Workflow, and Worklet Designer).
  • Working experience with Informatica performance tuning involving sources, target, sessions, and transformations.
  • Working Experience on complex mappings using the Informatica transformations like Lookup, Rank, Normalizer, Sorter, Aggregator, Filter, Expression, Router, Source Qualifier, Update Strategy, SQL, and Joiner.
  • Working experience in developing the ETL program for Data Extraction, Transformation and Loading using Informatica Powercenter.
  • Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
  • Knowledge on Oracle database supporting OLTP and Data Warehouses.
  • Good experience in developing SQL with various databases like Oracle, SQL server.
  • Having good experience in using PL/SQL, Triggers, Functions, and Packages for the Oracle Database Development.
  • Knowledge of Teradata RDBMS Architecture.
  • Experience in UNIX working environment, writing UNIX shell scriptsto run theInformatica workflows and controlling the ETL flow.
  • Experience in using the ETL Tool- Informatica Power center and Teradata development.
  • Experience in Designing and Implementing Data Warehouse applications, mainly Transformation processes using ETL tool Datastage also actively involved in Database Querying, Data Analysis, Production support, Review, on call support.
  • Good exposure of Data Warehouse concepts and SDLC Life Cycle.
  • Experience in Data Warehouse development, worked with Data Migration, Data Conversion, and (ETL) Extraction/Transformation/Loading using Ascential Datastage.
  • Involved in Data Migration between Teradata, MS SQL server and Oracle.
  • Extensively worked on Datastage Parallel Extender and Server Edition.
  • Programming experience in BTEQ, Stored Procedures, Macros, and Triggers.
  • Possess good analyzing skills and developed jobs in an understandable/ innovative way with less execution time and with minimal usage of DataStage stages.
  • Extensively used Datastage tools (Data Stage Designer, Datastage Administrator and Datastage Director). Experience in Forward/Reverse Engineering using Erwin.
  • Knowledge and experience in Data warehousing applications, responsible for Extraction, cleansing, Transformation and Loading of data from various sources to the data warehouse.
  • Experience and knowledge in Teradata RDBMS using Fastload, Multiload, Tpump, Fast export, Multiload, Teradata SQL Assistant and BTEQ utilities.
  • Good knowledge of concepts in Data Warehousing like Star Schema and Snowflake Schema, Slowing Changing Dimension(SCD), Surrogate Keys, Data Marts, Kimball Methodology in Relational, Dimensional, and Multidimensional data modelling.
  • Good experience in working with types of SCD: SCD1 and SCD2 where SCD1 maintains the new history which will overwrite the original history and SCD2 maintains the both original and new history.
  • Experience on development of UNIX platforms and Windows XP.
  • Basic Knowledge in Master Data Management(MDM) and Informatica Data Quality(IDQ).
  • Having great Interpersonal and communication skills.
  • Good problem solving, analytical, written, and oral communication skills.
  • Ability to work both independently and as a team.
  • Good decision-making capabilities, exceptional problem solving, alternative solutions, and confident, accurate, decision making.

TECHNICAL SKILLS:

Teradata Utilities: BTEQ, Fast Load, Multiload, Fast Export, Tpump, SQL Assistant, Teradata ParallelTransport utility (TPT), Stored procedures.

Operating Systems: Windows XP/2007,2003, UNIX.

Languages: SQL plus, PL/SQL, XML, UNIX Shell Scripting.

Databases: Oracle 12c/11g, Teradata 14.10/13.x, DB2.

ETL Tools: Informat ica Power Center 9.5.1 (Source Analyzer, Repository ManagerTransformationDeveloper, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor, Warehouse Designer, and Informatica Server).

PROFESSIONAL EXPERIENCE:

Confidential, Woonsocket, RI

ETL Informatica Developer

Responsibilities:

  • Involved in the business meetings to understand their requirements.
  • Working experience in designing, maintenance, and development of database for Data warehouse project.
  • Used the ETL Datastage Designer to develop processes for extracting, cleansing, transforms, integrating and loading data into data warehouse database.
  • Experience in using the Informatica Power Center and designed, Developed the Extraction, Transformation, and Load Process (ETL) for data migration.
  • Designed and developed various mappings using various transformations like Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter, and Sequence Generator transformations.
  • Worked on designing the reusable mapplets.
  • Analyzing business requirements, designs and writing technical specifications to design/redesign the solutions, creating, and maintaining the source-target mapping documents for ETL development team.
  • Experience in working on creating the mappings which involves the slowly changing dimensions.
  • Development of Datastage design concepts, execution, testing and deployment on the client server.
  • Designing the Workflows and Sessions using Workflow Monitor.
  • Experience on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, Worklets, Assignment, Timer, and scheduling of the workflow.
  • To load the data into the OLTP databases as per the business rules must create the Informatica workflows.
  • Knowledge in Writing Unix scripts for the business needs.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
  • Track the status of work from time to time and provide timely updates to team members.
  • Provided spontaneous response to the queries posted by business users and supporting teams.
  • Used Datastage Designer for developing various jobs to extract, cleansing, transforming, integrating and loading data into Data Warehouse.

Environment: DWH, PL/SQL, UNIX Shell Scripting.Informatica Power Center 9.5.1, Teradata 14.10, TPT, Bteq, Fastload, Multiload, Flat files, Oracle 12c, SQL Server, UNIX.

Confidential, Raton, FL

ETL Informatica Developer

Responsibilities:

  • Developed code to load the data from Flat File to stage and stage to DWH.
  • Involved in Data Extraction, Transformation and Loading using BTEQ and Stored Procedures.
  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata 14.
  • Wrote Teradata SQL, BTEQ, Multiload, Fastload, and FastExport for Ad-hoc queries, and build Unix shell script to perform ETL interfaces BTEQ, Fastload.
  • Developed mappings with ORACLE as target and formatting the target data according to the requirement.
  • Designed and developed Informatica mappings for data loads and data cleansing and created mappings to in corporate Incremental loads.
  • Developed the reusable Mapplets to include the Audit rules.
  • Worked with Business analysts to understand business requirements.
  • Helps to transform business requirements into effective technology solutions by creating the technical Specifications (Source to Target Documents) for the ETL from the Functional Specifications.
  • While coding the mapping to implement the complex logics, worked mostly on transformations like Lookup, Aggregator, and Expression Transformations.
  • Worked with Rank, Lookup, Joiner, Sorter, and Aggregator transformations for the better throughput of sessions with Memory cache for static and dynamic cache.
  • Migrating objects between different Environments using XML Export/Import (using Repository Manager).
  • Involved in creating Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions and Target Data.
  • Solely responsible for the daily loads.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
  • Creating stored procedures and sequences to insert key into the database table.
  • Involved in few database scripts to be executed from UNIX.
  • Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using Fast Load, Multiload of Teradata.
  • Experience in performance tuning and query optimization of the TeradataSQLs.
  • Created tables, views in Teradata according to the requirements in all environments.
  • Extensively Used UNIX Scripting for the business needs.
  • Guided the testing team and the development team and monitored the Implementation.
  • Provided support for the Nightly jobs.

Environment: DWH, PL/SQL, UNIX Shell Scripting.Informatica Power Center 9.5.1, Teradata 14.10, TPT, Bteq, Fastload, Multiload, Flat files, Oracle 11g, SQL Server, UNIX.

Confidential, Bentonville, AR

ETLInformatica Developer

Responsibilities:

  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata 14.
  • Wrote Teradata SQL, BTEQ, Multiload, Fastload, and FastExport for Ad-hoc queries, and build Unix shell script to perform ETL interfaces BTEQ, Fastload.
  • For the implementation of the business rules and standards of source data from the many systems into the data warehouse we must Develop and design the mapping, transformation logic and transformation in the Informatica.
  • Development of Datastage design concepts, execution, testing and deployment on the client server.
  • Experience in using the Informatica Power Center and designed, Developed the Extraction, Transformation, and Load Process (ETL) for data migration.
  • Responsible for Tuning the Performance of the ETL mappings.
  • Designing the Workflows and Sessions using Workflow Monitor.
  • Knowledge in Writing Unix scripts for the business needs.
  • Running and monitoring of Jobs using Datastage Director and checking logs.
  • To load the data into the OLTP databases as per the business rules must create the Informatica workflows.
  • Experience in working on creating the mappings which involves the slowly changing dimensions.
  • Designed and developed various mappings using various transformations like Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter, and Sequence Generator transformations.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
  • Track the status of work from time to time and provide timely updates to team members.
  • Provided spontaneous response to the queries posted by business users and supporting teams.

Environment: DWH, PL/SQL, UNIX Shell Scripting.Informatica Power Center 9.5.1, Teradata 14.10, TPT, Bteq, Fastload, Multiload, Flat files, Oracle 12c, SQL Server, UNIX.

Confidential

Informatica Developer

Responsibilities:

  • Implement procedures to maintain, monitor, backup and recovery operations for ETL environment.
  • Conduct ETL optimization, troubleshooting and debugging.
  • Maintain ownership of release activities interacting with ETL projects.
  • Create and maintain data models working with the Data Architect.
  • Created mapping documents for data mart deliverables.
  • Involved in using Framework Manager to build models, packages and publish packages to the ReportNet Server
  • Created the Data flow Diagrams for the full run and the reprocess partial run for the workflows to be created in Informatica taking into point the dependencies using Microsoft Visio.
  • Maintained mapping documents throughout development lifecycle.
  • Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
  • Developed mapping to load the data in slowly changing dimension.
  • Wrote UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Extensively migrated the Informatica code from the version 7.1.4 to 8.1.1 and did testing to maintain its results.
  • Worked on Dimensional modelling to design and develop STAR schemas using ER-win 4.0, Identifying Fact and Dimension Tables.
  • Created the Data flow Diagrams for the full run and the reprocess partial run for the workflows to be created in Informatica taking into point the dependencies using Microsoft Visio.
  • Analyzed data relationships graphically and changed displays using Cognos Power Play functionality by means of drill down, slice and dice, rank, sort, forecast, and nest information to gain greater insight into trends, causes, and effects.
  • Loaded consolidated data using SQL*Loader in parallel and direct mode.
  • Documenteduser requirements translated requirements into system solutions and develop implementation plan and schedule.
  • Implemented best practices in ETL Design and development and ability to load data into highly normalized tables and star schemas.
  • Identified all the confirmed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Involved in creation of Informatica users and repository backup using Server Manager.
  • Generated reports for end client using various Query tools like Cognos.

Environment: Informatica Power Center 8.x/ 7.x, PowerBuilder, Oracle 9i/10g, Sybase, Oracle SQL Developer, Cognos, Quest-Toad, Siebel, Shell Scripts, SQL*Loader, UNIX (IBM AIX, Solaris), DB2 UDB, and Windows 2000.

We'd love your feedback!