We provide IT Staff Augmentation Services!

Etl Informatica/ Talend Developer Resume

2.00/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • 7+ years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Testing and Maintenance with focus on Data warehousing applications using ETL tools like Talend and Informatica.
  • Using the concept of Slowly Changing Dimensions, complex mappings were created. Involved implementation of Business logic and capturing the deleted rows in the source origination.
  • Developed efficient mappings for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.
  • Extensive experience in Installation, Configuration and Updates of SQL Server.
  • Involved in Huge data migrations, transfers using utilities like Data Transformation Services (DTS), and SSIS, Bulk Copy Program (BCP) and Bulk Insert.
  • Designing and Deployment of Reports for the End - User requests using Web Interface & SSRS.
  • Experience in solving Real Time issues with Index fragmentation, DBCC checks, Query Tuning, Error and Event Handling.
  • Expertise in SQL Server Storage Structures and Security Architecture for databases residing on SAN storage.
  • Experienced in SDLC life cycle for Design, Development and Staging phases of the projects with support of Data-Flow, Process Models, E-R Diagrams.
  • Strong understanding of RDBMS concepts as well as Data Modeling Concepts.
  • Responsible in developing, support and maintenance for the ETL Informatica power system.
  • Excellent understanding and best practice of Data Warehousing Concepts, involved in Full Development life cycle of Data Warehousing. Expertise in enhancements/bug fixes, performance tuning, troubleshooting, impact analysis and research skills.
  • Expert in data warehouse performance tuning.
  • Created complex mappings in Talend using components like: tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow, tWarn, tbuffer, tcontextload.
  • Extensive experience in Relational and Dimensional Data modelling for creating Logical and Physical Design of Database and ER Diagrams using data modelling tools like ERWIN and ER Studio.
  • Data and Database Migration including Mainframe to PC database conversions and Data Mapping, retrieval, cleansing, consolidation, mapping, and reporting for client review.
  • Worked exclusively on implementing the types of slowly changing dimensions (SCDs) - Type 1 and Type 2 in different mappings as per the requirements.
  • Experience in UNIX Shell Scripting.

TECHNICAL SKILLS

Programming Languages: C, C++, HTML, SQL, Unix, T-SQL, PL/SQL, C#, Talend, Infomatica, Python, Shell.

Databases: Oracle 10/11g, SQL Server 2012/2014, MySQL, Teradata.

ETL/ Informatica Tool: Information Power Center 7.x/8.x/9.x, Informatica Data Quality, Information Power exchange, Informatica BDE, Pentaho Data, Integration, Web service SSIS, Talend RTX4.1, Informatics B2B, Data Transformation Studio 8.6.

Web Application Server: IBM Web sphere, Tomcat and LDAP.

Other Tools: ETL Testing Informatica, Jira.

Operating System: Windows, Linux, MS DOS, Windows XP/7, Ubuntu and Unix.

PROFESSIONAL EXPERIENCE

Confidential, Plano TX

ETL Informatica/ Talend Developer

Responsibilities:

  • Involved in the ETL design and its documentation.
  • Developed Jobs in Talend Enterprise edition from stage to source, intermediate, conversion and Target.
  • Worked with Data Mapping Team to understand the source to target mapping rules.
  • Analyzed the requirements and framed the business logic for the ETL process using talend.
  • Designing, developing and deploying end-to-end Data Integration solution.
  • Designed and Implemented the ETL process using Talend Enterprise Big Data Edition to load the data from Source to Target Database.
  • Involved in Data Extraction from Oracle, Flat files and XML files using Talend by using Java as Backend Language.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
  • Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput, tOracleOutput etc.
  • Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2 to capture the changes
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Conducted introductory and hands-on sessions of Hadoop HDFS architecture, Hive, Talend, Pig for other teams.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • ELT leverages the target system to do the transformation, the data is copied to the target and then transformed in place.
  • Involved in automation of FTP process in Talend and FTPing the Files in UNIX.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Extracted data from Oracle as one of the source databases.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations

Environment: Talend Data integration 5.6.1, Oracle 11g, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, T-SQL,, TOAD, AIX, Shell Scripts, Autosys, SQL Server Integration Services (SSIS),SQL Server Reporting Services (SSRS),SQL Server Analysis Service (SSAS), Share Point, MS Access, Perl and Windows 2012 Server.

Confidential, Tempe, AZ

ETL Informatica/ Talend Developer

Responsibilities:

  • Involved in requirements gathering, data modeling and designed Technical, Functional & ETL Design documents.
  • Implemented Mappings to extract the data from various sources Oracle, XML files and load into oracle and Teradata using Teradata utilities like Mload, Tpump, and Fast Export.
  • Designed and implemented slowly changing dimension mappings to maintain history.
  • Used PowerExchange to capture the changes in the source data.
  • Used Teradata utility Fast Load for bulk loading and Tpump & Mload utilities for loading less and larger volumes of data.
  • Scheduling jobs through Autosys/Control M for monthly and daily job cycles
  • INC/Remedy Ticket handling and problem ticket analysis.
  • Implemented ETL Balancing Process to compare and balance data directly from source and warehouse tables for reconciliation purposes.
  • Worked with XML sources and targets.
  • Worked on various backend Procedures and Functions using PL/SQL.
  • Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
  • Created and worked with generic stored procedures for various purposes like truncate data from stage tables, insert a record into the control table, generate parameter files etc.
  • Architecture oversight to various project teams, a key leadership role for the architecture redesign, data governance, security, new developments, and major enhancements.
  • As a release architect, managed various release activities including environment planning, release planning and dependency, project dependencies, architecture review, etc
  • Designed and developed Staging and Error tables to identify and isolate duplicates and unusable data from source systems.
  • Simplified the development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.
  • Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.
  • Designed and executed test scripts to validate end-to-end business scenarios.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance of ETL jobs.
  • Resolved complex technical and functional issues/bugs identified during implementation, testing and post production phases.
  • Identified & documented data integration issues and other data quality issues like duplicate data, non-conformed data, and unclean data.
  • Assisted team members in functional and Integration testing.
  • Automated and scheduled Workflows, UNIX scripts and other jobs for the daily, weekly, monthly data loads using Autosys Scheduler.
  • Created standard reports using Business Objects features like Combined Queries, Drill Down, Drill Up, Cross Tab, Master Detail etc.

Environment: Talend Platform for Big Data 5.6.2, Enterprise Platform for Data integration and MDM (V6.1.1,5.5.1, 5.6.1), UNIX, Oracle 11g, SQL Server 2012, Microsoft SQL Server management Studio, WINDOWS XP

Confidential

ETL Informatica/ Talend Developer

Responsibilities:

  • Developed number of complex Informatica Mappings, Mapplets and Reusable Transformations for different types of tests in Customer information, Monthly and Yearly Loading of Data.
  • Analyzing the source data coming from different sources and working with business users and developers to develop the Model.
  • Involved in Dimensional modeling to Design and develop STAR Schema, Using ER-win to design Fact and Dimension Tables.
  • Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup (Connected & Unconnected), Source Qualifier, Filter, Update Strategy, Stored Procedure, Router, and Expression).
  • Using Workflow Manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Extracted data from various sources like IMS Data Flat Files and Oracle.
  • Extensively Used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
  • Wrote complex SQL scripts to avoid Informatica joiners and Look-ups to improve the performance, as the volume of the data was heavy.
  • Involved in design, development, administration and maintenance of the Application.
  • Created and maintained a several Data Base objects including stored procedures, data base triggers and views.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Involved in designing and customizing several forms.
  • Organizing and managing the databases for optimum performance levels.
  • Generated several reports
  • Involved in walk through of code to implement performance tuning.
  • Involved in Analysis and documentation.
  • Created Sessions, reusable Worklets and Batches in Workflow Manager and Scheduled the batches and sessions at specified frequency.
  • Used SQL * Loader for Bulk loading.
  • Created Stored Procedures for data transformation purpose.
  • Monitored the sessions using Workflow Monitor.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
  • Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Captured data error records corrected and loaded into target system.
  • Created Mappings, Mapplets and Transformations, which remove any duplicate records in source.
  • Implemented efficient and effective performance tuning procedures, Performed benchmarking, and these sessions were used to set a baseline to measure improvements against.
  • Tuned Source System and Target System based on performance details, when source and target were optimized, sessions were run again to determine the impact of changes.
  • Used Calculations, Variables, Break points, Drill down, Slice and Dice and Alerts for creating Business Objects reports.
  • Configured workflows with Email Task which would send mail with session, log for Failure of a sessions and for Target Failed Rows.
  • Used Server Manager to create schedules and monitor sessions. And to send the error messages to the concerned personl in case of process failures.
  • Designed and developed the logic for handling slowly changing dimension table's load by flagging the record using update strategy for populating the desired.

Environment: Informatica Powercenter 8.6.1, SSIS, SSRS, IDQ 8.6, Python Scripting, Teradata, MS SQL Server, Oracle 10g/11g, TOAD, SQL, SQL Server 2005/2008, Windows 7, Unix, Linux.

Confidential

ETL Informatica/ Talend Developer

Responsibilities:

  • Developing stored procedures and database triggers and used user exits and PL/SQL interfaces to access foreign functions.
  • Using triggers for block processing, interface events, master-detail relationships, message handling, mouse events, navigation, query-time processing, transactional processing, validation and key processing.
  • Extensively involved in Coding PL/SQL, Packages, Procedures, Functions and Database Objects according to Business rules.
  • Experience in using Arrays in stored procedures and processing it.
  • Wrote extensive Sub-Queries, PL/SQL procedures, functions, database triggers and packages.
  • Written and modified Shell Scripts to automate the data loading.
  • Developed scripts to create Oracle objects such as Tables, Views, Indexes, Synonyms, Materialized views, Sequences, Cursors, Partitions, and Dynamic SQL.
  • Tracking bugs in code and logging them into Test Director. Prepare the Test Plan for all the releases. Implementation and Update system documentation.
  • Perform System testing and Regression testing for the new releases and making changes to test
  • Writing Test Scripts for various initiatives after analyzing BRD (Business Requirement Document) and User Interface documents.
  • Developing stored procedures and database triggers and used user exits and PL/SQL interfaces to access foreign functions.
  • Using triggers for block processing, interface events, master-detail relationships, message handling, mouse events, navigation, query-time processing, transactional processing, validation and key processing.
  • Extensively involved in Coding PL/SQL, Packages, Procedures, Functions and Database Objects according to Business rules.
  • Experience in using Arrays in stored procedures and processing it.
  • Wrote extensive Sub-Queries, PL/SQL procedures, functions, database triggers and packages.
  • Written and modified Shell Scripts to automate the data loading.
  • Developed scripts to create Oracle objects such as Tables, Views, Indexes, Synonyms, Materialized views, Sequences, Cursors, Partitions, and Dynamic SQL.
  • Tracking bugs in code and logging them into Test Director. Prepare the Test Plan for all the releases. Implementation and Update system documentation.
  • Perform System testing and Regression testing for the new releases and making changes to test
  • Writing Test Scripts for various initiatives after analyzing BRD (Business Requirement Document) and User Interface documents.

Environment: Informatica PowerCenter 7.1, MS SQL 2000, Oracle 9i, SQL, PL/SQL, SQL Navigator, UNIX Shell Scripting, Windows XP and 2000, SQL Server, TOAD, AutoSys.

We'd love your feedback!