We provide IT Staff Augmentation Services!

Talend Developer Resume

Portland, OR


  • Over 8 years of strong experience in Analysis, Design and Development of Business Intelligence Solutions in Data Warehousing using Talend 6.2.1/6.0.1/5.6 and Informatica PowerCenter 9.5/9.1/8.6/8.1 as an ETL tool on Windows and UNIX based operating systems.
  • Experience in developing ETL mappings, transformations and implementing source and target definitions in Talend.
  • Strong understanding of RDBMS concepts and experience in writing PL/SQL and SQL statements in databases.
  • Highly proficient in the integration of various data sources involving multiple relational databases like Oracle, MS SQL Server, Teradata, DB2 and non - relational sources like COBOL Files and Flat File.
  • Converting Large XML files to Multiple XML files as required by downstream application.
  • Significant experience wif Data Extraction, Transformation and Loading (ETL) from disparate data sources such as multiple relational databases and worked on integrating data from flat files, CSV files and XML files into a common reporting and analytical data model.
  • Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping, Session and System in ETL Process. Has heavily worked on performance tuning long running Informatica mappings using Pushdown optimization and session partitioning.
  • Deep understanding of OLTP, OLAP and data warehousing environment and tuning both kind of systems for peak performance.
  • Experience in Service oriented development using Talend ESB.
  • Strong understanding of the principals of DW using fact tables, dimension tables and star/snowflake schema modeling.
  • Worked in designing and developing the Logical and physical model using Data modeling tool (ERWIN).
  • Developed slowly changing dimension (SCD) mappings using type-me, type-II, and type-III methods.
  • Expertise on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
  • Strong Data Analysis and Data Profiling background using Informatica Analyst, Informatica Data Explorer (IDE) and data cleansing background using Informatica Data Quality (IDQ).
  • Experience in working wif Standardiser, Parser, Match, and Merge & Consolidation Transformations using IDQ.
  • Worked wif SQL/PL-SQL to write Complex SQL queries, Stored Procedures, Triggers, Functions & PL/SQL packages.
  • Hands on Experience in working wif Hadoop ecosystems like Hive, Pig, Sqoop, Map Reduce.
  • Worked on UNIX shell scripts using Kshell for the Scheduling sessions, automation of processes, pre-and post-session scripts.
  • Excellent work experience in AGILE methodology and development.


ETL/Middleware Tools: Talend 6.2.1/6.2.0/5.6, Informatica Power Center 9.5/9.1/8.6/8.1

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

Business Intelligence Tools: Business Objects 6.0, Cognos 8BI/7.0, Sybase, OBIEE 11g/10.1.3.x

RDBMS: Oracle 10g/9i, Netezza, Teradata, MS SQL Server, DB2, MySql, MS Access.

Programming Skills: T-SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java, C#.Net

Modeling Tool: Erwin 5.0/4.1, MS Visio.

Tools: TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion, Share Point, IP switch user, Teradata SQL Assistant.

Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX.

Scheduling Tool: Zena, Maestro, Control M, Informatica Scheduling.


Talend Developer

Confidential, Portland, OR

Roles & Responsibilities:

  • Involved in data analysis and handling the ad-hoc requests by interacting wif business analysts, clients and customers and resolve the issues as part of production support.
  • Interacting wif the clients on a regular basis to discuss day-to-day issues and matters.
  • Provide support for around 600 Talend Jobs (developed using Talend 5.6) in which few are running for every 10 minutes.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
  • Worked on Talend ETL to load data from various sources to Oracle DB. Used tmap, treplicate, tfilterrow, tsort and various other features in Talend.
  • On-Call/Production Support provided during day-time and off-hours.
  • Documenting the regular activities to be done as part to AMS.
  • Acnoledge the tickets and Fixing the issues wifin the SLA.
  • Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited as well as custom component such as tUnpivotRow.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.
  • Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput, tOracleOutput etc.
  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Converting Large XML files to Multiple XML files as required by downstream application.
  • Generating the reports required to state as per the requirement and submitting it to the states wifin the deadlines.
  • Involved in automation of FTP process in Talend and FTPing Files in UNIX.
  • Developing UNIX shell scripts for automating and enhancing/streamlining existing manual procedures.
  • Involved in the development of Talend Jobs and preparation of design documents, technical specification documents.
  • Worked on the incidents and able to provide the temporary solutions. So, that there won’t be any impact to the business.
  • Created PBI’s for analyzing the issues and providing the permanent solutions for the issues.
  • Created PKE’s for documenting the permanent solutions and followed up wif right stake holders to get the approvals and fixing the issue permanently for a specific release.
  • Analyzing the issues and providing the estimated duration to fix the issues to the appropriate stake holders in the project tracker.
  • Processed business requests in ad-hoc of loading data in to production DB2 database using Talend Jobs.

Environment: Talend 6.2.1, SSIS, SQL Server 2014, XML, DB2 database, T-SQL, Cognos, Unix, Maestro, Remedy ticketing tool, AutoSys Tool.

Talend Developer

Confidential, Herndon, VA.

Roles and Responsibilities:

  • Developed, documented and executed unit test plans for the components.
  • Documented the developed code, run the jobs, while keeping track of source and target row count.
  • Used most of the components such as the Source Connections, tMap, tAggregateRow, tJoin, tFilterRow, tSortRow, tOracleSCD, Target Connection, tMemorizeRows, tNormalize, tFlow ToIterate, tPivotToColumnsDelimited for developing jobs involving complex business logics.
  • Analyzed business requirements and prepped the design documents.
  • Involved in Data Extraction from SQL Server, Flat files using Talend.
  • Responsible for Pre-Post migration planning for optimizing Data load performance, capacity planning and user support.
  • Successfully loaded files to Hive and HDFS from SQL Server.
  • Created HIVE/PIG scripts for ETL purpose.
  • Exporting final tables from HDFS to SQL server using SQOOP.
  • Prepared ETL flow of data from Staging to Data Mart.
  • Push data as delimited files into HDFS using Talend Big data studio.
  • Created Database Design Document for SQL server Database tables which will be further used by DBA’s to create the tables.
  • Created and validated source layout using Talend.
  • Worked extensively on customization after up gradation to meet the current version requirements
  • Cleanse, De-duplicate and transform the data in the XML and load it to SQL Server Table.
  • Perform complex Look up operations wif other tables to derive new fields.
  • Writing Unit Test cases andexecuting those Test cases to check if the application is as per requirements.
  • Involved in Transferring files to Customer Using FTP by Zena.
  • Involved in the preparation of Test cases and Test Scripts.
  • Performed Unit and regression testing for the Application.
  • Handled User Acceptance Testing (UAT) wif the downstream feed users for validation of feeds.

Environment: Talend 5.5, Hadoop, HDFS, Hive, Impala, Oracle 11g, SQL server 2012, XML files, Sql developer, PL/SQL, MS Visual Studio 2012, Unix.

Talend Developer

Confidential, Chevy Chase, MD

Roles & Responsibilities:

  • Performed major role in understanding the business requirements and designing and loading data into data warehouse (ETL).
  • Created and developed a series of jobs for handling different cases of input data in the same source table. Analysis of certain existing jobs, which were producing errors and modifying them to produce correct results. Used repository context variables in jobs.
  • Developed jobs to populate Reference data tables which provide codes and descriptions for dimension tables in the database.
  • Used components like tJoin, tMap, tFilterRow, tAggregateRow, tSortRow, Target Connections and Source Connections.
  • Mapping source files and generating Target files in multiple formats like XML, Excel, CSV etc.
  • Transform the data and reports retrieved from various sources and generating derived fields.
  • Reviewed the design and requirements documents wif architects and business analysts to finalize the design.
  • Created WSDL data services using Talend ESB.
  • Cretaed Rest Services using tRESTRequest and tRESTResponse components.
  • Used tESBConsumer component to call a method from invoked Web Service.
  • Implemented few java functionalities using tJava and tJavaFlex components.
  • Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance.
  • Attending the technical review meetings.
  • Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.
  • Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.
  • Responsible for monitoring all the jobs that are scheduled, running completed and failed. Involved in debugging the jobs that failed using debugger to validate the jobs and gain troubleshooting information about data and error conditions.
  • Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
  • Developed various reusable jobs and used as sub-jobs in other jobs.
  • Used Context Variable to increase the efficiency of the jobs
  • Extensive use of SQL commandswifTOADenvironment to create Target tables.

Environment: Talend 5.1, Oracle 11g, DB2, Sybase, TOAD, SQL, UNIX.

ETL Developer



  • Loaded data from Source systems and sent to JMS queue for loading in to Target systems using XML Generator and Parser Transformations.
  • Worked wif Informatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
  • Reviewed the design and requirements documents wif architects and business analysts to finalize the design.
  • Used mapplets for use in mappings thereby saving valuable design time and effort.
  • Created Logical objects in Informatica Developer tool and exported them to Power Center 9.5 and used them in PC mappings.
  • Built common rules in analyst tools for analyst to use in mapping specifications and profiling on tables.
  • Created Pre/Post session to save the last generated numbers for SK’s.
  • Used Informatica workflow manager, monitor and log files to detect errors.
  • Used SQL Override in Sorter, Filter & in Source Qualifier Transformation.
  • Employed Normal Join, Full Outer Join, Detail Outer Join and master Outer Join in the Joiner Transformation.
  • Extensively worked on various re-usable tasks, workflows, Worklets, mapplets, and re-usable transformations.
  • Worked on slowly changing dimension Type2.
  • Developed workflow sequences to control the execution sequence of various jobs and to email support personnel.
  • Involved in unit testing and documenting the jobs and work flows.
  • Set Standards for Naming Conventions and Best Practices for Informatica Mapping Development.
  • Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.
  • Created various UNIX shell scripts for Job automation of data loads.
  • Worked on all phases of SDLC from requirement, design, and development and testing.

Environment: Informatica Power Center 9.1, Oracle 11g/10g, DB2, Teradata, Teradata SQL Assistant files, SQL Server, SQL, PL/SQL, Unix, Windows7.

ETL Developer


Roles & Responsibilities:

  • Gathered the business requirements from Business Analyst.
  • Worked on analyzing Source to Target mapping Excel document.
  • Extensively worked on Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
  • Used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing delta loads.
  • Worked on various tasks like Session, E-Mail task and Command task.
  • Worked on Informatica Scheduler for scheduling the delta loads and master loads.
  • Written the scripts needed for the business specifications.
  • Facilitated performance tuning of the process at mapping level, session level, source level and Target level.
  • Used Mapplets for use in mappings thereby saving valuable design time and effort.
  • Worked on various Look up Caches like Static, Dynamic, Persistent and Shared Caches.
  • Worked on UNIX to Import parameter files into workflow manager.
  • Worked on session logs, Informatica Debugger and Performance logs for error handling when we had workflows and session fails.
  • Functionality, Back-end and Regression testing during the various phases of application and data integrity/back-end testing by executing SQL statements.

ETL Developer


Roles & Responsibilities:

  • Involved in creating entity relational and dimensional relational data models using Data modeling tool Erwin.
  • Designing the Target Schema definition and Extraction, Transformation and Loading (ETL) using Data stage.
  • Mapping Data Items from Source Systems to the Target System.
  • Used the Data Stage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Worked on programs for scheduling Data loading and transformations using Data Stage from legacy system to Oracle 9i using SQL* Loader and PL/SQL.
  • Designing and Developing PL/SQL Procedures, functions, and packages to create Summary tables.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis).
  • Worked wif DataStage Manager for importing metadata from jobs, new job Categories and creating new data elements.
  • Developing Shell Scripts to automate file manipulation and data loading procedures.

Environment: DataStage 7.0 (Manager, Designer, Director), ERWIN, Oracle 9i, PL/SQL, SQL* Loader, SQL Server 2000, Windows NT 4.0, UNIX.

Hire Now