We provide IT Staff Augmentation Services!

Sr. Talend Developer Resume

Phoenix, AZ


  • Around 7 Years of experience in IT industry with progressive experience in providing product specifications, design, analysis, development, documentation, coding, and implementation of business technology solutions in Data warehousing applications.
  • Extensive experience in development and maintenance in a corporate wide ETL solution using SQL, PL/SQL, TALEND 4.x/5.x/6.x on UNIX and Windows platforms.
  • Strong experience with Talend tools - Data integration, big data and experience in Data Mapper, Joblets, Meta data and Talend components, jobs.
  • Experience in developing Various ETL mappings, transformations and implementation of source and Target definitions in Talend .
  • Extensively created mappings in TALEND using tMap, tJoin, tConvert Type, tLog Catcher, tNormalize, tDenormalize, tJava, tWarn, tLogCatcher, tMysqlScd, tGlobalmap, tDie etc.
  • Good experience with Map Reduce and Hadoop Ecosystem Pig, Hive, Hana, Sqoop etc.
  • Strong understanding of DW principles using fact tables, dimension tables, star/snowflake schema modelling.
  • Extensively use ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non-relational sources like XML, flat files, and mainframe Files.
  • Experience in scheduling tools Autosys, Control M & Job Conductor (Talend Admin Console).
  • Experience in developing ETL mappings, transformations and implementing source and target definitions in Talend.
  • Implemented Slowly changing dimension (SCD) mappings using type-I, type-II & type-III methods.
  • Strong Understanding of RDMS concepts and experience in writing PL/SQL AND SQL statements in databases.
  • Talend Administrator with Hands on Big Data Experience.
  • Extensively worked on performance tuning of informatica mappings using Pushdown optimization and session partitioning.
  • Experiences on database like MySQL, Oracle etc.,
  • Hands on experience in running Hadoop in running Hadoop streaming jobs to process terabytes of xml format data using Flume and Kafka.
  • Experience in Troubleshooting and implementation of Performance tuning at various levels such as Source, Target, Mapping, Session and System in ETL Process.
  • Knowledge in writing, testing and implementation of SQL queries, Stored Procedures, Functions, and Triggers using Oracle, PL/SQL.


ETL/Middleware Tools: Talend 5.5/5.6/6.2/7.1 , Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modelling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

Business Intelligence Tools: Business Objects 6.0, Cognos 8BI/7.0. s, Sybase, OBIEE 11g/10.1.3.x

RDBMS: Oracle 11g/10g/9i, MS SQL Server 2014/2008/2005/2000 , MySQL, MS Access.

Programming Skills: SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java.

Modelling Tool: Erwin 4.1/5.0, MS Visio.

Tools: TOAD, SQL Plus, SQL*Loader, Soap UI, Fish eye, Subversion, Share Point, IP switch user, SQL Assistant.

Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.


Sr. Talend Developer

Confidential, Phoenix, Az


  • Involved in Design and Development of Business requirements and analyzed application Requirements and Provided Recommended Design.
  • Participated actively in end user meetings and collected requirements.
  • Worked in the Data Integration Team to perform data and application integration with a goal of moving high volume data more effectively, efficiently and with high performance to assist in business-critical project.
  • Performing Data Manipulations using various Talend Components like tMap, tjavarow, tjava, tOracleRow, tOracle Input, tOracle Output, tMS SQL, Input and many more. creating re-usable transformations and robust mappings using Talend transformations like tConvert Type, tSort Row, tReplace, tAggregateRow, tUnite etc.
  • Using Jira for agile sprints and story management.
  • Created standard and best practices for Talend ETL components and jobs.
  • Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio.
  • Deployed Talend jobs in Talend Administration Console (TAC) through publisher and job conductor.
  • Developing data pipeline with Amazon AWS to extract data from weblogs and store in HDFS.
  • Implemented tHDFSinput, tHDFSGet to read and get files from HDFS.
  • Implemented tHDFSOutput, tHDFSOutput component to put and write files to HDFS.
  • Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.
  • Using GitHub version control repository for Source code management as well as CI/CD procedure.
  • Worked with production support in finalizing scheduling of workflows and database scripts using AutoSys.
  • Used scheduler jobs with dependencies for providing knowledge transfer to support team on all the new developments going into production.

Environment: Talend 7.1, Oracle, SQL Server, Unix, Jira, GitHub, Control M, HDFS, TAC, Java.

Talend Developer

Confidential, Houston, TX


  • Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Participated actively in end user meetings and collected requirements.
  • Deployed Talend jobs in Talend Administration Console (TAC) through publisher and job conductor.
  • Implemented tExcelInput and tMap for reading complex excel workbooks.
  • Used tKafkaInput component to read real-time data from kafka.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Used Various Components like tKafkaConnection tFilterRow, tSetGlobalVar, tSortRow, tUniqRow, tMap, etc.
  • Wrote complex SQL queries to inject data from various sources and integrated it with Talend.
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Administration of Talend Server, User Administration, Creation, Creation of Talend Folders and assigning Privileges, Creation of Unix Level Folders.
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Automated SFTP process by exchanging SSH keys between UNIX servers.
  • Converted User Inputs into ETL Docs.

Environment: Talend 6.3, SQL Server, HDFS, GitHub, TAC, Unix, Jira, Control M, Java.

Talend/ETL Developer

Confidential, Houston, TX


  • Implemented File Transfer protocol operations using Talend Studio to transfer files in between network folders.
  • Experienced in fixing errors by using debug mode of Talend.
  • Worked with most of the Talend Component’s Such as source connections, tMap, tJoin, tUnite, tFilterRow, tOracleSCD, Target Connection, tFlowToIterate, tSortRow, tPivotToColumnsDelimited for developing jobs involving complex business logics.
  • Used tOracleSCD components to implement Type 1 and Type 2 SCD’s to update Slowly changing dimension tables.
  • Extracted data from the flat files and other RDBMS databases into staging area and populated onto Datawarehouse.
  • Successfully loaded files to Hive and HDFS from SQL Server.
  • Push data as delimited files into HDFS using Talend Big Data Studio.
  • Exported final tables from HDFS to SQL Server using SQOOP.
  • Performed Unit and Regression Testing for the application.
  • Developed Interfaces using UNIX Shell Scripts to automate the Bulk Load and Update.
  • Handled User Acceptance Testing (UAT) with the downstream feed users for validation of feeds.
  • Developed UNIX shell scripts for automating and enhancing/streamlining existing manual procedures.
  • Responsible for monitoring all jobs that are scheduled, running completed and failed. Involved in Debugging the jobs that failed using debugger to validate the jobs and gain troubleshooting information about data and error conditions.

Environment: Talend 5.5/5.6, SSIS, SQL Server, Hadoop, Hive, XML files, Visual Studio 2012, Unix, Hadoop, Oracle 11g, AutoSys.

SQL/ETL Developer



  • Responsible for designing and developing of mappings, mapplets, sessions and work flows for loading the data from source to target database using Informatica Power Center and tuned mappings for improving performance.
  • Created database objects like views, indexes, user defined functions, triggers and stored procedures. Tuned mappings & SQL queries for better performance and efficiency.
  • Developed PL/SQL triggers & master tables for automatic creation of primary keys.
  • Involved in ETL process from development to testing and production environments.
  • Extracted data from various sources like Flat files, Oracle and loaded it into target systems using informatica 8.x.
  • Used Informatic Power Center Workflow Manager to create sessions, batches to run with the logic Embedded in the mappings.
  • Automated existing ETL operations using Autosys.
  • Created & Executed shell scripts in Unix Environment.
  • Created and ran the Workflows using Workflow manager in Informatica Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
  • Created tables and partitions in database Oracle.

Environment: Informatica Power Center 8.x, Oracle, SQL developer, MS Access, PL/SQL, UNIX Shell Scripting, SQL Server 2005, Windows XP.

Hire Now