We provide IT Staff Augmentation Services!

Lead Talend Developer Resume

3.00/5 (Submit Your Rating)

Austin, TX

SUMMARY

  • Around 10 years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Testing and Maintenance with focus on Data warehousing applications using ETL tools like Talend and Informatica.
  • 3+ years of experience using Talend Open Studios and Talend Administration Center (TAC).
  • Highly Proficient in Agile, Scrum and Waterfall software development life cycle.
  • Extensively used ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Netezza, Oracle, Teradataand non - relational sources like flat files.
  • Strong knowledge in improving the data quality by Match analysis, analysis of the MDM system performance and tuning the performance by changing the existing Match Columns and Match rules.
  • Experience in using cloud components and connectors to make API calls for accessing data from cloud storage like Salesforce in Talend Open Studio.
  • Experience with various components in Talend like tMap, tJava, tFileoutputDelimited, tAggregaterow, tDie, tJoin, tParallelize and tSleep.
  • Experience in creating JOBLETS in Talend for the processes which can be used in most of the jobs for First Name, Last Name and Email standardizations.
  • Experience in monitoring and scheduling using Job Conductor (Talend Administration Center), AutoSys and using UNIX (Korn& Bourn Shell) Scripting.
  • Experience in creating/building the Java MDM User Exits at different Processes (Stage, Load, Merge & Unmerge)
  • Experienced in creating Triggers on TAC server to schedule Talend jobs to run on server.
  • Strong experience in Extraction, Transformation, loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Metadata Manger)
  • Experience in developing Informatica mappings using transformations like Source Qualifier, Connected and Unconnected Lookup, Normalizer, Router, Filter, Expression, Aggregator, Stored Procedure, Sequence Generator, Sorter, Joiner, Update Strategy, Union Transformations.
  • Hands on Involvement on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
  • Expertise in processing Semi-structured data such as (XML,JSON,RC and CSV) in Hive/Impala by using Talend ETL Tool.
  • Tracking Daily, Monthly and Weekly data load.Resolving the issues in data loads and Providing timely updates to clients.
  • Experience in developing extensions and customizing to the services (SIF) tier of Informatica MDM Hub Server
  • Experienced in Unit Testing, Code Review and Code Migration from Sandbox to Prod.
  • Involved in the Data Analysis for source and target systems and good understanding of Data Warehousing concepts, Staging Tables, Dimensions, Facts and Star, Snowflake Schemas.
  • Experience working with Teradata Parallel Transporter (TPT), BTEQ, FastLoad, Multiload, TPT.
  • Extensive knowledge with Teradata SQL Assistant.

PROFESSIONAL EXPERIENCE

Confidential, Austin, TX

Lead Talend Developer

Responsibilities:

  • Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load. Involved in Data Extraction for various Databases & Files using Talend
  • Created Talend jobs using the dynamic schema feature. Have used Big Data components (Hive components) for extracting data from hive sources.
  • Performance tuning - Using the tmap cache properties, Multi-threading and tParallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
  • Worked with Master Data Management ( MDM) team to load data from external source systems to MDM hub.
  • ELT components - Pushdown optimization technique. Moving the transformation logic to the database side instead of handling at talend side. Database tables are indexed properly and data is huge then ELT method can provide to be much better option in terms of performance of the Job
  • Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Provides Informatica MDM HUB development support and detailed technical knowledge of Informatica MDM products.
  • Implemented Talend POC to Extract data from Salesforce API as an XML Object & .csv files and load data into SQL Server Database
  • Experience on AWS cloud services (EC2, S3, RDS, Redshift, IAM) • Experience in Migration of DataStage jobs to Talend Job producing same functionality with improved performance
  • Implemented Error Logging, Error Recovery, and Performance Enhancement's & created Audit Process (generic) for various Application teams.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments.
  • Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion)
  • Worked on Custom Component Design and used to have embedded in Talend Studio
  • Used to be On call Support if the Project is deployed to further Phases
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)

Environment: Talend Open studio V (6.1.1, 6.2.1) Enterprise Platform for Data management (V6.1.1, 5.5.1, 5.6.1), Netezza, UNIX, Cloud, Oracle, Microsoft SQL Server management Studio, WINDOWS XP

Confidential, Dallas, TX

Lead Talend Developer

Responsibilities:

  • Participated in JAD sessions with business users and SME's for better understanding of the reporting requirements.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Involved in writing SQL Queries and used Joins to access data from Oracle, and My SQL.
  • Used tStatsCatcher, tDie, tLogRow to create a generic job let to store processing stats.
  • Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvert Type, tSort Row, tReplace, tAggregate Row, tUnite etc.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental
  • Prepared User Guides, TDD for MDM Hub configuration and IDD Applications. loading and unit tested the mappings.
  • Hands on experience in Talend Big Data Integration and cloud using Hadoop, Hive, Amazon
  • Implemented installation and configuration of multi-node cluster on cloud using AWS on EC2.
  • Used tStatsCatcher, tDie, tLogRow to create a generic job let to store processing stats into a Database table to record job history. Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
  • Experienced in using debug mode of talend to debug a job to fix errors. Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow, tPivot To Columns Delimited etc. Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job. Created Context Variables and Groups to run Talend jobs against different environments.
  • Used tParalleize component and multi thread execution option to run sub jobs in parallel which increases the performance of a job.
  • Implemented FTP operations using Talend Studio to transfer files in between network folders as well as to FTP server using components like tFile Copy, TFileAcrchive, tFileDelete, tCreateTemporaryFile, tFTPDelete, tFTPCopy. tFTPRename, tFTPut, tFTPGet etc. Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.
  • Experienced in writing expressions with in tmap as per the business need.
  • Handled insert and update Strategy using tmap. Used ETL methodologies and best practices to create Talend ETL jobs.
  • Extracted data from flat files/ databases applied business logic to load them in the staging database as well as flat files.

Environment: alend 5.5/, Oracle 11g, Teradata SQL Assistant, HDFS, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, Informatica, TOAD, ERwin, AIX, Shell Scripts, AutoSys, SVN

Confidential, Irvine, CA

Sr Talend Developer

Responsibilities:

  • Deployed and scheduled Talend jobs in Administration console and monitoring the execution.
  • Created separate branches with in the Talend repository for Development, Production and Deployment.
  • Excellent knowledge with Talend Administration console, Talend installation, using Context and global map variables in Talend.
  • Developed mappings /Transformation/Job lets and designed ETL Jobs/Packages using Talend Integration Suite (TIS) in Talend 6.1.
  • Used Talend job let and various commonly used Talend transformations components like tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput&tHashOutput and many more.
  • Responsible for configuring with SVN with Talend Projects and created multiple users for accessing svn repos.
  • Create Hive databases and tables over the HDFS data and write HiveQL queries on the tables.
  • Schedule Hadoop and UNIX jobs using OOZIE.
  • Responsible for building data model for ODS/OLAP logical/physical design.
  • Modifies, installs, and prepares technical documentation for system software applications.
  • Developed POCs for bulk load options, web service API with in Talend.
  • Heavily used Talend for building ODS & OLAP structures, data movements and XML& JSON processing.
  • Responsible for generating web service SOAP requests for larger volume and run through SOAP service and load the SOAP response into POSTGRE database.
  • Set up and managed transactional log shipping, SQL server Mirroring, Fail over clustering and replication.
  • Designed the architecture of Talend jobs in parallel from execution stand point to reduce the run time.
  • Have handled issues related to cluster start, node failures and several java specific errors on the system.
  • Perform troubleshoot on all tools and maintain multiple servers and provide back up for all files and script management servers.
  • Wrote backup and recovery shell scripts to provide failover capabilities.
  • Worked Extensively on Talend Admin Console & Schedule Jobs in Job Conductors, this option is not available in Talend Open Studio.
  • Hands of Experience on many components which are there in the palette to design Jobs & used Context Variables/Groups to Parameterize Talend Jobs.
  • Worked Extensively on Talend Admin Console & Schedule Jobs in Job Conductors, this option is not available in Talend Open Studio.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments.
  • Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion)
  • Worked on Custom Component Design and used to have embedded in Talend Studio.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger).

Environment: Talend DI 6.1,Linux, Shell Scripting, Oracle, SQL Server 2010/2012

Confidential, Detroit MI

Sr. Talend/ETL Developer

Responsibilities:

  • Worked with Data mapping team to understand the source to target mapping rules.
  • Analyzed the requirements and framed the business logic and implemented it using Talend.
  • Involved in ETL design and documentation.
  • Analyzed and performed data integration using Talend open integration suite.
  • Worked on the design, development and testing of Talend mappings.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow, tJava, Tjavarow, tConvertType etc.
  • Used Database components like tMSSQLInput, tMsSqlRow, tMsSqlOutput, tOracleOutput, tOracleInput etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename.
  • Worked on improving the performance of Talend jobs.
  • Created triggers for a Talend job to run automatically on server.
  • Worked on Exporting and Importing of Talend jobs.
  • Created jobs to pass parameters from child job to parent job.
  • Exported jobs to Nexus and SVN repository.
  • Implemented update strategy on tables and used tJava, tJavarow components to read data from tables to pull only newly inserted data from source tables.
  • Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing Created Generic and Repository schemas.
  • Developed project specific 'Deployment' job responsible to deploy Talend jar files on to the windows environment as a zip file, later, this zip file is unzipped and the files are again deployed to the UNIX box.
  • Also, this deployment job is responsible to maintain versioning of the Talend jobs that are deployed in the UNIX environment.
  • Developed shell scripts in UNIX environment to support scheduling of the Talend jobs.
  • Monitored the daily runs, weekly runs and adhoc runs to load data into the target systems.

Environment: Talend 5.5.2, UNIX, Shell script, SQL Server, Oracle, Business Objects, ERwin, SVN, Redgate, Capterra.

We'd love your feedback!