We provide IT Staff Augmentation Services!

Talend Developer Resume

0/5 (Submit Your Rating)

NJ

SUMMARY

  • ETL Developer Around 7+ years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Testing and Maintenance with focus on Data warehousing applications using ETL tool Talend.
  • Experience in cloud platforms like AWS
  • Experience in Data Modeling / Architecture, Database Administration with specialization in Various ETL Platforms.
  • Experienced in ETL methodology for performing Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle 10g/9i/8i/7.x, DB2, SQL server, Teradata, Hive and non - relational sources like Delimited files, Positional files, flat file, JSON and XML.
  • Created complex mappings in Talend using tMap, tJoin, tReplicate, tParallelize, tJava, tJavaFlex, tAggregateRow, tDie, tWarn, tLogCatcher, etc.
  • Extensively used Talend components like tfileinputdelimited, tparquetinput, tspakrow, tSetGlobalVar tMap, tReplicate, tJoin, tFileList, tSortRow, tBufferInput, tBufferOutput, tDenormalize, tNormalize, tParseRecordSet, tUniqueRow, tS3put, tS3get, tS3FileList, tRedshiftInput, tRedshiftOutput, tRedshiftRow, tsnowflakeinput, tsnowflakeoutput, tsnowflakerow.
  • Extensively worked on Talend Bulkcomponents like tMySqlBulkExec, tMySqloutputBulk, tOracleBulkExec, tOracleOutputBulkExec for multiple data bases.
  • Experience in CI/CD processes.
  • Extensive experience in development and maintenance in a corporate wide ETL solution using SQL, PL/SQL, TALEND 5.x/6.x/7.x/8 on Unix and Windows platforms.
  • Experience in working with Standard jobs, Batch jobs and streaming jobs using Talend for Bigdata and Talend for real time Bigdata.
  • Experienced in ETL migration projects converting the Code from one ETL tool to another involving DWH migration at the same point.
  • Extracted data from multiple operational sources and implemented SCDs (Type 1/Type 2/ Type 3) using Talend.
  • Experience in working with Agile methodology and experienced in creating user stories in JIRA, participated in daily standups and sprint retrospective and sprint planning.
  • Extensively worked on Error logging components like tLogCatcher, tStatCatcher, tAssertCatcher, tFlowMeter, tFlowMeterCatcher.
  • Well versed in developing various database objects like packages, stored procedures, functions, triggers, tables, indexes, constraints, views in Oracle11g/10g. Well versed in writing complex queries using JOINS, Subqueries, Correlated Subqueries, PL/SQL procedures, functions, and triggers.
  • Experience in creating dashboards and charts/reports using Tableau and Power BI and proficient in Data collection, Data cleansing, Data modelling and interpretation of large data sets.
  • Skilled at database management using SQL Server to create complex Stored procedures and triggers.
  • Application areas: Business Intelligence, Data Warehousing, ETL, Data Integration, Operational Datastore, Application Interfaces, ERP, Master Data Management, Data Governance.

TECHNICAL SKILLS

ETL Tools: Talend Studio Data Integration & Big Data 8/7.0.1/6.4/6.3 /6.2.1, Talend Administrator Console, Talend Management Console.

Databases: Microsoft SQL Server, Oracle, REDSHIFT.

Languages: C, SQL, PL/SQL, Java.

BI Tools: Talend, Tableau and Power BI

Operating System: Microsoft Office Suite- MS-Word, MS-Excel, MS-Power point.

Packages: TOAD, SQL Plus, SQL*Loader, Soap UI, Subversion, IP switch user, Teradata SQL Assistant.

Other: Manual Testing (System, Regression, User Acceptance Testing), MS-Visio, AWS, ESP scheduling Tool.

PROFESSIONAL EXPERIENCE

Confidential, NJ

Talend Developer

Responsibilities:

  • Worked closely with Business Analysts to review the business specifications of the project and to gather the ETL requirement.
  • Closely worked with Data Architects in designing of tables and even involved in modifying technical Specifications.
  • Participated in Requirement gathering, Business Analysis, User meetings and translating user inputs into ETL mapping documents.
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes, and constraints.
  • Experience in cloud platforms like AWS.
  • Developed complex Talend ETL jobs to migrate the data from various files to database.
  • Implemented custom error handling in Talend jobs and worked on different methods of logging.
  • Followed the organization defined Naming conventions for naming the various file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Perform unit testing and capturing the run metrics in audit tables and Talend jobs working closely with SIT and UAT.
  • Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate-wide ETL solution using Talend Open Source for Data Integration.
  • Worked on real time Big Data Integration projects leveraging Talend Data integration components.
  • Analyzed and performed data integration using Talend open integration suite.
  • Wrote complex SQL queries to inject data from various sources and integrated it with Talend.
  • Worked on Talend Management Console TMC) for scheduling jobs and adding users.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Scheduling and Automation of ETL processes with scheduling tool in TMC and TAC.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRow Generator, tSetGlobalVar, tHashInput & tHashOutput and many more)
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, JSON, and Flat file structures.
  • Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.
  • Worked Extensively on Talend Management Console, Talend Admin Console and Schedule Jobs in Job Conductor.
  • Involved in production, deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
  • Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.

Environment: Talend 8/7.0.1/6.4/6.3, XML files, DB2, Oracle 11g, SQL server 2008, SQL, Mongo DB, MS Access, Unix, AWSREDSHIFT, Shell Scripts, Autosys, Talend Administrator Console, Talend Management Console.

Confidential, Chicago, IL

Talend Developer

Responsibilities:

  • Worked closely with Business Analysts to review the business specifications of the project and to gather the ETL requirement.
  • Worked on SSAS in creating data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying of analysis services projects.
  • SSAS Cube Analysis using MS-Excel and PowerPivot.
  • Implemented SQL Server Analysis Services (SSAS) OLAP Cubes with Dimensional Data Modeling Star and Snowflakes Schema.
  • Hands on experience with AWS
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes, and constraints.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and worked on different methods of logging.
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate-wide ETL solution using Talend Open Source for Data Integration.
  • Worked on real time Big Data Integration projects leveraging Talend Data integration components.
  • Analyzed and performed data integration using Talend open integration suite.
  • Wrote complex SQL queries to inject data from various sources and integrated it with Talend.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Used snowflake to build data intensive applications
  • Working experience with CI/CD processes.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRow Generator, tSetGlobalVar, tHashInput&tHashOutput and many more)
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.
  • Automated SFTP process by exchanging SSH keys between Unix servers.
  • Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
  • Involved in production, deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
  • Creating Talend Development Standards.
  • This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.

Environment: Talend 5.x,5.6, XML files, DB2, Oracle 11g, SQL server 2008, SQL, Mongo DB, MS Access, Unix, AWS, Shell Scripts TOAD, Autosys, Talend Administrator Console.

Confidential, Minneapolis, MN

Talend Developer

Responsibilities:

  • Collaborating with Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business-critical projects coming up with huge data extraction.
  • Writing T-SQL (DDL and DML) statements in creating Tables, User Defined Functions, Views, Complex Stored Procedures, common table expressions (CTEs), temporary tables, Cluster/Non-Cluster Index, Unique/Check Constraints, Relational Data Base Models, SQL joins and Triggers to facilitate efficient data manipulation and consistent data storage.
  • Used snowflake to build data intensive applications
  • Experience in cloud platforms like AWS
  • Worked on Tool Migration POC to Migrate Code from SSIS to Talend.
  • Created ETL design and mapping document of the code for tool migration.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow.
  • Used to store processing stats into a Database table to record job history.
  • Performed data manipulations using various Talend components like tMSSQLInput, tMssqlOutput, tOracleInput, tOracleOutput, tMap, tJavarow, tjava, tFileExist, tFileCopy, tFileList, tDie, tsendEmail and many more.
  • Worked on web services using talend components like tSOAP, tREST, tWebService, tWebService Input etc.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.
  • Developed unit test cases for all enhancements/ new requirements and executed the tests.
  • Working with application design development solutions that meet the client product requirements.
  • Develop solutions to meet design specifications from client.
  • Build features that meet design solutions and that are complaint with client design practices.
  • Worked with ETL Development teams and provided the recommendations to optimize the data extraction and loading using ETL Talend jobs.
  • Worked on end-to-end development of software products from requirement analysis to system study, designing, coding, testing (Unit & Performance), documentation and implementation.
  • Working on Talend Management Console for Job Scheduling, Server Monitoring, Task Creation, Plan Creation, Job Deployment etc.
  • Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation logic.
  • Developing ETL framework for Data Masking, Audit, Balance, Control, Validation architecture etc.

Environment: Talend for Data Integration 7.0.1, Talend Management Console, SQL Integration Services (SSIS), MS SQL Server 2012/2008.

Confidential

Talend Developer

Responsibilities:

  • Involved in building the ETL architecture and Source to Target mapping to load data into Oracle DB.
  • Performed data manipulations using various Talend components like tMap, tJavaRow, tjava, tOracleRow, tOracleInput, tOracle Output, tMSSQL and many more.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC)
  • Developed the Talend mappings using various transformations, Sessions and Workflows.
  • Used Talend to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Created multiple Job lets (reusable code) & Java routines in Talend.
  • Used snowflake to build data intensive applications
  • Experienced in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (StarSchema) of the Data warehouse, used Erwin to design the business process, dimensions, and measured facts.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend
  • Developed Talend ESB services and deployed them on ESB servers on different instances using tRest and ESB components.
  • Unit testing, code reviewing, moving in UAT and PROD.
  • Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
  • Working with high volume of data and tracking the performance analysis on Talend job runs and session.
  • Prepare the Talend job level LLD documents and working with the modeling team to understand the Big Data Hive table structure and physical design.
  • Conducted code reviews developed by my teammates before moving the code into QA.
  • Used Talend reusable components like routines, context variable and global Map variables.
  • Modified existing mappings for enhancements of new business requirements.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Talend Open studio V (6.1.1) Enterprise Platform for Data management (V6.1.1, 5.5.1, 5.6.1), UNIX, HadoopOracle, Microsoft SQL Server management Studio.

Confidential

Talend Developer

Responsibilities:

  • Assisted in requirement analysis and planning of delivering projects using Waterfall model.
  • ETL tool Informatica was used to load strategic source data to build the data marts.
  • An operational data store was created. Metadata Build up was designed for performing data mapping.
  • Also involved in Mass data loads, refreshing the data in various applications, Performance evaluations, modifying the existing code to accommodate new features.
  • Used various Transformations like Aggregator, Router, Expression, Source Qualifier, Filter, Lookup, Joiner, Sorter, XML Source qualifier, Stored Procedure and Update Strategy.
  • Worked extensively on Flat Files, as the data from various Legacy Systems are flat files.
  • Have setup Test and Production Environment for all mappings and sessions.
  • Created and configured Sessions in Workflow Manager and Server Manager.
  • Debugged the sessions using Debugger and monitored Workflows, Worklets and Tasks by Workflow Monitor.
  • Created and used mapping parameters, mapping variables using Informatica mapping designer to simplify the mappings.
  • Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor.
  • Integrated data into CDW by sourcing it from different sources like SQL, Flat Files and Mainframes (DB2) using Power Exchange.
  • Extensively worked on integrating data from Mainframes to Informatica Power Exchange.
  • Extensively worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer to designed, developed and tested complex mappings and Mapplets to load data from external flat files and RDBMS.
  • Used output xml files, to remove empty delta files and to FTP the output xml files to different server.
  • Worked with the Business Analyst team during the functional design and technical design phases.
  • Designed the mappings between sources (external files and databases) to operational staging targets.
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregators, Connected and Unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, XML Transformations and Update Strategy & Sequence Generator.
  • Used XML transformation to load the data XML file.
  • Worked on Informatica Schedulers to schedule the workflows.
  • Extensively worked with Target XSD's to generate the output xml files.
  • Created mappings to read parameterized data from tables to create parameter files.

Environment: Informatica Power Center 8.6.1, Power Exchange 8.6.1, Windows, IBM DB2 8.x, Mainframes, SQL Server 2008.

We'd love your feedback!