We provide IT Staff Augmentation Services!

Talend Lead Resume

2.00/5 (Submit Your Rating)

Oakbrook, IL

SUMMARY

  • Around 7 years of strong experience in Analysis, Design and Development of Business Intelligence Solutions in Data Warehousing using Talend 5.6/6.3 and Informatica PowerCenter 9.5/9.1/8.6/8.1 as an ETL tool on Windows and UNIX based operating systems.
  • CertifiedTalendOpen Studio for Data Integration Consultant.
  • Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate - wide ETL solution usingTalendOpen Source for Data Integration 5.6.
  • Experience in developing ETL mappings, transformations and implementing source and target definitions in Talend.
  • Converting Large XML files to Multiple XML files as required by downstream application.
  • Significant experience with Data Extraction, Transformation and Loading (ETL) from disparate data sources such as multiple relational databases and also worked on integrating data from flat files, CSV files and XML files into a common reporting and analytical data model.
  • Hands on Experience in working with Hadoop ecosystems like Hive, Pig, Sqoop, Map Reduce.
  • Strong understanding of RDBMS concepts and experience in writing PL/SQL and SQL statements in databases.
  • Used Kafka for real-time loads.
  • Strong understanding of the principles of DW using fact tables, dimension tables and star/snowflake schema modeling.
  • Developed slowly changing dimension (SCD) mappings using type-I, type-II, and type-III methods.
  • Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping, Session and System in ETL Process. Have heavily worked on performance tuning long running Informatica mappings using Pushdown optimization and session partitioning.
  • Worked with SQL/PL-SQL to write Complex SQL queries, Stored Procedures, Triggers, Functions & PL/SQL packages.

TECHNICAL SKILLS

ETL/Middleware Tools: Talend 5.5/5.6, Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

Business Intelligence Tools: Business Objects 6.0, Cognos 8BI/7.0.s, Sybase, OBIEE 11g/10.1.3.x

RDBMS: Oracle 10g/9i, Netezza, Teradata, MS SQL Server, DB2, MySql, MS Access.

Programming Skills: SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java, .Net

Modeling Tool: Erwin 4.1/5.0, MS Visio.

Tools: TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion, Share Point, IP switch user, Teradata SQL Assistant.

Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.

Scheduling Tool: Zena, Maestro, Control M, Informatica Scheduling.

PROFESSIONAL EXPERIENCE

Talend Lead

Confidential, Oakbrook, IL

Responsibilities:

  • Developed Datastage jobs in Talend as part of cloud migration.
  • Supporting production talend jobs.
  • Deployed talend jobs in TAC through publisher and job conductor.
  • Scheduled talend jobs in TAC through CRON scheduler.
  • Applied dependencies between the jobs using execution plan in TAC.
  • Used tExcelInput and tMap to read complex excel workbooks.
  • Supported complex jobs which involves more than 100 components.
  • Updated the R script to make it work in cloud environment.
  • Developed custom code using talend Routines.
  • Used Reusable job to load data from job server to Redshift. Used tRedshiftRow component for copy command in reusable job.
  • Developed jobs for loading pii data into redshift tables.
  • Used various components like tFilterRow, tSortRow, tUniqRow, tMap, tDie.
  • Able to unlock the jobs from TAC.
  • Used tRedshiftBulkExec component to load bulk pii data into Redshift tables.
  • Used to kickout the users who are inactive from TAC.
  • Able to work in onsite-offshore model.
  • Helped offshore in resolving functional and code issues.
  • Used tHDFSInput, tHDFSGet to read and get the files from HDFS.
  • Used tHDFSPut, tHDFSOutput component to put and write the files to HDFS.
  • Review offshore jobs and provide comments on functionality.
  • Used command line to access HDFS and AWS s3 buckets.

Environment: Talend 6.3, Redshift, Teradata, Datastage, SQL Server, AWS, Unix, Jira, GitHub, Crontab, TAC, R script, HDFS, AWS EMR.

Talend Developer

Confidential, Richmond, VA

Responsibilities:

  • Developed real-time jobs in talend to pull the data from Sql Server to PostGreSql.
  • Used tKafkaInput component to read real-time data from kafka.
  • Effectively used Talend Context variables for defining Job variables, FTP connections and relational connections.
  • Performed unit testing at various levels of the ETL code, Stored Procedures and actively involved in team code reviews.
  • Created child-jobs to use them in parent job using tRunJob.
  • Used Jira and version one for agile sprints and story management.
  • Prepared migration document to deploy the jobs from Development to QA/UAT and then to Production environments.
  • Developed custom code using talend Routines.
  • Used GitHub version control repository for source code management as well as CI/CD procedure.
  • Used most of the talend components like tKafkaConnection, tContextLoad, tReplicate, tFilterRow, tExtractJSONField, tJavaRow, tMap for developing jobs.
  • Developed jobs to send and read data from AWS S3 buckets using components like tS3Connection, tS3BucketExist, tS3Get, tS3Put.
  • Used crontab for scheduling talend jobs.
  • Used tFlowMeter, tStatCatcher, tLogCatcher, tDie and tWarn components for logging and error handling mechanisms.
  • Supported testing team to run talend jobs manually in unix.
  • Used AWS EC2 instances to store and schedule talend jobs.

Environment: Talend 6.3, SQL Server 2012, PostGreSQL 9.5.2, AWS, Unix, Jira, Version One, GitHub, Crontab.

Talend Developer

Confidential, Warwick, RI

Responsibilities:

  • Interacting with the clients on a regular basis to discuss day-to-day issues and matters.
  • Used tWaitForFile component for file watch events jobs.
  • Effectively used Talend Context variables for defining Job variables, FTP connections and relational connections.
  • Effectively worked on Onsite and Offshore work model.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Pulled data from MongoDb, applied business requirements and stored it in DB2 database.
  • Used most of the components such as the Source Connections, tMap, tJoin, tFilterRow, tSortRow, tOracleSCD, Target Connection, tFlowToIterate, tPivotToColumnsDelimited for developing jobs involving complex business logics.
  • Used tOracleSCD components to implement Type 1 SCD and Type 2 SCD to update slowly Changing Dimension Tables.
  • Prepared migration document to move the jobs from development to testing and then to production repositories.
  • On-Call/Production Support provided during day-time and off-hours.
  • Acknowledge the tickets and fixing the issues within the SLA.
  • Developing UNIX shell scripts for automating and enhancing/streamlining existing manual procedures.
  • Created PBI’s for analysing the issues and providing the permanent solutions for the issues.
  • Created PKE’s for documenting the permanent solutions and followed up with right stake holders to get the approvals and fixing the issue permanently for a specific release.
  • Processed business requests in ad-hoc of loading data in to production DB2 database using Talend Jobs.

Environment: Talend 5.6, SSIS, SQL Server 2012, XML files, DB2 database, MongoDB, PL/SQL, Cognos, Unix, Maestro, Remedy ticketing tool, AQT, BRIO tool.

Talend Developer

Confidential, Naperville, IL

Responsibilities:

  • Developed, documented and executed unit test plans for the components.
  • Documented the developed code, run the jobs, while keeping track of source and target row count.
  • Converting Large XML files to Multiple XML files as required by downstream application.
  • Used most of the components such as the Source Connections, tMap, tAggregateRow, tJoin, tFilterRow, tSortRow, tOracleSCD, Target Connection, tMemorizeRows, tNormalize, tFlowToIterate, tPivotToColumnsDelimited for developing jobs involving complex business logics.
  • Analysed business requirements and prepped the design documents.
  • Involved in Data Extraction from Sql Server, Flat files using Talend.
  • Responsible for Pre and Post migration planning for optimizing Data load performance, capacity planning and user support.
  • Successfully loaded files to Hive and HDFS from Sql Server.
  • Created HIVE/PIG scripts for ETL purpose.
  • Exporting final tables from HDFS to SQL server using SQOOP.
  • Prepared ETL flow of data from Staging to Data Mart.
  • Push data as delimited files into HDFS using Talend Big data studio.
  • Created Database Design Document for SQL server Database tables which will be further used by DBA’s to create the tables.
  • Created and validated source layout using Talend.
  • Worked extensively on customization after up gradation to meet the current version requirements
  • Perform complex Look up operations with other tables to derive new fields.
  • Writing Unit Test cases andexecuting those Test cases to check if the application is as per requirements.
  • Involved in Transferring files to Customer Using FTP by Zena.
  • Involved in the preparation of Test cases and Test Scripts.
  • Performed Unit and regression testing for the Application.
  • Handled User Acceptance Testing (UAT) with the downstream feed users for validation of feeds.

Environment: Talend 5.5, Hadoop, HDFS, Hive, Impala, Oracle 11g, SQL server 2012, XML files, Sql developer, PL/SQL, MS Visual Studio 2012, Unix.

Talend Developer

Confidential, Houston, TX

Responsibilities:

  • Performed major role in understanding the business requirements and designing and loading data into data warehouse (ETL).
  • Created and developed a series of jobs for handling different cases of input data in the same source table. Analysis of certain existing jobs, which were producing errors and modifying them to produce correct results. Used repository context variables in jobs.
  • Developed jobs to populate Reference data tables which provide codes and descriptions for dimension tables in the database.
  • Used components like tJoin, tMap, tFilterRow, tAggregateRow, tSortRow, Target Connections and Source Connections.
  • Mapping source files and generating Target files in multiple formats like XML, Excel, CSV etc.
  • Transform the data and reports retrieved from various sources and generating derived fields.
  • Reviewed the design and requirements documents with architects and business analysts to finalize the design.
  • Implemented few java functionalities using tJava and tJavaFlex components.
  • Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance.
  • Attending the technical review meetings.
  • Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.
  • Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.
  • Responsible for monitoring all the jobs that are scheduled, running completed and failed. Involved in debugging the jobs that failed using debugger to validate the jobs and gain troubleshooting information about data and error conditions.
  • Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
  • Developed various reusable jobs and used as sub-jobs in other jobs.
  • Used Context Variable to increase the efficiency of the jobs
  • Extensive use of SQL commandswithTOADenvironment to create Target tables.

Environment: Talend 5.1, Oracle 11g, DB2, Sybase, TOAD, SQL, UNIX.

ETL Developer

Confidential

Responsibilities:

  • Loaded data from Source systems and sent to JMS queue for loading in to Target systems using XML Generator and Parser Transformations.
  • Worked withInformatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
  • Reviewed the design and requirements documents with architects and business analysts to finalize the design.
  • Used mapplets for use in mappings thereby saving valuable design time and effort.
  • Created Logical objects in Informatica Developer tool and exported them to Power Center 9.5 and used them in PC mappings.
  • Built common rules in analyst tools for analyst to use in mapping specifications and profiling on tables.
  • Created Pre/Post session to save the last generated numbers for SK’s.
  • Used Informatica workflow manager, monitor and log files to detect errors.
  • Used SQL Override in Sorter, Filter & in Source Qualifier Transformation.
  • Employed Normal Join, Full Outer Join, Detail Outer Join and master Outer Join in the Joiner Transformation.
  • Extensively worked on various re-usable tasks, workflows, Worklets, mapplets, and re-usable transformations.
  • Worked on slowly changing dimension Type2.
  • Developed workflow sequences to control the execution sequence of various jobs and to email support personnel.
  • Involved in unit testing and documenting the jobs and work flows.
  • Set Standards for Naming Conventions and Best Practices for Informatica Mapping Development.
  • Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.
  • Created various UNIX shell scripts for Job automation of data loads.
  • Worked on all phases of SDLC from requirement, design, and development and testing.

Environment: Informatica PowerCenter 9.1, Oracle 11g/10g, DB2, Teradata, Teradata SQL Assistant,XML files, SQL Server, SQL, PL/SQL, Unix, Windows7.

ETL Developer

Confidential 

Responsibilities:

  • Gathered the business requirements from Business Analyst.
  • Worked on analysing Source to Target mapping Excel document.
  • Extensively worked on Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
  • Used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing delta loads.
  • Worked on various tasks like Session, E-Mail task and Command task.
  • Worked on Informatica Scheduler for scheduling the delta loads and master loads.
  • Written the scripts needed for the business specifications.
  • Facilitated performance tuning of the process at mapping level, session level, source level and Target level.
  • Used Mapplets for use in mappings thereby saving valuable design time and effort.
  • Worked on various Look up Caches like Static, Dynamic, Persistent and Shared Caches.
  • Worked on UNIX to Import parameter files into workflow manager.
  • Worked on session logs, Informatica Debugger and Performance logs for error handling when we had workflows and session fails.
  • Functionality, Back-end and Regression testing during the various phases of application and data integrity/back-end testing by executing SQL statements.

Environment: Informatica Powercenter 8.6, Business Objects 6.5.1, Oracle 9i, Java, Springs, Hibernate, SQL*Plus, Toad, Windows 2000, SQL Server 2000, PL/SQL, UNIX, AUTOSYS, Erwin 6.1.

We'd love your feedback!