Talend Support Engineer/developer Resume
Northbrook, IL
SUMMARY:
- Experienced in Design, Development, Maintenance, Enhancements and Production Support of Business Intelligence Enterprise Applications.
- Experienced on both Open source and Enterprise version of Talend Data Integration suite and Talend for Bigdata for Design and development of ETL/Bigdata code.
- Experienced in ETL migration projects converting the Code from one ETL tool to another involving DWH migration at the same point.
- Developed efficient mappings for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.
- Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.
- Experience in adhering software methodologies like Waterfall and Agile.
- Experience in developing Enterprise BI solution applications using Talend DI, Talend DQ, Talend Big Data Platform 7.0.1.
- Extensively used Talend components like tMap, tReplicate, tJoin, tFileList, tSortRow, tBufferInput, tBufferOutput, tDenormalize, tNormalize, tParseRecordSet, tUniqueRow, tS3put, tS3get, tS3FileList, tRedshiftInput, tRedshiftOutput, tRedshiftRow
- Extensively worked on Error logging components like tLogCatcher, tStatCatcher, tAssertCatcher, tFlowMeter, tFlowMeterCatcher.
- Extensively worked on Talend Bulk components for different data bases.
- Worked on AMC tables for Error Handling and Logging of Talend Jobs.
- Worked with different cloud - based data warehouse like SQL, Redshift.
- Experienced with AWS S3. Worked on S3 Buckets. Migrated data to S3 using Talend.
- Experience in Data warehouse design using Star and Snowflake dimensional models.
- Working experience with job scheduling tools like Autosys, Control-M, TAC.
- Experience in Debugging, Error Handling and Performance Tuning of sources, targets, Jobs etc.
- Extensive experience in Relational and Dimensional Data modelling for creating Logical and Physical Design of Database and ER Diagrams using data modelling tools like ERWIN and ER Studio.
- Worked extensively on the modelling tools like Erwin data modeler and Microsoft Visio.
- Having good knowledge in Normalization and De-Normalization techniques for optimum on XML data and XSD schema designing.
- Experience in converting the Stored Procedures logic into ETL requirements.
- Well versed in writing complex queries using JOINS, Subqueries, Correlated Subqueries, PL/SQL procedures, functions, triggers and cursors.
- Experience in creating Indexes and Partition tables to improve query performance.
- Experience in writing UNIX shell scripts.
- Experienced in scheduling the ETL jobs on daily, weekly, monthly and yearly basis in the Execution plan in TAC using the CRON trigger.
TECHNICAL SKILLS:
ETL Tools: Talend Big Data platform, Talend Data Integration
ETL Scheduling Tool: TAC, Autosys, Control-M
Databases: Oracle (10g, 11g & 12c), MS SQL Server, Hive, MySQL, RedShift.
DB Tool: Toad, SQL Developer, WinSQL, SQL Assistant.
Other Tools: JIRA, Erwin, Putty, Bitbucket
Operating Systems: Windows, Linux
Scripting: Unix, SQL
PROFESSIONAL EXPERIENCE:
Confidential, Northbrook, Il
Talend Support Engineer/Developer
Responsibilities:
- Designed various jobs for extracting data from various sources involving flat files and relational tables.
- Worked with Data Mapping Team to understand the source to target mapping rules.
- Analyzed the requirements and framed the business logic for the ETL process using Talend and involved in the Talend jobs design and its Development and also provided testing documentation.
- Understanding existing DataStage Jobs and need to build Talend Jobs accordingly.
- Developed jobs using Talend for data loading, importing Source/Target tables from the respective databases and flat files.
- Implemented the changes that to be made according to Change Requests in existing developed jobs and implemented Error Handling to provide the detailed error messages.
- Used Talend Troubleshooting and Data stage to understand the errors in Jobs and used the tMap/expression editor to evaluate complex expressions and look at the transformed data to solve mapping issues.
- Created complex mappings in Talend using components like: tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow, tWarn, tBuffer, tContextLoad.
- Worked on Talend ETL and used features such as Context variables, Database components like tDBinput, tDBoutput, tDBRow file components, ELT components etc.
- Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
- Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
- Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
- Used Talend components such as tmap, tDbRow, tCheckpoint tFileExist, tFileCompare, tELTAggregate, tRedshiftInput and TRedshiftBulkExec etc.
- Designed and Implemented ETL for data load from heterogeneous Sources like Salesforce and multiple files on weekly and quarterly basis to target databases for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2 to capture the changes
- Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
- Created Talend Development Testing Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
- Optimized the performance of the mappings by various tests on sources, targets and transformation.
Environment: Talend Big Data platform 7.0.1, Talend Administration Centre, IBM Data Stage,, AWS S3,Redshift,Bitbucket Putty, Aginity workbench for pure analytics, MS office, UNIX shell scripting
Confidential, San Francisco, CA
Talend Developer
Responsibilities:
- Involved in discussions with system analysts and system architects to understand requirements and design technical specification document.
- Designed data flow, work flow diagrams and prepared technical documents.
- Understanding existing business process, and estimating new process impacts on existing process and developed data integration jobs to transform and transmit data between Enterprise systems.
- Developed jobs to consume and process Flat files received from External vendors and job interfaces to transmit data between POS and SAP CRM systems.
- Developed jobs to consume IDOCs from SAP CRM and load into Netezza staging area and to load data from Netezza Stage to SAP CRM through RFC calls using BAPis.
- Developed jobs to consume Flat files and XML files and implemented Dimensions (SCD) Type1, Type 2 to capture the changes using Talend.
- Created jobs to perform record count validation and schema validation and worked on Error handling techniques and tuning the ETL flow for better performance.
- Redesigned master data jobs to reduce data traffic between POS and SAP CRM systems.
- Developed error logging module to capture both system errors and logical errors that contains Email notification and moving files to error directories.
- Worked Extensively TAC (Admin Console), where we Schedule Jobs in Job Conductor.
- Extensively Used Talend components tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tOracleInput, tOracleOutput, tfileList, tDelimited.
- Scheduling daily, weekly, monthly and yearly jobs on Tidal enterprise scheduler.
Environment: Talend Bigdata platform 7.0.1, Talend Administration centre, Oracle 12c, Netezza, SAP CRM, SQL server, UNIX shell scripting.
Confidential, Dallas, TX
Talend Developer
Responsibilities:
- Involved in gathering requirements from users and involved in modifying various Technical Specifications in the Development phase to Maintenance Phase.
- Designing, developing and deploying end-to-end Data Integration solutions.
- Designed and Implemented the ETL process using Talend Enterprise Big Data Edition to load the data from Source to Target Database.
- Involved in Data Extraction from Oracle, Flat files and XML files using Talend.
- Working with Big Data components like tHiveInput, tHiveOutput, tHDFSOutput, tHiveRow, tHiveLoad, tHiveConnection.
- Implemented Dimensions (SCD) Type1, Type 2 to capture the changes using Talend.
- Created jobs to perform record count validation and schema validation.
- Develop ETL mappings for various Sources (.TXT, .CSV, .XML) and load the data from these sources into relational tables with Talend Enterprise Edition.
- Worked on Global Context variables, Context variables, and Extensively used over 100+ components in Talend to create jobs.
- Worked on Error handling techniques and tuning the ETL flow for better performance.
- Worked Extensively TAC (Admin Console), where we Schedule Jobs in Job Conductor.
- Performed unit testing on Talend jobs.
- Scheduling the ETL mappings on daily, weekly, monthly and yearly basis.
Environment: Talend, Oracle 12c, SQL Server 12.0, Hadoop 2.6.0, TAC , Git, UNIX shell scripting, Talend Administrator console.
Confidential
ETL/Reports Developer
Responsibilities:
- Involved in requirements gathering by interacting with the users and other management personnel to get a better understanding of the business process.
- Converted Functional Requirements into Technical specifications.
- Designed and implemented SSIS packages to migrate data from heterogeneous data sources to staging database and load the data to data warehouse or data marts.
- Excellent report creation skills using Microsoft SQL Server 2014 Reporting Services (SSRS) with proficiency in using Report Designer as well as Report Builder .
- Created T-SQL queries, complex Stored Procedures, User-defined Functions, designing and implementing of Database Triggers (DDL, DML), Views and designing and implementing Indexes.
- Worked on Tuning SQL queries, Maintaining Data Integrity and Data Consistency, Performance Tuning and Query optimization.
- Experienced in monitoring MS SQL Server databases and performance tuning using Index Tuning Wizard, SQL Profiler, and windows Performance Monitor for optimal Performance.
- Created database maintenance plans for the performance of SQL Server including database integrity checks, update database statistics, re-indexing and data backups.
- Developed stored procedures for loading data into DW tables from different sources like Flat files, XML documents and different RDBMS (MySQL, Oracle).
- Optimized the performance of costly running jobs by tuning SQL queries and creating indexes.
Environment: Microsoft Visual Studio 2012, Oracle 11g R2, MySQL 5.5, SQL server 2008 R2, RHEL 5.6, Erwin Data modeler 8, Crystal Reports Xl, Apache SVN 1.6.
Confidential
PL/SQL Developer
Responsibilities:
- Developed customizations using PLSQL procedures, functions and packages.
- Implemented business rules using database triggers.
- Loaded data to and from flat files and excel sheets using import/export options in SQL Developer.
- Involved in Data migration process from MySQL to Oracle database.
- Performed data export/import tasks to move data between different DBs using Data Pump utility.
- Performed bulk loading activities using SQL*Loader utility.
- Created UNIX shell scripts to automate SQL scripts execution.
- Developed business reports using Crystal Reports.
- Worked on File Transfer Protocol (FTP) tool to transfer files from one server to other.
- Prepared the technical documentation on changes done.
Environment: Oracle 11g, MySQL 5.1, Crystal Reports XI, Oracle SQL Developer 2.1, RHEL 5.4, TOAD v10.1