We provide IT Staff Augmentation Services!

Sr. Etl/ Talend Developer Resume

0/5 (Submit Your Rating)

Pasadena, CA

SUMMARY

  • 8 years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Testing and Maintenance with focus on Data warehousing applications using ETL tools like Talend and Informatica.
  • 3 years of experience using Talend Integration Suite (6.1/5.x) / Talend Open Studio (6.1/5.x) and 2+ years of experience with Talend Admin Console (TAC).
  • Experience working with Data Warehousing Concepts like Kimball/ Inmon methodologies, OLAP, OLTP, Star Schema, Snowflake Schema, Logical/Physical/ Dimensional Data Modeling.
  • In depth understanding of the Gap Analysis i.e., As - Is and To-Be business processes and experience in converting these requirements into Technical Specifications and Test Plans.
  • Highly Proficient in Agile, Test Driven, Iterative, Scrum and Waterfall software development life cycle.
  • Extensively used ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Netezza, Oracle, DB2, SQL server, Teradata, Hive, Hanaand non-relational sources like flat files, XML and Mainframe Files.
  • Experience in analyzing data using HiveQL and Pig Latin in HDFS.
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.
  • Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType, tFlowToIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
  • Extensive experience in using Talend features such as context variables, triggers, and connectors for Database and flat files like tMySqlInput, tMySqlConnection, tOracle, tMSSQLInput, TMsSqlOutput, tMysqlRow, tFileCopy, tfileInputDelimited, tFileExist.
  • Experience in using cloud components and connectors to make API calls for accessing data from cloud storage (Google Drive, SalesForce, Amazon S3, Drop Box) in Talend Open Studio.
  • Experience in creating Joblets in TALEND for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
  • Experience in monitoring and scheduling using Autosys, Control M & Job Conductor (Talend Admin Console) and using UNIX (Korn & Bourn Shell) Scripting.
  • Expertise in creating sub jobs in parallel to maximize the performance and reduce overall job execution time with the use of parallelize component of Talend in TIS and using the Multithreaded Executions in TOS.
  • Experienced in creating Triggers on TAC server to schedule Talend jobs to run on server.
  • Strong experience in Extraction, Transformation, loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Metadata Manager).
  • Experience in developing Informatica mappings using transformations like Source Qualifier, Connected and Unconnected Lookup, Normalizer, Router, Filter, Expression, Aggregator, Stored Procedure, Sequence Generator, Sorter, Joiner, Update Strategy, Union Transformations.
  • Strong hands on experience using Teradata utilities like SQL Assistant, B-TEQ, Fast Load, MultiLoad, FastExport, TPUMP & TPT.
  • Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Informatica ETL Components, Teradata Utilities, UNIX Scripts, SQL Scripts etc.
  • Strong decision-making and interpersonal skills with result oriented dedication towards goals.

TECHNICAL SKILLS

ETL Tools: Talend Open Studio (TOS) for Data Integration … Informatica Power Center 9.x/8.x/7.x

Databases: Microsoft SQL Server, Teradata & Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, TPUMP), … DB2, Hive, Sybase.

Programming: T-SQL, PL/SQL, HTML, XML.

Environment: Windows, UNIX (SunSolaris10, HP, AIX) & Linux

Scripting Languages: Korn shell script & Windows batch scripting, JavaScript

Other Tools: SQL Developer, Agility Workbench, Teradata SQL Assistant, SQL*Plus, Toad, SQL Navigator, Putty, MS-Office, VMWare Workstation.

PROFESSIONAL EXPERIENCE

Sr. ETL/ Talend Developer

Confidential, Pasadena, CA

Responsibilities:

  • Acquire and interpret business requirements, create technical artifacts, and determine the most efficient/appropriate solution design, thinking from an enterprise-wide view.
  • Worked in the Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business critical projects coming up with huge data extraction.
  • Perform technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.
  • Participate in designing the overall logical & physical Data warehouse/Data-mart data model and data architectures to support business requirements
  • Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tMysqlRow, tMysqlInput, tMysqlOutput, tMSSQLInput and many more.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
  • Worked on Migration projects to migrate data from data warehouses on SQL Server and migrated those to MySQL.
  • Worked in optimizing the SQL queries for MySQL (5.7) to support Micro strategy reports
  • Worked on develop jobs and scheduled jobs in Talend integration suite.
  • Writing MySQL queries to join or any modifications in the table.
  • Used Talend reusable components like routines, context variable and global Map variables.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Implementing fast and efficient data acquisition using Big Data processing techniques and tools.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
  • Worked on Complex Data Migration from MySQL tables to JSON Structure.

Environment: Talend 6.3.1/,TAC, Mysql, TOAD, Aginity, SQL Server 2012, XML, SQL, Hive, Pig, SQL, PL/SQL, HP ALM, JIRA.

Sr. ETL/Talend Developer

Confidential, Houston, TX

Responsibilities:

  • Worked with Data mapping team to understand the source to target mapping rules.
  • Analyzed the requirements and framed the business logic and implemented it using Talend.
  • Involved in ETL design and documentation.
  • Analyzed and performed data integration using Talend open integration suite.
  • Worked on the design, development and testing of Talend mappings.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow, tJava, Tjavarow, tConvertType etc.
  • Used Database components like tMSSQLInput, tMsSqlRow, tMsSqlOutput, tOracleOutput, tOracleInput etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename.
  • Worked on improving the performance of Talend jobs.
  • Created triggers for a Talend job to run automatically on server.
  • Worked on Exporting and Importing of Talend jobs.
  • Created jobs to pass parameters from child job to parent job.
  • Exported jobs to Nexus and SVN repository.
  • Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing Created Generic and Repository schemas.
  • Also, this deployment job is responsible to maintain versioning of the Talend jobs that are deployed in the UNIX environment.
  • Developed shell scripts in UNIX environment to support scheduling of the Talend jobs.

Environment: Talend 5.5.2, UNIX, Shell script, SQL Server, Oracle, Business Objects, ERwin, SVN, Redgate, Capterra.

Informatica/Talend ETL Developer

Confidential, Newark, NJ

Responsibilities:

  • Attended POC for Talend open studio.
  • Created the ODS jobs using Talend Open Studio.
  • Used Talend open studio to execute jobs for ODS.
  • Debugged numerous issues in Talend.
  • Worked closely with the administrators with the configuration of Talend Open studio.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Developed Informatica mappings to load data into various dimensions and fact tables from various source systems.
  • Developed, tested stored procedures, Cursors, Functions and Packages in PL/SQL for Data ETL.
  • Used Power Exchange along with Power Center to leverage data by avoiding manual coding on data extraction programs.
  • Responsible for developing and testing the new conformed dimensions that were used by the conformed fact.
  • Responsible for validating the Informatica mappings against the pre-defined ETL design standards.
  • Developed incremental and updateable loading through Informatica mappings.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Documented all the mappings and the transformations involved in ETL process
  • Used UNIX shell scripting for scheduling tasks.
  • Extracted huge volumes of data from legacy systems and uploaded into Oracle using SQL*Loader and shell scripts.

Environment: Informatica9.1, Power Exchange, Talend Open Studio 5.1, Oracle 10g, SQL Server, ERwin, PL/SQL, and UNIX shell scripts, PL/SQL, TOAD.

Informatica Developer

Confidential

Responsibilities:

  • Reviewed and translated BRD/BSD in to technical specifications design (TSD).
  • Developed technical specifications design (TSD) for VA Claims track.
  • Coordinated with SME for technical clarifications.
  • Extensively worked on BTEQ scripts to load huge volume of data in to EDW.
  • Integrated data in to EDW by sourcing it from different sources like SAP, mainframe copy books (COBOL files), DB2 and Teradata tables.
  • Loaded data in to some of the DB2 X-ref tables.
  • Developed Infa mappings with DB2 as targets to support ESI application.
  • Loaded data in to Landing Zone (LZ) Teradata tables, applied transformations and then loaded the data in to conformed staging area (CSA).
  • Analyzed the source systems before starting the development.
  • Developed shell scripts to parameterize the date values for the incremental extracts.
  • Loaded data from CSA to EDW by cross referencing the RDM codes.
  • Extracted data using BTEQ scripting which includes Fast Export and Multi Load utilities.
  • Extensively worked on Informatica 8.6.1 to extract the data and load it in to LZ.
  • Coordinated with SIT team for testing different scenarios.
  • Developed unit test cases with different scenarios.
  • Implemented Audit balancing using BTEQ scripting.
  • Designed Teradata LZ tables with appropriate Primary index.

Environment: Informatica Power Center 9.1/8.6.1, Power Exchange 9.1/8.6.1, SAP R3 Power Connect/ Power Exchange, Windows, Teradata 12.x, IBM DB2 8.x, Oracle 10g, Toad for IBM DB2, BTEQ, ER studio.

We'd love your feedback!