We provide IT Staff Augmentation Services!

Etl Talend Developer Resume

SUMMARY

  • 9+ years’ Experience on ETL integration Enterprise Edition for Data integration/Data Quality.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle 10g, DB2, Netezza, SQL server, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands on Informatica ETL migration to Talend Studio ETL process.
  • Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJava row, tAggregate Row, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDieetc
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation, and production support.
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Experienced in Cloud technologies like AWS, S3, Redshift.
  • Experience in Talend administration, Installation and configuration and worked extensively on Talend Big data to load the data in to HDFS, S3, Hive, Redshift.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced in integration of various data sources like Oracle SQL, PL/SQL, Netezza, SQL server and MS access into staging area.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experienced in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Extensively worked with TAC (Talend Administrator Console) for scheduling jobs using the execution plan.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and used Netezza Utilities to load and execute SQL scripts using Unix
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques
  • Experienced in using Automation Scheduling tools like Autosys and Control-M,ESP.
  • Experience in data migration with data from different application into a single application.
  • Responsible for Data migration from MySQL Server to Oracle Databases
  • Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.

TECHNICAL SKILLS

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS, LINUX.

ETL Tools: Talend DI,TOS, Informatica Power Center 9.x/8.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server).

Databases: Oracle 12c/11g, MS SQL Server 2012, DB2 v8.1, Netezza, AWS RedShift, S3, Postgres SQL.

Methodologies: Agile, Waterfall.

Languages: SQL, PL/SQL, UNIX, Linux, Shell scripts, JSP, Web Services, Eclipse.

Scheduling Tools: Autosys, Control-M,ESP, TAC.

Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, ICDQ.

PROFESSIONAL EXPERIENCE

ETL Talend Developer

Confidential

Responsibilities:

  • Involved in creating jobs to ingest multiple client data sets using Talend data integration (DI) and Talend big data spark job components.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from source systems like on prem to cloud.
  • Storing and reading the parquet files and csv files using tFileOutParquet, tFileInputParquet, tfileInputDelimited and tfileOutputDelimited.
  • Involved to run the jobs on TAC based events from Amazon s3 using Lambda function and AWS S3 events.
  • Experience in using Zeppelin tool and parquet viewer for data analytics and data visualization.
  • Developed jobs to extract data from redshift using UNLOAD command and loading data into redshift using COPY command using tRedshiftrow component.
  • Built Talend big data spark jobs to process large volume of data and perform necessary transformations and do the rollups for huge raw files.
  • Used tHDFS Configuration and tS3Configuration components to access the data from s3 and HDFS when we run the job on Amazon EMR.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
  • Worked on Informatica Power Center 10.2 for Batch processing and integration of data from multiple sources.
  • Used Informatica jobs to create technical mapping documents.
  • Involved in different phases of SDLC from low level technical design, development, testing and offering warranty support for production environment.
  • Created and utilized cubes using AWS service and migrated to the HDP cluster to Amazon EMR in project Scope.
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Implement Complex End to End ETL Framework for Enterprise Data Warehouse using Informatica 10.2 to pull data from the source feeds, apply necessary rules and transform the data for easy and quick reporting.
  • Involve in requirements gathering and white board analysis and prepare technical designs for ETL Workflows.
  • Develop complex jobs using transformations like Expression, joiner, Aggregator, Update Strategy.
  • Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts.
  • Define batch schedules in ESP for daily, weekly, and monthly data processing.
  • Provide maintenance and enhancement support for the deliverables that are being done.
  • Involved in deployment of jobs from all environments (DEV/QA/UAT/PROD) using Talend Repo Manager (TRM) and involved in production issues.
  • Provide maintenance and enhancement support for the deliverables that are being done.

Environment: Talend DI 6.3.1/ 7.1.1 , Informatica Power Center 10.2, Oracle 11g, PL/SQL, SQL Server, AWS Redshift, S3, TOAD, ESP Scheduler, TAC, Putty, WinSCP, XML, Flat File UNIX Scripting, Linux.

Talend Developer

Confidential

Responsibilities:

  • Worked with Data mapping team to understand the source to target mapping rules.
  • Involved in ETL design and documentation.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
  • Analyzed and performed data integration using Talend open integration suite.
  • Worked on the design, development and testing of Talend mappings.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend ETL and used features such as Context variables & Talend components like Replace, tamp, tsort and tFilterColumn, tFilterRow,tjavarow, tConvertType, tjava, txmlMap, tdelimited etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, tFileDelete,tFileRename, tFileInputXML.
  • Worked on improving the performance of Talend jobs.
  • Experienced in Microsoft Azure Cloud Services ( PaaS & IaaS ), Application Insights, Document DB, Internet of Things (IoT), Azure Monitoring, Key Vault, Visual Studio Online (VSO) and SQL Azure.
  • Created triggers for a Talend job to run automatically on server.
  • Worked on Exporting and Importing of Talend jobs using Talend Admin Console.
  • Created jobs to pass parameters from child job to parent job.
  • Monitored the daily runs, weekly runs and adhoc runs to load data into the target systems.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote SQL queries to take data from various sources and integrated it with Talend.
  • Scheduled the Talend jobs using AWS Cloud watch.

Environment: Talend Data Management Platform 6.4, Data Stage, Azure, Netezza, SQL Server 2012, Unix, XML,UNIX Scripting, Linux, Toad, Oracle 10g, SQL Serve.

Hire Now