We provide IT Staff Augmentation Services!

Sr. Talend Developer Resume

4.00/5 (Submit Your Rating)

Branchville, NJ

SUMMARY:

  • 8+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • 3+ years' Experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
  • Experience in Implementation of Microsoft Business Intelligence (BI) platforms including SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) in SQL Server 2005.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, HBase, Dynamo dB, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of Confidential including Oracle DB2, Netezza, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Strong Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Experience in SQL Plus and TOAD as an interface to databases, to analyze, view and alter data.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Hands on experience in Pentaho Business Intelligence ServerStudio.
  • Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Hands on experience in Deployment of DTS and SSIS packages using ETL Tool.
  • Designing SSIS packages for extraction, transformation and loading of data.
  • Configuring SSRS. Generating Reports using SSRS.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assistant.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also used Netezza Utilities to load and execute SQL scripts using Unix
  • Proficient in the Integration of various data sources with multiple relational databases like MS SQL Server 8.0/7.0,UDB, MS Access,Netezza, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Experienced in using Automation Scheduling tools like Autosys and Control-M.
  • Experience in data migration with data from different application into a single application.
  • Responsible for Data migration from MySQL Server to Oracle Databases
  • Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.

PROFESSIONAL EXPERIENCE:

Sr. Talend Developer

Confidential, Branchville, NJ

Responsibilities:

  • Developed test transactions in adherence to specifications.
  • Analyzed and debugged program code for revision of programs.
  • Maintained detailed documentation of program development stages and coding.
  • Reviewed documents comprising of software installation and operating processes.
  • Conducted technical training session regarding program use.
  • Participated in Requirement gathering, Business Analysis, User meetings and translating user inputs into ETL mapping documents.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the Data Ingestion architecture and Source to Target mapping to load data into Data warehouse
  • Extensively leveraged the Talend Big Data components (tHDFSOutput, tPigmap, tHive, tHDFSCon) for Data Ingestion and Data Curation from several heterogeneous data sources
  • Worked with Data mapping team to understand the source to target mapping rules.
  • Prepared both High level and Low level mapping documents.
  • Analyzed the requirements and framed the business logic and implemented it using Talend.
  • Involved in ETL design and documentation.
  • Developed Talend jobs from the mapping documents and loaded the data into the warehouse.
  • Involved in end-to-end Testing of Talend jobs.
  • Analyzed and performed data integration using Talend Cloud hybrid integration suite.
  • Experience in working with large data warehouses, mapping and extracting data from legacy systems, Redshift/SQL Server UDB databases.
  • Worked on the design, development and testing of Talend mappings.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow etc.
  • Used Database components like tMSSQLInput, tOracleOutput etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist.
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Analyzed requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Created complex mappings by using different transformations like Filter, Router, lookups, stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Scheduling and Automation of ETL processes with scheduling tool in TIC and TAC.
  • Scheduled the workflows using Shell script.
  • Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
  • Troubleshoot database, Joblets, mappings, source, and target to find out the bottlenecks and improved the performance.
  • Involved rigorously in Data Cleansing and Data Validation to validate the corrupted data.
  • Migrated Talend mappings/Job /Joblets from Development to Test and to production environment.

Talend Developer

Confidential, FL- Edison, NJ

Role & Responsibilities:

  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Extensively worked on creating repositories in GIT and assigning roles to developers and creating same projects in TAC.
  • Broad design, development and testing experience with Talend and BigData Integration Suite and knowledge in Performance Tuning of mappings.
  • Development of staging, Data warehouse scripts and deployment
  • Writing specifications for ETL processes.
  • Deployed, scheduled and Monitored Talend batch jobs using TAC.
  • Extensively used components like tWaitForFile, tIterate, tFlow, tFlowToIterate, tHashoutput, tHashInput, tMap, tRunjob, tJava, tNormalize and tfile components to create Talend jobs.
  • Used tXmlMap to create xml's based on XSD and created nested loop xml's using tAdvancedXml component as part of development.
  • Developed optimal strategies for distributing the web log data over the cluster, importing and exporting the stored web log data into HDFS and Hive using Scoop.
  • Collected and aggregated large amounts of web log data from different sources such as webservers, mobile and network devices using Apache Flume and stored the data into HDFS for analysis.
  • Implemented Change Data Capture technology in Talend in order to load deltas to a DataWarehouse.
  • Experienced in using Talend database components, File components and processing components based up on requirements.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Coordination with Offshore team, providing them guidance and clarifications related to reports, underlying queries
  • Perform validation check and deployment of reports to customer's staging environment Client, Business objects.

Environment: Talend DI (6.3.1/6.0), Talend BDE, HDFS, UNIX, Oracle, Microsoft SQL Server management Studio, WINDOWS XP

Talend Developer

Confidential - Tucson, AZ

Role & Responsibilities:

  • Interacting with the clients on a regular basis to discuss day-to-day issues and matters.
  • On-Call/Production Support provided during day-time and off-hours.
  • Played as administrator for setting up development, QA, UAT and PROD environments for Talend, Postgres for development and documenting install plan at client locations.
  • Setup ETL Framework, best practices around Talend for data integration implementation.
  • Responsible for installing Talend on multiple environments, creating projects, setting up user roles, setting up job servers, configure TAC options, adding Talend jobs, job failures, on-call support and scheduling etc.
  • Excellent experience working on tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Developed jobs to expose HDFS files to Hive tables and Views depending up on the schema versions.
  • Created Hive tables, partitions and implemented incremental imports to perform ad-hoc queries on • Structured data. Developed jobs to move inbound files to HDFS file location based on monthly, weekly, daily and hourly partitioning.
  • Worked extensively on design, development and deployment of talend jobs to extract data filter the data and load them into datalake.
  • Manage and Review Hadoop log files and hands on with executing Linux and HDFS Commands
  • Responsible for writing Talend Routines in Java.
  • Developed ODS/OLAP data model in Erwin and also created source to target mapping documents.
  • Experience working with web services using tSOAP components for sending XML requests and receiving response XML files. Expertized in reading XMLs files on a loop and sending to webservice end point for generating output XML files. And also used Advanced XML mappers for parsing multiple loop elements.
  • Responsible for digging into PL/SQL code for investigating data issues.
  • Involved in the development of Talend Jobs and preparation of design documents, technical specification documents.
  • Implemented job parallelism in Talend BDE 6.0.1.
  • Experience working with Big data components for extracting and loading data into HDFS file system.
  • Production Support activities like application checkout, batch cycle monitoring and resolving User Queries.
  • Responsible for deploying code to different environments using GIT.

Environment: Talend DI (5.6.2), Talend BDE, UNIX, Oracle, Postgres, Jaspersoft Reports, WINDOWS XP

Datawarehouse Developer

Confidential

Role & Responsibilities:

  • Involved in Extraction, Transformation and Loading of data from 10 Confidential into our data warehouse.
  • Work with the offshore team for the day to day work and review the tasks done by them get the status updates in the daily meetings.
  • Created Jobs for Matching and Merging to cleanse the data coming from Source system.
  • Worked with different Sources such as Oracle, SQL Server and Flat files
  • Developed Informatica mappings, sessions and workflows as per the business rules and loading requirements.
  • Wrote test case scenarios and unit tested the code developed in Informatica
  • Worked on the project documentation and prepared the Source Target mapping specs with the business logic and also involved in data modeling
  • Worked on migrating data warehouses from existing SQL Server to Oracle database.
  • Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scans.
  • Used Command line Shell to run the Informatica jobs.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Implemented Performance tuning in Mappings/Jobs by identifying the bottlenecks and Implemented effective transformation Logic.
  • Created Workflows using various tasks like sessions, control, decision, e-mail, command, Mapplets/Joblets and assignment and worked on scheduling of the workflows.
  • Verify the logs to confirm all the relevant jobs are completed successfully and timely and involved in production support to resolve the production issues.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules.
  • Migrated the code and release documents from DEV to QA (UAT) and to Production.

Environment: Talend 5.x, Oracle 10g, SQL Server, ER/Studio, Toad, Windows XP, Unix, TOAD, SQL Developer

Datawarehouse Developer

Confidential

Role & Responsibilities:

  • Involved in discussion of various business users Data Analysts from ISD team.
  • Worked in an Agile project life cycle environment.
  • Conduct source system and source table analysis to develop the necessary data
  • Assisted the Data Architect in creating the logical/physical data models.
  • Extensively worked on Teradata tables, created Teradata BTEQ procedures and used TPT Reader/Writer utilities for Bulk Loading.
  • Created several Staging, Historical and Daily Incremental and ETL maps.
  • Worked extensively on Aggregated/Summarized data.
  • Created Informatica Mappings for Error Handling/Audit Balance control flows.
  • Prototyping ETL mappings and workflows for Slowly Changing Dimensions (SDCs),
  • Push Down Optimization (PDO) and corresponding SQL code (Aggregations, Transformations, Rollups, Inserts, Updates, Deletes)
  • Performance tuning and troubleshooting various Informatica mappings.

Environment: Windows 7, Linux, Informatica Power Center 9.5, Erwin, Informatica Power Exchange 9.1.0/9.5, Informatica DVO 9.5, Teradata 13, Oracle 10G, Flat Files, XML Files, ESP scheduling tool, Teradata SQL Assistant.

TECHNICAL SKILLS:

ETL Tools: Informatica Power center V8.X/9.X, Talend 6.3.1/6.1/5. X

DW Tools: Erwin, ER/Studio, MS Visio, Teradata studio, Teradata SQL Assistant

RDBMS: Oracle 10G/9i/8.x, Postgres, MS SQL Server, MS Access

Languages: SQL, PL/SQL, C, C++, VB, Shell Scripting, Java and XML.

Operating Systems: Microsoft Windows, MS-DOS, UNIX and Linux

Development Tools: TOAD, SQL Plus, SQL Developer, MS Visual Studio, Autosys

We'd love your feedback!