We provide IT Staff Augmentation Services!

Sr. Etl/talend Developer Resume

4.00/5 (Submit Your Rating)

Jacksonville, FL

SUMMARY:

  • 8+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Highly skilled ETL Engineer with 8+ years of software development in tools like Informatica/SSIS/Talend.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • 3+ years Experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle 10g/9i/8i/7.x, DB2, Netezza, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType,, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc
  • Strong Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Hands on experience in Pentaho Business Intelligence ServerStudio.
  • Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle 12c/11g/10g/9i/8x, SQL Server 2012/2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata, Netezza.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
  • Experienced in integration of various data sources like Oracle SQL, PL/SQL, Netezza, SQL server and MS access into staging area.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experienced in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also used Netezza Utilities to load and execute SQL scripts using Unix
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle12c/11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques
  • Experienced in using Automation Scheduling tools like Autosys and Control-M.
  • Experience in data migration with data from different application into a single application.
  • Responsible for Data migration from MySQL Server to Oracle Databases
  • Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.

TECHNICAL SKILLS:

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS

ETL Tools: Talend, TOS, TIS, Informatica Power Center 9.x/8.x/7.x/6.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), SSIS, Ab-Initio

Databases: Oracle 12c/11g/10g/9i/8i, MS SQL Server /2005, DB2 v8.1, Netezza, Teradata, Hbase

Methodologies: Data Modeling - Logical Physical

Dimensional Modeling: Star / Snowflake

Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, SOAP UI, JSP, Web Services, Java Script, HTML, Eclipse

Scheduling Tools: Autosys, Control-M

Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, Test Director, Clear test, Clear case

PROFESSIONAL EXPERIENCE:

Confidential, JACKSONVILLE, FL

Sr. ETL/Talend Developer

Responsibilities:

  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats.
  • Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
  • Experienced in using debug mode of talend to debug a job to fix errors. Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited, etc.
  • Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.
  • Created Context Variables and Groups to run Talend jobs against different environments.
  • Used tParalleize component and multi thread execution option to run subjobs in parallel which increases the performance of a job.
  • Implemented FTP operations using Talend Studio to transfer files in between network folders as well as to FTP server using components like tFileCopy, TFileAcrchive, tFileDelete, tCreateTemporaryFile, tFTPDelete, tFTPCopy, tFTPRename, tFTPut, tFTPGet etc.
  • Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.
  • Experienced in writing expressions with in tmap as per the business need. Handled insert and update Strategy using tmap. Used ETL methodologies and best practices to create Talend ETL jobs.
  • Extracted data from flat files/ databases applied business logic to load them in the staging database as well as flat files.

Environment: Talend 6.2, Oracle 11g, Teradata SQL Assistant, HDFS, MS SQL Server 2012/2008, PL/SQL, Shell Scripts, AutoSys, SVN.

Confidential, NJ

Sr. ETL/Talend Developer

  • Involved in End-End development of the implementation and Roll out.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Work on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Coordinated with the business to gather requirements and preparing Functional Specification document.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
  • Involved in automation of FTP process in Talend and FTP the Files in UNIX.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed over 90 mappings to support the business logic including the historical data for reporting needs.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
  • Developed Type-1 and Type-2 mappings for current and historical data.
  • Incorporated business logic for Incremental data loads on a daily basis.
  • Written complex PL/SQL procedures for specific requirements.
  • Used Parameter Variables and Mapping variables for incremental data feeds.
  • Used Shared folders for Source, Targets and Lookups for reusability of the objects.
  • Scheduled the Informatica jobs from third party scheduling tool Autosys Scheduler.
  • Involved/Migrated Informatica from 8.6 to version 9.6
  • Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
  • On-call support for production maintenance

Platform: Informatica 9.6, DB2 UDB, UNIX, Autosys, SQL Server 2008.

Environment: Talend 6.4, Informatica power center 8.6, 9.6, Oracle 11g, SQL, Unix, PL/SQL, TOAD, MY SQL, Autosys, Netezza, Xml, Flat files.

Confidential, Alpharetta GA

Sr. ETL Developer/Talend Developer

  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Performed data manipulations using various Talend components like tMap, tJavaRow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Extensive experience on Pentaho designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Developed advanced Oracle stored procedures and handled SQL performance tuning.
  • Involved in creating the mapping documents with the transformation logic for implementing few enhancements to the existing system.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC)
  • Developed the Talend mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
  • Loaded data in to Teradata Target tables using Teradata utilities (FastLoad, MultiLoad, and FastExport) Queried the Target database using Teradata SQL and BTEQ for validation.
  • Used Talend to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Created connection to databases like SQL Server, oracle, Netezza and application connections.
  • Created mapping documents to outline data flow from sources to targets.
  • Prepare the Talend job level LLD documents and working with the modeling team to understand the Big Data Hive table structure and physical design.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings.
  • Developed mapping parameters and variables to support SQL override.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Created maplets & reusable transformations to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Developed the Talend jobs and make sure to load the data into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Unit testing, code reviewing, moving in UAT and PROD.
  • Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Working with high volume of data and tracking the performance analysis on Talend job runs and session.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Experience in Batch scripting on windows, Windows 32 bit commands, Quoting, Escaping.
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
  • Modified existing mappings for enhancements of new business requirements.
  • Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Talend 6.1, TOS, TIS, Hive, Pig, Hadoop 2.2, Sqoop, PL/SQL, Oracle 12c/11g/, Erwin, Autosys, SQL Server 2012, Teradata, Netezza, Sybase, SSIS, UNIX, Profiles, Role hierarchy, Workflow & Approval processes, Data Loader, Reports, Custom Objects, Custom Tabs, Data Management, Lead processes, Record types.

Confidential, Indianapolis, IN

Informatica Developer

  • As a consultant studied the existing DataMarts to understand and integrate the new source of data.
  • Managing the off shore support group in India for support issue as well as small enhancements for data warehouse.
  • Preparing the weekly status report and coordinating weekly status calls with technology lead/business
  • Designed and created new Informatica jobs to implement new business logic into the existing process.
  • Using Informatica modules (Repository Manager, Designer, Workflow Manager and Workflow Monitor) to accomplish end to end ETL process.
  • Performed data profiling with Sources to analyse the content, quality and structure of source data
  • During mapping development.
  • Created required scripts/transformations to extract the source data from various sources such as Oracle, Flat Files etc.
  • Used all the complex functionality of informatica (Maplets, Stored Procedures, Normalizer,
  • Update Strategy, Router, Joiner, Java, SQL Transformation Etc...) to interpret the business logic into
  • The ETL mappings.
  • Designed and developed complex aggregate, joiner, lookup transformations to implement the business rules in the ETL mappings to load the target Facts and Dimensions.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Used Maplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Created Complex Informatica mappings and in other hand Simple mappings with Complex SQLs in it based on the need or requirement of business user.
  • Used Informatica’s features to implement Type 1, 2, 3changes in slowly changing dimension Change Data Capture (CDC)
  • Different database triggers Created and configured workflows, worklets & Sessions to transport the data to target systems using Informatica Workflow Manager.
  • Fine-tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used Versioning, Labels and Deployment group in the production move process.
  • Automation of Workflow using UNIX scripts using PMCMD and PMserver commands.
  • Setup Permissions for Groups and Users in all Environments (Dev, UAT and Prod).
  • Created tables, views, primary keys, indexes, constraints, sequences, grants and synonym.
  • Involved in developing optimized code using PL/SQL for Server related Packages to centralize the application through procedures containing PL/SQL were created and stored in the database and fired off when contents of database were changed.
  • Used debugger to test the mapping and fixed the bugs.
  • Conducted Design and Code reviews and extensive documentation of standards, best practices and
  • ETL Procedures.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Developed Oracle Stored Procedures, Packages and Functions and utilized in ETL Process.
  • Handled the performance tuning of Informatica Mappings at various level to accomplish the established standard throughput.
  • Analyzed the Target Data mart for accuracy of data for the pre-defined reporting needs.
  • Wrote complex SQL queries to achieve and interpret the reporting needs into the ETL Process. Also worked on SQL tuning to achieve the maximum throughput.
  • Assisted in all aspects of the project to meet the scheduled delivery time.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Wrote UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands.
  • Used Autosys Tool to schedule shell scripts and Informatica jobs.
  • Performed Unit and Grid Integration Testing to validate results with end users.
  • Worked as a part of a team and provided 7 x 24 production support.

Environment: Talend 5.5.1, Informatica Power Center 9.5, Erwin, MS Visio, Oracle 11g, SQL, PL/SQL, Oracle SQL Developer Tool, SQL Server 2008, Flat Files, XML, Mainframe, Cobol Files, Autosys, UNIX Shell Scripting, Subversion.

Confidential, Nashville, TN

Informatica Developer

  • Extraction, Transformation and data loading were performed using Informatica into the database. Involved in Logical and Physical modeling of the drugs database.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse database.
  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Created tables, views, indexes, sequences and constraints.
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL Loader to database.
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Used legacy systems, Oracle, and SQL Server sources to extract the data and to load the data.
  • Involved in design and development of data validation, load process and error control routines.
  • Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
  • Involved in ETL process from development to testing and production environments.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement
  • Generated monthly and quarterly drugs inventory/purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote reports for sales data.

Environment: Informatica Power Center7.1, Oracle 9, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script.

Confidential, Rockaway, NJ

ETL Developer

  • Involved in extracting the data from the Flat Files and Siebel data into staging area.
  • Implementing Slowly Changing Dimension (SCD type I,II) design for the Data Warehouse
  • Developed several Complex Informatica Mappings, Mapplets and Reusable Transformations for other mappings.
  • Developed PL/SQL procedures for processing business logic in the database and use them as a Stored Procedure Transformation
  • Tuning of SQL Queries, Procedures, Functions and Packages using EXPLAIN PLAN.
  • Extracted data from source systems to a staging database running on Teradata using utilities like MultiLoad and FastLoad.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes. Error checking and testing of the ETL procedures and programs using Informatica session log
  • Involved in fixing invalid Mappings, Unit and Integration Testing of Informatica Tasks, Workflows and the Target Data.
  • Created and scheduled Sessions, Jobs were scheduled in DAC based on demand, run on time and run only once.

Environment: Informatica Power Center 8.6/8.5, Oracle9i, SAP, SAP BI 7.0, SQL Server, Sun Solaris.

We'd love your feedback!