We provide IT Staff Augmentation Services!

Talend Etl Developer Resume

5.00/5 (Submit Your Rating)

Orlando, FL

SUMMARY

  • 5+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Experiences on developing & leading the end to end implementation of Big Data projects by using Talend BIGDATA, comprehensive experience as a in Hadoop Ecosystem like Map Reduce, Hadoop Distributed File System (HDFS), Hive.
  • Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Oracle, DB2, SQLserver, Teradata, Hive, Hana, and flat files, XML and Mainframe files and Active MQ.
  • Good Experiences on relational database management systems, experience in integrating data from various data source like Oracle, MSSQL Server, MySQL and Flat files too.
  • Hands on experience on Hadoop technology stack (HDFS, Map - Reduce, Hive, HBase, Pig, Sqoop, Oozie, Flume and Spark).
  • Experience with NOSQL bases like HBASE and Cassandra
  • Excellent knowledge in deployment process from DEV to QA, UAT and PROD with both Deployment group and Import/Exports method.
  • Excellent working experience in Waterfall, Agile methodologies.
  • Familiar with design and implementation of the Data Warehouse life cycle and excellent knowledge on entity-relationship/multidimensional modeling (star schema, snowflake schema), Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).
  • Debugging ETL jobs errors, ETL Sanity and production Deployment in TAC-Talend Administrator Console using SVN.
  • Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping Session and System in ETL Process.
  • Experience in converting the Store Procedures logic into ETL requirements
  • Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle … DB2, Netezza, SQL server, Teradata, Hive, Hana and non-relational sources like flat files, XML and Mainframe files.
  • Experiences on Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Hands on Experiences on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Strong Experiences on Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Experience in SQL Plus and TOAD as an interface to databases, to analyze, view and alter data.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Hands on experience in Pentaho Business Intelligence ServerStudio.
  • Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Hands on experience in Deployment of DTS and SSIS packages using ETL Tool.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also used Netezza Utilities to load and execute SQL scripts using Unix

TECHNICAL SKILLS

Development Tools: Talend 6.2, Talend ESB 6.2, informatica, Eclipse, Putty, File Zilla, SQL Developer,Toad,SQL Assistant, Agnity Workbench for Netezza, Quality Center, Harvest, SQL Server Management Studio.

Programming Languages: Core Java, SQL, PL/SQL, C, C++

Operating Platforms: MS Windows 2010/2000/XP/NT, Mac OS, Unix and Linux.

Web Technologies: HTML, XML, CSS, XSD, JavaScript and JSON.

PROFESSIONAL EXPERIENCE

Confidential, Orlando, FL

Talend ETL Developer

Responsibilities:

  • Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing DimensionsSCD-Type1and SCD-Type2.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Created Talend Jobs to retrieve data from Legacy sources and to retrieve user data from the Flat files on monthly and weekly basis.
  • Written Hive Queries to fetch Data from HBase and transferred to HDFS through HIVE.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Develop the ETL mappings for XML, CSV, TXT sources and loading the data from these sources into relational tables with Talend ETL Developed Joblets for reusability and to improve performance.
  • Import the data from different sources like HDFS/HBase into Spark RDD.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDD, Scala and Python.
  • Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.
  • Migrated the code and release documents from DEV to QA (UAT) and to Production.
  • Troubleshooting, debugging & altering Talendissues, while maintaining the health and performance of the ETL environment.
  • Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.
  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
  • Experienced in Talend Data Integration, Talend Platform Setup on Windows and UNIX systems.
  • Created complex mappings in Talend 6.0.1/5.5 using tMap, tJoin, tReplicate, tParallelize, tJava, tjavarow, tJavaFlex, tAggregateRow, tDie, tWarn, tLogCatcher, etc.
  • Created joblets in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments
  • Developed jobs to move inbound files to vendor server location based on monthly, weekly and daily frequency.
  • Implemented Change Data Capture technology in Talend in order to load deltas to a DataWarehouse.
  • Created jobs to perform record count validation and schema validation.
  • Created contexts to use the values throughout the process to pass from parent to child jobs and child to parent jobs.
  • Developed joblets that are reused in different processes in the flow.
  • Developed error logging module to capture both system errors and logical errors that contains Email notification and also moving files to error directories.
  • Provided the Production Support by running the jobs and fixing the bugs.
  • Experienced in using Talend database components, File components and processing components based up on requirements.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Performed unit testing and also integration testing after the development and got the code reviewed.
  • Responsible for code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Broad design, development and testing experience with TalendIntegrationSuiteand knowledge in Performance Tuning of mappings.

Confidential, Whippany, NJ

Sr. Talend ETL Consultant

Responsibilities:

  • Created complex mappings in Talend 6.2using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, tHiveIput, tHiveOutputetc.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema.
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
  • Developed complex ETL jobs from various sources such as SQL server, Postgres sql and other files and loaded into target databases using Talend OS ETL tool.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig HBase database and Sqoop.
  • Performed transformations, cleaning and filtering on imported data using Hive, MapReduce, and loaded final data into HDFS.
  • Handled importing of data from various data sources using Sqoop, performed transformations using Hive, MapReduce and loaded data into HDFS.
  • Excellent knowledge of NOSQL on Mongo and Cassandra DB.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Developing UNIX shell scripts for automating and enhancing streamlining existing manual procedures.
  • Data migration from relational (Oracle.Teradata) databases or external data to HDFS using Sqoop and Flume & Spark.
  • Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Developed data validation rule in the Talend MDM to confirm the golden record.
  • Designed both Managed and External tables in Hive to optimize performance.
  • Experienced in creating Generic schemas and creating Context Groups and Variables to run jobs against different environments like Dev, Test and Prod.
  • Creating complex user provisioning for the company employees to run on daily basis.
  • Experienced in working with Tac (Talendadmintration Console).
  • Developed error logging module to capture both system errors and logical errors that contains Email notification and moving files to error directories.
  • Handled the Errors at Control and Data flow level in SSIS Packages.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings.
  • Developed mapping parameters and variables to support SQL override.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Developed the Talend jobs and make sure to load the data into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Unit testing, code reviewing, moving in UAT and PROD.
  • Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Working with high volume of data and tracking the performance analysis on Talend job runs and session.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Experience in Batch scripting on windows, Windows 32 bit commands, Quoting, Escaping.
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
  • Modified existing mappings for enhancements of new business requirements.
  • Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Confidential

ETL Developer

Responsibilities:

  • Developed complex ETL mappings using Informatica to move data from our operational database to DataWarehouse.
  • Developed complex mappings using Filter, Sorter, Aggregator, Look up, stored procedure, expression, Joiner, Router transformations for populating target table in efficient manner
  • Used client tools Designer, Workflow Manager and Workflow Monitor
  • Used SQL in a DB2 environment to write queries to extract data
  • Worked on Re-pricing of the Drug project to modify the current process of re-pricing and be more efficient in the way we handle different formats of files and how we analyze the file using Informatica.
  • Used Informatica to transform, process, and load data in to DataWarehouse
  • Loading data from ODS source into Staging tables and staging tables to delimited flat files
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improved the performance of the mapping using Informatica Designer
  • Tuned ETL mappings by identifying performance bottlenecks and deploying established solutions
  • Created ETL mappings and Workflow using Informatica to extract data from flat files (CSV, Text) and loads to warehouse
  • Created staging tables to load the data in warehouse to perform ETL tasks using Informatica
  • Worked with BA’s to finalize Data Model, functional and detailed technical requirements
  • Created technical documentation describing ETL task implementation
  • Established a process to develop and deploy code in a team
  • Worked on a production support tickets to fix the issues

Confidential

SQL Developer

Responsibilities:

  • Created and managed schema objects such as Tables, Views, Indexes and referential integrity depending on user requirements.
  • Actively involved in the complete software development life cycle for a design of database for new Financial Accounting System.
  • Successfully implemented the physical design of the new designed database into MSSQL Server 2008/2005.
  • Used MS SQL Server 2008/2005 to design, implement and manage data warehouses OLAP cubes and reporting solutions to improve asset management, incident management, data center services, system events support and billing.
  • Utilized T-SQL daily in creating customs view for data and business analysis.
  • Utilized Dynamic T-SQL within functions, stored procedures, views, and tables.
  • Used the SQL Server Profiler tool to monitor the performance of SQL Server particularly to analyze the performance of the stored procedures.
  • Stored Procedures and Functions were optimized to handle major business crucial calculations.
  • Implementation of data collection and transformation between different heterogeneous sources such as flat file, Excel and SQL Server 2008/2005 using SSIS.
  • Migrated all DTS packages to SQL Server Integration Services (SSIS) and modified the package according to the advanced feature of SQL Server Integration Services.
  • Defined Check constraints, rules, indexes and views based on the business requirements.
  • Extensively used SQL Reporting Services and Report Builder Model to generate custom reports.
  • Designed and deployed Reports with Drop Down menu option and Linked reports
  • Developed drill down and drill through reports from multi-dimensional objects like star schema and snowflake schema using SSRS and performance point server.
  • Created subscriptions to provide a daily basis reports and managed and troubleshoot report server related issues.

We'd love your feedback!