We provide IT Staff Augmentation Services!

Talend Developer Resume

2.00/5 (Submit Your Rating)

Farmington, CT

PROFESSIONAL SUMMARY:

  • Around 7 years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Testing and Maintenance with focus on Data warehousing applications using ETL tools like Talend and Informatica.
  • 3+ years of experience using Talend Open Studios and Talend Administration Center (TAC).
  • Highly Proficient in Agile, Scrum and Waterfall software development life cycle.
  • Extensively used ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Netezza, Oracle, Teradataand non - relational sources like flat files.
  • Experience in using cloud components and connectors to make API calls for accessing data from cloud storage like Salesforce in Talend Open Studio.
  • Experience with various components in Talend like tMap, tJava, tFileoutputDelimited, tAggregaterow, tDie, tJoin, tParallelize and tSleep.
  • Experience in creating JOBLETS in Talend for the processes which can be used in most of the jobs for First Name, Last Name and Email standardizations.
  • Experience in monitoring and scheduling using Job Conductor ( Talend Administration Center ), AutoSys and using UNIX (Korn& Bourn Shell) Scripting.
  • Experienced in creating Triggers on TAC server to schedule Talend jobs to run on server.
  • Strong experience in Extraction, Transformation, loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Metadata Manger)
  • Experience in developing Informatica mappings using transformations like Source Qualifier, Connected and Unconnected Lookup, Normalizer, Router, Filter, Expression, Aggregator, Stored Procedure, Sequence Generator, Sorter, Joiner, Update Strategy, Union Transformations.
  • Hands on Involvement on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
  • Tracking Daily, Monthly and Weekly data load.Resolving the issues in data loads and Providing timely updates to clients.
  • Experienced in Unit Testing, Code Review and Code Migration from Sandbox to Prod.
  • Involved in the Data Analysis for source and target systems and good understanding of Data Warehousing concepts, Staging Tables, Dimensions, Facts and Star, Snowflake Schemas.
  • Experience working with Teradata Parallel Transporter (TPT), BTEQ, FastLoad, Multiload, TPT.
  • Extensive knowledge with Teradata SQL Assistant.
  • Developed BTEQ scripts to Load data from Teradata Staging area to Data Warehouse, Data Warehouse to data marts for specific reporting requirements.
  • Experience in Cloud data migration using AWS, Jenkinsand Snowflake.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle12c/11g/Oracle10g/9i, MySQL, Netezza, Teradataand Flat Files into the staging area, Data Warehouse and Data Mart.
  • Highly adaptive to a team environment and proven ability to work in a fast-paced environment with great communication skills.

TECHNICAL SKILLS:

Languages: C#, Java, C/ C++, HTML, SQL, PLSQL, Unix Shell Scripting, Python.

Databases: Oracle, Netezza, MySQL, MS SQL Server, Teradata and Snowflake.

Tools: Talend, Informatica, TAC, Jenkins, Tableau, Autosys.

Web Technologies: HTML 5, XML, JavaScript, CSS 3, AngularJS.

SDLC Methodology: Agile, Waterfall model.

Operating Systems: Windows, Unix/Linux.

WORK EXPERIENCE:

TALEND DEVELOPER

Confidential, Farmington, CT

Responsibilities:

  • Worked on developing and enhancing the ETL code for Dedupe, CRM, PRM, Consumerand New Moverimplementations for several clients like Sarasota, St Elizabeth, Robert wood Johnson and Ohio State University. According to the Client Requirement and S2T documents.
  • Participated in Requirement gathering, Business Analysis, Clients meetings and translating Clients inputs into ETL mapping documents.
  • Followed Agile Methodologies, actively involved in Sprint Planning’s and Daily Stand up’s.
  • Created ETL test data for all ETL mapping rules to test the functionality of Talend mappings.
  • Worked with Salesforce Cloud platform for editing the Schema, Updating, Loading and Deleting the Patient and Consumer Data in the Platform.
  • Worked with SFTPfor File Transfer, File Evaluation, creating Directories, Executing Scripts and File Archival.
  • Created Views, Complex Queries and Mapping them through ETL job implementations.
  • Performed tuning and optimization of complex SQL queries inOracle and Netezza.
  • Worked on Creating all the objects in Databases required for the Jobs in Oracle and Netezza.
  • Written several Netezza SQL scripts to load the data between Netezza tables.
  • Extensively worked on migrating all the patient data from Netezza to Snowflake DB.
  • Modifying existing Talend mappings to load to Snowflake DB.
  • Recreating existing Netezza objects in Snowflake.
  • Worked in a Scrum Agile process with two-week iterations delivering new Snowflake objects and migrating data at each iteration.
  • Wrote shell scripts for File Validations and File Archive.
  • Worked on developing the custom ETL codes based on the client requirement documents and source to target documents.
  • Extensively worked with all the components in the Talendpalette (tMap, tJava, tFileoutputDelimited, tAggregaterow, tDie, tJoin, tParallelize and tSleep). while developing the jobs for data loads.
  • Published, Deployed and Executed the ETL job’s on Talend Administration Center.
  • Provided support to ETL job’s during Daily, Monthly and weekly Production Runs and handling any kind of Issues during those Run’s.
  • Involved in production and deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
  • Actively communicated with the clients about the missing files, file discrepancies and data load issues.
  • Worked on Unit Testing’s and Communicated well with Data Analyst’s during the QA. Before moving code from Sandbox to the Production.
  • Actively Created, Assigned and ResolvedJIRA tickets.
  • Expertise working in JIRA, CONFLUENCE and BITBUCKET SERVER.
  • Used to take care of all the Production Move’s from Sandbox.

Environment: Talend, UNIX, Oracle, Netezza, Salesforce, Snowflake, OKTA, Jenkins, Java, Putty, JIRA, TAC, WinSCP, MS Excel and MySQL.

TALEND DEVELOPER/ L2

Confidential, Plano, TX

Responsibilities:

  • Involved in Reviewing the project scope, requirements, architecture diagram, proof of concept (POC) design and development guidelines on Talend.
  • Reviewed/Prepared the technical design documents (TDD) and source to target mappings are created as per requirements meet.
  • Designed and Implemented ETL for data load from heterogeneous Sources to Teradata and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2 to capture the changes.
  • Developed Mapping to pull data from Source, apply transformations, and load data into target database like Teradata and Oracle.
  • Development of scripts for loading the data into the base tables and to load the data from source to staging and staging area to target tables using FastLoad, Multiload and BTEQ utilities of Teradata.
  • Involved in various projects such as Data Migration, Data warehousing & Automation Projects.
  • Involved in data extraction from various sources like relational database & Flat Files and Loading them to Salesforce platform.
  • Extensively Used tMap component which does lookup & Joiner Functions, tJava, tOracleInput, tInputDelimitedcomponents etc.
  • Worked Extensively on Talend Admin Console & Schedule Jobs in Job Conductor, this option is not available in Talend Open Studio.
  • Mainly Involved in Performance Tuning of Talend Jobs.
  • Hands on Experience on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
  • Hands on Experience in creating Generic schemas, Context Groups and Variables to run jobs against different environments like Dev, Test and Prod.
  • Worked with Parallel connectors for Parallel Processing to improve job performance while working with bulk data sources in Talend.
  • Scheduling the ETL Jobs on daily, weekly, monthly and yearly basis.
  • Experience in Agile methodology.

Environment: Talend Enterprise for Big Data, UNIX, Oracle, Teradata, Salesforce, OKTA, MySQL, Putty, FileZilla, TAC, Java, SVN, JIRA.

ETL DEVELOPER

Confidential

Responsibilities:

  • Developed ETL and source to target mappings.
  • Developed the transformation/business logic to load data into data warehouse.
  • Created Several Informatica Mappings to populate the data into dimensions and fact tables.
  • Involved in the development of Informatica mappings and tuned them for optimum performance, Dependencies and Batch Design.
  • Identify the Fact tables and slowly changing dimensional tables.
  • Extensively used ETL to load data from multiple sources to Staging area (Oracle 9i) using Informatica Power Center Worked with pre and post sessions and extracted data from Transaction System into Staging Area.
  • Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level. Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.
  • Used Debugger to test the data flow and fix the mappings.
  • Tuned the workflows and mappings.
  • Used Informatica designer for designing mappings and mapplets to extract data from Oracle sources.
  • Used Various Transformations like Expression, Filter, Router, Joiner, Look Up, and Update strategy, Source Qualifier in many mappings.
  • Designed Complex mappings for Slowly Changing Dimensions using Lookup (connected and unconnected), Update strategy and filter transformations for retaining consistent historical data.
  • Written PL/SQL procedures for processing business logic in the database.
  • Query based optimization, Cost based optimization.
  • Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
  • Executed Workflows and Sessions using Workflow Monitor.

Environment: Informatica Power Center V9.5.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), SQL, PL/SQL, MS SQL Server, Oracle 11g/10g,Flat files, Shell Scripting, UNIX, Windows.

ETL DEVELOPER

Confidential

Responsibilities:

  • Studied the existing environment and accumulating the requirements by querying the Clients on various aspects.
  • Identification of various Data Sources and Development Environment.
  • Extensively worked on extracting the data from various flat files (fixed width, delimited), applying the business logic and then loading them to the oracle databases.
  • Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Participate in the discussions with the business solutions team in creating and implementing the plans for the designs like Flow Chart diagrams, Conceptual and Logical Diagrams, and defining the terms on the needs of the project.
  • Designed and developed complex mappings to load the Historical, Weekly and Daily files to Oracle database.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Tuning the SQL queries as required. Created the PL/SQL Stored procedures, Indexes, Views.
  • Performed unit testing and documented the results.
  • Worked closely with QA team during the testing phase and fixed bugs that were reported
  • Designed the Build Docs of the entire release/ project for the production support.

Environment: Informatica PowerCenter 9.5/9.1, Oracle 11g/10g, UNIX, MS EXCEL 2010, Autosys .

We'd love your feedback!