We provide IT Staff Augmentation Services!

Sr. Talend Developer Resume

4.00/5 (Submit Your Rating)

Altamonte Springs, FL

SUMMARY

  • Over 7 years of experience in IT in data mining, analysis, development, and implementation of data warehouse along with data services applications.
  • Have great expertise in designing and building Data warehouses, Data marts and Decision support systems using star and snowflake scheme concepts.
  • Systemic experience in Software development and project management in J2EE technologies, Spring, web technologies along with maintenance programming and testing.
  • Have intense experience with bigdata, Hadoop, HDFS, Spark and Hadoop ecosystem technologies.
  • Good knowledge and experience in utilizing Hibernate core interfaces, annotations, SQL, KDBC in implementing persistent layers.
  • Experience on different operating systems like UNIX, Linux, and windows.
  • Hands on experience in ETL for performing Data profiling, migration, extraction, transformation, and loading using the Talend and data conversion from series of source systems such as Oracle, DB2, SQL server, Teradata, Hive and other basic sources like XML and mainframe files.
  • Knowledge in Object orient programming such as core Java SE, including collections APL, threads, multithreading, generics, reflection, Data structures and algorithms.
  • Understanding of Hadoop architecture and other components such as HDFS, Name node, data node, job tracker, Task tracker, Spark, Zookeeper, YARN and Map reduce.
  • Experience in making API calls for cloud storage such as Jira and Amazon S3 in Talend open studio using cloud components and connectors.
  • Familiarity with CDC and daily load strategies of data warehouse and data marts along with surrogate keys and data warehouse concepts.
  • Knowledge about tackling salesforce connectors and implemented SDFC CDS for tables account, opportunity, history case users.
  • Immense experience in projecting sessions/tasks, worklets, workflows using workflow manager tools - task developer, workflow and worklet designer.
  • Fully experienced in creating appropriate documents from source to target systems and in identifying performance bottlenecks and tuning the ETL loads for better performance.
  • Understanding of the UNIX shell scripting, FTP, and file management in various UNIX environments.
  • Expert understanding of Data warehouse project development lifecycle along with documenting all the phases of DWH projects.

TECHNICAL SKILLS

Operating systems: Windows, Mac OS X, LINUX- RHEL 5/7, UNIC

ETL tools: Talend open studio, Talend data fabric, informatica, Mainframe system

Databases: Teradata, Oracle 11g, MySQL, DB2, HBase and MS SQL

Methodologies: SDLC, Agile, waterfall.

Scheduling tools: Autosys, TIC, TAC

Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, JSP, Java script, HTML, Eclipse

Version control: Subversion (SVN), TFS, Git Hub, Nexus and Jenkins

Tools: Junit, eclipse, Adobe page marker, MS office, Sublime text, Putty, confluence, postman, Android studio.

PROFESSIONAL EXPERIENCE

Confidential, Altamonte Springs, FL

Sr. Talend Developer

Responsibilities:

  • Implemented DW Team HUB-spoke data model for Confidential Company which was used to build multiple data marts (spokes) from legacy systems. The project had two deliverable steps - Oracle DW (hub) and Mart (spoke). The Hub ETL team used staging Oracle structures to extract/load data from DB2 to Oracle DW. Mart ETL took this data from Oracle DW to campaign management data mart (spoke). The project used and implemented Agile methodology to deliver the business needs.
  • Worked along with business analyst in designing business required documents and created a data dictionary which was used to map the business requirements to attributes in designing logical data model implementing star schema.
  • Showed responsibilities in producing report analysis by column level for oracle tables, primary key analysis, and cross domain analysis.
  • Designed and developed ETL process using Talend Open Studio (Data integration) and worked on enterprise latest version in parallel in development and acted as Talend admin creating new projects, scheduling jobs, migration to higher environments.
  • Helped in developing complex ETL mapping for stage, dimensions, facts, and data marts load along with providing tips for the DW team to improve the performance of ETL job.
  • Involved in data extraction for various databases and files using Talend and used Talend’s most recent and used components such as (tmap, tDie, tConvertType, tLogCatcher, regenerator, tSetGlobalVar, tHashInput & tHashOutput, tFilterRow, tAggregateRow, tFileExist, tailcup, tFileList, and many more).
  • Showed responsibility in developing support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite and created many complex ETL jobs for data exchange from and to Database Server and other systems Including RDBMS, XML, CSV, and Flat file structures.
  • Executed Error Logging, Error Recovery, and Performance Enhancement’s & created Audit Process for various Application teams.
  • Hands of Experience on many components to design Jobs & used Context Variables/Groups to Parameterize Talend Jobs.
  • Knowledge in Migration of Source code from Lower to higher environments using Repository Manager.
  • Performing unit testing, regression testing and supporting QA testing. And taking care of check-in, checkouts of the code and deploying across the various project environments.
  • Prepared ETL mapping Documents for every mapping and Data Migration documents.
  • Took responsibilities in Unit testing, User Acceptance Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Responsible for prioritizing the issues and assign them to the production support team and planning the deployment of fixes for the same.

Environment: Talend 7.1, SQL Workbench, Git, Jenkins, Spark SQL, Putty, JIRA, Teradata, Oracle 11g, Cloudera, Hive, HDFS, Sqoop, TOAD, UNIX.

Confidential, Louisville, KY

Talend Developer

Responsibilities:

  • Worked in all phases of development life cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components and created and managed Source to Target mapping documents for all Facts and Dimension tables.
  • Utilized ETL methodologies and other practices to create Talend ETL jobs and Setup ETL Framework around Talend for STC Big data implementation.
  • Performed and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.
  • Immense experience working on tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Performed jobs to expose HDFS files to Hive tables and Views depending up on the schema versions and created Hive tables, partitions and implemented incremental imports to perform ad-hoc queries on structured data and developed jobs to move inbound files to HDFS file location based on monthly, weekly, daily, and hourly partitioning.
  • Providing best practices and taking delivery standards to the next level.
  • Exclusively worked on Talend Administrator Console (TAC) for scheduling jobs and adding users.
  • Worked with Apache Spark for streaming data from large datasets.
  • Based on the user requirements worked on Standard, Big Data Spark and Big Data Streaming jobs and worked extensively on design, development, and deployment of talend jobs to extract data, filter the data and load them into datalake.
  • Manage and Review Hadoop log files and hands on with executing Linux and HDFS Commands.
  • Responsible for writing Talend Routines in Java.
  • Created source to target mapping documents and developed ODS/OLAP data model in Erwin.
  • Responsible for digging into PL/SQL code for investigating data issues and implemented job parallelism in Talend BDE 6.0.1.
  • Responsible for deploying code to different environments using GIT.

Environment: Talend Big Data 6.0.1/6.3, Hortonworks HDP 2.3, Hive, HDFS, Sqoop, Spark, Jaspersoft Professional 6 Oracle 11g, Web services (SOAP), GIT, Jira, Jenkins, and AWS

Confidential, Charlotte, NC

Talend Developer

Responsibilities:

  • Working with business analysts and the application architects to understand the project requirements in terms of functional and technical.
  • Providing technical designs and getting reviewed with the client teams.
  • Worked on developing templates for Dimensional tables, Slowly Changing dimension tables, Fact tables.
  • Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user.
  • Using various transformations like expression, aggregator, joiner, source qualifier, router, lookup Connected/Unconnected, and filter developed standard and re-usable mappings and mapplets.
  • Helped in Identifying performance issues in existing sources, targets, and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance along with Extensive use of Persistent cache to reduce session processing time.
  • Worked in Maintaining warehouse metadata, naming standards and warehouse standards for future application development.
  • Experienced in using Workflow Manager for creating, validating, testing, and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
  • Programmed coding process to create ETL jobs from Enterprise to Stage for faster delivery.
  • Developed UNIX shell scripts for automating process.

Environment: Informatica 8.5, Oracle 9i/10g, Teradata, SQL plus, Teradata SQL Assistant, Control M, PVCS.

Confidential

Talend Developer 

Responsibilities:

  • Worked in Analysis of the requirements.
  • Was responsible in writing scripts in the data members for ETL process using Fexport, Mload, FastLoad, and BTEQ utilities.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Using file manager and Teradata utilities for Port Test data.
  • Assisting the team for production move and scheduling and Run Extraction and Load process and monitor sessions using CA7 Scheduler.
  • Working on code changes and support production issues.
  • Showed responsibilities in entire life cycle of replenishment work bench i.e. until the component moved to production.
  • Implemented type II slowly changing dimension techniques on changing attributes and Retry logic for reprocessing records in next run if the foreign key data missed in first run due to timing issues.
  • Expertized in reading XMLs files on a loop and sending to webservice end point for generating output XML files.
  • Production Support activities like application checkout, batch cycle monitoring and resolving User Queries.
  • Worked in report generation for the forecasting purpose using BI Query tool.

Environment: Teradata SQL Assistant, Oracle 8.i, PL/SQL, Teradata, DB2, Z/OS, JCL, IBM Data Studio, BI Query, IBM Data studio.

We'd love your feedback!