We provide IT Staff Augmentation Services!

Talend Developer Resume

3.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 8 years of IT experience in Software Analysis, Design, and Development for various software applications in client - server environment in providing Business Intelligence Solutions in Data Warehousing for Decision Support Systems, OLAP and OLTP Application Development and analysis of data in various industries for the role of Talend/Data Stage ETL expert.
  • Expertise with Talend Data Integration 6.2.1 frequently used components (tOracleInput, tMysqlnput, tMap, tSoap, tESBConsumer, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput&tHashOutput, tDie and more.
  • Experienced in working with Horton works distribution of Hadoop, HDFS, MapReduce, Hive, Sqoop, Flume, Pig, HBase, and MongoDB
  • Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
  • Good experience in all phases of software development life cycle (SDLC) including system design, development, integration, testing, deployment and delivery of applications.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, Hive, and sqoop.
  • Experience in all the aspects of Software Development Life Cycle (SDLC) serving multiple clients from Health care, Insurance.
  • Well versed with Relational and Dimensional Modelling techniques like Star Schema, Snowflake Schema, Fact and Dimensional Tables.
  • Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Teradata, Flat Files, Excel files loaded data in to Data ware house and Data marts using Talend Studio.
  • Experience in dealing with structured and semi-structured data in HDFS.
  • Experienced in Talend Service Oriented Web Services using SOAP, REST and XML/HTTP technologies using Talend ESB components.
  • Experienced in scheduling Talend jobs using Talend Administration Console (TAC).
  • Excellent understanding and knowledge of NOSQL databases like HBase and Cassandra.
  • Created joblets in Talend for the processes which can be used in most of the jobs in a project like to Start Job and Commit job.
  • Expert in using the Talend Troubleshooting and Datastage to understand the errors in Jobs and used the tMap/expression editor to evaluate complex expressions and look at the transformed data to solve mapping issues
  • Expertise in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups.
  • Familiar with data architecture including data ingestion, pipeline design, Hadoop information architecture, data modelling and data mining, machine learning and advanced data processing and Experience optimizing ETL workflows.
  • Experience in the Implementation of full lifecycle in Data warehouse, ODS and Data marts with Dimensional modeling techniques, Star Schema and Snowflake Schema.
  • Good Knowledge on Datastage client components - Datastage Director, Datastage Manager, Datastage Designer.
  • Developed efficient mappings for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.
  • Extensive experience in using Talend features such as context variables, triggers, connectors for Database and flat files.
  • Strong knowledge in implementing Change Data Capture (CDC) and Slowly Changing Dimensions (SCD's) 1, 2, and 3.
  • Created test plans, test data for extraction and transformation processes and resolved data issues following the data standards.

TECHNICAL SKILLS

Programming Languages: PL/SQL, SQL, Java, Scala

Tools: and Technologies: Talend 6.2, Tibco Spotfire, Informatica, File Zilla, Netezza, Harvest and SQL Server Management Studio

Big Data Ecosystems: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop

Databases: Oracle 11g/10g, SQL Server, Teradata, DB2 and Greenplum.

DB Tools: SQL Developer, TOAD, SQL, PL/SQL Developer.

Scripting: Unix Shell Script, Visual Basic, HTML, XML, Java Script

Operating System: Unix/Linux, HPUX, Windows.

ETL Tools: Talend Enterprize Edition

Web Technologies: HTML, XML, CSS, JavaScript and JSON

PROFESSIONAL EXPERIENCE

Talend Developer

Confidential, Chicago, IL

Responsibilities:

  • Created Complex Jobs and used transformations like tMap, tOracle (Components), tLogCatcher, tStatCatcher, tFlowmeterCatcher, File Delimited components and Error handling components (tWarn, tDie)
  • Designed the Data model and Load strategy to get data from different systems and create a Data Lake for TCS Aviation.
  • Data ingestion to and from HDFS and HAWQ/Teradata for storage and analytics with end user reporting.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load
  • Design, develop, test, implement and support of Data Warehousing ETL using Talend and Hadoop Technologies.
  • Responsible for MDM for Customer DataTalend MDM Customers Suppliers Products Assets Agencies Stores Address Standardizations and Reference Data Employees MDM is about creating and managing the golden records of your business
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Interacted with business team to understand business needs and to gather requirements.
  • Designed target tables as per the requirement from the reporting team and also designed Extraction, Transformation and Loading (ETL) using Talend.
  • Created Talend jobs to populate the data into dimensions and fact tables.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and used them in the job.
  • Load and transform data into HDFS from large set of structured data /Oracle/Sql server using Talend Big data studio.
  • Used Big Data components (Hive components) for extracting data from hive sources.
  • Develop stored procedures/views in Snowflake and use in Talend for loading Dimensions and Facts.
  • Developed simple to complex Map Reduce jobs using Hive and Pig for analyzing the data.
  • Designed, developed and improved complex ETL structures to extract transform and load data from multiple data sources into data warehouse and other databases based on business requirements.
  • Develop merge scripts to UPSERT data into Snowflake from an ETL source.
  • Responsible for understanding &deriving the new requirements from Business Analysts/Stakeholders.
  • Identified performance issues in existing sources, targets and Jobs by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Performed migration of mappings and workflows from Development to Test and to Production Servers.
  • Supported team using Talend as ETL tool to transform and load the data from different databases.
  • Wrote ETL jobs to read from web APIs using REST and HTTP calls and loaded into HDFS using java and Talend.
  • Worked on designing the table layout in hive, scripts to write the data into Hadoop
  • Responsible for modelling the new requirements based on BDD method for ELT applications.
  • Created many complex ETL jobs for data exchange and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structure.
  • Created and reviewed scripts to create new tables, views, queries for new enhancement in the applications using TOAD.

Talend Developer

Confidential, Milwaukee, WI

Responsibilities:

  • Design and Implement ETL for data load from Source to target databases and for Fact and Slowly Changing Dimensions (SCD) Type1, Type 2, Type 3 to capture the changes.
  • Extensive experience in extraction of data from various sources like relational databases Oracle, SQL Server, and Flat Files.
  • Continuous delivery/Continuous Integration (CD/CI) using Jenkins/Cloud Bees and hostage of Quality Control App.
  • Analyzed the requirements and framed the business logic for the ETL process using Talend.
  • Design, develop, test, implement and support of Data Warehousing ETL using Talend and Hadoop Technologies.
  • Prepare high level design documents, detail design documents, business requirement documents, technical specifications, table level specs and test plan documents.
  • Interacting with the client for requirements gathering and analysis for the Talend Migration projects.
  • Analyzed the Migration requirements and framed the business logic and implemented it using Talend.
  • Creating Spark SQL queries for faster requests.
  • Developed Jobs in Talend Enterprise edition from stage to source, intermediate, conversion and Target
  • Worked on Talend ETL to load data from various sources to Oracle DB. Used tmap, treplicate, tfilterrow, tsort and various other features in Talend.
  • Extract data from legacy systems to staging area and then cleanse, homogenize, process and load into the data warehouse.
  • Use JIRA to create, implement and deploy ETL related stories.
  • Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
  • Imported, exported data from various databases ORACLE, and MYSQL into HDFS using Talend.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Experience in creating Joblets in TALEND for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename.
  • Develop the ETL mappings for XML, .CSV, .TXT sources and also loading the data from these sources into relational tables with Talend ETL Developed Joblets for reusability and to improve performance.
  • Created UNIX script to automate the process for long running jobs and failure jobs status reporting.
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
  • Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput, tOracleOutput etc.
  • Perform ETL using different sources like databases, flat files, xml, and avro files.
  • Wrote ETL jobs to read from web APIs using REST and HTTP calls and loaded into HDFS using java and Talend.
  • Prepared migration document to move the mappings from development to testing and then to production repositories
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Created context variables and groups to run Talend jobs against different environments.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Involved in automation of FTP process in Talend and FTPing the Files in UNIX.

Environment: Talend 5.x, JIRA, Java, GIT, Sql Server, MySQL, Putty, XML, JUNIT, Hadoop, Jenkins, Apache Pig, Hive, Elastic Search, Web Services, Microsoft Office, SOAPUI/REST.

Data Stage Developer

Confidential

Responsibilities:

  • Extracted the data from heterogeneous sources like .dat file, Excel file, .csv file, flat files, Oracle etc and loaded into target tables Oracle.
  • Extensively used the Sequential File, Dataset, Join, Look-up, transformer, Funnel, Xml and Connector stages provided by Data Stage to perform transformation and load the data.
  • Used different Parallel job stages like Join, Merge, Lookup, Filter, Col and Row generator, Transformer, Modify, Aggregator, Remove Dup, Teradata Enterprise Stage, TeradataLoad Stage, TeradatamultiLoad/Tpump/Fast Export stages.
  • Used Teradata utilities like Fast Load, Multi load, Pt Api Interface, BTEQ to efficiently handle data.
  • Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
  • Proficient in writing, implementation and testing of triggers, procedures and functions in PL/SQL and Oracle.
  • Designed various Data stage jobs for loading the EDW staging layer using Lookups, Join, CDC, Transformer, DB2 connector, Sort, Aggregator, Sequential file stages.
  • Prepared Technical Design Specification Documentation for Data Extraction, Data Transformation, and Data Loading in to Enterprise Data warehouse (EDW).
  • Developed complex ETL programs to implement SCD Type 2 Dimensional tables using CDC Data mirror and SCD built in stages.
  • Used diverse partitioning methods like Auto, Hash, Same, Entire etc.
  • Created Master sequencers using User Variable Activity, Job Activity, Execute Command Stage, Start Loop Activity, End Loop Activity, Nested Condition, Exception Handler Stages
  • Proven track record in troubleshooting of Data Stage jobs and involved in creating functional and scope documents.
  • Involved in Analysis, Development, Unit Testing and Performance tuning of the jobs
  • Created and tested ETL processes composed of multiple data stage jobs using job sequencer and used shell scripts to automate the process.
  • Redesigned and modified existing program logic to improve overall system performance.
  • Worked on Data Stage admin activities like creating ODBC connections to various Data sources, Server Start up and shut down, Creating Environmental Variables, Creating Data Stage projects and Message Handler files, Dsparams File.

Environment: Accentual Data Stage 7.5, Quality Stage 7.5.2, UNIX, Parallel Extender/Enterprise Edition Oracle 9i, DB2, SQL Navigator, Cognos 7.1, XML, SQL, PL/SQL, SQL*Loader, Erwin.

ORACLE Developer

Confidential 

Responsibilities:

  • Involved in requirement gathering and analysis, designing, implementing, testing and maintenance stages of Software Development Life Cycle.
  • Created Entity Relationship (ER) Diagrams to the proposed database.
  • Have extensive experience in developing efficient PL/SQL code for OLTP environment and ETL code for the Data Warehouse environment.
  • Used SQL * Loader to upload data into the staging files from flat files.
  • Performed code reviews in Oracle PL/SQL procedures, functions and triggers.
  • Wrote Perl scripts to automate the process of scheduled bill payments.
  • Responsible for development of database stored procedures, functions, packages and triggers using PL/SQL and defining test plans, test data/parameters, and evaluate test results to ensure that system functionality/outputs meet system specifications.
  • Involve in writing complex SQL queries and report generation.
  • Conducted product QA and support for Customer Acceptance Testing (CAT).
  • Wrote high performance queries/programs using Collections, Bulk Binds, Objects, Nested tables, REF Cursors, pipeline functions etc.
  • Wrote UNIX scripts to schedule data loads using CRONTAB.
  • Created various connection managers (OLE DB connection, flat file connection, FTP Connection) available in SSIS to collect data from heterogeneous sources such as SQL Server, flat files
  • Involve in developing and customizing UNIX shell scripts, windows batch scripts to automate the data manipulation in the database and scheduling batch jobs, cron jobs.
  • Work with senior executives defining business needs, development and testing strategy and success criteria to deliver business goals.
  • Developed complex OracleForms providing extensive GUI features (multi-selects drag and drop, graphical charts, automated system alerts and notifications etc.).
  • Created Test Plan for QA and implementation plan for Production implementation once the unit test is done.
  • Wrote control files and loaded the data into the Oracle database with SQL Loader
  • Involved in testing of oracle back-end objects like database triggers, stored procedures, Sequences and Synonyms.
  • Responsible for developing GUI forms using Visual Basic and VB.NET as per the design specification.
  • Troubleshoot performance issues and bugs within packages, forms, and reports using dbms output, forms debugger, explain plan and TKPROF.
  • Maintaining different version of software (version control) using CVS.
  • Developed procedures using Dynamic SQL

Environment: Oracle 11g, SQL, PL/SQL, SQL Server, MS VISIO, MS SQL Server Integration Services 2005(SSIS), T-SQL

We'd love your feedback!