We provide IT Staff Augmentation Services!

Sr. Talend Developer Resume

Chicago, IL

SUMMARY:

  • 7 years of total experience in information technology including Data Modeling, Database Design, Programming, Development and Implementation using Talend and Informatica Power Center .
  • Experience in using Talend Data Integration tool as applied to BI data analytics, reporting and dashboard projects.
  • Good Experience in Talend DI Administration, Talend Data Quality and Talend Data Mapping.
  • Experience in Talend Big Data Integration for business demands to work towards Hadoop and NoSQL databases.
  • Created complex mappings in Talend using components like tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow etc. And have created various complex mappings.
  • Experience in Hadoop Big Data Integration with Data stage ETL on performing data extract, loading and transformation process for automobile ERP data.
  • Experience in using Talend Big Data components to create connections to various third - party tools used for transferring, storing or analyzing big data, such as Sqoop, MongoDB and BigQuery to quickly load, extract, transform and process large and diverse data sets.
  • Created Talend Mappings to populate data into dimensions and fact tables.
  • Successfully deployed Talend Admin Centre WAR file in Weblogic10.3 server and gave to client about TalendAdmin Centre functionality
  • Designed ETL Jobs/Packages using Talend Open Studio (TOS)
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users
  • Used ETL methodologies and best practices to create Talend ETL jobs.
  • Extensive experience in using Microsoft BI studio products like SSIS,SSAS,SSRS for implementation of ETL methodology in data extraction, transformation and loading .
  • Expert in Data Warehouse development starting from inception to implementation and ongoing support, Strong understanding of BI application design and development principles.
  • Using Normalization and De-Normalization techniques
  • Generating complex Transact SQL (T-SQL) queries, Sub queries, Co-related sub queries, Dynamic SQL queries etc. Programming in SQL Server - Using the stored procedures, Triggers, User-defined functions and Views, Common table expressions (CTEs)
  • Experienced interface integration engineer designing, implementing and testing interfaces
  • Writing heavy stored procedures, Audit triggers views, creating SQL scripts for data loads and Upgrades for data migrations and data validations
  • Proficient in Performance Tuning/Query optimization using indexes
  • Assisted with automation of manual processes by writing VBA code and using macros and formulas to speed processes and maximize accuracy.
  • Tuning the performance of stored procedures and large T-SQL Queries using Clustered indexes, efficient coding standards
  • Huge Data transfers from & to SQL Server Databases using utilities/ tools like DTS, SSIS, BCP, Bulk Insert
  • Highly experienced in whole cycle of DTS/SQL server integration services (SSIS 2005/2008) Packages (Developing, Deploying, Scheduling, Troubleshooting and monitoring) for performing Data transfers and ETL Purposes across different servers.
  • Experience in providing Logging, Error handling by using Event Handler, and Custom Logging for SSIS Packages.
  • Data integration technical lead overseeing interface builds, testing, client data validation and implementation support for migration projects
  • Used output xml files, to remove empty delta files and to FTP the output xml files to different server.
  • Scheduling and Monitoring ETL Processes using DTS Exec utilities and batch files.
  • Expertise in generating reports using SQL Server Reporting Services, Crystal Reports, and MSExcel spreadsheets and Power Pivot.
  • Expert in designing Enterprise reports using SQL Server Reporting Services (SSRS 2000/2005/2008 ) generated drill down reports, parameterized reports, linked reports, sub reports, matrix dynamics and filters, charts in SSRS 2005/2008.
  • Experience in creating Ad-hoc reports, data driven subscription reports by using Report Builder in SSRS.
  • Good knowledge of Data Marts, Operational Data Store (ODS), OLAP, Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling for FACT and Dimensions Tables using Analysis Services.
  • Writing MDX Scripts to create datasets to perform reporting and included interactive drill down reports, report models and dashboard reports.
  • Created Dashboard pages in SharePoint Server that use different types of web parts and excel services for reports
  • Create ETL Mapping with Talend Integration Suite to pull data from Source, apply transformations, and load data into target database.
  • Developed mappings /Transformation/Joblets and designed ETL Jobs/Packages using Talend Integration Suite (TIS) in Talend5.2.2.
  • Created complex mappings in Talend 5.2.2 using tMap, tJoin, tReplicate, tParallelize, tJava, tJavaFlex, tAggregateRow, tDie, tWarn, tLogCatcher, etc.
  • Created Scorecard/Dashboard pages in Performance point server (PPS)
  • Used ProClarity tool to analyze the data in the cube by using different features like Chart view, Decomposition Tree, Performance Maps, etc.
  • Defined group level security standards for Tableau.
  • Used Tableau to visually analyze data and create consise and actionable dashboards
  • Generated reports from Analysis Services cube by using Pro Clarity tool.
  • Has very strong background in a disciplined software development life cycle (SDLC) process and has excellent analytical, programming and problem solving skills.
  • Good team player, Excellent interpersonal and communication skills combined with self-motivation, initiative and the ability to think outside the box.
  • Expertise in defining the business process flow and gathering business requirements.

TECHNICAL PROFICIENCY:

Data Modeling Tools: Star Schema Modeling, Snow Flake Modeling

Data Warehousing ETL Tools: Informatica Power Center 8 .X,7.X, Talend … Data Cleaner 2.2

Databases: MS SQL Server … SYBASE, MYSQL, ORACLE, Hadoop

Database Tools: Enterprise Manager, Query Analyzer, SQL Profiler, Data Transformation Services, Crystal Reports 9.0 / 10

Languages: PL/SQL, SQL, Java, C

Web Tools: HTML, DHTML, XML, JAVA SCRIPT

Operating Systems: Linux, Microsoft … NT/98/95

Others: MS Word, Excel, VISIO 2007

WORK EXPERIENCE:

Confidential, Chicago, IL

Sr. Talend Developer

Responsibilities;

  • Interacted with the end users to get the business requirements, reporting needs and created the business requirement documents.
  • Deployed and scheduled Talend jobs in Administration console and monitoring the execution
  • Created separate branches with in the Talend repository for Development, Production and Deployment.
  • Excellent knowledge with Talend Administration console, Talend installation, using Context and global map variables in Talend.
  • Designed the architecture of Talend jobs in parallel from execution stand point to reduce the run time
  • Collaborate successfully with project managers, client resources and other data integration engineers to deliver project on time
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (CronTrigger).
  • Expertise in processing Semi-structured data such as (XML,JSON,RC and CSV) in Hive/Impala by using Talend ETL Tool.
  • Used to be on call Support if the Project is deployed to further Phases.
  • Developed routines in CODE in Repository using Java and called them in Expression Filter
  • Used max JVM parameters and Cursor Size in Talend as a part of Performance tuning.
  • Reviewed/Prepared the technical design documents (TDD) and source to target mappings are created as per requirements meet
  • Designed and Developed ETL process using Talend Open Studio (Data Integration) & Even Worked on Enterprise Latest Version's
  • Design and Implement ETL to data load from Source to target databases and for Fact and Slowly Changing Dimensions (SCD) Type1, Type 2 to capture the changes
  • Designed and Developed ETL process using Talend Platform for Big data& Even Worked on Developed ETL mappings for Staging, Dimensions, Facts and Data marts load
  • Involved in Data Extraction for various Databases & Files using Talend Open Studio & Big Data Edition.
  • Used output xml files, to remove empty delta files and to FTP the output xml files to different server.
  • Subject Matter Expert all data integration for the Tahoe project - including data flow diagrams
  • Approve technical design documentation for all data integration for Tahoe Project
  • Designing, developing and deploying end-to-end Data Integration solution.
  • Worked on Talend with Java as Backend Language.
  • Extensively Used tmap component which does lookup & Joiner Functions
  • Worked on Joblets which is used for reusable code.
  • Wrote backup and recovery shell scripts to provide failover capabilities
  • Performed administration of Sun Solaris and building high end Sun Servers for production environment. Build Sun servers with Solaris 10, Jumpstart.
  • Decommissioning of the old servers and keeping track or decommissioned and new servers using inventory list.
  • Monitored System Activities like CPU, Memory, Disk and Swap space usage to avoid any performance issues.
  • Software package and patch administration, involving adding and removing software packages and updating patches
  • Handling problems or requirements as per the ticket (Request Tracker) created.
  • Experience in writing monitoring/start up shell scripts for unix and Linux

Environment: Talend Platform for Data management 5.6.1, UNIX Scripting, Toad, Oracle 10g, SQL Server, Redshift

Confidential, Chicago, IL

Sr.Talend Developer

Responsibilities: -

  • Worked Extensively on Talend Admin Console & Schedule Jobs in Job Conductors, this option is not available in Talend Open Studio.
  • Mainly Involved in Performance Tuning of Existing Jobs and created Audit Process & Error Capturing methodology for various app teams.
  • Hands of Experience on many components which are there in the palette to design Jobs & used Context Variables/Groups to Parameterize Talend Jobs.
  • Implemented Error Logging, Error Recovery, and Performance Enhancement's & created Audit Process (generic) for various Application teams.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments.
  • Managed change requests for the middleware data integration team, including scheduling production code deployments.
  • Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN(Subversion).
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregators, and Connected and Unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, XML Transformations and Update Strategy & Sequence Generator.
  • Worked on Custom Component Design and used to have embedded in Talend Studio
  • In parallel to Development acted as a Talend Admin: Creating Projects/ Scheduling Jobs /Migration to Higher Environments & Version Upgrades.
  • Created Talend Jobs to populate the data into dimensions and fact table.
  • Created complex jobs in Talend using tMap, tJoin, tDie, tConvertType, tFlowMeter, tLogCatchertReplicate, tParallelize, tjava, tJavaFlex, tAggregateRow, tWarn, etc. and created Talend jobs to populate the data into dimensions and fact tables.
  • Developed high level data dictionary of ETL data mapping and transformations from a series of complex Talend data integration jobs.
  • Used XML transformation to load the data XML file. Worked on Informatica Schedulers to schedule the workflows. Extensively worked with Target XSD's in order to generate the output xml files.
  • Created Bash/Korn shell scripts to monitor system resources and system maintenance.
  • Wrote Perl and Python scripts to generate statistics and monitor processes.
  • Deployed Active/Active MySQL clusters using Redhat Cluster Suite for supporting internal applications built on the LAMP stack.
  • Automated system management tasks like user account creation, file system size monitoring, monitor system resources and system maintenance using Shell, and Perl scripts.
  • Used DTS/ SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML
  • Installed, configured & upgraded Talend Studio and provided extensive support in code deploy, change management and application level troubleshooting for the Dev, Test, Pre-Prod & Production environment.
  • Deployed applications (EAR/WAR) files on multiple Servers/Clusters and maintained Load balancing.
  • Worked with DBAs on installation of Oracle database, RDBMS database, restoration and log generation. Perform security patching of Linux servers.

Environment: Talend 5.5/, Oracle 11g, Teradata SQL Assistant, HDFS, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, Informatica, TOAD, ERwin, AIX, Shell Scripts, AutoSys, SVN

Confidential, Cleveland, Ohio

Talend Developer

Responsibilities:

  • Creates logical data models adhering to data warehouse design principles.
  • Extensively used ETL to load data from different source systems having different formats to target Oracle BI.
  • Involved in migration of data stage 8.7 to Talend V 5.6.2.
  • Worked on the design, development and testing of Talend mappings.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow, tJava, Tjavarow, tConvertType etc.
  • Used Database components like tMSSQLInput, tMsSqlRow, tMsSqlOutput, tOracleOutput, tOracleInput etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename.
  • Worked on improving the performance of Talend jobs.
  • Created triggers for a Talend job to run automatically on server.
  • Worked on Exporting and Importing of Talend jobs.
  • Worked closely with the administrators with the configuration of Talend Open studio.
  • Monitoring the entire Batch cycle of application and working on high priority incidents and Problem Tickets
  • Implementing manual work - around where manual effort required until fix went to live.
  • Created Execution Tasks in Talend Administration Center on jobs that are either saved in SVN or in Pre-Generated Studio as Zip files.
  • Experienced in versioning, importing and exporting Talend jobs. Set up Triggers for Talend jobs in Job conductor.
  • Have direct meetings with Business users to understand the all issues they reported/facing and provide them short and long term workaround.
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
  • Experienced in using debug mode of Talend to debug a job to fix errors.
  • Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited, etc.
  • Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.
  • Created Context Variables and Groups to run Talend jobs against different environments.
  • Used tParalleize component and multi thread execution option to run subjobs in parallel which increases the performance of a job.
  • Involved in developing SPRF's with Dev Team and implemented performance tuning on Data stage jobs which are running with poor performance in production.
  • Wrote the shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.
  • Involved in creation of IA projects, importing metadata, Mapping Data source, Assigning roles to user, adding users to projects.
  • Effectively used standardized stage in standardizing the source data by using the existed rule sets like name, address etc.
  • Effectively worked on AVI stage to validate the address information coming from the source and generated the valid and invalid address reports.
  • Created Unix scripts to automate the process for long running jobs and failure jobs status reporting.
  • Used Pig as ETL tool to do transformations, event joins, filter boot traffic and some pre-aggregations before storing the data onto HDFS.
  • Worked on custom Pig Loaders and Storage classes to work with a variety of data formats such as JSON, Compressed CSV, etc.
  • Involved heavily in writing complex SQL queries based on the given requirements such as complex DB2 Joins, Sub Queries, Stored Procedures, and Macros etc.
  • Helped Business Users by writing Complex efficient Teradata SQLs to get a detailed for Data Mining.
  • Working as a Primary and secondary support member to fix daily cycle issues and other incidents and service calls as well.
  • Performance tuning, SQL query enhancements, code enhancements to achieve performance targets.

Environment: Talend Data Integration 6.1/5.5.1, Talend Enterprise Big Data Edition 5.5.1, TalendAdministrator Console, Oracle 11g, Hive, HDFS, Sqoop, Netezza, SQL Navigator, Toad, Control M, Putty, Winscp

Confidential

Talend Developer

Responsibilities:

  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
  • Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats.
  • Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema.
  • Recreating existing Netezza objects in Snowflake.
  • Importing and exporting data into HDFS and Hive using SQOOP.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, Spark and loaded data into HDFS.
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
  • Experienced in using debug mode of talend to debug a job to fix errors. Created complex mappings using tHash Output, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited, etc.
  • Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.
  • Created Context Variables and Groups to run Talend jobs against different environments.
  • Used tParalleize component and multi thread execution option to run subjobs in parallel which increases the performance of a job.
  • Implemented FTP operations using Talend Studio to transfer files in between network folders as well as to FTP server using components like tFileCopy, TFileAcrchive, tFile Delete, tCreate Temporary File, tFTPDelete, tFTPCopy, tFTPRename, tFTPut, tFTPGet etc.
  • Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.
  • Experienced in writing expressions with in tmap as per the business need. Handled insert and update Strategy using tmap. Used ETL methodologies and best practices to create Talend ETL jobs.
  • Extracted data from flat files/ databases applied business logic to load them in the staging database as well as flat files.

Environment: Talend, Oracle, Teradata SQL Assistant, Linux, HDFS, MS SQL Server, PL/SQL, Shell Scripts, AutoSys, SVN.

Confidential

Talend Developer

Responsibilities:

  • Involved in End-End development of the implementation and Roll out.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Work on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Coordinated with the business to gather requirements and preparing Functional Specification document.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Worked in a Scrum Agile process with two-week iterations delivering new Snowflake objects and migrating data at each iteration.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracle Output, file components, ELT components etc.
  • Involved in automation of FTP process in Talend and FTP the Files in UNIX.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed over 90 mappings to support the business logic including the historical data for reporting needs.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
  • Developed Type-1 and Type-2 mappings for current and historical data.
  • Incorporated business logic for Incremental data loads on a daily basis.
  • Used Parameter Variables and Mapping variables for incremental data feeds.
  • Used Shared folders for Source, Targets and Lookups for reusability of the objects.
  • Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
  • On-call support for production maintenance

Environment: Talend, Oracle, Linux, SQL, Unix, PL/SQL, TOAD, MY SQL, Autosys, Netezza, Xml, Flat files.

Hire Now