We provide IT Staff Augmentation Services!

Sr. Talend / Etl Developer Resume

5.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY:

  • Proficient ETL Developer over 8 years of experience in Data Warehouse, Installation of Talend Product, Configuration and Maintenance.
  • Experience on multiple Talend Platforms comprising of Talend DI, Talend Big Data Integration, Talend DQ, TAC and its associated technologies.
  • Good exposure in overall SDLC including requirement gathering,desigining, development, testing, debugging, deployment, documentation and production support.
  • Good hands on experience with Unix Shell Script and Windows batch script.
  • Experience in design and deploy ETL solution for large - scale data OLAP and OLTP instance using Talend ETL.
  • Developed efficient mappings for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, Hive, and sqoop.
  • Optimizing Talend Mappings and Sessions to improve the performance
  • Experience in integration of various data sources like Teradata, SQL Server, Oracle, DB2, Flat Files and source files like delimited files, Excel, Positional and CSV files.
  • Good Knowledge on Data Warehousing concepts like Star Schema, Dimensions and Fact tables.
  • Hands on Experience on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
  • Experience in working with parallel extender for splitting bulk data into subsets to distribute the data to all available processors to achieve best job performance.
  • Expert in using different components in Talend like Processing Components (tFilterRow, tAggregateRow, tMap and tJoin), Custom Code Components (tJava, tJavaRow), Logs & Error Components(tDie,tLogCatcher,tLogrow).DatabaseComponents(tOracleInput,tOracleoutput,tOraclecommit,tOracleConnection,tOracleSP,tOracleRollback)ManagementComponents(tFilelist,tFileCopy)OrchestrationComponent(tWaitForFile),FileComponents(tFileInputDelimited, tfileOutputDelimited ), Misc. Component(tRowGenerator).
  • Hands on Experience to create the Routines (User defined functions).
  • Having Experience in Various Analysis (Column and Table) on Data and generate the Reports.
  • Good communication skills, interpersonal skills, team coordination and versed with Software Development processes.
  • Proficient in planning, estimation, implementation plan, offshore co-ordination and time management.

TECHNICAL SKILLS:

ETL/Middleware Tools: Talend 5.5/5.6/6.2,6.4Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables.

RDBMS: Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000, DB2, MySQL, MS Access.

Programming Skills: SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Pig, Hive and Java.

Storage Repository: GIT and SVN

Tools: MySQL workbench, SQL developer and other Talend related tools.

Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.

WORK EXPERIENCE:

Confidential, Minneapolis, MN

Sr. Talend / ETL Developer

Responsibilities:

  • Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load Worked on Data Migration using export/import.
  • Involved in Reviewing the project scope, requirements, architecture diagram, proof of concept (POC) design and development guidelines on Talend.
  • Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1, SCD-Type2 and SCD-Type3 to capture the changes.
  • Used components liketHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Performance tuning - Using the tMap cache properties, Multi-threading and tParallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
  • Extensively Used tMap component which does lookup & Joiner Functions, tJava, tOracle, tXml, tDelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+ components to use in my jobs.
  • Implemented File Transfer Protocol(FTP) operations using Talend Studio to transfer files in between network folders using Talend components like tftpConnection,tftpFilelist,tftpget and tftpput etc.
  • Designed, developed and improved complex ETL structures to extract transform and load data from multiple data sources into data warehouse and other databases based on business requirements.
  • Used custom code components liketJava, tjavarow and tjavaflex.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and used them in the job.
  • Troubleshooting, debugging & fixing Talend specific issues, while maintaining the health and performance of the ETL environment.
  • Used TAC (Talend Administrator center) and implemented new users, projects, tasks within multiple different environments of TAC (Dev, Test, Prod, DR).
  • Scheduling the ETL Jobs in TAC using file based and time based trigger.
  • Experience in Agile methodology.

Environment: Talend Enterprise for Big Data (V6.0.1, 5.6.2/5.6.1 ), UNIX, SQL, Hadoop, Hive, Pig, Oracle,Unix Shell Scriting, Microsoft SQL Server management Studio.

Confidential, Harrisburg, PA

Sr. Talend / ETL Developer

Responsibilities:

  • Involved in understanding the ETL mapping document and Source to Target mappings.
  • Designed and Developed ETL process using Talend Platform for Big data and Worked on Enterprise Latest Versions.
  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Worked on Talend with Java as Backend Language.
  • Prepared the ETL functional documents and test case documents.
  • Studying the source of the data to identify the methods of data extractions.
  • Experienced on processing the data files through Talend Product.
  • Worked with multiple sources such as Relational databases, Flat files for Extraction using tMap and tJoin.
  • Worked onTalend components like tMap, tOracleinputoutput, tfileinputoutput, tAggregatRow, tSort, tFilterRow,tFiltercolumn,tSpitRow,tNormalizer, tJavaRowComponents.
  • Exception Handling in Talend using components like tDie,tsendmail,twarn and tlogcatcher.
  • Working knowledge on the reusable components like Contexts, Global variables and Environment variables in Talend.
  • Prepared the test plans and test cases for different functionalities involved in the application.
  • Worked with Microsoft SQL Server management Studio/TOAD while Implementing Unit Testing.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and also worked on different methods of logging.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.

Environment: Talend Platform for Big Data 5.6.2, Enterprise Platform for Data integration(V5.6.1), UNIX, SQL Server 2012, Microsoft SQL Server management Studio, Windows XP.

Confidential, Newark, NJ

Talend / ETL Developer

Responsibilities:

  • Worked closely with Business Analysts to review the business specifications of the project and also to gather the ETL requirements.
  • Developed jobsand Joblets in Talend.
  • Designed ETL Jobs/Packages using Talend Integration Suite (TIS).
  • Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow, tPivottoColumnDelimited as well as custom component such as tUnpivotRow.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and also worked on different methods of logging.
  • Created ETL/Talend jobs both design and code to process data to target databases.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.
  • Successfully Loaded Data into different targets from various source systems like Oracle Database, DB2, Flat files, XML files etc. into the Staging table and then to the target database.
  • Troubleshot long running jobs and fixing the issues.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Performed Unit testing and System testing to validate data loads in the target.
  • Developed data quality test plans and manually executed ETL and BI test cases.
  • Understood the business functionality in detail.
  • Tested entire ETL flow from source 2 target.
  • Developed FTP for use by testing team during their testing process.
  • Understood the various process plans, business processes, and functionality in detail.
  • Initiated knowledge sharing sessions to help circulate the existing knowledge better among the teams.

Environment: Talend 5.1, Oracle 11g, DB2, Sybase, MS Excel, MS Access, TOAD, SQL, UNIX.

Confidential

ETL Developer/Informatica

Responsibilities:

  • Involved in ETL implementation, bug fixing, enhancements and support to BIRST’s customers.
  • Working with Business analyst to analyze the problem statement and come up with a technical perspective on the implementation of a probable solution using Talend and Informatica Power Center.
  • Worked on HVR. Created various channels for CDC.
  • Analysis of certain existing Dimension models, ETL mappings, Dashboards, Reports which were producing errors and giving wrong data set results and modifying them to produce correct results.
  • Monitored Job and query executions and collected performance data to maximize the performance.
  • Written Test Cases for ETL to compare Source and Target database systems and check all the transformation rules.
  • Performed Verification, Validation, and Transformations on the Input data.
  • Tested the messages published by INFORMATICA and data loaded into various databases.
  • Involved in performance tuning of the ETL and Reports.
  • Lead the team as Module lead for ETL Projects.
  • Designed and developed Informatica power center medium to complex mappings using transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank, Sequence Generator, Stored Procedure and Update Strategy.
  • Used Normalization up to 3NF and De-normalization for effective performance.
  • Involved in implementation of the Test cases and Test Scripts.
  • Tested the data and data integrity among various sources and targets.
  • Tested to verify that all data were synchronized after the data is troubleshoot, and used SQL to verify/validate test cases.
  • Extensively worked on the designing the database structure to suit the business needs.
  • Involved in Unit testing.
  • Involved in preparation of High level and low-level documents.
  • Extensively worked in the performance tuning which includes removing ETL bottlenecks.
  • Performed various EQA & IQA activities at account level.
  • Performed various Project Management activities at account level.

Environment: Erwin r7.3, SQL/MS SQL Server, MS Analysis Services, Windows NT, MS Visio, XML, Informatica.

Confidential

SQL Developer

Responsibilities:

  • Security issues related to logins, database users, and application roles and linked servers.
  • Performance tuning of SQL queries and stored procedures using SQL profiler and Index tuning advisor.
  • Administered of all SQL server database objects, logins, users and permissions in each registered server.
  • Resolved any deadlocks issues with Databases/Servers on a real-time basis.
  • Wrote scripts for generating Daily Backup Report, verifying completion of all routine backups, log space utilization monitoring etc.
  • Worked on DTS Package, DTS Import/Export for transferring data from various heterogeneous sources to SQL server.
  • Created tables, relationships, triggers and indexes for enforcing business rules.
  • Used SQL Profiler to estimate the Slow Running queries and performance tuning purpose.
  • Wrote different complex SQL queries including inner, outer join and update queries.
  • Developed reports for payment and BI Count to show organizational and seasonal comparison.
  • Incremental and full database recovery with experience of complex recovery scenarios.
  • Worked on DTS Package, DTS Import/Export for transferring data from various heterogeneous sources to SQL server.

Environment: SQL server 2000 Enterprise Edition, Windows 2000/NT, UNIX, Excel, SQL Profiler, Replication, DTS, MS Access, T-SQL, Crystal Reports.

We'd love your feedback!