Sr. Etl/talend Developer Resume
Boston, MA
SUMMARY
- Senior ETL Talend developer with 7+ years of experience in Development and Production environments. Experience in developing ETL for enterprise data warehouse and BI reports.
- Experience in Talend Open Studio and Talend Integration Suite.
- Experience working with Data Warehousing Concepts like OLAP, OLTP, Star Schema, Snowflake Schema, Logical/Physical/ Dimensional Data Modeling.
- Extensively used ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Netezza, Oracle, DB2, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe Files.
- Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.
- Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tConvertType, tFlowMeter, tLogCatcher, tNormalize, tDenormalize, tJava, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
- Experienced in Implementation of ETL engine using Java to handle incremental loads.
- Excellent experience with Talend ETL and used features of Context Variables, Database components like tMSSQLInput, tOracleOutput, tmap, tFileCopy, tFileCompare, tSalesforceOutput, tSalesforceBulkExec, tSalesforceInput tFileExists file components, ELT Components etc.
- Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.
- Worked with different tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
- Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.
- Experience in monitoring and scheduling using AutoSys, Control M & Job Conductor (Talend Admin Console) and using UNIX (Korn& Bourn Shell) Scripting.
- Experienced in creating Triggers on TAC server to schedule Talend jobs to run on server.
- Strong experience in Extraction, Transformation, loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Repository Manger).
- Experience in developing Informatica mappings using transformations like Source Qualifier, Connected and Unconnected Lookup, Normalizer, Router, Filter, Expression, Aggregator, Stored Procedure, Sequence Generator, Sorter, Joiner, Update Strategy, Union Transformations.
- Experience on developing analytic reports and dashboard using business objects and Tableau.
- Able to perform independently in complex troubleshooting, root-cause analysis and solution development.
- Proven team player, good communication skills and quick learner.
PROFESSIONAL EXPERIENCE
Confidential, Boston MA
Sr. ETL/Talend Developer
Responsibilities:
- Responsible for designing and implementing ETL process to load data from different sources, perform data mining and analyze data using visualization/reporting tools to leverage the performance of OpenStack.
- Created job lets in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
- Used several features of Talend such as tmap, treplicate, tfilterrow, tsort, tWaitforFile, tSalesforceOutput, tSalesforceBulkExec, tSalesforceInput etc for ETL process.
- Involved in design and development of complex ETL mapping.
- Implemented error handling in Talend to validate the data integrity and data completeness for the data from flat file.
- To populate the data into dimensions and fact tables, efficiently involved in creating Talend Mappings.
- Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis.
- Extensively used ETL to load data from flat files, XML, Oracle database, MySql from different sources to Data Warehouse database.
- Efficient in writing complex Java code using tJava, tJavarow, tJavaFlex and handled Heap Space Issues and memory related issues in Talend.
- Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL
- Develop reports, dashboards using Tableau for quick reviews to be presented to Business and IT users.
- Developed Ad-hoc reports using Tableau Desktop, Excel.
- Developed visualizations using sets, Parameters, Calculated Fields, Dynamic sorting, Filtering, Parameter driven analysis.
Environment: Talend Open Studio and Talend Integration Suite, tmap, treplicate, tfilterrow, tsort, tWaitforFile, tSalesforceOutput, tSalesforceBulkExec, tSalesforceInput, Oracle, SQL, Tableau and Hadoop.
Confidential
Talend Developer
Responsibilities:
- Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
- Worked on integration projects from gathering requirements phase to implementation phase which enabled Customers to see the data daily basis or weekly basis with very minimal manual intervention rather than spending quite number of manual hours.
- Developed talend jobs to push data into consolidated stage (for all feeds), which is sauce for MDM process.
- Used Talend most used components (tMap, tDie,tConvertType, tSOAP, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie and many more).
- Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
- Implemented different matching rules, data validation rule to get the golden record.
- Created many complex ETL jobs for data exchange from and to Database Server and various other systems Including RDBMS, XML, CSV, and Flat file structures.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
- Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
- Developed Talend jobs to populate the claims data to data warehouse - star schema.
- Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis.
- Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
- Experienced with Java transformations for calling Hive views to extract data from Hadoop systems.
- Expertized in developing shell scripts in UNIX. Created Talend mappings using the Transformations.
- Involved in analyzing and extracting mongo application collections into ODS by using hive views.
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- Responsible for prioritizing the issues and assign them to the production support team and planning the deployment of fixes for the same.
Environment: Talend Components (tMap, tDie,tConvertType, tSOAP, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie and many more), Oracle.
Confidential
Talend Developer
Responsibilities:
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
- Developed high level and detailed level technical and functional documents consisting of detailed design documentation function test specification with use cases and unit test documents
- Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
- Developed PL/SQL triggers and master tables for automatic creation of primary keys.
- Involved in Talend Data Integration, Talend Platform Setup on Windows and UNIX systems.
- Created job lets in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
- Created complex mappings in Talend using tHash, tDenormalize, tMap, tJoin, tReplicate, tParallelize, tJava, tjavarow, tUniqueRow, tPivotToColumnsDelimited as well as custom component such as tUnpivotRow.
- Used tStatsCatcher, tDie, tLogRow, tDie, tWarn, tLogCatcher, to create a generic job let to store processing stats into a Database table to record job history.
- Created Talend Mappings to populate the data into dimensions and fact tables.
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Implemented custom error handling in Talend jobs and also worked on different methods of logging.
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- Developed error logging module to capture both system errors and logical errors that contains Email notification and also moving files to error directories.
- Created Talend ETL job to receive attachment files from pop e-mail using tPop, tFileList, tFileInputMail and then loaded data from attachments into database and archived the files
- Created jobs and job variable files for Teradata TPT and load using tBuild command from command line.
- Implemented agile development methodology using XP, Scrum and Kanban/Continuous Flow.
- Created FTP scripts and Conversion scripts to convert data into flat files to be used for Talend jobs.
Environment: Talend 6.0.1/5.5, Oracle 11g, Teradata V 13.0, Teradata SQL Assistant, MS SQL Server 2012/2008, DB2, TOAD, Erwin, AIX, Shell Scripts.
Confidential
SAP BO Developer
Responsibilities:
- Created new and modified existing Hierarchies in the universes to meet Drill Analysis of the user’s reporting needs.
- Defined Alias and Contexts to resolve loops and also tested them to ensure that correct results are retrieved.
- Performed integrity testing of the Universes (Universe Structure checking, Objectparsing, Joins parsing, Conditions parsing, Cardinalities checking, Loops checkingand Contexts checking) after any modifications to them in terms of structure,classes, and objects.
- Exported the universe to the Repository to make resources available to the users.
- Created new classes, objects and made structural changes to Universes in Designerwhich includes adding new objects, updating tables and joins.
- Created different types of reports such as Master/detail, Cross Tab and Charts.
- Used Prompts, Conditions to restrict the data returned by Query and used Filters to restrict the data to be displayed on the report.
- Used Alerts for highlighting desired data in the report.
- Worked extensively with the major functionality of BO like Breaks, sections
- Exported reports into XL, PDF, CSV and XML formats as per the clientrequirement.
- Created graphical representation of reports such as Bar charts, 3D charts, Pie charts, Column chart, Line chart, list-box using Dashboard Design Tool.