Sr. Etl/ Talend Developer Resume
SUMMARY
- Around 6 years of experience in full life cycle of software project development in various areas like design, Applications development of Enterprise Data Warehouse on large scale development efforts leveraging industry standard using Talend and Informatica.
- Identify teh business requirements from teh client, created functional designs and technical designs analyzed and requested data extracts according to application requirements.
- Reading information from COBOL copybook files to extract client data and stage them into designed based structured Oracle tables through Informatica mappings.
- Support Admin in creating new repositories from teh Informatica and Talend Administrator console.
- Design ETL process using Informatica PowerCenter and talent studio to load from sources to targets through data transformations like tmap, tunit, taggregate, Source qualifier, Expression, Connected and unconnected Lookup, Normalizer, Sorter, Rank, Aggregator, Union, Joiner, Update strategy, Filter, Sequence Generator, Stored Procedure, SQL, etc.
- Hands - on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Creating Talend jobs for SDLC process through Talend Studio using TDBSCD components.
- Extensive experience indesign, develop and testprocesses for loading initial data into a data warehouse
- Validate teh Legacy system data by writing scripts and power center design to throw invalid data into exception tables.
- Develop SQL validation scripts to test teh data loaded into parent tables by teh ETL process and missing from child tables vice versa.
- Improve teh maximize teh performance of case notes information from data collection Module by checking source/Target - bottlenecks, tuning SQL's, Partition at session-level, adding teh data types to store a high volume of data in a column.
- Develop Complex SQL queries wif teh knowledge of all layers of teh application for test case scenarios to pull teh cases for testing data.
- Lead teh team in Unit Test process and update report to Manager.
- Develop a "Data conversion Job flow chart" using Microsoft Visio 2016 for CA7 batch scheduler and Talend management console to automate Informatica jobs according to Application driver
- Develop reports using COGNOS after every run to display Load statistics.
- Create Access to ETL developer bases on roles.
- Create a security alert and sending security email.
- Assist in creating procedures to update Database and Informatica sequences.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center, Power Mart, Oracle Warehouse Builder, Autosys 3.5,JCL, SAS, SQL*Plus, SQL*Loader,Talend Data Integration 7.3.1/8.0, Talend Enterprise Big Data Edition 5.5.1.
Reporting tools: Tableau, Cognos, Business Objects.
Databases: Oracle 11g/10g/9i/8i/8.0/7.0 MS SQL Server 2000/7.0,SQL Server 2005,2008 IBM DB2/UDB 7.2, MS Access 2000, MySQL, Teradata, Aurora DB(AWS).
Operating Systems: Windows XP/2000/NT/9x, UNIX(Korn), Sun Solaris 2.6/7/8, AIX 5.2, MS-DOS 6.22.
Programming: Java, SQL, PL/SQL, C, C++.
Version Controller: Git,CVS, SVN,Git Bucket .
Unix Tools: VI Editor, F-Secure SSH Client, SFTP, NDM.
PROFESSIONAL EXPERIENCE
Confidential
Sr. ETL/ Talend Developer
Responsibilities:
- Participated in all phases of development life-cycle wif extensive involvement in teh definition and design meetings, functional and technical walkthroughs.
- Created Talend jobs to copy teh files from one server to another and utilized Talend FTP components
- Created and managed Source to Target mapping documents for all Facts and Dimension tables
- Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.
- Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.
- Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.
- Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
- Extensively used tMap component which does lookup & Joiner Functions, tjava, tOracle, txml, tdelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+components to use in my jobs.
- Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more).
- IntegratedSalesforce.com wif an external application using SOAP, REST based web services
- Implemented multi-channel service desk including email to case, web to case, CTI integration using Ingenious open CTI, live agent setup, case escalation and assignment rules
- Worked on customizing service console, Used REST API for implementing Web Service Definition Language (WSDL) in teh application for access to data from external systems and web sites.
- Developed stored procedure to automate teh testing process to ease QA efforts and also reduced teh test timelines for data comparison on tables.
Environment: Talend Data Integration7.3.1/8.0, Talend Enterprise Big Data Edition 5.5.1, Talend Administrator Console, Oracle 11g, Hive, HDFS, Sqoop, Netezza, SQL Navigator, Toad, Control M, Putty, Winscp.
Confidential
ETL Developer
Responsibilities:
- Extensively involved inRequirements analysis/ gatheringprocess, and documentation of function/technical specificationsas required for teh project.
- Worked interactively wifLine of Business group to maintain teh integrityof building thePublic Exchangeensuring to meet teh standards as required.
- Primary responsibilities includedesigning teh ODS layerfor teh data mart which involved preparingData modelsusing Microsoft Visioand ETL constructs (mappings, sessions, workflows)usingInformatica Powercenter 9.5
- Worked directly on Informatica server environment onconfiguring teh parameter filesand maintainfile manipulationsas required in building teh ETL mappings & sessions.
- Worked on CA 7 agent to maintain thescheduling requirementsfor teh project.
- Involved insupport activitiesfor teh Data mart throughout teh project timeframe as needed
- Worked in teh performance tuning of SQL, ETL and other processes to optimize session performance.
- Worked wif different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing teh Mappings.
- Used SQL tools like TOAD to run SQL queries to view and validate teh data loaded into teh warehouse.
- CreatedStored Proceduresto transform teh Data and worked extensively in SQL, PL/SQL for various needs of teh transformations while loading teh data into Data warehouse
- Responsible for teh data management and data cleansing activities usingInformatica data quality (IDQ).
- Performed data quality analysis to validate teh input data based on teh cleansing rules.
- Used Parameter Files defining variable values for mapping parameters and variables.
- Created FTP scripts and Conversion scripts to convert data into flat files to be used for Informatica sessions.
- Worked wif Pre and Post Session SQL commands to drop and recreate teh indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
- Extensively involved in code deployment from Dev to Testing.
Environment: Informatica Power Center 9.5, LINUX,SQL Server, Oracle, COGNOS, Toad 10.1.6, SQL Server Management, Windows XP Professional, Microsoft Visio 2008, WinSCP, MyEclipse, Java.