Etl/ Talend Developer Resume
5.00/5 (Submit Your Rating)
Newark, NJ
SUMMARY
- 7+ years of experience in full life cycle of software project development in various areas like design, Applications development of Enterprise Data Warehouse on large scale development efforts leveraging industry standard using Talend and Informatica.
- 3+ years of experience using Talend Data Integration/Big Data Integration (6.1/5.x) / Talend Data Quality.
- Extensive knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.
- Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Oracle, DB2, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
- Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tflowtoIterate, tAggregate, tSortRow, tRowGenerator, tNormalize, tDenormalize, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tMysqlScd, tFilter, tDie, Buffer component etc.
- Created Talend ETL jobs to receive attachment files from pop e-mail using tPop, tFileList and tFileInput Mail and then loaded data from attachments into database and achieved the files.
- Strong understanding of NoSQL databases like HBase, MongoDB.
- Expertise in Data modeling techniques like Data Modeling- Dimensional/ Star Schema and Snowflake modeling, Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).
- Excellent working experience in Waterfall, Agile methodologies. Proficient in performance analysis, monitoring and SQL query tuning using EXPLAINPLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
- Well versed with Talend Big Data, Hadoop, Hive and used Talend Big data components like tHDFSInput, tHDFSOutput, tPigLoad, tHiveInput.
- Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica PowerCenter and IDQ tool.
- Created mappings using Lookup, Aggregator, Joiner, Expression, Filter, Router, Update strategy and Normalizer Transformations. Developed reusable Transformation and Mapplets.
PROFESSIONAL EXPERIENCE
ETL/ Talend Developer
Confidential, Newark, NJ
Responsibilities:
- Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
- Created Talend jobs to copy the files from one server to another and utilized Talend FTP components
- Created and managed Source to Target mapping documents for all Facts and Dimension tables
- Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.
- Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.
- Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.
- Used Talend most used components (tMap, tDie, tConvertType, tRowGenerator, tHashInput & tHashOutput and many more).
- Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
- Created Implicit, local and global Context variables in the job. Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
- Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
- Developed stored procedure to automate the testing process to ease QA efforts and also reduced the test timelines for data comparison on tables.
ETL/ Talend Developer
Confidential, chesterfield, MO
Responsibilities:
- Participated in JAD sessions with business users and SME's for better understanding of the reporting requirements.
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
- Analyzing the source data to know the quality of data by using Talend Data Quality.
- Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
- Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
- Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
- Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats.
- Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
- Developed Talend jobs to populate the claims data to data warehouse - star schema.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
- Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
- Experienced in using debug mode of talend to debug a job to fix errors. Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivot to ColumnsDelimited, etc.
- Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.
- Created Context Variables and Groups to run Talend jobs against different environments.
- Used tParalleize component and multi thread execution option to run subjobs in parallel which increases the performance of a job.
- Implemented FTP operations using Talend Studio to transfer files in between network folders as well as to FTP server using components like tFileCopy, TFileAcrchive, tFileDelete, tCreateTemporaryFile, tFTPDelete, tFTPCopy, tFTPRename, tFTPut, tFTPGet etc.
- Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.
- Experienced in writing expressions with in tmap as per the business need. Handled insert and update Strategy using tmap. Used ETL methodologies and best practices to create Talend ETL jobs.
Informatica Developer
Confidential, Minneapolis, MN
Responsibilities:
- Present the IT solutions implementation approach as dictated by the Business. Requirements Documents requirements classification and methodology
- Present the risks, dependencies, and outstanding items that require attention from the project stakeholders. Responsible for designing, developing and unit testing of the mappings.
- Developed mappings using Informatica PowerCenter Designer to load data from various source systems to target database as per the business rules.
- Used various transformations like Source Qualifier, Aggregators, Connected & unconnected lookups, Filters, Sequence generators, Routers, Update Strategy, Expression, etc.
- Involved in developing test cases for the Informatica mappings and update processes.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
- Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
- Performed the unit, system and integration testing for the jobs. Validated the test results by executing the queries using the Toad software.
- Prepared test plans for both unit and system tests. Responsible to design, develop and unit test the mappings.
ETL Informatica Developer
Confidential, Minneapolis, MN
Responsibilities:
- Understanding the exiting PL/SQL Procedures & re-engineering the logic into Informatica requirements
- Extracted source definitions from oracle and flat files
- Developed mappings and workflows as per the new target databases
- Converting oracle stored procedures into Type1&Type2 mapping as per new business requirements.
- Loaded data from files to Netezza tables (stage area) using NZLOAD utility & to HDFS files using UNIX scripts.
- Understanding the Oracle Golden-Gate Data Integration application errors & support the
- Oracle DBA team for fixing the issues.
- Unit testing and System Testing
- Scheduling the ETL jobs using Informatica scheduler
- Monitoring the daily/weekly DW ETL workflows
- Fixing the issues occurring on daily ETL load
SQL Developer
Confidential
Responsibilities:
- Analyzed reports and fixed bugs in stored procedures using SSRS. Used complex expressions to group data, filter and parameterize reports.
- Created linked reports and managed snapshots using SSRS. Performed various calculations using complex expressions in the reports and created report models.
- Generated Complex SSRS reports like reports using Cascading parameters, Snapshot reports Drill-down Reports, Drill-Through Reports, Parameterized Reports and Report Models and ad hoc reports using SSRS based on Business Requirement Document.
- Conducted performance tuning of complex SQL queries and stored procedures by using SQL Profiler and index tuning wizard.
- Provided Production support to analyze and fix the problems and errors on daily basis by modifying SSIS Packages and Stored Procedure if necessary.
- Designed and developed Tables, Stored procedures, Triggers and SQL scripts using TSQL, Perl and Shell scripting for enhancements and maintenance of various database modules