Sr. Etl/ Talend Developer Resume
Lombard, IL
SUMMARY
- 6+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
- Highly skilled ETL Engineer with 6+ years of software development in tools like Informatica/SSIS/Talend.
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- 2+ years’ Experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
- Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, HBase, Dynamodb, Elastic Search and Spark SQL.
- Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Confidential 10g/9i/8i/7.x, DB2, Netezza, SQL server, Confidential, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
- Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
- Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc
- Strong Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
- Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
- Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
- Experienced on writing Hive queries to load the data into HDFS.
- Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
- Expertise in working with relational databases such as Confidential 12c/11g/10g/9i/8x, SQL Server 2012/2008/2005 , DB2 8.0/7.0, UDB, MS Access and Netezza.
- Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Confidential PL/SQL.
- Experienced in integration of various data sources like Confidential SQL, PL/SQL, Netezza, SQL server and MS access into staging area.
- Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Experienced in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and used Netezza Utilities to load and execute SQL scripts using Unix
- Proficient in the Integration of various data sources with multiple relational databases like Oracle12c/11g / Oracle10g/9i, MS SQL Server, DB2, Confidential, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Extensively worked with Netezza database to implement data cleanup, performance tuning techniques
- Experienced in using Automation Scheduling tools like Autosys and Control-M.
- Experience in data migration with data from different application into a single application.
- Responsible for Data migration from my SQL Server to Confidential Databases
- Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
TECHNICAL SKILLS
Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS
ETL Tools: Talend, TOS, TIS, Informatica Power Center 9.x/8.x/7.x/6.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), SSIS, Ab-Initio.
Databases: Confidential 12c/11g/10g/9i/8i, MS SQL Server /2005, DB2 v8.1, Netezza, Confidential, Hbase.
Methodologies: Data Modeling - Logical Physical, Dimensional Modeling - Star / Snowflake
Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, Web Services, Java Script, HTML.
Scheduling Tools: Autosys, Control-M
Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, Test Director, Clear test, Clear case.
PROFESSIONAL EXPERIENCE
Confidential, Lombard, IL
Sr. ETL/ Talend Developer
Responsibilities:
- Worked in the Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business-critical projects coming up with huge data extraction.
- Perform technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.
- Participate in designing the overall logical & physical Data warehouse/Data-mart data model and data architectures to support business requirements.
- Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
- Analyzing the source data to know the quality of data by using Talend Data Quality.
- Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal Solutions, and revise procedures and documentation as needed.
- Worked on Migration projects to migrate data from data warehouses on Confidential /DB2 and migrated those to Netezza.
- Used SQL queries and other data analysis methods, as well as Talend Enterprise Data Quality platform for profiling and comparison of data, which will be used to make decisions regarding how to measure business rules and quality of the data.
- Writing Netezza SQL queries to join or any modifications in the table.
- Used Talend reusable components like routines, context variable and global Map variables.
- Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and Query performance.
- Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
- Developed Confidential PL/SQL and Stored Procedures and worked on performance and fine tuning of SQL.
Environment: Talend 6.1/5.6, Informatica Power Center 9.6.1, Netezza, Confidential 12c, DB2, Aginity, Business Objects 4.1, SQL Server 2012, XML, HBase, Spark, Hive, Pig, SQL, PL/SQL, JIRA.
Confidential, Irvine, CA
Sr. ETL /Talend Developer
Responsibilities:
- Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
- Work on Data Migration using export/import.
- Created Talend jobs using the dynamic schema feature.
- Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
- Coordinated with the business to gather requirements and preparing Functional Specification document.
- Automated the process of extracting the data from Netezza tables into external flat files, loaded the data into AWS S3 cloud storage in compressed format (.csv).
- Several Redshift query performance improvement measures were taken to make sure that the time taken by the data load into Redshift process was faster .
- Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to use in the Transformations and development and production environment structures.
- Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
- Optimized the performance of the mappings by various tests on sources, targets and transformations.
- Involved in end-to-end testing of jobs.
- Loaded the data from some of the tables into AWS Redshift .
- Wrote complex SQL queries to take data from various sources and integrated it with Talend.
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
- Incorporated business logic for Incremental data loads on a daily basis.
- Written complex PL/SQL procedures for specific requirements.
- Used Shared folders for Source, Targets and Lookups for reusability of the objects.
- Scheduled the Informatica jobs from third party scheduling tool Autosys Scheduler.
- Involved/Migrated Informatica from 8.6 to version 9.6
- Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
- Platform: Informatica 9.6, DB2 UDB, UNIX, Autosys, SQL Server 2008.
Environment: Informatica Power Center 8.6.1, 9.6.1, Confidential 11g, SQL, PL/SQL, TOAD, MY SQL, Unix, Hive, HBase, Spark, Pig, Autosys, Xml, Flat files.
Confidential
Informatica Developer
Responsibilities:
- Extraction, Transformation and data loading were performed using Informatica into the database. Involved in Logical and Physical modeling of the database.
- Designed the ETL processes using Informatica to load data from Confidential , Flat Files to target Confidential Data Warehouse database.
- Based on the requirements created Functional design documents and Technical design specification documents for ETL.
- Created tables, views, indexes, sequences and constraints .
- Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
- Transferred data using SQL Loader to database.
- Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
- Implemented SCD methodology including Type 1 and Type 2 changes.
- Used legacy systems, Confidential, and SQL Server sources to extract the data and to load the data.
- Involved in design and development of data validation, load process and error control routines.
- Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
- Involved in ETL process from development to testing and production environments.
- Analyzed the database for performance issues and conducted detailed tuning activities for improvement.
- Generated monthly and quarterly inventory/purchase reports.
- Coordinated database requirements with Confidential programmers and wrote reports for sales data.
- Involved in creating database objects like tables, stored procedures, views, triggers, and user defined functions for the project which was working on.
- Analyze the client requirements and translate them into technical requirements.
- Gathered requirements from the end user and involved in developing logical model and implementing requirements in SQL server 2000.
- Data migration (import & export - BCP) from text to SQL server.
- Responsible for creating reports based on the requirements using reporting services.
- Identified the database tables for defining the queries for the reports.
- Worked on SQL server queries, stored procedures, triggers and joins.
- Defined report layouts for formatting the report design as per the need.
- Identified and defined the datasets for report generation.
- Formatted the reports using global variables and expressions.
- Deployed generated reports onto the report server to access it through browser.
- Maintained data integrity by performing validation checks.
Environment: Informatica Power Center7.1, Ora cle 9, Control-M, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script. MS SQL 2000, Windows server 2000, Control-M, SQL Query Analyzer and Enterprise, Manager, MS Access 2000 & Windows NT platform.