Sr. Talend Developer Resume
Nj Sr Talend Developer, NJ
SUMMARY
- Around 8+ years of total IT experience in all the phases of the Data warehouse life cycle involving Requirement Gathering, design, development, analysis, Documentation & testing of Data warehouses using ETL, Data Modeling, Online Analytical Processing.
- Experience in development and design of ETL methodology for supporting data transformations & processing in a corporate wide ETL Solution using Informatica Power Center, SQL, PL/SQL and UNIX.
- Solid experience in implementing complex business rules by creating re - usable transformations and robust mappings/Mapplets using various transformations like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
- Experience in modifying Visual force pages to be supported in Lightning Experience and good understanding of Lightning mode and its features
- Experience in Data Warehousing and Worked on various projects involving Data warehousing using Informatica Power Center (Workflow Manager, Workflow Monitor, Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer & Transformation Developer).
- Hands on Involvement on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
- Experienced in ETL Talend Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
- Worked on Lightning Process builder flows, Connect API, Chatter and quick Action
- Tracking Daily Data load, Monthly Data extracts and send to client for their verification.
- Strong experience in designing and developing Business Intelligence solutions in Data Warehousing using ETL Tools.
- Excellent understanding and best practice of Data Warehousing Concepts involved in Full Development life cycle of Data Warehousing.
- Experienced in analyzing, designing and developing ETL strategies and processes, writing ETL specifications.
- Involved in extracting user's Data from various Data sources into Hadoop Distributed File Systems (HDFS)
- Experience with MapReduce, Pig, Programming Model, Installation and Configuration of Hadoop, HBase, Hive, Pig, Sqoop and Flume using Linux commands.
- Experienced in using Talend Data Fabric tools (Talend DI, Talend MDM, Talend DQ, Talend Data Preparation, ESB, TAC).
- Experienced in working with different data sources like Flat files, Spreadsheet files, log files and Databases.
- Knowledge in Data Flow Diagrams, Process Models, E-R diagrams with modeling tools like ERwin & ERStudio.
- Experience in AWS S3, RDS (MySQL) and Redshift cluster configuration.
- Extensive experience in J2EE platform including, developing both front end & back end applications using Java, Servlets, JSP, EJB, AJAX, Spring, Struts, Hibernate, JAXB, JMS, JDBC, Web Services.
- Strong Understanding of Data Modeling (Relational, dimensional, star and snowflake schema) Data analysis implementation of Data Warehouse using Widows and Unix.
- Extensive Experience in, Functions, Developing Stored Producers Views and Triggers, complex queries using SQL Server.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Worked in all phases of BW/BI full life cycles including Analysis, Design, Development, Testing, Deployment, Post-Production Support/Maintenance, Documentation and End-User Training.
- Highly Proficient in Agile, Test Driven, Iterative, Scrum and Waterfall software development life cycle.
- Highly motivated with the ability to work effectively in teams as well as independently.
TECHNICAL SKILLS
ETL Tools: Enterprise Manager, Informatica Power Center 9.5/9.1/8.6.1/ 8.1.1/7.1.2/6.2 , SSIS, Profiler, Autosys, Talend 6.2.1/6.0.1 , Talend OpenStudio, BigData/DQ/DI, Talend Administrator Console
Programming Languages: SQL, T-SQL, PL/SQL, C, C++, Java, VC++.Net, VB Script
Databases: MS-SQL Server 2000/2005/2008/2012/2014 , SAP Hana, Oracle, Redshift, Snowflake
Operating System: Win XP Prof/Standard, Win 98, and UNIX.
Reporting Tools: MS SQL Server Reporting Services, Tableau 8/9, Crystal Reports, Excel, Macros
PROFESSIONAL EXPERIENCE
Confidential, NJ
Sr. Talend Developer
Responsibilities:
- Has developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend.
- Worked in the Data Integration Team to perform data and application integration with a goal of moving high volume data more effectively, efficiently and with high performance to assist in business-critical projects.
- Interacted with Solution Architects and Business Analysts to gather requirements and update Solution Architect Document Created mappings and sessions to implement technical enhancements.
- Deployed and scheduled Talend jobs in Administration console and monitoring the execution
- Created separate branches with in the Talend repository for Development, Production and Deployment.
- Excellent knowledge with Talend Administration console, Talend installation, using Context and global map variables in Talend.
- Review requirements to help build valid and appropriate DQ rules and implement DQ Rules using Talend DI jobs.
- Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle, ActiveMQ, Redshift, Snowflake.
- Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.
- Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification.
- Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing.
- Created Generic and Repository schemas.
- Performed Data Manipulations using various Talend Components like tMap. tjavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
- Implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
- Created standard and best practices for Talend ETL components and jobs.
- Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio.
- Worked on HIVE QL to get the data from hive database and worked on Redshift, Snowflake database
- Responsible for developing data pipeline with Amazon AWS to extract the data from weblogs and store in HDFS
- Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.
- Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
- Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query performance.
- Configure Talend Administration Center (TAC) for scheduling and deployment.
- Create and schedule Execution Plans - to create Job Flows
- Worked with production support in finalizing scheduling of workflows and database scripts using AutoSys.
Environment: Talend 6.1/5.6, Netezza, Redshift, IBM DB2, TOAD, Aginity, BusinessObjects 4.1, MLOAD, SQL Server 2012, XML, SQL, Hive, Pig, SQL, PL/SQL, HP ALM, JIRA, Amazon EC2, Apache Hadoop 1.0.1, MapReduce, HDFS, CentOS 6.4, HBase, Kafka, Scala, Elastic Search, Hive, Pig, Oozie, Flume, Java (jdk 1.6), Eclipse, Sqoop, Ganglia, snowflake
Confidential, Topeka, KS
Talend Developer
Responsibilities:
- Worked in the Data Integration Team to perform data and application integration with a goal of moving high volume data more effectively, efficiently and with high performance to assist in business-critical projects.
- Excellent knowledge with Talend Administration console, Talend installation, using Context and global map variables in Talend.
- Review requirements to help build valid and appropriate DQ rules and implement DQ Rules using Talend DI jobs.
- Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle, ActiveMQ.
- Worked on Salesforce Lightning Components for building customized components replacing the existing ones
- Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.
- Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification.
- Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing.
- Created Generic and Repository schemas.
- Performed Data Manipulations using various Talend Components like tMap. tjavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
- Implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
- Created standard and best practices for Talend ETL components and jobs.
- Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio.
- Worked on HIVE QL to get the data from hive database.
- Created many Lightning Components and server-side controllers to meet the business requirements
- Responsible for developing data pipeline with Amazon AWS to extract the data from weblogs and store in HDFS.
- Involved in developing Salesforce Lightning Apps, Components, Controllers, events and Skilled in understanding and implementing the new Salesforce Lightning Experience
- Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.
- Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
- Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query performance.
- Configure Talend Administration Center (TAC) for scheduling and deployment.
- Create and schedule Execution Plans - to create Job Flows
- Worked with production support in finalizing scheduling of workflows and database scripts using AutoSys.
Environment: Talend 6.1/5.6, Netezza, Oracle 12c, IBM DB2, TOAD, Aginity, BusinessObjects 4.1, MLOAD, SQL Server 2012, XML, SQL, Hive, Pig, SQL, PL/SQL, HP ALM, JIRA, Amazon EC2, Apache Hadoop 1.0.1, MapReduce, HDFS, CentOS 6.4, HBase, Kafka, Scala, Elastic Search, Hive, Pig, Oozie, Flume, Java (jdk 1.6), Eclipse, Sqoop, Ganglia, Hbase.
UNIX Shell Scripting.
Confidential, SanJose, CA
ETL Developer
Responsibilities:
- Acquire and interpret business requirements, create technical artifacts, and determine the most efficient/appropriate solution design, thinking from an enterprise-wide view.
- Worked in the Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business-critical projects coming up with huge data extraction.
- Perform technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.
- Participate in designing the overall logical & physical Data warehouse/Data-mart data model and data architectures to support business requirements
- Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tMysqlRow, tMysqlInput, tMysqlOutput, tMSSQLInput and many more.
- Analyzing the source data to know the quality of data by using Talend Data Quality.
- Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
- Worked on Migration projects to migrate data from data warehouses on SQL Server and migrated those to MySQL.
- Used SQL queries and other data analysis methods, as well as Talend Enterprise Data Quality Platform for profiling and comparison of data, which will be used to make decisions regarding how to measure business rules and quality of the data.
- Worked in optimizing the SQL queries for MySQL (5.7) to support Micro strategy reports
- Worked on develop jobs and scheduled jobs in Talend integration suite.
- Writing MySQL queries to join or any modifications in the table.
- Used Talend reusable components like routines, context variable and global Map variables.
- Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.
- Developed Talend ESB services and deployed them on ESB servers on different instances.
- Implementing fast and efficient data acquisition using Big Data processing techniques and tools.
- Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
- Worked on Complex Data Migration from MySQL tables to JSON Structure.
- Worked on History Data Migration from various data sources like from SQL Server, Oracle to Mysql Tables Using Talend.
Environment: Talend 6.3.1/,TAC, Mysql, TOAD, Aginity, SQL Server 2012, XML, SQL, Hive, Pig, SQL, PL/SQL, HP ALM, JIRA.
Confidential, Boston, MA
Administrator (Linux, Solaris)
Responsibilities:
- Installation, configuration and Operating System upgrade on Sun Solaris 8,9,10 on x86 server and Red Hat Linux 6.x on sun Servers & HP-Dl585 servers.
- Installation of patches, security fixes, e-fixes, packages on AIX, Sun Solaris, and Red Hat Linux
- Performed basic network troubleshooting via various built-in command line tools.
- Monitoring and managing logs on multiple platforms (Solaris 10 and RHEL) using command line tools including processes, swap management for performance tuning.
- Writing Shell scripts for system maintenance and file management and python scripts, Ruby.
- Strong skills in OPENSTACK installation with as a open-source cloud-software, we managed large pools of compute, storage, and networking resources throughout a datacenter.
- Good Experience and Understand Open Stack ecosystem, engage in discussions with the Open Stack and Amazon Cloud service,
- Automate everything. Using Ansible, Puppet/puppet Enterprise, chef and other tools to manage deployments and configurations
- Magento, Magento extensions, backend, SEO platforms, networking and server system. Integrated ecommerce APIs, inventory, payment systems and social media APIs such as EBay, Amazon API integration with Magento
- Experience with security practices, including PCI-DSS compliance of web ecommerce systems.
- Performed installation, configuration and maintenance of Red Hat Linux 4.x, 5.x, 6x, performed remote installation using Jumpstart, NIM and Kick-start Installation
- Working Experience on Web Logic and Exalogic on different environment.
- Assisted in development and execution of RPM packages for installation of in house applications.
- Provided technical assistance for analysis and complete automation of business processes by usage of shell scripts.
- Coordinated with application development teams for preparation of shell scripts and RPM packages.
- Installed, configured and maintained VIO server on IBM AIX 5.3 based P-Series 570/ 550 servers.
- Expertise in Brocade and Confidential Fabric switches configuration and management and installation,troubleshooting.
- Experience Virtualizing Sues servers using vSphere 4.1 and managed ESX servers using vCenterand worked with ESX Cluster and motion.
- Maintaining CPU, Memory resources on Micro partition VIO Server and Client’s environments.
- Experience in setting up an Open stack for Amazon Cloud service in IT organizations as to Infrastructure-as-a-Service (IaaS) in Red Hat servers.
- Participated in complete process of functionality testing of all versions of AutoSys applications.
- Communicated with team members for regular updating of software product development packages for Solaris and Linux systems.
- Implemented processes for development of application installation methods for SUN, AIX and Windows applications
- Coordinated with technical teams for management of software upgrading and migration projects.
- Experience on migration from EMC to Hitachi.
- Configuring and Administering DNS and LDAP on AIX server environment.
- Performed Red Hat Linux kernel, memory upgrades and undertook Red hat Linux Kick-startinstallations. Attended the ITIL training and implemented in projects.
- Setup of full networking services and protocols on UNIX, including NIS/NFS, DNS, SSH, DHCP, NIDS, TCP/IP, ARP, applications, and print servers to insure optimal networking, application, and printing functionality
Environment: Red Hat Enterprise Linux 4.x/5.x,6x,SUSE Linux 9/10, Tivoli Storage Manager, Oracle 9i, Logical Volume Manger for Linux and VMwareESX Server 2.x, Samba 3.0, Veritas Net backup, Apache 2.0, HACMP, HMC, ILO, RAID, WebLogic, IBM P570, P590, P595, Hitachi disk Arrays.
Confidential, Richardson, TX
Linux Administrator
Responsibilities:
- Worked with IBM AIX 5.1 under P Series and RS6000 system environment.
- IBM AIX 5.1 System installation and configuration, problem determination, solutions design and implementation, Maintenance, Performance tuning, disk mirroring using LVM, backup, disaster recovery, trouble shooting and user management.
- Writing Shell scripts for system maintenance and file management and python scripts
- Worked as Red Hat Linux/Unix System Administrator for UNIX Engineering Group. Handling 150 servers. Deployed My SQL database platforms on various production environment.
- Experience on web-Logic Works - VG4 - about 135 Linux environments.
- WebLogic Server clusters can be configured with cluster-wide optimizations that further
- Experience Virtualizing Sesu servers using vSphere 4.1 and managed ESX servers using vCeneter and also worked with ESX Cluster and vMotion.
- Expertise in Linux command line
- Maintain & troubleshoot SAN and Fiber card related errors on AIXMaintain & troubleshoot Storage Disks from IBM, HITACHI, and EMC Disk arrays
- Installed SUSE Linux through Autoyast via NFS and HTTP on Dell Servers
- Installed and worked on VMware Workstation 5.0.0 & installed VMware tools.
Environment: Red Hat Linux 3.x/4.x/5.x, IBM AIX 4.0/5.0/5.1/5.2 , LPAR, HACM 4.2 X/ 4.34.3.1/ 4.4 , Tivoli Storage Manager, Apache Server 1.x/2.x, VMware 2.x,NIM,Veritas, Net backup, Boss, WebLogic 5.1/ 6.1/ 7.0/ 8.1. Hitachi, EMC.
