Sr. Etl/ Talend Developer Resume
Phoenix, AZ
SUMMARY:
- 8+ years of experience in Analysis, Design, Development, Testing, Implementation, Enhancement and Support of ETL applications which includes strong experience in OLTP & OLAP environments as a Data Warehouse/Business Intelligence Consultant.
- 4+ years of experience in Talend Open Studio (6.x/5.x) for Data Integration, Data Quality and Big Data.
- 2+years of experience with Talend Admin Console (TAC).
- Experienced in working in Hadoop and used several services HDFS, MapReduce, Hive, Sqoop, Flume, Hive, HBase, and MongoDB.
- Experience in dealing with structured and semi - structured data in HDFS.
- Experienced in Data Migration from MySQL Tables to JSON format.
- Experience working with Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table, Dimension Table, Logical Data Modeling, Physical Modeling, Dimension Data Modeling.
- Expertise in ETL designs involving Source databases Oracle, Flat Files (fixed width, delimited), DB2, SQL Server, and Target databases Oracle, Teradata and Flat Files.
- Develop the ETL mappings for XML, .CSV, .TXT sources and loading the data from these sources into relational tables with Talend ETL Developed Joblets for reusability and to improve performance.
- Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and loading using Talend and designed data conversions from wide variety of source systems.
- Extensively created mappings in Talend using tMap, tJoin, tReplicate, tParallelize, tJava, tJavarow, tDie, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap etc.
- Experience in Talend Administration Center (TAC) for scheduling and deployment.
- Experience in using AWS cloud components and connectors to make to make API calls for accessing data from cloud storage (Amazon S3, Redshift) in Talend Enterprise Edition.
- Worked on Talend DQ module for Data Profiling on source tables.
- Knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
- Expertise in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Union, Sequence Generator, Sorter, Rank, Normalizer etc.
- Strong Technical skills in range of technologies which includes Talend, Informatica, Teradata, Oracle, Sybase, DB2, COBOL, PL/SQL and Unix.
- Experience in writing database scripts such as SQL queries, PL/SQL Stored Procedures, Indexes, Functions, Views, and Triggers.
- Experience in Waterfall and Agile methodology.
- Performed data validation by Unit testing, integration testing and System Testing.
- Extensive experience in On Shore-Off Shore Coordination/Design Reviews/Code reviews/Implementing Standards.
- Extensive experience in J2EE platform including, developing both front end & back end applications using Java, Servlets, JSP, EJB, AJAX, Spring, Struts, Hibernate, JAXB, JMS, JDBC, Web Services.
- Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business customers and developers across multiple disciplines.
TECHNICAL SKILLS:
Data Warehousing: Talend Data Management Platform 6.x
Big Data: HDFS, Hive, Pig, Spark, HBase, Scala, Impala, Sqoop, Flume
Databases: Teradata, Mysql 5.7, Oracle 11g/10g/9i, MS SQL Server 2012/2008, Sybase, MS Access.
Tools: ERwin, ER Studio, Visio, Toad, Teradata SQL Assistant, SQL Plus, Data Studio, MS Office.
Methodologies: Kimball/Inmon Dimensional Data Modeling, Star Schema, Snowflake Schema, Agile, Waterfall.
Languages: SQL, T-SQL, PL/SQL, C, JAVA, UNIX Shell Scripting.
Operating Systems: Unix (AIX, HP-UX), LINUX, Windows
PROFESSIONAL EXPERIENCE:
Confidential, Boise, ID
Talend Big Data Developer
Responsibilities:
- Involved in various projects such as Data Migration, Data warehousing & Automation Projects.
- Load and transform data into HDFS from large set of structured data /Oracle/Sql server using Talend Big data studio.
- Used Big Data components (Hive components) for extracting data from hive sources.
- Wrote HiveQL queries using joins and implemented in tHiveInput component.
- Used DOM4J and SAX parsers to flatten xml files.
- Utilized Big Data components like tHiveInput, tHiveOutput, tHDFSOutput, tHiveRow, tHiveLoad, tHiveConnection, tOracleInput, tOracleOutput, tPreJob, tPostJob, tLogRow.
- Deployed and scheduled Talend jobs in Administration console and monitoring the execution
- Created separate branches with in the Talend repository for Development, Production and Deployment.
- Excellent knowledge with Talend Administration console, Talend installation
- Has developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend.
- Hands on experience on many components from the palette to design Jobs & used context variables to parameterize Talend jobs.
- Involved in a huge Data Migration from 80+ MySQL Tables to JSON format using Talend
- Involved in complex changes in the JSON using Talend.
- Stored MapReduce program output in Amazon S3 and developed a script to move the data to RedShift for generating a dashboard using QlikView
- Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.
- Configure Talend Administration Center (TAC) for scheduling and deployment of jobs.
- Involved in performance tuning of Talend jobs.
- Hands on experience in creating generic schemas, context groups and variables to run jobs against different environments like Dev, Test and Prod
- Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
- Implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
- Created standard and best practices for Talend ETL components and jobs.
- Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Data Management Platform.
- Worked with parallel connectors for parallel processing to improve job performance while working with bulk data sources in Talend.
Environment: Talend 6.3.1, Talend Data Management Platform/, Talend Administrator Console, HDFS, Hive, Impala, xml MySQL 5.7, Amazon S3, Redshift, UNIX Shell Scripting.
Confidential, Phoenix, AZ
Sr. ETL/ Talend Developer
Responsibilities:
- Worked closely with Business Analysts to review the business specifications of the project and also to gather the ETL requirements
- Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
- Analyzing the source data to know the quality of data by using Talend Data Quality.
- Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type2 capture the changes.
- Used components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
- Migrated on premise database structure to Amazon Redshift data warehouse
- Performance tuning - Using the tMap cache properties, Multi-threading and tParallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
- Extensively Used tMap component which does Joiner Functions, tJava, tOracle, tXml, tDelimtedfiles, tlogrow, tlogback components etc. in many of my jobs created and worked on over 100+ components to use in my jobs.
- Implemented File Transfer Protocol (FTP) operations using Talend Studio to transfer files in between network folders using Talend components like tftpConnection, tftpFilelist,tftpget and tftpput etc.
- Designed, developed and improved complex ETL structures to extract transform and load data from multiple data sources into data warehouse and other databases based on business requirements.
- Used custom code components like tJava, tjavarow and tjavaflex.
- Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and used them in the job.
- Experienced in using debug mode of Talend to debug a job to fix errors.
- Used TAC (Talend Administrator center) and implemented new users, projects, tasks within multiple different environments of TAC (Dev, Test, Prod, DR).
- Scheduling the ETL Jobs in TAC using file based and time-based triggers.
- Experience in Agile methodology.
Environment: Talend Enterprise for Big Data (V6.0.1, 5.6.2/5.6.1 ), UNIX, SQL, Hadoop, Hive, Oracle, Unix Shell Scripting, Microsoft SQL Server management Studio.
Confidential, Hartford, CT
ETL/Talend Developer
Responsibilities:
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
- Developed high level and detailed level technical and functional documents consisting of detailed design documentation function test specification with use cases and unit test documents
- Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
- Developed PL/SQL triggers and master tables for automatic creation of primary keys.
- Involved in Talend Data Integration, Talend Platform Setup on Windows and UNIX systems. .
- Created joblets in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
- Created complex mappings in Talend using tHash, tDenormalize, tMap, tJoin, tReplicate, tParallelize, tJava, tjavarow tUniqueRow.tPivotToColumnsDelimited as well as custom component such as tUnpivotRow.
- Used tStatsCatcher, tDie, tLogRow, tDie, tWarn, tLogCatcher, to create a generic joblet to store processing stats into a Database table to record job history.
- Created Talend Mappings to populate the data into dimensions and fact tables.
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Implemented custom error handling in Talend jobs and also worked on different methods of logging..
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- Developed error logging module to capture both system errors and logical errors that contains Email notification and also moving files to error directories.
- Created Talend ETL job to receive attachment files from pop e-mail using tPop, tFileList, tFileInputMail and then loaded data from attachments into database and archived the files
- Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
- Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
- Involved in UNIX shell scripting and configuring jobs in Control M.
- Implemented Error handling in Talend to validate the data integrity and data completeness for the data from the flat file.
- Created jobs and job variable files for Teradata TPT and load using tBuild command from command line.
- Implemented agile development methodology using XP, Scrum and Kanban/Continuous Flow.
- Created FTP scripts and Conversion scripts to convert data into flat files to be used for Talend jobs.
Environment: Talend 6.0.1/5.5, Oracle 11g, Teradata V 13.0, Teradata SQL Assistant, MS SQL Server 2012/2008, DB2, TOAD, ERwin, AIX, Shell Scripts.
Confidential
Java/J2EE Programmer
Responsibilities:
- Analyzed Business Requirements and Identified mapping documents required for system and functional testing efforts for all test scenarios.
- Performed Requirement Gathering & Analysis by actively soliciting, analyzing and negotiating customer requirements and prepared the requirements specification document for the application using Microsoft Word.
- Developed Use Case diagrams, business flow diagrams, Activity/State diagrams.
- Adopted J2EE design patterns like Service Locator, Session Facade and Singleton.
- Configuration of application using spring, Hibernate, DAO's, Actions Classes, Java Server Pages.
- Configuring Hibernate.
- Developed the application using Spring Framework that uses Model View Controller (MVC) architecture with JSP as the view.
- Developed presentation layer using JSP, HTML and CSS, JavaScript.
- Extensively used Spring IOC for Dependency Injection.
- Design and develop pricing region services using Oracle.
- Used JavaScript for the logical implementation part at client side for all the application.
- Developed Servlets and Java Server Pages (JSP), to route the submittals to the EJB components and render-retrieved information using Session Facade, Service Locator (design pattern).
- Developed J2EE components on Eclipse IDE.
- Used JDBC to invoke Stored Procedures and also used JDBC for database connectivity to SQL.
- Deployed the applications on Web Sphere Application Server.
- Used Oracle11g database for tables creation and involved in writing SQL queries using Joins and Stored Procedures.
- Writing complex SQL queries and reviewing SQL queries for the other team members.
- Developed JUnit Test Cases for Code unit test.
- Deployed the application on Web Sphere Application Server6.0 (WAS)
Environment: J2EE, Java 1.7, Spring framework, Hibernate, JSP 2.0, JSR303, JDBC, AJAX, JAX-WS Web services, SOAP, XML, Java Beans, Apache Axis2, JQuery, JavaScript, AngularJS, IBM DB2, IBM Pure Query.
Confidential
JAVA Developer
Responsibilities:
- Responsible to create applications using Java (Core Java, J2EE, Spring, Hibernate).
- Used various Core Java concepts such as Multi-Threading, Exception Handling, Collection APIs to implement various features and enhancements.
- Created and injected Spring services, Spring controllers and DAOs to achieve dependency injection and to wire objects of business classes.
- Used Hibernate, object/relational-mapping (ORM) solution, technique of mapping data representation from MVC model to Oracle Relational data model with a SQL-based schema.
- Created Front-end Applications using HTML, CSS, JavaScript, XML and JSON.
- Contribute to the design direction of our product set.
- Worked with next-generation technologies like AJAX and jQuery to enable more efficient development and more responsive interfaces.
- Contributed positively to the overall team dynamic; participate in stand-up meetings, planning and design Sessions and other business-development working.
- Implemented page designs in standards-compliant HTML and CSS.
- Designing and implementing functionality using technologies including JavaScript, AJAX, and JavaScript Frameworks such as JQuery.
- Created cross-browser compatible and standards-compliant CSS based page layouts.
- Successfully implanted entire Invoice Module using Oracle Business Intelligence (BI) publisher
- Under the guidance of Oracle Consultant successfully implemented JAVA executable code to produce Invoices and automatically send it to respective customer.
- Successfully developed unique custom opcodes for Balance Inquiry and Balance Transfer with zero defects.
- Improved the Functionality of default invoicing Opcodes to optimize and enhance the overall Invoicing
- Process of Oracle product: Billing and Revenue Management (BRM).
- Highly involved in making updates to the current website and simultaneously handling other projects.
Environment: Java/J2EE, Oracle 10g, SQL, PL/SQL, JSP, EJB, Struts, Hibernate, WebLogic 8.0, HTML, AJAX, Java Script, JDBC, XML, JMS, XSLT, UML, JUnit, Log4j, Eclipse 6.0