We provide IT Staff Augmentation Services!

Sr. Talend Developer Resume

Boston, MA

SUMMARY

  • 7+ years of IT experience in Analysis, Design, Implementation, and maintenance of Business Applications specializing in Data warehouse/Data Mart development using Talend and Informatica.
  • Experienced in leading and developing overall implementation of Big Data projects by using Talend BIGDATA, comprehensive experience in Hadoop Ecosystem such as Map Reduce, Hadoop Distributed File System (HDFS), and Hive.
  • Extensive experience in ETL methodology for defining and changing business logic using Data Profiling, Data Migration, Extraction, Transformation and Loading on Talend.
  • 5 years of experience in using Talend Data Integration tool as applied to BI data analytics, reporting and dashboard projects. 3+ years of experience with Talend Open Studio, Talend Cloud Hybrid Studio & Talend Enterprise platform for Data Management (V 6.1, 6.3, 6.5,7.1) and Data Integration
  • Experienced in developing complex jobs with web API services such as REST API and SOAP API.
  • Hands on Experience with relational database management systems, experienced in integrating data from various data source such as Oracle, MSSQL Server, MySQL and Flat files.
  • Designed jobs using various data conversions for a wide variety of source systems including Oracle, DB2, SQL server, Teradata, Hive, Hana, and flat files, XML and Mainframe files and Active MQ.
  • Experience with NOSQL databases such as HBASE and Cassandra
  • Hands on experience with Hadoop technology stack (HDFS, Map - Reduce, Hive, HBase, Pig, Sqoop, Oozie, Flume and Spark).
  • Excellent knowledge in deployment processes from DEV to QA, UAT and PROD with both Deployment group and Import/Exports method.
  • Experienced in all phases of SDLC-Agile/Scrum from gathering requirement, development, implementation, testing and support, familiar with design and implementation of the Data Warehouse life cycle and excellent knowledge on entity-relationship/multidimensional modeling (star schema, snowflake schema), Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).
  • Experience in transforming Store Procedure logic into ETL requirements.
  • Experienced with dealing with Debugging ETL jobs errors, ETL Sanity and production Deployment in TAC-Talend Administrator Console using SVN.
  • Experience in Big Data technologies such as Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, DynamoDB, Elastic Search and Spark SQL.
  • Experienced in implementing Performance tuning and Trouble shooting at various levels such as Source, Target, Mapping Session and System in ETL Processes.
  • Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle … DB2, Netezza, SQL server, Teradata, Hive, Hana and non-relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Hands on Experiences with DataStage 8.7 ETL migration to Talend Studio ETL process.
  • Experienced with Data Warehousing ETL experience using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Expertise in Using transformations such as Joiners, Expressions, Connected and Unconnected lookups, Filters, Aggregators, Rank, Update Strategy, Java Transformations, Router and Sequence generators.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Hands on experience in developing and Deployment of DTS and SSIS packages.
  • Extensive experience in analyzing functional data mart and data warehouse requirements for BI and designing Star/ Snowflake Schemas.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Extensive working experience in Normalization and De-Normalization techniques for both OLTP and OLAP systems in creating Database Objects like tables, Constraints and Indexes using DDL, DML, DCL commands

TECHNICAL SKILLS

Development Tools: Talend7.1.1, Talend6.2, Talend ESB 6.2, Confidential, SQL Server (2005/2008/2008 R2/2012), DB2, Postgres, MS Access, NoSQL Databases (HBase). Putty, SQL Server Management Studio, Kafka Topic, File Zilla, SQL Developer, informatica, Eclipse, Toad, Teradata SQL Assistant, Agnity Workbench for Netezza, Quality Center, Harvest, Kafka Topic, SSIS

Methodologies: Star/Snowflake Schema, Relational Modeling, OLAP, Dimensional Modeling, Agile, Jira, Waterfall.

Languages: T-SQL, Core Java, SQL, PL/SQL, C, C++

Reporting Tools: SSRS, Powerbi, Tableau

Other Tools: MS Project, SQL Profiler, Toad, TFS 2010, AWS, Azure.

Operating Systems: Windows XP, Windows 7, Windows 8, Win 2003/2008/2008 R2, UNIX (Solaris, HP, Linux)

PROFESSIONAL EXPERIENCE

Confidential, Boston, MA

Sr. Talend Developer

Responsibilities:

  • Worked on Talend 7.1.1 Enterprise Edition
  • Wrote Talend routines on Java
  • Used FTP and SFTP Protocols to connect and acquire source data.
  • Hands on experience working on NoSQL databases like Hive, PostgreSQL and Casandra.
  • Involved in creating deploying Talend Jobs as well as preparing the design of documents and technical specification documents.
  • Using TJava wrote custom code to load source data into MongoDB.
  • Implemented Data Quality Checks for the Incoming Files using the MetaServelts.
  • Executed Wrapper Script in Talend to connect to AWS Server.
  • Worked with AWS S3 Buckets to acquire Json Files.
  • Designed Documents depicting the flow of data from source to Target.
  • Used Data Profiling to check the complexity of data from the source.
  • Implemented Data Quality checks of the incoming data before development.

Skills Used: Talend 7.1.1 Enterprise Edition, MDM, Oracle BI, XML files, Flat files, JSON, REST API, SOAP API, ZENA, EBX, Agile Methodology, AWS

Confidential

Sr. ETL Talend Consultant

Responsibilities:

  • Collaborated with Data Integration Team to move data more efficiently, and with elevated performance to assist in business- critical projects requiring massive data extraction.
  • Performed technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.
  • Explored previous ETL metadata and mappings, Developed and maintained SQL code as needed for SQL server and Oracle.
  • Created e-text rtf Templates in Oracle BI to publish the extracted data from oracle.
  • Worked on web services using Talend components like tSOAP, tREST, tWebService, tWebService Input etc.
  • Examined the staging area for profiling and comparison of data, which will be used to make decisions regarding how to measure business rules and quality of the data.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDD, Scala and Python.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput,
  • Implemented migration and data integration processes with Talend Big Data Integration Suite 6.5 as a part of migration from SSIS to Talend, with destination database from SQL to AWS Redshift and new applications development in ETL.
  • Troubleshooted data integration issues and bugs, analyzed reasons for failure, implemented optimal solutions, and revised procedures and documentation as needed.
  • Developed ETL framework for Data Masking, Audit, Balance, Control, Validation architecture etc.
  • Worked on end-to-end development of software products from requirement analysis to system study, designing, coding, testing (Unit & Performance), documentation and implementation.
  • Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart
  • Provided Production Support by running the jobs and fixing the bugs.
  • Implemented Kafka to read Json messages from the provided Kafka topic.
  • Worked on Zena Scheduling jobs, Server Monitoring, Task Creation, Plan Creation, Job Deployment etc.
  • Participated and contributed to code reviews, shared modules, investigated reusable code.
  • Experienced with Agile using JIRA software, and Confluence JIRA for Source Code Control, project related document sharing and team collaboration.

Skills Used: Talend Big Data 6.4.1, Talend, MDM, Oracle BI, XML files, Flat files, JSON, REST API, SOAP API, ZENA, EBX, Agile Methodology, AWS

Confidential

Sr. ETL Talend Consultant

Responsibilities:

  • Created complex mappings in Talend 6.2 using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, tHiveIput, tHiveOutput etc.
  • Generated complex ETL processes to transfer data from different sources like Oracle, SQL Server, and Excel files and load into Pre-Staging and Staging database for populating production.
  • Designed ETL process using Talend Tool to load from various Sources to Targets and performed various data Transformations wherever required as per business need.
  • Used Talend Platform for Big Data to perform ETL on data ingestion process.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema.
  • Developed complex ETL jobs from various sources such as SQL server, Postgres SQL and other files and loaded into target databases using Talend OS ETL tool.
  • Integrated java code inside Talend studio by using components such as tJavaRow, tJava, tJavaFlex and Routines.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Performed transformations, cleaning and filtering on imported data using Hive, MapReduce, and loaded final data into HDFS.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig HBase database and Sqoop.
  • Excellent knowledge of NOSQL on Mongo and Cassandra DB.
  • Handled importing of data from various data sources using Sqoop, performed transformations using Hive, MapReduce and loaded data into HDFS.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Developed data validation rule in the Talend MDM to confirm the golden record.
  • Developing UNIX shell scripts for automating and enhancing streamlining existing manual procedures.
  • Data migration from relational (Oracle. Teradata) databases or external data to HDFS using Sqoop and Flume & Spark.
  • Designed both Managed and External tables in Hive to optimize performance.
  • Creating complex user provisioning for the company employees to run on daily basis.
  • Experienced in working with TAC(Talend administration Console).
  • Developed error logging module to capture both system errors and logical errors that contains Email notification and moving files to error directories.
  • Debugged the Errors at Control and Data flow level in SSIS Packages and made modifications wherever necessary.
  • Used various transformations such as Update Strategy, Joiner, Stored Procedure, Union, Filter, Expression, Sequence Generator to develop robust mappings.
  • Developed mapping parameters and variables to support SQL override.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Created Talend jobs and made sure to load the data into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables
  • Worked on different tasks in Workflows such as sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer, and scheduling of the workflow.
  • Unit testing, code reviewing, moving in UAT and PROD.
  • Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Working with high volume of data and tracking the performance analysis on Talend job runs and session.
  • Developed detailed Unit Test Document with all possible Test cases/Scripts.
  • Conducted code reviews developed by my teammates before moving the code into QA.
  • Experience in Batch scripting on windows, Windows 32 bit commands, Quoting, Escaping.
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Help with the design solution and planning by identify the business problem and end user consumption to patterns.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository
  • Experience with building and deploying applications in AWS (S3, Hive, Glue, EMR, AWS Batch, Dynamo DB, Redshift, CloudWatch, RDS, Lambda, SNS, SQS etc.)
  • Experienced with building file-sourced ETL feeds on Talend Open Data Studio.

Skills Used: Talend Big Data 6.2, Talend MDM, Hive, Oracle 11G,AWS,XML files, Flat files, HL7 files, JSON, Talend Administrator Console, IMS, Agile Methodology.

Confidential

ETL Talend Developer

Responsibilities:

  • Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1and SCD-Type2.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Created Talend Jobs to retrieve data from Legacy sources and to retrieve user data from the Flat files on monthly and weekly basis.
  • Written Hive Queries to fetch Data from HBase and transferred to HDFS through HIVE.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Developed ETL mappings for XML, CSV, TXT sources and loading the data from these sources into relational tables with Talend ETL Developed Jobs for reusability and to improve performance.
  • Imported the data from different sources like HDFS/HBase into Spark RDD.
  • Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.
  • Migrated the code and release documents from DEV to QA (UAT) and to Production.
  • Troubleshooted, debugged & altered Talend issues, while maintaining the health and performance of the ETL environment.
  • Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.
  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
  • Experienced in Talend Data Integration, Talend Platform Setup on Windows and UNIX systems.
  • Created Jobs in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments
  • Developed jobs to move inbound files to vendor server location based on monthly, weekly and daily frequency.
  • Created jobs to perform record count validation and schema validation.
  • Created contexts to use the values throughout the process to pass from parent to child jobs and child to parent jobs.
  • Developed Jobs that are reused in different processes in the flow.
  • Developed error logging module to capture both system errors and logical errors that contains Email notification and also moving files to error directories.
  • Experienced in using Talend database components, File components and processing components based up on requirements.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Performed unit testing and also integration testing after the development and got the code reviewed.
  • Responsible for code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.

Skills Used: Talend 6.0.1, Oracle 11g, Hive, Sqoop, Teradata V 13.0, Fast load, Multi-load, Teradata SQL Assistant, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, T-SQL, SSIS, TOAD, Erwin, AIX, Shell Scripts, Autosys

Confidential

ETL Developer/SSIS

Responsibilities:

  • Involved in Data Model design and enhancements of Data Marts based on the ETL project requirements.
  • Created ETL design and process flow documents from Design Mapping Specification (DMS).
  • Led the group in analyzing DB requirements. Worked collaboratively with technical staff and business experts to develop solutions to business requirements and client problems.
  • Worked in a system with two different databases Oracle and SQL Server for legacy and current applications.
  • Developed the Complex T-SQL queries using Joins, Stored Procedures, Views, Triggers to implement the business rules and transformations.
  • Designed and developed ETL data models in data warehouse and data mart environments.
  • Involved in creation of users, groups, projects, workbooks and the appropriate permission sets for Tableau server logons and security checks.
  • Developed mappings to load Slowly Changing Dimensions Type-1 and Type-2, Fact tables.
  • Worked on Performance Tuning for packages. This included identifying data-quality issues and suggesting methods for their resolution.
  • Troubleshot various report issues including connection problems and data errors.
  • Worked on developing data designs and processing flows to support customer service functions.
  • Worked on different transformations like fuzzy lookup, conditional split, derived column transformations in SSIS packages.
  • Creating Tableau reports using different visualization to analyze trend.
  • Installation, Upgrade, maintenance, and support of all Tableau components
  • Deployed the SSIS packages and scheduled the package through Jobs in all Tiers including DEV, UAT and PROD, including setting the Configuration files in the specified tables and Created technical specs documents for all the enhancements.
  • Analyzed the data warehousing requirements with the Business Analyst team, Data Base team to configure the Data marts and Development environments.
  • Worked collaboratively with technical staff and business experts to develop solutions to business requirements and client problems.
  • Responsible for maintaining cubes using SSAS and populate the cubes with the data.
  • Created ad-hoc reports to users in Tableau by connecting various data sources
  • Used SSIS to create ETL packages to validate, extract, transform and load data-to-data warehouse databases, data mart databases and process SSAS cubes to store data to OLAP databases.
  • Generated daily, weekly, monthly reports for analysis purpose when the end user wants to see the report on the fly using SSRS

Skills Used: Microsoft SQL Server, T-SQL, SSRS, SSIS, MS Visio, MDX, Windows server, Oracle, Toad, SQL Developer, Tableau

Hire Now