We provide IT Staff Augmentation Services!

Sr. Talend Developer Resume

2.00/5 (Submit Your Rating)

Pekin, IL

SUMMARY:

  • Result - driven IT professional with around 5+ years of experience in Talend Open Studio (7.x/6.x/5.x) for Data Integration, Data Quality and Big Data with involvement in Analysis, Design, Development, Testing, Implementation, Enhancement and Support of ETL applications which includes strong experience in OLTP & OLAP environments.
  • Experience in working with Data Warehousing Concepts like OLAP, OLTP, Star Schema, Snow Flake Schema, Logical Data Modelling, Physical Modelling and Dimension Data Modelling.
  • 4+ years’ experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
  • Widespread experience in using Talend features such as context variables, triggers, connectors for Database and flat files.
  • Hands on Involvement on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
  • Experienced in ETL Talend Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
  • Tracking Daily Data load, Monthly Data extracts and send to client for their verification.
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehousing using ETL Tools.
  • Excellent understanding and best practice of Data Warehousing Concepts involved in Full Development life cycle of Data Warehousing.
  • Experienced in analysing, designing and developing ETL strategies and processes, writing ETL specifications.
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Involved in extracting user's Data from various Data sources into Hadoop Distributed File Systems (HDFS).
  • Experience with MapReduce, Pig, Programming Model, Installation and Configuration of Hadoop, HBase, Hive, Pig, Sqoop and Flume using Linux commands.
  • Experienced in using Talend Data Fabric tools ( Talend DI, Talend MDM, Talend DQ, Talend Data Preparation, ESB, TAC).
  • Experienced in working with different data sources like Flat files, Spreadsheet files, log files and Databases.
  • Knowledge in Data Flow Diagrams, Process Models, E-R diagrams with modelling tools like Erwin & ER/Studio.
  • Strong Understanding of Data Modelling (Relational, dimensional, star and snowflake schema) Data analysis implementation of Data Warehouse using Widows and Unix.
  • Extensive Experience in, Functions, Developing Stored Producers Views and Triggers, complex queries using SQL Server.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Worked in all phases of BW/BI full life cycles including Analysis, Design, Development, Testing, Deployment, Post-Production Support/Maintenance, Documentation and End-User Training.
  • Highly Proficient in Agile, Test Driven, Iterative, Scrum and Waterfall software development life cycle. Highly motivated with the ability to work effectively in teams as well as independently.

PROFESSIONAL EXPERIENCE:

Sr. Talend Developer

Confidential - Pekin, IL

  • Developed ETL Processes to Facilitate Data Archival Activities from a Variety of Source Databases into a Central Repository Utilizing Primarily Talend Data Integration Set of Tools.
  • Worked in Data Integration team to perform data and application integration with goal of moving high volume data effectively with high performance to assist in business-critical projects.
  • Developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend.
  • Deployed and scheduled Talend jobs in Administration console and monitoring the execution.
  • Created separate branches with in Talend repository for Development, Production and Deployment.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Excellent knowledge with Talend Administration console, Talend installation, using Context and global map variables in Talend.
  • Review requirements to help build valid and appropriate DQ rules and implement DQ Rules using Talend DI jobs.
  • Created Snowflake Schemas by normalizing the dimension tables as appropriate and creating a Sub Dimension named Demographic as a subset to the Customer Dimension.
  • Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle, ActiveMQ.
  • Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.
  • Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification.
  • Transformed data while loading it into table with the help of Snowflake.
  • Configure Talend Administration Centre (TAC) for scheduling and deployment.
  • Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing.
  • Simplified ETL for basic transformations to store pre-transformed data when reordering columns during a data load.
  • Performed Data Manipulations using various Talend Components like tMap, tjavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
  • Created standard and best practices for Talend ETL components and jobs.
  • Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio.
  • Implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tAggregateRow, tUnite etc.
  • Responsible for developing data pipeline with Amazon AWS to extract the data from weblogs and store in HDFS.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query performance.

ETL/Talend developer

Confidential - Houston, TX

  • Worked closely with Business Analysts to review the business specifications of the project and to gather the ETL requirements.
  • Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
  • Created Talend jobs to copy file from one server to another and utilized Talend FTP components.
  • Used ETL methodologies and best practices to create Talend ETL jobs.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.
  • Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.
  • Develop stored procedures/views in Snowflake and use in Talend for loading Dimensions and Facts.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more).
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Automated SFTP process by exchanging SSH keys between UNIX servers.
  • Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.
  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Implemented new users, projects, tasks within multiple different environments of TAC (Dev, Test, Prod, and DR).
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and worked on different methods of logging.
  • Created ETL/Talend jobs both design and code to process data to target databases.

Talend Developer

Confidential - Marietta, GA

  • Interacted with Solution Architects and Business Analysts to gather requirements and update Solution Architect Documents.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Responsible to build the jobs by looking to the ETL Specification documents.
  • Created and managed Source to Target mapping documents for all Facts and Dimension tables.
  • Analysing the source data to know the quality of data by using Talend Data Quality.
  • Involved in writing SQL Queries and used Joins to access Data from Oracle, and MySQL.
  • Assisted in migrating the existing data centre into the AWS environment.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis.
  • Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion, and target.
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema.
  • Created Context Variables and Groups to run Talend jobs against different environments.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.

Talend Developer

Confidential - Union Beach, NJ

  • Involved in the preparation of documentation for ETL, standards, procedures and naming conventions.
  • Responsible for profiling the source systems data using Different DQ Techniques.
  • Design and developed end-to-end ETL process from various source systems to the Staging area, from staging to Data Marts.
  • Responsible for developing DI jobs to implement address validations, cleans and standardization on Talend ETL with different components like tRecordMatching, tFuzzyMatch, tMatchGroup and other components like DI, DP, DQ using features like context variables, database components.
  • Responsible for understanding & deriving the new requirements from Business Analysts/Stakeholders.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis.
  • Responsible for Data Ingestion to inject data into Data Lake using multiple sources systems using Talend Bigdata.
  • Assisted gathering business requirements and worked closely with various Application and Business teams to develop Data Model, ETL procedures to design Data Warehouse.
  • Designed and developed star schema model for target database using ERWIN Data modelling.
  • Extensively used ETL Informatica tool to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.
  • Responsible for developing the modelling using Talend MDM at the same time responsible to develop the DI jobs to populate the data in REF/XREF tables and to create the data stewardship tasks.
  • Experienced in using debug mode of Talend to debug a job to fix errors.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Responsible for injecting the data from multiple source systems to Hive.
  • Responsible for run the Talend jobs using TAC.
  • Responsible for understanding & deriving the new requirements from Business Analysts/Stakeholders.
  • Analysed and performed data integration using Talend Cloud hybrid integration suite.

We'd love your feedback!