We provide IT Staff Augmentation Services!

Hadoop And Talend Etl Developer Resume

5.00/5 (Submit Your Rating)

Birmingham, AL

SUMMARY:

  • A Hadoop, ETL and Talend developer with 8+ years of experience in Development and Production environments.
  • Extensive years of experience in developing ETL for enterprise data warehouse and BI reports
  • Extensive years of experience in installing, configuring, testing Hadoop ecosystem components.
  • Capable of processing large sets of structured, semi - structured and unstructured data and supporting systems application architecture.
  • Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.
  • Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience optimizing ETL workflows.
  • Expertise in Java/Big data technologies as an engineer, proven ability in project based leadership, teamwork and good communication skills.
  • Extensive years of experience in Talend Open Studio and Talend Integration Suite.
  • Excellent experience with Talend ETL and used features of Context Variables, Database components like tMSSQLInput, tOracleOutput, tmap, tFileCopy, tFileCompare, tSalesforceOutput, tSalesforceBulkExec, tSalesforceInput tFileExists file components, ELT Components etc.
  • Good knowledge of No-SQL databases- HBASE
  • Extensive years of experience on developing analytic reports and dashboard.
  • Worked on multiple projects in full life cycle of data warehouse. Extensively worked in developing programs for supporting data extraction, transformation and loading for data warehouse.
  • Strong business understanding of verticals like Banking, Brokerage, Mutual Funds and Telecom.
  • Independently perform complex troubleshooting, root-cause analysis and solution development.
  • Ability to meet deadlines, handle multiple tasks, team player, motivated, flexible in work schedule, good communication skill, able to grasp things quickly, analytical in problem solving.

PROFESSIONAL EXPERIENCE:

Hadoop and Talend ETL developer

Confidential -Birmingham, AL

Responsibilities:

  • Responsible for designing and implementing ETL process to load data from different sources, perform data mining and analyze data using visualization/reporting tools to leverage the performance of OpenStack.
  • Used several features of Talend such as tmap, treplicate, tfilterrow, tsort, tWaitforFile, tSalesforceOutput, tSalesforceBulkExec, tSalesforceInput etc for ETL process
  • Involved in design and development of complex ETL mapping.
  • Implemented error handling in Talend to validate the data integrity and data completeness for the data from flat file.
  • Extensively used ETL to load data from flat files, XML, Oracle database, MySql from different sources to Data Warehouse database.
  • Imported data frequently from MySQL to HDFS using Sqoop.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL
  • Develop reports, dashboards using Tableau for quick reviews to be presented to Business and IT users.
  • Developed Ad-hoc reports using Tableau Desktop, Excel.
  • Developed visualizations using sets, Parameters, Calculated Fields, Dynamic sorting, Filtering,
  • Parameter driven analysis.

Hadoop and Talend ETL developer

Confidential - Fultondale, AL

Responsibilities:

  • Worked as part of the Global Citi KYC (Know Your Customer) Data Migration team to support its data migration initiative and development on hadoop.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Managed and reviewed Hadoop log files.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Tested raw data and executed performance scripts.
  • Supported code/design analysis, strategy development and project planning.
  • Designed and developed ETL process using Talend.
  • Supported fully metadata driven ETL Framework built on top of Talend.
  • Participated in all phases of SDLC including business requirements collection, analysis, sizing, design, development, testing, deployment and production support.
  • Defined and created data model, tables, views, queries etc. to support business requirements.
  • Designed, developed, debug, tested and promoted Java/ETL code into various environments from DEV through to PROD.
  • Used several features of Talend such as tmap, treplicate, tfilterrow, tsort, tWaitforFile etc for ETL process.
  • Implemented error handling in Talend to validate the data integrity and data completeness for the data from flat file.
  • Develop reports, dashboards using Tableau for quick reviews to be presented to Business and IT users.
  • Developed POCs by building reports and dashboards using Tableau matching requirements to the charts to be chosen, color patterns as per user's needs, standardizing dashboard's size, look and feel etc.
  • Developed Ad-hoc reports using Tableau Desktop, Excel.
  • Prototyped data visualizations using Charts, drill-down, parameterized controls using Tableau to highlight the value of analytics in Executive decision support control.
  • Developed visualizations using sets, Parameters, Calculated Fields, Dynamic sorting, Filtering, Parameter driven analysis.
  • Involved in design and development of complex ETL mapping.
  • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.

Hadoop and developer

Confidential - Richmond, VA

Responsibilities:

  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Developed multiple MapReduce jobs in java for data cleaning.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Designed and developed ETL process using Talend.
  • Used several features of Talend such as tmap, treplicate, tfilterrow, tsort, tWaitforFile etc for ETL process.
  • Implemented error handling in Talend to validate the data integrity and data completeness for the data from flat file.
  • Involved in design and development of complex ETL mapping.
  • Monitor workflows and fix errors.
  • Supported code/design analysis, strategy development and project planning.

Data warehouse and Hadoop developer

Confidential - Dayton, OH

Responsibilities:

  • Developed ETL processes for data warehouse Talend and SQL.
  • Interacted with project lead to understand the requirement, documented business functions, and developed ETL process.
  • Monitored ETL jobs for performance and tuned run times.
  • Staggered ETL jobs for better run time.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Managed and reviewed Hadoop log files.

Data warehouse developer

Confidential

Responsibilities:

  • Extensively used ETL to load data from flat files, XML, Oracle database from different sources to Data Warehouse database.
  • Involved in Designing of data modeling for the Data Warehouse.
  • Involved in Requirement Gathering and Business Analysis.

We'd love your feedback!