We provide IT Staff Augmentation Services!

Talend Etl Consultant Resume

5.00/5 (Submit Your Rating)

Raritan, NJ

SUMMARY:

  • More than Ten years of experience as an ETL Developer in Data warehousing, Big data hadoop, Business Intelligence (BI) Solutions design & development
  • Extensively worked on job scheduling using TAC and Job Conductor and installed new production aka standby environment and moved 5 application to the new environment in short duration of 3 months.
  • Experience working with Talend, Cloud (AWS), Big Data, Python, MDM, Data Quality, and Data Integration
  • Experience in Informatica, Amazon S3, Vertica, Terradata and Redshift.
  • Experience in Talend Installation, configuration and administration
  • Extensively worked on Talend components that are used for Data Integration, Data Mapping, Big Data and Data Quality.
  • Experience in analysis, design (OOAD, UML, Design Patterns), development, testing, and rollout of application software.
  • Experienced in data integration, metadata management, ETL, data modeling tool sets.
  • Experience in Other Utilities/Tools/Scripts like Korn Shell Scripting, Perl 5.0, SQL Plus, SQL Loader, Export and Import utilities, TOAD 9.1 and Visio 10.0.
  • Extensive knowledge and hands on project, new product, enhancements, and new feature releases to production for every two weeks
  • Extensive knowledge and hands on cut over, impact analysis, rollover plan, risk mediation plans for every release
  • Managed CCB review meetings for every release.
  • Data modelling and oracle performance optimization
  • Experience in Cloudera 5.2 platform and Hands - on experience in Hive, PIG, Spark, Hbase
  • Experience in core java programming.
  • Business Intelligence area dealing with data analysis, solution design metadata management and MDM.

PROFESSIONAL EXPERIENCE:

Confidential, Raritan, NJ

Talend ETL Consultant

Responsibilities:

  • Design and oversaw develop data warehouse in Amazon Redshift. Develop new test environment for replicating the production environment for troubleshooting complex scenarios as in production
  • Envisioned setting up a new preprod environment, led new initiative with Talend on aws cloud
  • Implemented best practices with context files, variables
  • Verified RDS, Amazon Redshift, EC2 instances, S3 bucket policy for the new aws account creation and setting up a new preprod environment
  • Spearheaded in all phases of deployment; CCB; migration; deployment; testing; go live for every two weeks
  • Delegated Talend installation, migration, and running all the current jobs into the new AWS environment on redshift database
  • Oversaw configuration of Talend job in Talend Data Fabric 6.4.2 done by junior developers
  • Programmed MDM data quality components such as NameID and Address verification and Address formatting and zip code validation for data stewardship
  • Pioneered development of dynamic, highly configurable and complex Talend jobs for data integration of sales data from multiple source system to build data lake and then derive the interim layer where the data is used to build facts and dimensions for reporting purposes using tableau or Qlik sense
  • Orchestrated jobs such as Daily, Weekly, Monthly sales of products for various territory; adding users in Talend Administration Console (TAC), and Tidal Scheduler
  • Chaired the design, development of ETL Jobs/Packages using Talend Open Studio (TOS) by the team
  • Organized Talend Code migration via Nexus repository to Talend execution server; publish/deploy Talend jobs from nexus; deployment using Jenkin;Leveraged Talend components such as (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)
  • Produced many complex ETL jobs for data exchange from and to Databases (Oracle, MySql); files structures(XML, CSV)
  • Controlled Talend jobs with Implicit, explicit, local,global Context variables to handle sensitive data
  • Headed the design team to design Talend’s native connector for AWS Redshift; migrate data to AWS Redshift; load and extract data to and from Amazon Redshift;Insert data from S3 to aws redshift DB using copy command in Talend

Confidential, San Francisco, CA

Senior Talend Consultant

Responsibilities:

  • Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs
  • Led Design and Develop Talend job in Talend open studio 6.4.2 for Talend Integration with redshift
  • Design and oversaw develop data warehouse in Amazon Redshift. Develop new test environment for replicating the production environment for troubleshooting complex scenarios as in production
  • Designed Talend jobs with Implicit, local and global Context variables in Talend jobs to decouple sensitive data and eliminated hardcoding
  • Leveraged Talend Administration Console (TAC) for scheduling jobs and user management. Talend Installation, Configuration, Troubleshooting and Maintenance
  • Proposed and implemented Talend code migration from one environment to other via Nexus repository and deployment
  • Peer Reviewed Development and deployment of physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment
  • Designed ETL Jobs/Packages using Talend Open Studio (TOS); Troubleshooting Talend log files
  • Configured Talend’s native connector for AWS Redshift to migrate data to AWS Redshift and load and extract data to and from Amazon Redshift
  • Ingested data from S3 to aws redshift DB using copy from command via Talend

Confidential, Southlake, TX

Senior Talend Consultant

Responsibilities:

  • Conducted design meetings and attend functional and technical walkthroughs
  • Configured Talend job in Talend open studio 6.2.1 for big data components
  • Actively used big data Appliances (Cloudera 5.2) with Talend ETL. Talend code migration from one environment to other via Artifactory repository and Jenkins for deployment
  • Designed Talend job using sqoop, file list, Hive Hadoop ecosystem components; Big data components tImplaInput, tImpalaRow, tHDFSConnection, tHiveConnection, tHiveLoad, VerticaOutput, HDFS file system and vice versa
  • Implemented Data loads from Talend server to HDFS file system
  • Designed Talend jobs by following ETL methodologies and best practices. Deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment
  • Actively engaged in troubleshooting Talend jobs while reviewing log files for Talend components for loading files in HDFS. Design and develop Implicit, local and global Context variables in the job; Leveraging Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie
  • Leveraged Talend’s native connector for AWS Redshift to migrate data to AWS Redshift and load and extract data to and from Amazon Redshift

Confidential, Nashville, TN

BI/Talend Developer

Responsibilities:

  • Developed ETL jobs in Talend for loading data into Amazon Redshift using Talend Redshift components for initial and incremental load
  • Designed and created data warehouse in Amazon redshift for creating Adhoc reports
  • Designed, Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment
  • Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2
  • Designed ETL Jobs/Packages using Talend Open Studio (TOS)
  • Configured Talend components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)
  • Designed Talend jobs using various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie, Big data Appliances(Cloudera 5.2)

Confidential, Brentwood, TN

Senior Talend Consultant

Responsibilities:

  • Designed Talend jobs by following ETL methodologies and best practices
  • Implemented Variables and Parameters in the mappings as per Talend best practices designing context files, error handling and file archival.
  • Developed Talend jobs to copy the files from one server to another and utilized Talend FTP components
  • Designed and Source to Target mapping documents for all Facts and Dimension tables; Leveraged ETL methodologies and best practices to create Talend ETL jobs
  • Designed ETL Jobs/Packages using Talend Open Studio (TOS).
  • Designed and develop complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures
  • Designed and Implicit, local and global Context variables in the job following best practices provided by aws and Talend
  • Implemented complex Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie to accomplish best ingestion framework

Confidential, Carmel, IN

ETL Developer

Responsibilities:

  • Defined downstream targets for MDM with design for attributes and sync methodologies
  • Followed agile Best practices for enhanced productivity and produced high quality software
  • Designed Metadata management frame work; integrate and manage metadata; Implemented Metadata Management framework to capture, manage and publishes rich metadata
  • Designed, developed and tested Informatica mappings; Big data requirement gathering and design analysis
  • Oversaw Loading data from disparate non-Hadoop sources which can be both structured and unstructured; Logical and Physical data model design.
  • Administered and Provided integration support to Data Stage teams for downstream integration
  • IBM Initiate broker configuration for data sync with sources and message processing based on source priority
  • Developed high-level design using design patterns and Object oriented programming and OOAD concepts
  • Customized Code Design for integration to downstream ODS and Data Warehouse for BI needs
  • Devisied new composite services design for lookup; and update services; in MDM Server and troubleshooting MDM response

Confidential, New York City, NY

ETL Developer

Responsibilities:

  • Led DW/BI best practices sessions and suggested design changes to processing/querying bottlenecks. Develop Informatica ETL Mappings, exceptions handling, compliance tracking and data reconciliation
  • Architected MDM solution for customer information across various source systems. Delivered MDM data stewardship and data governance program
  • Evaluated and studied the Metadata about the master data for MDM and the transactional data.
  • Regulated Talend job creation to load Data from various sources like to Oracle DB; Involved in Requirement Analysis and Use Case development using rational tools, OOAD and RUP
  • Analyzed ETL test data and tested all ETL mapping rules to the functionality of the Informatica mapping
  • Architectured Physical and, Logical Data Models for the source system using Erwin 7.3. Tailored product for each client providing customizations in data warehouse, ETL mappings, OBIEE Answers, Dashboards & BI Publisher
  • Performed tuning and optimization of complex SQL queries using Teradata Explain
  • Performed bulk data load from multiple data sources and Flat files to Teradata RDBMS using BTEQ, Multiload, FastLoad and UNIX shell scripting
  • Verified Informatica mappings, workflows and Teradata scripts; Providing Data Quality support in the process of migration of the fact and dimension tables from Oracle to Netezza platform

Confidential, Hartford, CT

Data Analyst

Responsibilities:

  • Designed database objects Tables, Views, Procedures, Functions, Triggers, Packages using Oracle tools like SQL* Plus and PL/SQL
  • Designed and developed database in ORACLE and maintaining the database
  • Developed various complex stored procedures, Functions, packages, and triggers in PL/SQL; Wrote various SQL* Loader scripts to load data into oracle database and extensively used SQL scripts to validate the integrity of data
  • Extensively used database triggers for adding various default functionality, like Database & non-database methods of running totals, sequence generation
  • Developed complex stored Procedures, Functions, Packages and triggers in PL/SQL as well as Shell scripting; Develop Perl Scripts for creating batch executable scripts
  • Designed and develop SQL statements, SQL Query tuning, views, cursors, schemas
  • Monitored databases;optimized database performance, and use of resources including selecting optimal physical implementations of databases
  • Maintained high availability; integrity of databases; which included referential integrity checking and multiple access schemes and Database capacity planning
  • Designed and developed PL/SQL procedure as per the requirement. Created functional design and Data Mapping between the data mart and the Source Systems
  • Developed Use Case Diagrams, Activity Diagrams, Sequence Diagrams and ER Diagrams in MS Project

Confidential

Programmer Analyst(Java and pl/sql)

Responsibilities:

  • Facilitated with Business Users to conduct thorough Requirements Analysis.
  • Designed and scheduled sessions based on demand, run on time and run only once using Informatica Server Manager
  • Designed Java User Interface for the Account entity where the users login and browse the account details
  • Designed and developed of database objects like Tables, Views, Procedures, Functions, Triggers, Packages using Oracle tools like SQL* Plus and PL/SQL
  • Developed complex stored procedures, Functions, packages, and triggers in PL/SQL;
  • Designed various SQL* Loader scripts to load data into oracle database and extensively used SQL scripts to validate the integrity of data
  • Implemented database triggers for adding various default functionality, like Database & non-database methods of running totals, sequence generation
  • Designed and develop complex stored Procedures, Functions, Packages and triggers in PL/SQL as well as Shell scripting
  • Developed Perl Scripts for creating batch executable scripts; Performance tuning, views, cursors, schemas
  • Monitored databases and optimizing database performance and use of resources, including selecting optimal physical implementations of databases
  • Maintained high availability, and integrity of databases, including referential integrity checking and multiple access schemes and Database capacity planning
  • Performed backed testing using SQL queries and analyzed the server performance on UNIX
  • Audited and confirmed that all Artifacts are in compliance with corporate SDLC Policies and guidelines

We'd love your feedback!