We provide IT Staff Augmentation Services!

Senior Talend Developer Resume

5.00/5 (Submit Your Rating)

Atlanta-gA

SUMMARY

  • Over 6+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Highly skilled ETL Engineer with 6+ years of software development in tools like Informatica/SSIS/Talend with 2+ years of Experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • Strong experience in Dimensional Modeling using Star and SnowFlake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER - Studio.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, HBase, Dynamodb, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle 10g/9i/8i/7.x, DB2, SQL server, Hive and non-relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc
  • Strong Data Warehousing ETL experience of using Informatica Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

TECHNICAL SKILLS

  • UNIX, Win XP/NT 4.0, Sun Solaris 2.6/2.7, HP-UX 10.20/9.0.
  • Talend, TOS, TIS, Informatica Power Center 9.x/8.x/7.x/6.x (Designer,
  • Workflow Manager, Workflow Monitor, Repository manager and
  • (Informatica Server), SSIS, Ab-Initio, Informatica Power Mart 9/8.x,
  • (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse
  • Designer, Transformation Developer, Mapplet Designer, Mapping
  • Designer, Repository manager), Metadata, Talend
  • Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling,
  • Physical and Logical Data Modeling, Erwin, ER Studio and Oracle Data Modeler
  • Oracle 12c/11g/10g/9i/8i, MS SQL Server 2012 /2008/2005 , DB2 v8.1, HBase, Teradata 12/13/13.10/14 , Oracle 11g/10g/9i/8i/8.x, DB2 UDB 8.5, SQL*Plus, SQL*Loader, TOAD, SQL Assistant.
  • Amazon Web Services Cloud, S3, EMR, EC2.
  • Data Modeling - Logical Physical, Dimensional Modeling - Star /Snowflake
  • SQL, PL/SQL, UNIX, Shell scripts, C++, Web Services, Java Script, HTML, Perl, Teradata Procedure
  • Informatica Workflow Manager, Autosys, Control-M
  • QTP, Win Runner, Load Runner, Quality Center, Test Director, Clear Test, Clear case

PROFESSIONAL EXPERIENCE

Confidential, Atlanta-GA

Senior Talend Developer

Responsibilities:

  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Performed data manipulations using various Talend components like tMap, tJavaRow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Developed advanced Oracle stored procedures and handled SQL performance tuning.
  • Involved in creating the mapping documents with the transformation logic for implementing few enhancements to the existing system.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
  • Developed the Talend mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
  • Loaded data in to Teradata Target tables using Teradata utilities (FastLoad, MultiLoad, and FastExport) Queried the Target database using Teradata SQL and BTEQ for validation.
  • Involved in a huge Data Migration from 60+ MySQL Tables to JSON format using Talend.
  • Used Talend to Extract, Transform and Load data into Data Warehouse from various sources like Oracle and flat files.
  • Created mapping documents to outline data flow from sources to targets.
  • Prepare the Talend job level LLD documents and working with the modeling team to understand the Big Data Hive table structure and physical design.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Environment: Talend, TOS, TIS, Hive, Pig, Hadoop 2.2, Sqoop, PL/SQL, Oracle 12c/11g/, Erwin, Autosys, SQL Server 2012, Teradata,, Sybase, SSIS, UNIX, Profiles, Role hierarchy, Workflow & Approval processes, Data Loader, Reports, Custom Objects, Custom Tabs, Data Management, Record types.

Confidential, Los Angles-CA

ETL Developer

Responsibilities:

  • Design, Develop, Test, Implement and Support of Data Warehousing ETL using Informatica.
  • Used JIRA to create, implement and deploy ETL related stories.
  • Participated in daily scrum and bi weekly iteration planning as part of agile environment.
  • Research, analyze and prepare logical and physical data models for new applications and optimize the data structures to enhance data load times and end-user data access response times.
  • Used SQL tools like TOAD to run SQL queries and validated the data.
  • Created Java programs to read data from web services and load into teradata.
  • Developed stored procedures/views in Teradata and used in Informatica for loading scoring tables.
  • Performed ETL using different sources like MYSQL tables, CSV, fixed length files and loaded into Teradata, HDFS and hive targets.
  • Manage and scheduled ETL jobs using Informatica Workflow manager in development and production environment.
  • Prepare high level design document, detail design document, system requirement document, technical specifications, table level specs and test plan documents.
  • Extract data from legacy systems to staging area and then cleanse, homogenize, process and load into the data warehouse.
  • Develop MERGE scripts to UPSERT data into Teradata for an ETL source.
  • Worked on writing pig/hive/hcatalog scripts to process huge data files like web clicks data.
  • Used GIT as version control for the code and implemented branching for different environments.
  • Provided 24x7 production support for the ETL processes.

Environment: Informatica 8.6/IDQ, JIRA, Java, Maven, GIT, FLUX, Teradata, MySQL, Putty, XML, JUNIT, Hadoop, Apache Pig, Hive, Web Services, OBIEE, Microsoft Office, Oracle 10g/11g

Confidential

Informatica Developer

Responsibilities:

  • Worked on Informatica tools like Power Center Designer, Workflow Manager and Workflow Monitor.
  • Performed data manipulations using various Informatica Transformations like Joiner,
  • Expression, Lookup, Sorter, Aggregate, Filter, Update Strategy, Normalizer and Sequence Generator.
  • Developed InformaticaType-1 and Type-2 Mappings based on the requirements.
  • Involved in Mapping, Session and database level optimization techniques to improve the performance.
  • Involved in creating complex workflows which utilize the various tasks like Event Wait, Command, Mail decision and Session.
  • Extracted data from Flat files and Oracle to load them into Teradata.
  • Worked on WLM Scheduler to automate the Workflows.
  • Prepared SDLC Work book and conducted walk through before moving to SIT, UAT and Production.
  • Involved in the Performance Tuning of Database and Informatica
  • Debugged invalid mappings using break points, tested stored procedures, functions, Informatica sessions, batches and the target data
  • Wrote stored procedures in PL/SQL and UNIX Shell Scripts for automated execution of jobs
  • Identified performance bottlenecks and resolved those issues by Query optimization, dropping Indexes and Constraints, bulk loading.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Created complex/ad-hoc reports using Business Objects reporter and exported to the repository.
  • Defects were tracked, reviewed and analyzed.

Environment: Informatica Power Center 9.5, Teradata V12, Oracle 11g, WLM, UNIX, Windows XP.

Confidential

Informatica Developer

Responsibilities:

  • Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor.
  • Integrated data into CDW by sourcing it from different sources like SQL, Flat Files and Mainframes (DB2) using Power Exchange.
  • Extensively worked on integrating data from Mainframes to Informatica Power Exchange.
  • Extensively worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer to designed, developed and tested complex mappings and Mapplets to load data from external flat files and RDBMS.
  • Used output xml files, to remove empty delta files and to FTP the output xml files to different server.
  • Worked with the Business Analyst team during the functional design and technical design phases. Designed the mappings between sources (external files and databases) to operational staging targets.
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregators, and Connected and Unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, XML Transformations and Update Strategy & Sequence Generator.
  • Used XML transformation to load the data XML file. Worked on Informatica Schedulers to schedule the workflows.
  • Extensively worked with Target XSD's in order to generate the output xml files. Created mappings to read parameterized data from tables to create parameter files. Good Experience in Co-Coordinating with Offshore.

Environment: Informatica Power Center, Power Exchange, Windows, IBM DB2, Mainframes, SQL Server, Erwin.

We'd love your feedback!